Export import as FDF,  supported with Reader

There are tow types of button. One is for Export and another one for import as fdf. How can I get supported in reader.
Would you help me?

HI,
greenlnd34 wrote:
I wish to save file as fdf to local disk with reader. Is it possible?
thanks
technically it is possible but not with the reader application alone.
as George Johnson has already stated this requires the Adobe LiveCycle Reader Extensions product in order to be able to export FDF from the Reader application.
If you investigate the documentation
http://livedocs.adobe.com/acrobat_sdk/10/Acrobat10_HTMLHelp/wwhelp/wwhimpl/js/html/wwhelp. htm?&accessible=true
under the API Name there are 4 boxes
Box 1 == 5.0 ( this meanst that this API was added in Acrobat 5.0)
Box 2 == BLANK  ( this means that this API does not affect the state of the document)
Box 3 == S ( this means that this API is could only be available at certain times to avoid any possible problems)
Box 4 == F ( this means that this API "Requires forms rights" and these rights can only be provided by the LC Reader Extensions product)
Hope this helps your understanding
Malcolm

Similar Messages

  • Problems Export/Import on Windows x64 with SQL2005 x64

    Hi All:
    After successful export with R3Load of a ECC5.00 (Kernel 6.40 Patch level 171) when we import the data into SQL2005 SP1 and after successful import into the target database, we have some inconsistencies in several DYNPROS, we got errors like this when we run several transactions:
    BZW Error 2 during compression/decompression ( UNPAC)
    AB0 Run-time error "DYNPRO_SYNTAX_ERROR" occurred
    and in the trace files we found:
    Fri Feb 23 16:12:35 2007
    ***LOG BZW=> error 2          on compression/decompression using UNPACK     [dbdynp#1 @ 842] [dbdynp  0842 ]
    ERROR => Involved dynpro: SAPMF02D                                 - 0101
    dbdynp.c     842]
    ERROR => DY-SRC_READ BUFFER: any error 4 [dgdynp.c     1225]
    It seems the DYNPROSOURCE table contains bad entries, but the export/import procedure was successful, we also execute "Tools for MS SQL" successfully.
    Any help will be greatly appreciated...!!!
    Regards,
    Federico

    Hi Federico
    Seems there is some problem in kernals.I would suggest a solution :Solution: Copy <putdir>\exe to $(DIR_EXE_ROOT)\$(OS_UNICODE)\NTAMD64
    and repeat phase.    
    Let me know if it works.
    Also check the numbers in instance problem,it should not be more than 20 for windows.
    Reward points

  • How to Export/Import/Create a VM with an existing OS-Drive.

    How do I do the same things that these Power Shell command do with the REST-Based API?
    Export-AzureVM
    Remove-AzureVM
    Import-AzureVM
    New-AzureVM
    Eventually I want to be able to export and import Virtual machines in a similar way to this:
    Export and Remove machine
    Export-AzureVM -ServiceName <ServiceName>  -Name <Name> -Path <FileName>
    Remove-AzureVM -ServiceName <ServiceName>  -Name <Name>
    Import Machine
    Import-AzureVM -Path <FileName> | New-AzureVM -ServiceName <ServiceName> -Location <Location>

    No, still not enough information.
    What I need is a direct mapping between PowerShell commands and rest API calls.
    Something like this:
    Export-AzureVM -ServiceName <ServiceName>  -Name <Name> -Path <FileName>
    MAPS TO
    Rest API and the XML to go with it(with the parameters from the powershell command clearly marked in the xml)
    I thought the intention of the BETA/Preview was to have users/developers test the stuff. But if you want users to do it without any help/documentation
    you can't really be serious about it. And to use a tool like reflector (which
    I now days have to buy) to try to reverse engineer Microsoft code
    to try to figure out how it is done is just so wrong in so many ways.
    If you guys at Microsoft don't get your shit together very soon, than developers will start using documented platforms where stuff actually works.

  • Official oracle stance on portal 9.0.2 migrate/export/import

    i have been working with oracle support, scouring the forums and deja and metalink, applying patches, standing on my head... trying to export my 9.0.2 portal applications, reports, pages and such from a 9.0.2 implementation into an exact same 9.0.2 implementation. i have run into issues with the reports, issues with the pages, corrupt objects that can't be deleted, most everything stated on this forum, plus more. i have asked oracle time and again when the migrate scripts [the FAQ still posted at technet and portalcenter STILL says the scripts will be available august/september] will be ready and have gotten no meaningful response. surely oracle cannot pretend that this export/import works and I NEED A REPLY FROM ORACLE AS TO THEIR OFFICIAL STANCE on this and as to when we can expect a working solution. i am going to have to recommend against oracle portal because we cannot get from our development environment into a testing one.

    thanks... i do hope they can give us some expectation of when this will be ready for primetime. i couldn't get my pages across because i get a region not found error, so i tried just to import my provider applications that include rwreps with parameters. i figured worst case i could manually recreate the pages. the export indicates no errors, the import check mode indicates no errors, but the actual import gives me a warning on every portlet parameter and none of my reports come across:
    sample warning i receive:
    Importing portlet preferences ..........
    Importing category/perspective info ..........
    ---------- Before Importing Pronto Tables ----------
    *** Warning: no corresponding portlet parameter ID found on target for 295 for BUSINESS_UNIT
    so i applied patch 2617359 per oracle support but no luck. apparently this is another bug being worked on by support [Bug:2644937 Abstract: IMPORT:PAGE:PRONTO:REPORT:SECURITY:PORTLET PARAMETER ID NOT FOUND IN LOG] but in the meantime i face the prospect of having to manually recreate 40 plus reports on each instance. this may or may not be related to bug 2426089 from version 3.0.9.x [Bug 2426089 Abstract: PARAMETERS ARE NOT VISIBLE AFTER REGISTERING THE REPORTS ON PORTAL]. i am still trying to come up with some way to get my reports across. i'm going to try exporting/importing just a report with nothing else and see what happens.

  • Export/import oracle9

    Hi,
    since I do not have a good knowledge of oracle, hopefully somebody may help me to understand better how export/import work (oracle 9)
    A colleague as run an export from a production server, now he needs to run the import on a test server but it fails.
    Could someone be so kind to let me know what I have to take into consideration before running both commands ?
    I explain better:
    I have noticed that the DB_BLOCK_SIZE is indeed different on the two servers (production server has a DB_BLOCK_SIZE=4096)(test server has a DB_BLOCK_SIZE=8192)
    so the first thing I have seen in the import log file is an error message saying that "tablespace block size 4096 does not match configured block sizes"
    So my first question would be: is there any chance to run export/import on two servers with different DB_BLOCK_SIZE ?
    Then (due to my ignorance of the subject):
    - is there any problem is the SID is different on the two servers ? or it is not important ?
    - is it important that the PATH of "*dbf" files is different on the two servers or this fact is not taken into consideration at all (since only 'logical' data are taken into consideration) ?
    - do the tablespaces currently used on the production server have to exist in the test server in order to have a successful import ?
    Hopefully somebody has few minutes to answer my question.
    Thanks a lot in advance.

    Hi,
    Are you talking about FULL export/import method, right? Well, Oracle 9i supports multiple block sizes in the same database. So, it gives you an option of using multiple block sizes, which is especially useful when you are transporting tablespaces from another database with a different block size. In this case, will be necessary to set the initialization parameter called DB_nK_CACHE_SIZE in the initialization parameter file. For example, in your case, if your standard block size is 8KB, you will need to pre-create a tablespace with a block size, say 4KB that are currently using on your product server and then you will must set the DB_4K_CACHE_SIZE parameter in your test server. The DB_nK_CACHE_SIZE The parameter is dynamic so you can alter its value using the ALTER SYSTEM statement.
    >>- is there any problem is the SID is different on the two servers ? or it is not important ?
    No problem ...
    >>- is it important that the PATH of "*dbf" files is different on the two servers or this fact is not taken into consideration at all (since only 'logical' data are taken into consideration) ?
    In fact, It is important that the tablespace exists in the target database and doesn't matter where they datafiles are located. In this case, logical data are taken into consideration.
    >>do the tablespaces currently used on the production server have to exist in the test server in order to have a successful import ?
    Yes. For a full import method, this is very important. So, you will need to pre-create the tablespaces in your target tablespaces, but in this case they need to be created with 4k of block size, as I said before. Anyways, you will need to make some tests.
    Otherwise, if you want to make a test, create a database on you test server using 8k default block size and use the export user method instead.
    Eg.:
    If you have the user A, B and C using the TBS_1 in your prod database, so, create the tablespace TBS1 using the default block size (8k) in your test database and make this test below:
    PROD DATABASE
    exp a/pass file=a grants=n
    exp b/pass file=b grants=n
    exp c/pass file=c grants=n
    TEST DATABASE
    create the users A,B and C and perform these tasks below:
    imp a/pass file=a full=y
    imp b/pass file=b full=y
    imp c/pass file=c full=y
    In addition, give a feedback ....
    Cheers
    Legatti

  • EXPORT/IMPORT DB PROVIDER NT-- SOLARIS IN PORTAL RELEASE 2

    Hi,
    I've been assigned to export/import a database provider with all the applications contained therein from NT platform to SUN SOLARIS platform. Can anyone please kindly tell how to do this or where to get the step-by-step guide for this task.
    Thanks.

    hi,
    i am using portal 9.0.4 in windows 2003 server. i am searching for the step-by-step import/export details. you told you have got it. can you share with me the same?
    my email id is [email protected] i am in-search of export/import commands. i am able to export through the transport set, but the import will show the importing details, but nothing is available. please help me.
    regards
    sudhir

  • Exporting/importing Responsibility-level personalizations

    Hello,
    Suppose on one instance I have done some personalizations for responsibility A and I want to apply these personalizations on another instance, but there is no responsibility A on another instance, but instead its name is B.
    Does export/import mechanism has support for this kind of requirement.
    - Yora

    Yora,
    Does export/import mechanism has support for this kind of requirement.No, they don't have any functionality like this.
    For much more information on this refer http://apps2fusion.com/apps/14-fwk/215-move-oa-framework-personalizations-from-one-environment-to-another
    //Note Taken from Above Link -- Anil Passi
    You have done responsibility level personaliation for OAF Pages, for this resp.
    Issue:- When you extract the personalizations, the directory path of extracted personalizations will contain the responsibility_id
    However, the same responsibility when created on TEST SYSTEM might be allocated a responsibility_id of 1088.
    Hence, you referring to ***** in notes above, you will have to rename the directory from 1032 to 1088. This can be scripted too in Unix.
    To overcome this issue:-If you wish to perform responsibility level personalizations against a custom responsibility, first create
    this responsibility on production, and get this cloned to other environments. Alas, such forward planning rarely happens in projects.
    Blame this on Oracle’ design. They could have easily used RESPONSIBILITY_KEY/APPLICATION_ID in the path, instead of using RESPONSIBILITY_ID. Regards,
    Gyan

  • Export/Import Process in the UI for Variations Content Translation is Generating CMP Files with No XML

    We have a SharePoint 2010 Publishing Website that uses variations to deliver contain to multiple languages. We are using a third-party translation company to translate publishing pages. The pages are
    exported using the  export/import using the UI process described here: "http://blogs.technet.com/b/stefan_gossner/archive/2011/12/02/sharepoint-variations-the-complete-guide-part-16-translation-support.aspx".
    Certain sub-sites are extremely content-intensive. They may contain many items in the Pages library as well as lists and other sub-sites. 
    For some sub-sites (not all), the exported CMP file contains no XML files. There should be a Manifest.XML, Requirements.XML, ExportSettings.XML, etc., but there are none. After renaming the CMP file
    to CAB and extracting it, the only files it contains are DAT files.
    The only difference I can see between the sub-sites that generate CMP files with no XML files is size. For example, there is one site that is 114 MB that produces a CMP file with no XML files. Small
    sites do not have this problem. If size is the problem, then I would think the process would generate an error instead of creating a single CMP file that contains only DAT files. However, I do not know exactly what the Export/Import Process in the UI is doing.
    This leads to two questions:
    1.
    Does anyone know why some CMP files, when renamed to *.CAB and extracted, would not contain the necessary XML files?
    2. Second, if exporting using the UI will not work, can I use PowerShell? I have tried the Export-SPWeb, but the Manifest.XML does not contain translatable
    content. I have not found any parameters that I can use with Export-SPWeb to cause the exported CMP to be in the same format as the one produced by the Export/Import process in the UI.
    As a next step, we could try developing custom code using the Publishing Service, but before doing this, I would like to understand why the Export/Import process in the UI generates a CMP that
    contains no XML files.
    If no one can answer this question, I would appreciate just some general help on understanding exactly what is happening with the Export/Import Process -- that is, the one that runs when you select
    the export or import option in the Site Manager drop down. Understanding what it is actually doing will help us troubleshoot why there are no XML files in certain export CMPs and assist with determining an alternate approach.
    Thanks in advance
    Kim Ryan, SharePoint Consultant kim.ryan@[no spam]pa-tech.com

    I wanted to bump this post to see about getting some more responses to your problem. I'm running into the same problem as well. We're running a SharePoint 2010 site and are looking at adding variations now. The two subsites with the most content take a
    while to generate the .cmp file (one to two minutes of the browser loading bar spinning waiting on the file). Both files are generated with a lot of .dat files but no .xml files. I was thinking like you that it must be a size issue. Not sure though. Did you
    ever happen to find a solution to this problem?

  • Is it possible to merge a .fdf file with the IOS application for Adobe Reader?

    I receive .fdf files via email that merge with a pdf template on my PC. I would like to do this on my Ipad but can't see any way that this would be possible. Does anyone know how this can be done? If not are Adobe looking at making this function possible in the future?

    wildcat24,
    Currently, Adobe Reader mobile products do not support import of FDF files.
    Sorry, I am not able to speculate on features in future releases.

  • Need help with database migration using export/import

    Hi,
    I am planning to do a database migration from 8.0.6 to 10.2.0.4. I am using export/import. Please answer some of my queries:
    1) Do i need to do full export using SYS/SYSTEM user.
    exp80 system/manager file=c:full.dmp log=c:\full.txt full=y consistent=y
    2) Will there be any data corruption while export.
    3) Is export/import the only method of migration/upgradation.
    Please help
    Thanks.

    I'll answer the specifics that I know about your questions, but please look at the post from Srini for additional info:
    1) Do i need to do full export using SYS/SYSTEM user.
    exp80 system/manager file=c:full.dmp log=c:\full.txt full=y consistent=yA full export will move as much information (metadata/data) as possible. In order to do a full export/import, you need to do this from a
    privileged account. A privileged account is one with EXP_FULL_DATABASE for export and IMP_FULL_DATABASE for import.
    2) Will there be any data corruption while export.The data in the objects that you are exporting is being read, nothing is written to user data. I'm not sure what this is available in 8.0.6,
    but if you need a consistent export, look to see if consistent=y is available in 8.0.6. All this means is that the dump file created will have
    consistent data in it.
    3) Is export/import the only method of migration/upgradation.If you are changing hardware, the only supported way from 8.0 to 10.2.0.4, that I know of, is using exp/imp.
    Hope this helps.
    Dean

  • Firefox can't read any Bookmark that was imported from my PC with file extension .url. Safari reads them fine. Is there a fix, so I can use Firefox instead of Safari? Many thanks if so. I have the latest version of Firefox

    Firefox can't read any Bookmark on my Mac that was imported from my PC with file extension .url. Safari reads them all fine. Is there a fix, so I can use Firefox instead of Safari? Many thanks if so. I have the latest version of Firefox
    == URL of affected sites ==
    http://anysite.url
    == User Agent ==
    Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-us) AppleWebKit/531.22.7 (KHTML, like Gecko) Version/4.0.5 Safari/531.22.7

    Hello JF.
    I don't think that extension is supported. I believe Firefox can only read .json and .html.
    You may want to read this though:
    [http://support.mozilla.com/en-US/kb/Importing+bookmarks+and other data from Safari Importing bookmarks and other data from Safari]

  • Problem with EXPORT IMPORT PROCESS in ApEx 3.1

    Hi all:
    I'm having a problem with the EXPORT IMPORT PROCESS in ApEx 3.1
    When I export an application, and try to import it again. I get this error message
    ORA-20001: GET_BLOCK Error. ORA-20001: Execution of the statement was unsuccessful. ORA-06550: line 16, column 28: PLS-00103: Encountered the symbol &amp;quot;牃慥整㈰㈯⼴〲㐰〠㨷㐵㈺′䵐&amp;quot; when expecting one of the following: ( - + case mod new not null &amp;lt;an identifier&amp;gt; &amp;lt;a double-quoted delimited-identifier&amp;gt; &amp;lt;a bind variable&amp;gt; avg count current exists max min prior sql stddev sum variance execute forall merge time timestamp in
    As a workaround, I check the exported file and found this
    wwv_flow_api.create_flow
    p_documentation_banner=> '牃慥整⠤㈰㈯⼴〲㠰〠㨷㠵㈺′äµ
    And when I replace with this
    p_documentation_banner=> ' ',
    I can import the application without the error.
    somebody knows why I have to do this??
    Thank you all.
    Nicolas.

    Hi,
    This issue seems to have been around for a while:
    Re: Error importing file
    I've had similar issues and made manual changes to the file to get it to install correctly. In my case, I got:
    ORA-20001: GET_BLOCK Error. ORA-20001: Execution of the statement was unsuccessful.<br>ORA-02047: cannot join the distributed transaction in progress<br>begin execute immediate 'alter session set nls_numeric_characters='''||wwv_flow_api.g_nls_numeric_chars||'''';end;There are several suggestions, if you follow that thread, about character sets or reviewing some of the line breaks within pl/sql code within your processes etc. Not sure what would work for you.

  • How to find EXPORT MODE of MEMORY ID with ref to IMPORT  MODE of MEMORY ID

    HI ,Friends,Iam not able to find Export memory ID with ref to available IMPORT Memory ID in particular SD routine of a program.
    Plz let me know how to find Export Memory ID reference program with available import memeory ID as I need to do changes for exiting routine program done by somebody .

    As it's an SD routine probably the simplest way is to set a breakpoint on the keyword 'EXPORT' when you first enter the transaction.  May need to skip a few but eventually you will come across the correct one.  When you do find it I would also suggest placing a comment on the IMPORT statement to say where the value has come from for future reference.

  • Problem with the Export/Import Utility when importing a .portal file

    Hi there !
    I got the following error when trying to import a .portal file to a desktop by using the Export/Import Utility.
    <13-11-2009 12:13:26 PM CLST> <Error> <netuix> <BEA-423487> <Import: did not add navigable [2203] as it would cause an illegal dependency.>
    The error is shown several times, starting from the number [2203] until [2285].
    It causes that i can't see some pages of my portal... it's like they dont exist.
    I Hope you can help me fix it, because i have not found any information about it.
    Best regards
    P.D.: I am using the BEA Weblogic 10.0.1 version.
    Edited by: user12231353 on 16-nov-2009 12:38

    I upgraded to cs6 and imported all the images together, I made them 3 frames per and exported them. The problem is slightly improved but still there. I'm still getting flicker.
    Check it out: http://www.youtube.com/watch?v=g_yZjskzTLs
    Black screen version from yesterday if you want to look at it: http://www.youtube.com/watch?v=NCcAEO8YU6Y
    The problem has to be with my settings. when I upgraded it deleted all my prefrences and settings. There was no dslr preset either so I had to redo it all from  scratch to the best of my ability. Here is a complete rundown of my settings:
    I shoot with a canon dslr 60D in 24fps and upload all my stuff in 1080p. Please advice.

  • Portal Export Import with DB on different OS

    Hi All,
    I have a quick question about exporting the OracleAS Portal Metadata Repository database. Our source DB is on a different OS to our target DB.
    We are going to apply the 10.1.2.3 patch and associated portal export/import patches to both nodes (from note 263995.1).
    Our understanding was that we would use the method described in Chapter 10 of the Portal 10.1.4 Admin Guide (http://download.oracle.com/docs/cd/B14099_18/portal.1014/b19305/cg_imex.htm).
    However, according to metalink note#276620.1, because our source and target OS are different, we should use the database export/import methodology that is described in Chapter 11 of the Application Server 10.1.2 Admin Guide (http://download.oracle.com/docs/cd/B14099_18/core.1012/b13995/prodtest.htm).
    Could anyone tell me which way is better, we would prefer to use the database export/import method, has anyone tried this and what are their experiences.
    Regards,
    Tim.

    kirwantd wrote:
    We are going to apply the 10.1.2.3 patch and associated portal export/import patches to both nodes (from note 263995.1).
    Our understanding was that we would use the method described in Chapter 10 of the Portal 10.1.4 Admin Guide (http://download.oracle.com/docs/cd/B14099_18/portal.1014/b19305/cg_imex.htm).
    However, according to metalink note #276620.1, because our source and target OS are different, we should use the database export/import methodology that is described in Chapter 11 of the Application Server 10.1.2 Admin Guide (http://download.oracle.com/docs/cd/B14099_18/core.1012/b13995/prodtest.htm).
    Could anyone tell me which way is better, we would prefer to use the database export/import method, has anyone tried this and what are their experiences.
    Regards,
    Tim.As far as your referred Metalink note is concerned, it gives you three options to move your repository with if you are working with 10.1.4 Portal.
    It's a difference of can, should and must.
    - It only shows ability to use three options ( can );
    - on the whole, it does not give preferences for any method ( should );
    - nor does it put a requirement for any single option ( must ).
    ( the only requirement/prohibition is for the option 2 which is not supported ).
    we have used the first three methods given on the note (yes, we also tried 2; this note came later). 1 is what we use regularly for transports between environments; and 3 we use for environment staging (as given in chap 11 of pt. 3). both can be messy with a large repository (we do so for about ~400 GB) repository. but following that supportability guide (Metalink Note 333867.1), method 1 works too.
    hope that helps!
    AMN

Maybe you are looking for

  • A message to Adobe.

    A message to Adobe. Yes this includes all the people working on the forum, that would be JC to zeno, as well as all the Adobe suits. Personally I have never seen the comments forum so busy and the product forums so slow. Please Keep in mind that we t

  • Create object form pattern

    I have seen mutiple ways for implmeneting forms for creating entities. What is the best practice. I have some attributes that are exposed in the form and the rest need to be set programatically. I need to navigate away to 2 different pages depending

  • Missing Photo Booth Effects

    For some reason, I only have two pages of effects in my Photo Booth application. There are no backdrop effects, either prefab or blank frames where I can create my own. I've got version 2.0.3, so I don't think I've missed an upgrade anywhere. Can any

  • I can't open my mail app. It appears on the screen and immediately disappears before i can do anything.

    I can't open my mail app. It appears on the screen but disappears immediately. What can I do ?

  • Monitoring network traffic from large queries to client

    Hi, is there any way to monitor the volume from network traffic from the server to the client, the statistics SQL*net message to client and SQL*net message from client don't show a full picture. I have a query that sends several GB of data to a clien