Export/Import API workaround?

Hi
I want to generate a pl/sql-procedure to execute export and import (not datapump) of table!
It appeas that Such a thing doesn't exist.Or
I don't know it. Can anybody help me to find it or shows me a similar workaround ?Thanks and regards
hqt200475

Hello,
ORA-39139: Data Pump does not support XMLSchema objects.Thank for your feedback. I understand better why you keep on using the Original Export/Import.
Else, as the XML Objetcs are located in a specific schema, it's still possible to exclude them from the DATAPUMP Export as follow in Command Line:
EXCLUDE=SCHEMA:"='<SCHEMA_NAME>'"
NB: It's better to use a PARFILE so as to avoid syntax error on the command line.
Or by using the DBMS_DATAPUMP.METADATA_FILTER Procedure.
Then, you are right, you'll have to use the Original Export/Import to Export the XML Schema.
It's just a suggestion.
Hope this help.
Best regards,
Jean-Valentin

Similar Messages

  • Progamming export / import of XMP

    Hi folks,
    I'm still very new to this (just bought a book on Lua ) but I want to create something that export the XMP files (so that I can edit them on the road) and import them back to Lightroom. I've been reading the API reference and SDK manual lately, had a feeling that this is not possible but since I've seen some workarounds to achieve certain goals in this forum, thought still worth asking.
    So, assuming that this is not possible, here are a few of my ideas:
    Create some keyboard macros to automate the XMP export / import porcess. It'd nice and sweet if it's some simple application I'm dealing with, but with a complex app like Lr there are too many unpredictables and since it's all valuable data maybe this is not a good idea.
    Create a plug-in that does not thing but generate XMP files using the post-processing part, and then import edited XMP manually. In this case though, I wonder if it's even possible to create a plug-in that will not export photos but only performs post processing.
    Would anyone please shed some lights?
    Thanks.
    Nick.

    Hi Rob,
    To be honest I just Googled your name and found your web site that contains plug-ins that do similar things of what I wanted to do, this is nice to know because I just wanted to know if there's a way to go around this -- I didn't want to waste my time putting codes that wouldn't eventually give me what I wanted, thus the intention to ask the question in the first place. ;-)
    I have some development background dealing with XML version control and conflict management so merging online VS offline content is okay for me, -- since all I want to do at moment is editing keywords and titles (or any other description fields) maybe I'll just create some CRC from online files, and write any changed values in the offline file back to the online file.
    How I plan to edit the XMP file was to create a XSLT stylesheet that converts XMP files into a spreadsheet so I can edit on the road. If it's too much work I might just create a PHP script that sits on my home PC host the XMP web edit interface so I can edit them from any device with a web browser -- with this method I might not even have to deal with XMP versioning since I'll probably have the PHP to read XMP and write changes in realtime.
    The mortifying part is -- as you pointed out already -- using a keyboard macro isn't really elegant, but guess this is the only way to automate this kind of stuff...
    Thanks.
    Nick.

  • Error when exporting/importing

    I am using MDM 7.1
    When I export the schema from my Dev system to import into QA I get the following Error:
    "This repository requires additional steps before transport. See the MDS log for details.
    In the log my issue is that I am trying to export an Assignment that includes an expression that uses "look-ups".
    In my Dev system I removed the expression to confirm if this is the issue, once I no longer have expressions with look-ups then it allows me to export the schema. I then tries to import it to QA (since the expressions are not changing I planned on excluding them from the import as a temporary workaround.
    however I get the same error message when trying to import. It seems that I can not export or import with a system that has an assignment with an expression that uses look-ups.
    Is there some config I am missing?

    Hi Brad,
    assignments/validations are a general problem when it comes to Schema exports/imports! What you can do in case there are not too many assignments -  is to delete the assignments and create them new (manually) after you have imported the schema.
    Hope this helps a little.
    Regards,
    Erdal

  • Problem with EXPORT IMPORT PROCESS in ApEx 3.1

    Hi all:
    I'm having a problem with the EXPORT IMPORT PROCESS in ApEx 3.1
    When I export an application, and try to import it again. I get this error message
    ORA-20001: GET_BLOCK Error. ORA-20001: Execution of the statement was unsuccessful. ORA-06550: line 16, column 28: PLS-00103: Encountered the symbol &amp;quot;牃慥整㈰㈯⼴〲㐰〠㨷㐵㈺′䵐&amp;quot; when expecting one of the following: ( - + case mod new not null &amp;lt;an identifier&amp;gt; &amp;lt;a double-quoted delimited-identifier&amp;gt; &amp;lt;a bind variable&amp;gt; avg count current exists max min prior sql stddev sum variance execute forall merge time timestamp in
    As a workaround, I check the exported file and found this
    wwv_flow_api.create_flow
    p_documentation_banner=> '牃慥整⠤㈰㈯⼴〲㠰〠㨷㠵㈺′äµ
    And when I replace with this
    p_documentation_banner=> ' ',
    I can import the application without the error.
    somebody knows why I have to do this??
    Thank you all.
    Nicolas.

    Hi,
    This issue seems to have been around for a while:
    Re: Error importing file
    I've had similar issues and made manual changes to the file to get it to install correctly. In my case, I got:
    ORA-20001: GET_BLOCK Error. ORA-20001: Execution of the statement was unsuccessful.<br>ORA-02047: cannot join the distributed transaction in progress<br>begin execute immediate 'alter session set nls_numeric_characters='''||wwv_flow_api.g_nls_numeric_chars||'''';end;There are several suggestions, if you follow that thread, about character sets or reviewing some of the line breaks within pl/sql code within your processes etc. Not sure what would work for you.

  • Export/Import Project still buggy in v2.1.3

    After seeing the update notes for v2.1.3 and noticing references to project importing, I started to get excited that maybe the bugs have been ironed out. Unfortunately my longstanding bug still remains.
    If anyone can think of any workarounds I'd be extremely grateful...
    I have a 2.16GHz Intel Core Duo MacBook Pro with 2GB RAM, running OS X 10.5.6 and now Aperture 2.1.3
    I have a project with many images, arranged into stacks, a pick in each stack with many adjustment and a 62-page book created from these picks. There are also some smart albums and a light-table.
    Now I export the project (i.e. in the Project Inspector I right-click the project and choose Export > Project) to a file on the desktop and rename the existing project in Aperture. I now import the saved .approject file and I find that all my Picks are no longer Picks and my book is therefore no longer made up of my desired images. (I've tried this many times now and ruled out the possibility of a corrupt export file).
    As a result, when I select the Book I get the "Non-Pick Images" dialog with the dreaded "One or more placed items are no longer the pick of a stack." message. "Use Current Pick" ruins the book - I have to go through and work out all the picks again (and also re-pan all the images within the Photo Boxes as this data has also been lost). "Create new Version" appears to preserve my book's original look, but my project now contains a couple of hundred more images than it used to and my other albums and light-table still have incorrect images shown as the Picks.
    Does anybody have any ideas of what I can do to ensure the stack-picks are preserved during the export/import process?
    (By the way, the reason I'm exporting and then re-importing is because I actually want to do the export from my laptop where the project is and then the import on my main work machine where the rest of my Aperture library lives, but that fails for the same reason, so I'm doing the export+import on my laptop for now to reduce the number of variables in the problem.)

    I go with the assumption that you now know how to create transport sets. After having created a transport set containing the objects you want to be exported, go to the administer tab under the main page of portal. There you will find the Export/Import portlet. Here you need to choose your transport set from the first LOV and then click EDIT to choose the "Security Option". Now you export out the stuff. This will export all the user page customizations for you.
    Thanks.

  • Export Import to Maintain Test and Production Environments

    We have developed an application using Locally built Database Providers in Oracle Portal 9.0.2.6 which is installed to 2 schemas, has 5 different providers, and over 100 Portal major components (Forms, Reports, and Calendars) and over 200 minor components (LOV's and Links). We have used export/import transport sets with some luck, but it is a struggle becuase the import procedures are not very robust. Many things (such as missing LOV's, corrupt components, preexisting versions, etc, etc.) can cause an import to fail. And the cleanup necessary to finally achieve a successful import can be very time-consuming.
    Having a robust import mechanism is very important to our strategy for keeping installed (our own and clients') portal instances up-to-date with our latest release. Some of the enhancements that would make it much easier to develop and maintain Portal applications include:
    Within the Portal:
    1. Ability to copy an entire provider within the same portal (rather than one component at a time).
    2. Ability to change the schema to which a Provider is associated.
    3. When copying a component from one provider to another, the dependent items (i.e. LOVs and Links) should be copied to new second provider as well. (i.e. rather rebuilding each LOV in each provider and then editing each form to point to the new LOVs)
    Transport Sets:
    4. Should allow for changing provider names and provider schema, and global component name changes, and resetting unqiue id's on import (to create copy rather than overwrite).
    5. Should allow the option to ignore errors and import all components which pass pre-check (rather than failing all components if all items do not pass pre-check).
    How are other Portal Developers dealing with installing and then rolling out new sets of Locally built Database Providers from Development environments to Production? Are there any whitepapers on the best practices for replicating/installing a portal application to a new portal instance and then keeping it updated?
    Oracle, are any of my wish-list items above on the future enhancement lists? Or have others figured out workarounds?
    Thanks,
    Trenton

    There are a couple of references which can be found on Portalstudio.oracle.com that are of some use:
    1. A FAQ for Portal 9.0.2.6 Export/Import http://portalstudio.oracle.com/pls/ops/docs/FOLDER/COMMUNITY/OTN_CONTENT/MAINPAGE/DEPLOY_PERFORM/9026_EXPORT_IMPORT_FAQ_0308.HTM
    2. Migration Instructions by Larry Boussard (BRUSARDL)
    3. Migrating Oracle Portal from Dev Systems to Production Systems bt Dheeraj Kataria.
    These are all useful documents for a successful first-time Export-Import. However, the limitations and lack of robustness I listed in my first post, make the process so time-consuming and error fraught as to not be a practical development strategy.

  • OWB - issue in export  import

    hi,
    When i do an export – import from one OWB repository to another of an OWB mapping having a MERGE statement some of the columns are missing in the MERGE. I am using OWB 10g R2.
    Eg: if the MERGE is based on columns c1, c2, c3 after the export-import the mapping to a different environment the imported mapping is having the MERGE based on only c1
    Thanks

    There is Bug 5705198: LOADING PROPERTIES CHANGED AFTER MDL EXPORT/IMPORT (fixed in 10.2.0.4 OWB patchset), similar problem with lost MATCH COLUMN WHEN UPDATING ROW property during MDL import.
    Maybe it is your case.
    There is not workaround for this problem, only patching to OWB 10.2.0.4
    Regards,
    Oleg

  • Export/Import subpartition stats

    I hope someone can give me a workaround for this, because it's causing our reports to take longer than they should!
    Background:
    We have some sub-partitioned tables on a 10.2.0.3 database, partitioned daily on the date column, with the subpartitions based on a list of values.
    Overnight, various reports are run. Each report loads its data into the table, and then produces a file based on the data that's been loaded for that report. It is not practical (IMO) to analyze the tables after each report has loaded its data, due to other reports loading their data at the same time.
    As the amount of data loaded into the tables each night does not vary significantly, we export the stats from a previous partition and import them into the new partition as part of the partition housekeeping job (stats imported from old partition, old partition gets dropped, new partition created with same name as the old one, and stats imported). This is done using dbms_sql.export_table_stats and dbms_sql.import_table_stats.
    However, one report which currently loads 43million rows is taking 4.5 hours to run. The size of the load file increases daily, but looking at the history of the report, each relatively small increase causes the report to run a disproportional amount longer (ie. an increase of a similar amount of rows on one night can add twice as much time onto the length of the report than the increase the previous night did).
    We've just implemented some changes to improve the buffer sizes, etc, on the database, in a bid to reduce some of the waits, but this has not improved matters much - the report now runs in 4 hours.
    We know this report can run faster, because in testing, we saw the report run in 60 minutes! Subsequent investigation shows that this was after the partitions had been analyzed, whereas the slow report ran prior to the partitions being analyzed, despite the stats being there for the partition.
    I have now tested the export/import stats process and found that they do not import the stats for the subpartitions. This looks like it is a large part of why the report takes longer before the relevant partitions/subpartitions have been analyzed than it does afterwards.
    Does anyone know of anyway that I can export/import the stats at a subpartition level? (I tried putting a subpartition name in the partition parameter, but I just got an error about it being an unknown partition name.)
    Any help, ideas or workarounds on this will be gratefully received!

    *** Duplicate Post - Please Ignore ***

  • ProgressIndicator for migration (export/import) of db tables

    Hi
    I want to use the "progressIndicator" component of ADF faces for my export-import db tables utility which will be launched by an "Export/Import" button on the UI. For the value of "progressIndicator" component, I am using a bean and extending it to the BoundedRangeModel class where getValue() and getMaximum() methods are being implemented. The problem is, how can I call the export() method "in sync with" my bean getValue() and getMaximum() methods, or will I have to use threads, no workaround? (Here, getValue() should get the value from some log db table which will get updated by the export() method)
    Thanks
    Bhavesh

    Bhavesh,
    I have done this in the past using this general approach:
    1). Class that actually does the exports is a thread. As it progresses, it updates a "percent complete" somewhere, perhaps in a session-scoped managed bean that has been injected upon creation.
    2). Web page has progress indicator and af:poll to update the indicator periodically. progress indicator uses percent complete from the session bean that is updated by the thread.
    3). Upon user clicking "export" - instantiate and run the thread.
    Don't actually have time to post the actual code here, but I know I put this in a post earlier (perhaps last year).
    John

  • Client copy error - Export/Import

    Hello Folks,
    With a certain business requirement we have small scale implementation planned every quarter and as per the requirement we are performing a client copy into one of our SANDBOX(SDX) client (201) every quarter from our Production ECC system (cleint-600).
    *The SANDBOX is used for POC and new enhancement testing and also for training purposes.
    Cleint copy (export/import method)
    ============================
    PRD (600)  ==> SDX (201)
    Both Sanbox and production are with same SAP version, patch, OS and database versions.
    The client copy(client export-import method) is successfull when the client is new (SDX-201), but when we did the client next time on the same client, the client copy finished with many inconsistencies.
    We have also tried deleting the client 201 and then recreating client 201 in SDX and copy from our PRD-600. But the result is same, the client copy finished with many inconsistencies.
    Now we are not able to figure out the problem why this problem is occuring. The whole point of client copy from PRD is not helping project. As a workaround at the moment we are copying onto a new client every quarter.
    We have to get a solution. Please suggest.
    PS: We can not do the system refresh, because there are other clients on the Sandbox systems and we do not want them to be disturbed.
    Lookign forward for your suggestions.
    Thanks
    James

    Hello James,
    The client copy(client export-import method) is successfull when the client is new (SDX-201), but when we did the client next time on the same client, the client copy finished with many inconsistencies.
    What inconsistencies ? Can you please post the error log ?
    Regards,
    Subhash

  • Exporting/importing custom library objects

    Dear,
    When we made custom objects for in our library. They are available to all environments on my laptop, but not on a different machine.
    Is there any way to export/import these?
    Kind regards,
    Frederik-Jan.

    Hello,
    These things are stored locally on your machine.
    Note: On tab Library in your LCD click right arrow to get the library menu, mark your components tab (you need to export), right arrow opens the menu where there is the "group properties..." item. Here there is a path on your harddrive. My problem is I cannot locate the path I can see here. So I have looked for a workaround.
    Workaround: create a folder on your desktop. Create a component group in you Library in LCD (new tab). Go to properties of this new group, set the path to your new desktop folder. Go to the component tab where you have your custom components. Select them, use move to..., select the new component group/ desktop folder. After completing this, check the xfo files in your desktop folder. You can now send them to another machines.
    Note: In the library "right-arrow-menu" there is the item "shared library location". You can use this to share the components among all the machines you need.
    Regards, Otto

  • MS Office Export/Import to Excel

    Do we have standard API to implement export/import functionality for MS Excel 2007. I am not sure if JXL/HSSF or any other APIs have come up for this new version of excel.
    Any thoughts?

    Hi
    Please refer [this|http://java.ittoolbox.com/groups/technical-functional/java-l/java-api-to-read-excel-2007-2580267?cv=expanded] and [this|http://xlsgen.arstdesign.com/core/exceltools.html] thread for help.
    Best Regards
    Satish Kumar

  • 2 midi export-import problems...from l8 to l8!

    1. l8 always exports/imports midi regions as their own tracks. the preference for "export single regions as type 0 files" is different from how it worked in l7 (1 track with several separate regions exported/imported as 1 track). when exporting an entire sequence to midi, this preference now makes no difference whatsoever. annoying.
    2. markers with text (lyrics, for instance) don't survive the import/export process. hardly surprising, i suppose. the markers themselves are there, but not their names, or the info contained in them. instead, you get a marker that looks like html/xml? it apparently references an rtf file, and shows font/color info. the actual marker start/end time values are correct, however.
    grrrr
    can anyone confirm, or know of a workaround?
    Message was edited by: kelldammit

    Are there any free alternatives to Emaichemy to handle importing from Thunderbird to OS X Lion Mail?
    I'm also having a problem importing from Thunderbird, but in my case the folders are importing but I'm only getting one message per folder. This was reported in another forum for an older OS X version (apparently Apple never fixed the problem), and the solution there was to use a utility called Eudora Mailbox Cleaner. Unfortunately, this utility does not run under Lion.
    What I'm really trying to do is get local mail folders migrated from Zimbra Desktop to Mac Mail. There was no direct way to do this, because Mac Mail doesn't support Zimbra Deskop, and ZD can't export to an mbox file (it exports to eml format). However, Mac Mail "supports" importing directly from Thunderbird, and Thunderbird can import eml files using an add-on called ImportExportTools, so I installed Thunderbird, installed the add-on, imported the exported data from ZD, and that successfully brought my local mail folders into Thunderbird. So far so good. Unfortunately, when I tried to import from Mac Mail directly from Thunderbird, it imported all the folders, but just one message from each folder.
    I also tried using the Thunderbird ImportExportTools add-on to export the folders, thinking it would generate mbox format. I was able to import that data using the "import from mbox" option in Mac Mail, but I got the same result -- all my folders came in with a single in message in each.
    Any other suggestions? This is a one-time thing, I don't want to spend money on an application if I don't have to.

  • Export/import utility from within apex

    Hi Friends,
    With my knowledge in oracle forms the export/import utility can be called from within a form in runtime
    using a push button.But how can i achieve this using a similar button in the oracle apex.
    Pls give me a helping hand.Thanks.
    regards,
    kehinde

    Hello:
    The Oracle Data Pump utility is now the preferred tool to use to import and export data and metadata from an Oracle database. Oracle Data Pump provides capabilities that far exceed those provided by the older imp/exp programs. Further, in addition to command line invocation of the tool, Oracle Data Pump has a set of pl/sql APIs (DBMS_DATAPUMP) that will let you do import and exports from pl/sql. You can therefore easily set up an APEX page that will accept a bunch of parameters and execute the appropriate procedures within DBMS_DATAPUMP to do the export/import.
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/dp_api.htm#i1008009
    Varad

  • Nexus 1010 can not erase export-import folder

    Hi Expert,
    After we import the backup vsb. We try to erase the export-import folder. But system show "Device or resource busy". So we can not execute more export or import command. Please reference the below image and help us to fix this issue.
    Ps. The Nexus 1010 version is "4.2.1.SP1.4"

    Hello Meng,
    I was able to recreate this problem in the lab.  It appears it only occurs when importing a secondary VSB.  A 'system switchover' of the N1010s frees the NFS locks.  Not an ideal workaround, but a workaround.
    CSCtz44247 has been opened for this problem.
    Matthew

Maybe you are looking for