Bug in Export/Import of Metadata

I have an image (freshly imported; DNG, converted from NEF) containing the keyword "Großbritannien" which is part of the keyword-hierarchy
Welt
  Europa
   Großbritannien
The image is developed to be Black and White.
To get GPS-Info into the IPTC-fields, I
1. In LR: Metadata - Save Metadata to file (just to be sure everything is written into the file; "Automatically write to XMP" is turned on)
2. Open the image in GeoSetter (but all this applies if I don't use GeoSetter but e.g. apply "Recover Edges" to the dng-file) and save GPS-Information into the IPTC-Section of the file
3. In LR: Metadata - Read Metadata from file
In LR2, I now had the B&W image with keyword "Großbritannien" and GPS-information was filled.
In LR3, the image is now coloured instead of B&W (develop history says "Einstellungen zurücksetzen", should be "reset to default" in english) and it has the keywords "Europa, Großbritannien, Welt" which are now single new entries despite being part of the hierarchy mentioned above.
GPS-info is OK.
WinXP, LR3 trial.

ExifTool (used in GeoSetter) is reporting errors when writing data back to LR3 processed files.
This is new to the latest version of GeoSetter, which uses a newer version of ExifTool.
According to Friedemann Schmidt from GeoSetter:
"[LR3] contains a bug that results in an incorrect written Creator in the metadata (list type 'bag' instead of 'seq'). That's why ExifTool (>= 8.26) now gives a warning. In prior versions of ExifTool, the Creator has been added again in the correct way. But the the Creator existed twice then in the XMP data and Lighroom didn't recognize nearly all metadata anymore after importing it again...
In short: You can ignore this warning ;-) Your image data still contains the wrong metadata (written by Lightroom 3), but it doesn't contain the data twice anymore after saving with ExifTool 8.26 (GeoSetter 3.3.60).
As far as I know this problem will be solved in the next update of Lightroom 3..."
Also from Friedemann:
" One more remark: If you used GeoSetter 3.1.20 together with Lightroom 3, after reimporting the metadata to LR 3 maybe your development settings of the images have been reset. Please check this! The deveopment settings still exist in the development protocol and you have to go one step back for each image..."
This seems to cover the problem first reported in this thread with the picture reverting back to Color.
If the metadata is wrong and we don't run GeoSetter to fix it, are there other implications?
Does anyone know if this potential metadata error has been reported to Adobe so that LR3 can be fixed?

Similar Messages

  • Unit Tester: Bug in Export/Import of Suites

    I was moving my Unit Test repository from one schema to another, so I first exported all of my suites and then imported them into a new repository.
    But, when I imported them into their new repository, all of the tests forgot what specific function/procedure name within a package they were actually calling. For example, the test that used to be for "pccdb.rotat_maintenance.parseRotatKey", now says it's going to call "pccdb.rotat" (i.e. the specific function name it's supposed to be calling within that package disappeared).
    I went looking and found that in the UT_TEST table, the "object_call" column was null for all of my tests, but because all of my tests call methods within packages, I'm thinking that column shouldn't be NULL. So, I filled in the appropriate function name for one of them, and the test started working....
    So, something in the import or export of unit suites (or their tests) isn't working quite right in populating the ut_tests.object_call column, I'm guessing - please take a look at it.

    This was originally reported in SQL Developer 2.1: Problem exporting and importing unit tests thread. It was logged as
    Bug 9236694 - 2.1: OTN: UT_TEST.OBJECT_CALL COLUMN NOT EXPORTED/IMPORTED
    It has been fixed and will be available in the next patch release.
    Brian Jeffries
    SQL Developer Team

  • EIF Export/Import Only Metadata

    The following export statement 'export' entire AW with data. And import also imports everything.
    exec dbms_aw.execute('export all to eif file ''MYDIR/TEST_OLAP.eif'' api');
    exec dbms_aw.execute('import all from eif file ''MYDIR/TEST_OLAP.eif'' api');
    Can I only export and import the metadata(like we do XML template) using commands like above?

    Just learned... Try below.
    jarname_ is the name of the AWM jar. e.g. awm11.2.0.2.0.jar
    user is the name of your user. e.g. TESTOLAP
    password is the password for your user
    url is the url for a node in your RAC cluster. e.g. jdbc:oracle:thin:@myserver:1521:orcl
    filename is the name of the XML file to create. e.g. testolap.xml
    java -classpath jarname_ oracle.olapi.metadata.util.ExportXML -user user -password password -url url > filename
    The java version should be "1.6.0_24".

  • Export/Import integrator metadata between instances

    Hi all
    We have hit recently a situation with some customers who would like to migrate integrator related metadata from one instance to another (integrator, components, mappings, layout, parameter list).
    They are doing this using FNDLOAD. For example
    Download phase on the source.
    FNDLOAD apps/apps 0 Y DOWNLOAD $BNE_TOP/patch/115/import/bneint.lct XXCOHR_Integrators.ldt BNE_INTEGRATORS INTEGRATOR_ASN="XXCOHR" FNDLOAD apps/apps 0 Y DOWNLOAD $BNE_TOP/patch/115/import/bnemap.lct XXCOHR_mapping.ldt BNE_MAPPINGS MAPPING_ASN="XXCOHR" FNDLOAD apps/apps 0 Y DOWNLOAD $BNE_TOP/patch/115/import/bnelay.lct XXCOHR_Layouts.ldt BNE_LAYOUTS LAYOUT_ASN="XXCOHR" FNDLOAD apps/apps 0 Y DOWNLOAD $BNE_TOP/patch/115/import/bnecont.lct XXCOHR_bnecont.ldt BNE_CONTENTS CONTENT_ASN="XXCOHR"
    Upload phase on the target.
    FNDLOAD apps/apps 0 Y UPLOAD $BNE_TOP/patch/115/import/bneint.lct XXCOHR_Integrators.ldt FNDLOAD apps/apps 0 Y UPLOAD $BNE_TOP/patch/115/import/bnelay.lct XXCOHR_Layouts.ldt FNDLOAD apps/apps 0 Y UPLOAD $BNE_TOP/patch/115/import/bnemap.lct XXCOHR_mapping.ldt FNDLOAD apps/apps 0 Y UPLOAD $BNE_TOP/patch/115/import/bnecont.lct XXCOHR_bnecont.ldt
    The problem is that after trying to create a document using the new BNE elements they are getting various errors.
    No parameter list definition at parameter list idor No Mappings exist for the specified Content.
    I have had a look at some patch delivering integrators via the same LCT files , and the unified driver has for example the following sequence which is executed during the patch
    exec fnd bin FNDLOAD bin &phase=last+1 checkfile:pay:patch/115/import/D:pyinfcmp.ldt &ui_apps 0 Y UPLOAD @BNE:admin/import/bnecomp.lct @PAY:patch/115/import/D/pyinfcmp.ldt - UPLOAD_MODE=NLS WARNINGS=TRUE
    exec fnd bin FNDLOAD bin &phase=last+1 checkfile:pay:patch/115/import/D:pyinfcnt.ldt &ui_apps 0 Y UPLOAD @BNE:admin/import/bnecont.lct @PAY:patch/115/import/D/pyinfcnt.ldt - UPLOAD_MODE=NLS WARNINGS=TRUE
    exec fnd bin FNDLOAD bin &phase=last+1 checkfile:pay:patch/115/import/D:pyinfint.ldt &ui_apps 0 Y UPLOAD @BNE:admin/import/bneint.lct @PAY:patch/115/import/D/pyinfint.ldt - UPLOAD_MODE=NLS WARNINGS=TRUE
    exec fnd bin FNDLOAD bin &phase=last+1 checkfile:pay:patch/115/import/D:pyinflay.ldt &ui_apps 0 Y UPLOAD @BNE:admin/import/bnelay.lct @PAY:patch/115/import/D/pyinflay.ldt - UPLOAD_MODE=NLS WARNINGS=TRUE
    exec fnd bin FNDLOAD bin &phase=last+1 checkfile:pay:patch/115/import/D:pyinfmap.ldt &ui_apps 0 YUPLOAD @BNE:admin/import/bnemap.lct @PAY:patch/115/import/D/pyinfmap.ldt - UPLOAD_MODE=NLS WARNINGS=TRUE
    exec fnd bin FNDLOAD bin &phase=last+1 checkfile:pay:patch/115/import/D:pyinfpar.ldt &ui_apps 0 Y UPLOAD @BNE:admin/import/bneparamlist.lct @PAY:patch/115/import/D/pyinfpar.ldt - UPLOAD_MODE=NLS WARNINGS=TRUE
    I couldn' find anywhere this process being documented in an official document or Note.These customers have used some external non-oracle references fordoing so.
    Is there anywhere a document giving more information on how this export/import is supposed to work?
    Please reply to me directly as I am not part of this mailing list.
    Thank you and best regards,
    Dan

    Looks like you have not transferred your parameter list. Please use following code to download and upload your parameter list.
    Download
    FNDLOAD apps/$FNDPWD 0 Y DOWNLOAD $ BNE_TOP / admin / import / bneparamlist.lct GENERAL_201.ldt BNE_PARAM_LISTS INTEGRATOR_ASN = "HX" INTEGRATOR_CODE = "GENERAL_201_INTG"
    Upload
    FNDLOAD apps/$FNDPWD 0 Y UPLOAD $BNE_TOP/admin/import/bneparamlist.lct GENERAL_201.ldt
    For mapping there is a possibility that you might have migrated wrong content. You need to migrate content which is none instead 'text'
    HTH

  • Exporting/Importing Transformation metadata

    Hello All,
    I have a custom written package which we use as a post-mapping process.
    I compile this package in the database external to OWB and then I import the definition into OWB. Where it becomes a transformation.
    We are currently in Dev and I now want to migrate to Test.
    So I tried to export the metadata for this package so I can import it into the Test Design Center. This is because I want everything contained in OWB.
    The export appears to work but when I import it into Test (which also seems to work) and try to generate the code it produces an empty file.
    Is it possible to export and import the metadata of a custome transformation? If so are there any particular options I should use?
    I'm sorry this seems to be a basic task but I can't find anything in the documentation and I wondered if anyone else had come across it.
    Thanks in advance.

    Hi,
    you cannot redesign a package with an import. You will only get the package header, not the body. You must create the custom transformation in the design center and then deploy it to the target database - not the other way round.
    Regards,
    Detlef

  • Export/import OWB (10.1 and PARIS) metadata problems

    I am attempting to import an OWB 10.1–exported map into an OWB 10.2/PARIS repository .. this failed as follows from the Paris Design Center menu bar ...
    Design – Import – Warehouse Builder Metadata and specify a previously exported OWB 10.1 map. I receive the following message … ”Metadata version is not compatible with you current repository version. Upgrading the file is required to import your metadata”
    … Selecting ”UPGRADE” yields the following …
    Upgrade started at Jul 25, 2006 12:02:05 PM
    Preloading objects from release 9.2
    Upgrading objects from release 9.2 to 10.0
    Project "SIRR_DATA_CONVERSION"
    "Error occurred importing from file "C:\MKS\ConfigurationMgmt\OWB\Unit Testing\Build 1.13\Client Account\OWBC_ALS_CLI_CLIENT\OWBC_ALS_CLI_CLIENT.mdl".
    CNV0002-0025(ERROR): Unexpected error occurred. Upgrade terminated. See log file C:\MKS\ConfigurationMgmt\OWB\Unit Testing\Build 1.13\Client Account\OWBC_ALS_CLI_CLIENT\OWBC_ALS_CLI_CLIENT_10_0_upgrade.log for details. Please contact"
    The log contained over 16000 lines of info associated with mapping parameter/mapping parameter properties/property values, groups and finally, before failing, mappingparameterconnection --> mappingparameterconnection .. I can email the log if anybody is interested in taking a look.
    Two things with error above .. I do not have a 9.2 map (seems the upgrade is assuming a 9.2 map) but a 10.1 map .. and the error message seems to be truncated after ”Please contact”
    any advice on importing OWB 10.1 metadata(maps) into OWB PARIS ? as I am trying to create business case for upgrading from OWB 10.1 to PARIS.

    There is a similar issue logged in our Support database relating to this type of problem. The bug (ref number 5309802) has some workarounds relating to issues within an MDL file relating to non-unique IDs and blank properties for certain objects.
    The log file should tell you exactly where the problem objects are located, row and column reference. If you could open a TAR with Oracle Support and send them the log file and the associated MDL file they should be able to organize a fix for you.
    If you want to try this yourself then make a backup of the MDL file. Next open the MDL file using Winzip and extract the two files within the MDL file. The larger of the two files is in fact an XML file. Open this file in a text editor that shows line and column numbers. Now refer back to the log file and find the error messages from that file and cross reference row/column numbers within the XML file and see if the error makes sense to you. If it does, correct the error and then rezip everything back into an MDL file and try importing the MDL file again.
    Alternatively you may want to leave this to Oracle Support to manage for you.
    Hope this helps
    Keith

  • Metadata export/import to other BI tools?

    Metadata export/import to other BI tools?
    Hi ,
    I am using SQL DM 3.0.0.665. I need your thoughs on following.
    As we build our logical/physical models in SQL data Modeler, we create our metadata for our attributes/columns.
    After creating our physical model, other BI and reporting tools acquire access to the physical tables. Each tool has a process of creating their reporting layer and able to bring the tables/columns into reporting layer. It shows the column names as it is. It is not business friendly.
    Do we have a mechanism to export attribute business names as meta data of DB tables?
    If we define the meta data as part of physical model using modeling tools such as Oracle SQL Developer or ERWIN, will we able to import those into BI tools as report elements meta data?
    Where can I find details?
    How do we do that?
    Thanks in helping us out.

    Hi,
    there is no problem if you are building reports on top of our reporting repository - table DMRS_MAPPINGS holds information for pairs - (entity- table), (attribute-column)...., and you can take implementation and business name.
    Exchange to BI tools - you need to know how particular BI tool can recognize and link implementation and business names and then you can use transformation script in order to provide required information. There are examples on this forum how to deal with files and database connection using transformation script.
    Philip

  • Best practice metadata export/import  for WebCenter portal

    Hi
    I'm using JDeveloper to build the WebCenter portal. When I let's say register a portlet producer along with my portal, the information about it is stored in the metadata. In case I want to move my portal from the integrated WLS which JDeveloper provides to another WebLogic server with WebCenter installation - steps which I need to do are:
    1) Export the metadata from JDeveloper
    2) Import metadata to my target Weblogic server
    3) deploy the application
    Am I missing something?
    To export/import metadata from JDeveloper to another Weblogic (equipped with WebCenter) server I should use the WLST script, right?
    Thanks

    A little correction: IAS Version is Oracle9i Application Server Release 1 version 1.0.2.2.2

  • ?BUG? App Export / Import error with javascript

    Greetings -
    Anyone have an idea on how to get around this or if I'm doing something wrong?
    I have a page button to collapse a tree, with a URL Target of:
    javascript:apex.widget.tree.collapse_all($v(P2130_TREE_ID));
    Page and button work fine, have never had a problem.  Until I went to do an application import and found that the SQL file for the import was corrupted due to the above line.  SQL file has:
         p_button_redirect_url=>'javascript:apex.widget.tree.collapse_all($v(P'||to_char(2130+www_flow_api.g_id_offset)||''_TREE_ID));',
    It appears to have an extra (2x) single quote right before _TREE_ID.  I have this logic in several places in my application and every place caused the same error.
    Application copy also fails with same error.
    Thanks,
    Scott

    Tom -
    I assume this is the prior post you where talking about : https://forums.oracle.com/thread/2401714
    However, this doesn't cover my problem.  I have tried all the follow, and they all break export / import and copy functionality due to mismatched quotes:
    collapse_all($v(P2130_TREE_ID));
    collapse_all($v('P2130_TREE_ID'));
    collapse_all($v("P2130_TREE_ID"));
    collapse_all('&P2130_TREE_ID.');
    collapse_all("&P2130_TREE_ID.");
    These all allow the collapse button to work.
    NOW, if I change my ID tag to be something like 'P_SCOTT_TREE' then everything works just fine (import/export/copy).  It's having a number in the tag that is cause the problems because the export is casting it to_char and somehow throwing in an extra quote.
    Thanks,
    Scott

  • Export/Import Project problems

    I have two machines: a PowerMac (G5) and a MacBook Pro (Intel Dual Core). Both 2GB RAM, running OS X v10.5.6 and Aperture 2.1.2. (More tech spec info available if anyone thinks it relevant).
    I have built up a project on the MacBook Pro, creating albums, creating stacks and making picks within the stacks, making adjustments to each of the picks and then creating a book.
    I now want the project on the PowerMac G5, which is my master machine and contains all my other projects.
    On the MacBook Pro I exported the project and then copied the resulting .approject file to the PowerMac. I then imported the project into Aperture from there. On a cursory scan, everything seems fine and in the Projects Inspector it shows 1376 images against the project on both machines. However, if I look more closely, I've lost all my picks. I.e. I need to go through every stack again and reselect my picks (very boring).
    It gets worse when I look at the Book within the project - it's lost the positionings (i.e. I double clicked on the image and moved it around slightly to get the best crop for the selected box shape). And then I did a Close All Stacks and it deleted every picture from the book so all 60 pages were empty again and the images were all back down in the film strip!
    At this point, my immediate thought was the .approject file got corrupted during the copy, but I've done a recursive diff (diff -qr fileA fileB) between the file on the MacBook Pro and the file on the PowerMac and it reported no differences at all.
    Has anybody else had similar problems? Is the project import/export known to be flakey? Better yet, anybody know of any workarounds?
    Thanks in advance for any help anyone can offer.
    Darren

    OK - I've tried eliminating some of the variables and it turns out the two-machines-issue is a bit of a red herring and this is a straightforward Export/Import bug in Aperture.
    On a single machine (2.16GHz Intel Core Duo MacBook Pro with 2GB RAM, running OS X 10.5.6 and Aperture 2.1.2 (build 3G13)) I have a project with many images. I create versions of these images and apply adjustments (Sharpness, Monochrome Mixer, Exposure, Enhance, etc.). I then make the adjusted version the Pick and I've created a 62 page book from these Picks.
    Now I export the project (i.e. in the Project Inspector I right-click the project and choose Export > Project). I export the project to a file on the desktop and rename the existing project in Aperture. I now import the saved .approject file and I find that all my (long and carefully chosen) Picks are no longer Picks. (I've tried this a few times and ruled out the possibility of a corrupt export file).
    As a result, when I select the Book I get the "Non-Pick Images" dialog with the dreaded "One or more placed items are no longer the pick of a stack." message.
    I have two options at this point: "Use Current Pick" ruins the book - I have to go through and work out all the picks again (and also re-pan all the images within the Photo Boxes as this data has also been lost).
    Or I can choose "Create new Version", which appears to preserve my book's original look (I haven't looked in too much detail yet), but my project now contains a couple of hundred more images than it used to (or at least extra versions which, whilst I know these are relatively 'cheap', are still unnecessary) and this still doesn't help with my other albums and light tables, which still have incorrect images shown as the Picks.
    I'm finding this quite a worry - I originally hesitated to move my photo library into Aperture due to the thought of losing the metadata one day if I moved from Aperture to something else, but in the end I decided to trust that something as simple-to-test as Export/Import would get properly implemented - now I'm beginning to fear that it doesn't even within the same version of Aperture, never mind different versions of Aperture or (heaven forbid) different applications altogether

  • Export/Import Project still buggy in v2.1.3

    After seeing the update notes for v2.1.3 and noticing references to project importing, I started to get excited that maybe the bugs have been ironed out. Unfortunately my longstanding bug still remains.
    If anyone can think of any workarounds I'd be extremely grateful...
    I have a 2.16GHz Intel Core Duo MacBook Pro with 2GB RAM, running OS X 10.5.6 and now Aperture 2.1.3
    I have a project with many images, arranged into stacks, a pick in each stack with many adjustment and a 62-page book created from these picks. There are also some smart albums and a light-table.
    Now I export the project (i.e. in the Project Inspector I right-click the project and choose Export > Project) to a file on the desktop and rename the existing project in Aperture. I now import the saved .approject file and I find that all my Picks are no longer Picks and my book is therefore no longer made up of my desired images. (I've tried this many times now and ruled out the possibility of a corrupt export file).
    As a result, when I select the Book I get the "Non-Pick Images" dialog with the dreaded "One or more placed items are no longer the pick of a stack." message. "Use Current Pick" ruins the book - I have to go through and work out all the picks again (and also re-pan all the images within the Photo Boxes as this data has also been lost). "Create new Version" appears to preserve my book's original look, but my project now contains a couple of hundred more images than it used to and my other albums and light-table still have incorrect images shown as the Picks.
    Does anybody have any ideas of what I can do to ensure the stack-picks are preserved during the export/import process?
    (By the way, the reason I'm exporting and then re-importing is because I actually want to do the export from my laptop where the project is and then the import on my main work machine where the rest of my Aperture library lives, but that fails for the same reason, so I'm doing the export+import on my laptop for now to reduce the number of variables in the problem.)

    I go with the assumption that you now know how to create transport sets. After having created a transport set containing the objects you want to be exported, go to the administer tab under the main page of portal. There you will find the Export/Import portlet. Here you need to choose your transport set from the first LOV and then click EDIT to choose the "Security Option". Now you export out the stuff. This will export all the user page customizations for you.
    Thanks.

  • Importing the METADATA ONLY using DBMS_DATAPUMP

    Hi DBAs,
    Using the DBMS_DATAPUMP , how can I import the metadata only for a particular table . Also I dont want to import any associated INDEXES and TRIGGERS. I have the following codes but it is trying to import every thing. Also if the table exist, it is not importing the METADATA rather erroring out.
    handle1 := DBMS_DATAPUMP.OPEN('IMPORT','SCHEMA', 'QAXDB.WORLD');
    DBMS_DATAPUMP.METADATA_FILTER(handle1, 'SCHEMA_EXPR', 'IN (''HR'')');
    DBMS_DATAPUMP.SET_PARAMETER(handle1, 'INCLUDE_METADATA', 1);
    Thanks
    -Samar-

    See the below link,
    http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14258/d_datpmp.htm
    Hope this helps,
    Regards,
    http://www.oracleracexpert.com
    Click here for [Cross platform Transportable tablespace using Datapump|http://www.oracleracexpert.com/2009/08/transportable-tablespace-export-import.html]
    Click here to learn [Oracle data pump export/import with examples.|http://www.oracleracexpert.com/2009/08/oracle-data-pump-exportimport.html]

  • Migrate Database- Export/Import

    Hi,
    I need to migrate an Oracle database 9i from Sun Solaris to Linux. The final target database version would be 11g.
    Since this is a 9i database, I see that we have only option of export and database. We have around 15 schemas.
    I have some queries related to it.
    1. If I perform a export with full=y rows=y, Does it export Sys, System Schema objects also?
    2. Can we perform export in Oracle 9i and use datapump import on targert 11g?
    3. What is the ebst approach - a) to perform schema by schema export or b) to perform a full database export with exp / file=xxx.dmp log=xxxx.log full=y?
    Since there is a database version different I dont want to touch sys, system schema objects.
    Appreciate your thoughts.
    Regards
    Cherrish Vaidiyan

    Hi,
    Let me try to answer some of these questions you queried for:
    1. If I perform a export with full=y rows=y, Does it export Sys, System Schema objects also?Export won't export sys objects. For example, there are tables in sys, like obj$ that contain information for other metadata objects, like scott.emp, etc. This is not exported becuase when scott.emp is exported, the data from obj$ is essentially exported that way. When the dumpfile is imported and scott.emp is recreated, the data in sys.obj$ will be restored through the create table statement. As far as the SYSTEM schema is concerned, some objects are exported and some are not. There are tables in system that contain information about queues, jobs, etc. These will probably not make any sense on the target system so those types of tables are excluded from the export job. Other objects make sense to export/import so those are done. This is all figured out in the internals of export/import. Thre are other schemas that are not exproted. Some that I can think of are DMSYS, ORDSYS, etc. This would be for the same reason as SYS.
    2. Can we perform export in Oracle 9i and use datapump import on targert 11g?No, the dumpfiles are formatted differently. If you use exp, then you must use imp. If you use expdp, then you must use impdp. You can do exp on 9i and imp on 11g with the dumfile that was created on 9i.
    3. What is the ebst approach - a) to perform schema by schema export or b) to perform a full database export with exp / file=xxx.dmp log=xxxx.log full=y?This is case by case decision. It depends on what you want. If you want the complete database moved, then I would personally think that a full=y is what you would want to do. If you just did schema exports, then you would never export the tablespaces. This would mean that you would have to create the tablespaces on the source system before you ran imp. There are other objects that are not exported when a schema level export is performed that are exproted when a full is performed. This information can be seen in the utilities guide. Look to see what is exported when in user/schema mode vs full/database mode.
    Since there is a database version different I dont want to touch sys, system schema objects.This is all done for you with the internal workings of exp/imp.
    Dean
    Edited by: Dean Gagne on Jul 29, 2009 8:38 AM

  • Using export/import to migrate data from 8i to 9i

    We are trying to migrate all data from 8i database to 9i database. We plan to migrate the data using export/import utility so that we can have the current 8i database intact. And also the 8i and 9i database will reside on the same machine. Our 8i database size is around 300GB.
    We plan to follow below steps :
    Export data from 8i
    Install 9i
    Create tablespaces
    Create schema and tables
    create user (user used for exporting data)
    Import data in 9i
    Please let me know if below par file is correct for the export :
    BUFFER=560000
    COMPRESS=y
    CONSISTENT=y
    CONSTRAINTS=y
    DIRECT=y
    FEEDBACK=1000
    FILE=dat1.dmp, dat2.dmp, dat3.dmp (more filenames here)
    FILESIZE=2048GB
    FULL=y
    GRANTS=y
    INDEXES=y
    LOG=export.log
    OBJECT_CONSISTENT=y
    PARFILE=exp.par
    ROWS=y
    STATISTICS=ESTIMATE
    TRIGGERS=y
    TTS_FULL_CHECK=TRUE
    Thanks,
    Vinod Bhansali

    I recommend you to change some parameters and remove
    others:
    BUFFER=560000
    COMPRESS=y -- This will increase better storage
    structure ( It is good )
    CONSISTENT=y
    CONSTRAINTS=y
    DIRECT=n -- if you set that parameter in yes you
    can have problems with some objects
    FEEDBACK=1000
    FILE=dat1.dmp, dat2.dmp, dat3.dmp (more filenames here)
    FILESIZE=2048GB
    FULL=y
    GRANTS=y -- this value is the default ( It is
    not necesary )
    INDEXES=y
    LOG=export.log
    OBJECT_CONSISTENT=y -- ( start the database in restrict
    mode and do not set this param )
    PARFILE=exp.par
    ROWS=y
    STATISTICS=ESTIMATE -- this value is the default ( It is
    not necesary )
    TRIGGERS=y -- this value is the default ( It is
    not necesary )
    TTS_FULL_CHECK=TRUE
    you can see what parameters are not needed if you apply
    this command:
    [oracle@ozawa oracle]$ exp help=y
    Export: Release 9.2.0.1.0 - Production on Sun Dec 28 16:37:37 2003
    Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
    You can let Export prompt you for parameters by entering the EXP
    command followed by your username/password:
    Example: EXP SCOTT/TIGER
    Or, you can control how Export runs by entering the EXP command followed
    by various arguments. To specify parameters, you use keywords:
    Format: EXP KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
    Example: EXP SCOTT/TIGER GRANTS=Y TABLES=(EMP,DEPT,MGR)
    or TABLES=(T1:P1,T1:P2), if T1 is partitioned table
    USERID must be the first parameter on the command line.
    Keyword Description (Default) Keyword Description (Default)
    USERID username/password FULL export entire file (N)
    BUFFER size of data buffer OWNER list of owner usernames
    FILE output files (EXPDAT.DMP) TABLES list of table names
    COMPRESS import into one extent (Y) RECORDLENGTH length of IO record
    GRANTS export grants (Y) INCTYPE incremental export type
    INDEXES export indexes (Y) RECORD track incr. export (Y)
    DIRECT direct path (N) TRIGGERS export triggers (Y)
    LOG log file of screen output STATISTICS analyze objects (ESTIMATE)
    ROWS export data rows (Y) PARFILE parameter filename
    CONSISTENT cross-table consistency(N) CONSTRAINTS export constraints (Y)
    OBJECT_CONSISTENT transaction set to read only during object export (N)
    FEEDBACK display progress every x rows (0)
    FILESIZE maximum size of each dump file
    FLASHBACK_SCN SCN used to set session snapshot back to
    FLASHBACK_TIME time used to get the SCN closest to the specified time
    QUERY select clause used to export a subset of a table
    RESUMABLE suspend when a space related error is encountered(N)
    RESUMABLE_NAME text string used to identify resumable statement
    RESUMABLE_TIMEOUT wait time for RESUMABLE
    TTS_FULL_CHECK perform full or partial dependency check for TTS
    VOLSIZE number of bytes to write to each tape volume
    TABLESPACES list of tablespaces to export
    TRANSPORT_TABLESPACE export transportable tablespace metadata (N)
    TEMPLATE template name which invokes iAS mode export
    Export terminated successfully without warnings.
    [oracle@ozawa oracle]$
    Joel P�rez

  • OWB - issue in export  import

    hi,
    When i do an export – import from one OWB repository to another of an OWB mapping having a MERGE statement some of the columns are missing in the MERGE. I am using OWB 10g R2.
    Eg: if the MERGE is based on columns c1, c2, c3 after the export-import the mapping to a different environment the imported mapping is having the MERGE based on only c1
    Thanks

    There is Bug 5705198: LOADING PROPERTIES CHANGED AFTER MDL EXPORT/IMPORT (fixed in 10.2.0.4 OWB patchset), similar problem with lost MATCH COLUMN WHEN UPDATING ROW property during MDL import.
    Maybe it is your case.
    There is not workaround for this problem, only patching to OWB 10.2.0.4
    Regards,
    Oleg

Maybe you are looking for