Logical clonning using IMPDP

-- created one database db1 using dbca and created one table test with few records .
-- took full export impdp sys/oracle@db1 directory=export dumpfile=full_exp.dmp logfile=full_exp.log full=y
-- after that next day we created one more database using create database command (database name export)
-- execute catalog.sql,catproc.sql and pubutils.sql
-- impdp sys/oracle@db1 directory=export dumpfile=full_imp.dmp logfile=full_imp.log full=y
but it's importing database from full_exp.dmp to export database but no access . there is not table test that i created under db1
NEED HELP !!!

1005255 wrote:
-- created one database db1 using dbca and created one table test with few records .
-- took full export impdp sys/oracle@db1 directory=export dumpfile=full_exp.dmp logfile=full_exp.log full=y
-- after that next day we created one more database using create database command (database name export)
-- execute catalog.sql,catproc.sql and pubutils.sql
-- impdp sys/oracle@db1 directory=export dumpfile=full_imp.dmp logfile=full_imp.log full=y
In addition to the other comments, even if this syntax were valid (no parameters that are not known to impdp) you are showing that you tried to connect to 'db1' to perform the import. That's the same database you said you did the "export" (export with "impdp"? I don't think so ...)
So, like SB said, nothing you state is believable. Copy and paste of your actual commands and responses, not your transcription of what you think you did.
This has nothing to do with installation, which is the subject of this forum ....
but it's importing database from full_exp.dmp to export database but no access . there is not table test that i created under db1
>
NEED HELP !!!Edited by: EdStevens on May 10, 2013 11:23 AM

Similar Messages

  • Logical path using in abap program

    hay,
    I want to create file in application server directory.
    so i want the user to select the Logical path and file name from screen.so that he can create the file and seved in selected logical path.{path used must be from Instance profile DIR_LOGGING,GLOBALPATH directory etc).
    could you pls tell me how to do this using parameter for user input selecting FILE PATH.
    Do we have Functional module for Retrieving physical path
    from logical path?
    could you pls help me..
    ambichan.

    hey,
    Thanks for your reply.
    Instead of using logical file name as input, can we allow the user to select logical path using parameter?
    I want user to select Logical path or logical file name from parameter. is it possible?
    (I want to avoid user to input insteal want to allow user to select the path)
    ambichan
    Message was edited by: ambi chan

  • Can we use impdp to import the data from an normal exp dump?

    Hi All,
    I have a export dump taken from a 9i database. Can i use impdp to import the data into 10g database from that 9i exp dump?
    Please suggest
    thanks and Regards
    Arun

    Hi,
    I have a export dump taken from a 9i database. Can i use impdp to import the data into 10g database from that 9i exp dump?Yes, it can be.
    Refer:
    http://wiki.oracle.com/thread/3734722/can+a+9i+dump+file+be+imported+into+10%3F
    thanks,
    X A H E E R

  • Getting Errors while using IMPDP command in UNIX

    Hi All,
    My work involves Refreshing of DB with latest Data. For this i am using IMPDP command to import the data. But Every time during refresh i will get errors saying like "oracle Generated Errors Oracle Not available".
    I don't know what is the reason for getting this error, and that too my db goes down after this error. Again i will startup the db , and i will try my luck to complete the process.
    so please tell me what are the actual problems that might cause the IMPDP to be failed.
    with one advise , i have stopped my listener and done the importing, Interestingly this time IMPDP works without errors. But this is also not always working.
    But i feel that every time starting and stopping the listener is not the correct solution.
    so please help me in this regard.
    Regards
    Naveen R.

    The error i am getting is...
    UDI-03113: operation generated ORACLE error 3113
    ORA-03113: end-of-file on communication channel
    Process ID: 6208
    Session ID: 346 Serial number: 298
    UDI-03114: operation generated ORACLE error 3114
    ORA-03114: not connected to ORACLE
    [user1@node1 cpt]$
    DB version is 11.1.0.6.0
    os is.. Red Hat Linux

  • How can I replace existing oracle-sequences using impdp?

    I use impdp to transport table data from one database-instance to another (on a regular base).
    I do this by using a database link.
    Here is the content of the parameterfile for the impdp:
    NETWORK_LINK=dbl_backoffice
    INCLUDE=TABLE:" in ('SSU_CUSTOMERS','SSU_ORDERS','SSU_ORDERLINES')"
    INCLUDE=SEQUENCE:" in ('SEQ_ORDERNR','SEQ_CUSTNR')"
    TABLE_EXISTS_ACTION=REPLACE
    Now, the mentioned tabels are neatly replaced however, the sequence-objects aren't because impdp reports :
    ORA-31684: Object type SEQUENCE:"SSU"."SEQ_ORDERNR'" already exists
    ORA-31684: Object type SEQUENCE:"SSU"."SEQ_CUSTNR'" already exists
    It is understandable, because the impdp will issue create sequence statements and fails because they're already there (and there is no such thing as a import-parameter as SEQUENCE_EXISTS_ACTION=REPLACE or whatsoever).
    I don't want to do a complete schema import, since I only want to refresh front-office data with a part of the back-office data.

    You need a plugin called Flip4Mac to create a wmv on a Mac
    It plugs in to Quicktime.
    WMV is generally unavailable on MAcs cos its a Windows thing ( Windows Media Video)
      just like PorRes is not available generally on a Windows system.

  • Logical column using data source from 2 generations of same hierarchy

    Hi experts,
    I'm using Essbase as my data source in CEIM physical layer,
    and I have a hierarchy called "Entity" which contains different level of companies,
    in Generation 2 I have only one member called "group totals" and in Generation 3 are 5 members representing 5 different industries,
    I need to use these total 6 members as slider on the top of the view(a Dial),
    and the measure I want to show is scaled in rate, which I can't simply sum up those five members in Gen3 to get Gen2 measurement.
    I tried to create a logical column using Entity-default(the alias table) as datasource, it worked but was not sorted in outline style.
    I tried to sort them using calculate items in selection steps, but calculate items seemed cannot be shown as section or slider.
    So I wonder if I can simply create a logical column that sourced from different generations(in this case, my Gen2 and Gen3) of same hierarchy,
    is it possible to do such things?
    Thanks for reply.

    You could try General Database Discussions the main db foum.
    What are you using to migrate your database? Why is it being mapped from varchar2(8) to varchar2(32). It sounds like someone/something is intervening here.
    Barry

  • DB Upgrade+Migrate from 10.1 to 11.2 using IMPDP with network_link param

    Dear all,
    I would like to upgrade and migrate my 10.1.0.5 DB on old server to 11.2.0.2 on new server.
    Here is the background info:
    Old server:
    OS : Redhat Linux AS 2.1
    DB Version : 10.1.0.5 (32 bit)
    No RAC
    New Server:
    OS : OEL 5.5
    DB Version : 11.2.0.2
    RAC
    ASM
    What I have done so far:
    1. Create new clustered 11Gr2 DB on new server.
    2. Pre-create tablespaces on new DB.
    3. Migrate 10.1.0.5 to 11.2.0.2 using IMPDP.
    impdp system/******* DIRECTORY=dump_file_dir NETWORK_LINK=DWH_DBLINK LOGFILE=log_file_dir:DWH_import_20110621.log FULL=Y SERVICE_NAME=dwhdb.xxx.xxx TABLE_EXISTS_ACTION=replace cluster=N exclude=statistics
    4. After IMPDP complete, invalid objects and components are found, run utlrp.sql no help
    SQL> select owner, count(*) from dba_objects where status = 'INVALID' group by owner;
    OWNER COUNT(*)
    WKSYS 16
    PUBLIC 12
    OLAPSYS 7
    ODM 21
    SYS 2
    WMSYS 11
    12 rows selected.
    SQL> select comp_name, status, version from dba_registry;
    COMP_NAME STATUS VERSION
    OWB VALID 11.2.0.2.0
    Oracle Application Express VALID 3.2.1.00.12
    Oracle Enterprise Manager VALID 11.2.0.2.0
    OLAP Catalog VALID 11.2.0.2.0
    Spatial VALID 11.2.0.2.0
    Oracle Multimedia VALID 11.2.0.2.0
    Oracle XML Database VALID 11.2.0.2.0
    Oracle Text VALID 11.2.0.2.0
    Oracle Expression Filter VALID 11.2.0.2.0
    Oracle Rules Manager VALID 11.2.0.2.0
    Oracle Workspace Manager INVALID 11.2.0.2.0
    Oracle Database Catalog Views VALID 11.2.0.2.0
    Oracle Database Packages and Types INVALID 11.2.0.2.0
    JServer JAVA Virtual Machine VALID 11.2.0.2.0
    Oracle XDK VALID 11.2.0.2.0
    Oracle Database Java Packages VALID 11.2.0.2.0
    OLAP Analytic Workspace VALID 11.2.0.2.0
    Oracle OLAP API VALID 11.2.0.2.0
    Oracle Real Application Clusters VALID 11.2.0.2.0
    19 rows selected.
    5. Check SYS's invalid objects, e.g. DBA_OUTLINE_HINTS, after tracing the reason, find outln.ol$hints is replaced by 10.1.0.5 version. I think it is due to the IMPDP's "TABLE_EXISTS_ACTION=replace" parameter.
    Others invalid objects like WMSYS.WM$ENV_VARS, also replaced by old version.
    What should I do now?
    Do I need to run upgrade script after DB migration using IMPDP?
    Is the migration procedure correct?
    Please advise
    Thanks in advance

    Hi,
    It looks like you've messed up the Non User (Oracle default user) data and/or metadata i.e sys, system, wksys.
    How many non Oracle default user do you have? and how big is the database? If you're using this method I'm assuming the data is not really big.
    I personally will not do full export/ import across different version, its better to just export the non default Oracle user schema as you might ended up messed up the sys objects, etc
    What you might do is
    - drop the 11g database and start from beginning again but exclude Oracle default user e.g sys,system, etc
    - or try run catupgrd.sql script (this will drop and recreate the objects), -this may or may not be fixing your invalid components
    Cheers

  • Logic 8 using only one CPU while bouncing

    Whenever I bounce a project, Logic 8 uses only one CPU. It also spikes this CPU completely, with the CPU meter going into the red area, which is why I am rather keen on persuading Logic to distribute the load.
    Any ideas?
    (I've given the support document with tips on how to balance multi-core performance a good read, but it doesn't seem to say anything on this particular subject, and following the advice in the document does nothing to change my CPU load.)

    Bouncing is a single core process. Just like rendering is in Final Cut Pro. This won't change until Leopard and Apple's ProApps get better at handling multiple cores.
    It's redlining because the processor core is working as fast as it can. Computer processors don't just randomly run slower, they use their resources for the tasks at hand. If it only used 50% of a core, the bounce would take twice as long to process.

  • ORA-02374: conversion error loading table during import using IMPDP

    HI All,
    We are trying to migrate the data from one database to an other database.
    The source database is having character set
    SQL> select value from nls_database_parameters where parameter='NLS_CHARACTERSET';
    VALUE
    US7ASCII
    The destination database is having character set
    SQL> select value from nls_database_parameters where parameter='NLS_CHARACTERSET';
    VALUE
    AL32UTF8
    We took an export of the whole database using expdp and when we try to import to the destination database using impdp. We are getting the following error.
    ORA-02374: conversion error loading table <TABLE_NAME>
    ORA-12899: value too large for column <COLUMN NAME> (actual: 42, maximum: 40)
    ORA-02372: data for row:<COLUMN NAME> : 0X'4944454E5449464943414349E44E204445204C4C414D414441'
    Kindly let me know how to overcome this issue in destination.
    Thanks & Regards,
    Vikas Krishna

    Hi,
    You can overcome this issue by increasing the column width in the target database for the max value required for all data to be imported successfully in the table.
    Regards

  • Re: How to Clone Using the Restore Option of Disk Utility

    This is a revision to original post:
    How to Clone Using Restore Option of Disk Utility
    1. Open Disk Utility from the Utilities folder.
    2. Select the backup or destination volume from the left side list.
    3. Click on the Erase tab in the DU main window.
    4. Set the format type to Mac OS Extended (Journaled) and click on the Erase button. Wait for volume to remount on the Desktop.
    5. Click on the Restore tab in the DU main window.
    6. Select the startup or source volume from the left side list and drag it to the Source entry field.
    7. Select the backup or destination volume from the left side list and drag it to the Destination entry field.
    8. Select the destination drive icon on the Desktop and press COMMAND-I to open the Get Info window. At the bottom in the Ownership and Permissions section be sure the box labeled "Ignore Permissions on this Volume" is unchecked.
    9. Double-check you got it right, then click on the Restore button.

    There is a checkbox option in the Restore dialog to Erase the Destination. Be sure you uncheck the option.

  • Moving Large amount of data using IMPDP with network link

    Hi Guru,
    Here we are having a requirement to move 2TB of data from production to non-prod using Network_link parameter. What is the process to make it fast.
    Previously we did it but it took 7 days for importing data and index .
    Here i am having an idea can you please guide me is it good to make import faster .
    Step 1) import only metadata .
    Step 2) import only table data using table_exists_action=append or truncate.( Here indexes are allready created in step 1 and import will be fast as per my plan.)
    Please help me the better way if we can.
    Thanks & Regards,
    Venkata Poorna Prasad.S

    You might want to check these as well:
    DataPump Import (IMPDP) Over NETWORK_LINK Is Sometimes Very Slow (Doc ID 1439691.1)
    DataPump Import Via NETWORK_LINK Is Slow With CURSOR_SHARING=FORCE (Doc ID 421441.1)
    Performance Problems When Transferring LOBs Using IMPDP With NETWORK_LINK (Doc ID 1488229.1)

  • Problem while creating logical port using soamanager

    Hi all,
    I have created a client proxy for web service from a 3rd party system.
    When i am creating a logical port for the same consumer proxy i get a error as follows:
    RABAX_STATE -e: UNCAUGHT_EXCEPTION
    and a dump saying
    "  The exception 'CX_SXMLP' was raised, but it was not caught anywhere along the call hierarchy.
         Since exceptions represent error situations and this error was not adequately responded to, the running ABAP program
          'CL_SXMLP_FRAGMENT=============CP' has to be terminated.                  
    Please suggest what can be done or what can be the problem.
    Thanks in advance.
    Komal

    Pls go to txn SM59 and check if the RFC destination for webservice is working in the test connection.
    Then go to txn LPCONFIG and create the logical port using the RFC dest.  Pls mention the path suffix appropriately.
    Next in your code while instantiating the client proxy pass the logical port name in the constructor (if the LP is not maintained as default).
    Pls reward points if the tips are helpful.

  • Importing tablespace using impdp

    Hi,
    I am trying to transport tablespace using expdp/impdp like:
    source side: expdp dumpfile=tbs1.dmp logfile=tbs1.log directory=datapump transport_tablespaces=tbs1
    transformed to target side
    target side: impdp dumpfile=tbs1.dmp directory=datapump transport_datafiles='/u01/app/oracle/sales/tbs1.dbf'
    it is giving:
    ORA-39123: Data Pump transportable tablespace job aborted
    ORA-29345: cannot plug a tablespace into a database using an incompatible character set
    when i searched in google some suggestions are given like do transport using rman.
    But when we have the option in impdp why i need to go to rman tool.
    i want to use impdp only. but i ma unable to use it properly while importing.
    can you please alter me.
    thank you!

    Hello,
    ORA-29345: cannot plug a tablespace into a database using an incompatible character setIt cannot work due to the error ORA-29345 above:
    ORA-29345: cannot plug a tablespace into a database using an incompatible character set
    Cause: Oracle does not support plugging a tablespace into a database using an
    incompatible character set.
    Action: Use import/export or unload/load to move data instead.
    i want to use impdp only. but i ma unable to use it properly while importing.So, as suggested, move the data by import/export. But, be careful, as you have different character set you may have conversions.
    Hope this help.
    Best regards,
    Jean-Valentin

  • What's the maximum RAM logic can use?

    As we all know, the Mac Pros can sport up to 16 gig of RAM. The question is.... how much ram can Logic use?
    I use piles of samples so i'd love to stick 8 gig in the mac pro (that i haven't bought yet)
    Is it pointless to spend all that cash?
    If it can only use 4 gig currently, (as i hear most apps do)
    has anyone heard of plans to upgrade logic to use more?

    I'm pretty sure that the MAC I have right (Dual 1.8 G5) now will use up to 8 gigs.. I have 5.5 in there right now. I'll try tonight and see if I can max the usage up to 5.

  • Memory fault while using impdp

    I'm unable to import a database using impdp; as soon as I start the impdp cmd it errors out with a memory fault
    I was able to previously import 1 TB out of a total of 3 TB, and then got an error related to the new MEMORY_TARGET parameter, complaining about insufficient tmpfs.
    I resolved that error, and then after that I can't invoke impdp without the memory fault error as described below..
    $ nohup impdp "'/ as sysdba'" JOB_NAME=LANCE_FULL_01 parfile=auexpdp.dat &
    [1] 32118
    $ nohup: appending output to `nohup.out'
    [1] + Memory fault nohup impdp "'/ as sysdba'" JOB_NAME=LANCE_FULL_01 parfile=auexpdp.dat &
    $
    I also get the following error in the import log
    ORA-39012: Client detached before the job started.
    This is what my parfile looks like
    directory=dmpdir
    dumpfile=aexp1_%U.dmp,aexp2_%U.dmp,aexp3_%U.dmp,aexp4_%U.dmp
    parallel=8
    full=y
    exclude=SCHEMA:"='MDDATA'"
    exclude=SCHEMA:"='OLAPSYS'"
    exclude=SCHEMA:"='ORDSYS'"
    exclude=SCHEMA:"='DMSYS'"
    exclude=SCHEMA:"='OUTLN'"
    exclude=SCHEMA:"='ORDPLUGINS'"
    include=tablespace
    #transform=oid:n
    logfile=expdpapps.log
    trace=1FF0300
    Has anybody seen this type of error before ?
    Thanks

    Pl post details of OS and database versions
    >
    $ nohup impdp "'/ as sysdba'" JOB_NAME=LANCE_FULL_01 parfile=auexpdp.dat &
    >
    You should not use '/ as sysdba' for expdp/impdp - use SYSTEM account instead - see the first NOTE sections in these links
    http://docs.oracle.com/cd/E11882_01/server.112/e22490/dp_export.htm#i1012781
    http://docs.oracle.com/cd/E11882_01/server.112/e22490/dp_import.htm#i1012504
    HTH
    Srini

Maybe you are looking for

  • Problem with text formatting within a table in Adobe Forms

    Hi all, I have a table on a non-interactive Adobe form (which is a copy of the standard expense form PTRV_EXPENSE_FORM.  The table in ABAP has a text field with CHAR50.  On the form, the text field has a width which allows approximately 35 characters

  • Activating Photoshop CS2?

    Help: I have CS2 and Photoshop is giving me an error message that I have to get a new authorization because it detected changes to my computer (running windows 7) the options given to get authorization are no longer valid (tried two phone numbers and

  • TIme Machine verification issue (Drobo and Airport Extreme)

    Hello. I recently purchased a Drobo (1st gen, off ebay) to replace a 1TB WD MyBook I was using as a Time Machine backup and Media Server. The Drobo is connected to an Airport Extreme base station over USB 2.0. My Mac is connected by ethernet to the A

  • Work center in material determination of DIP profile

    Hello, the source of our DIP-Profile is 0001 Acutal costs - line items. We have the requirement to use the work center in material determination as criteria. But in DP90 we get material determination error. The issue is that in the failed combination

  • Counting specific value for a characterstics in Infoset query

    Hi Experts, I have an query built on infoset which includes some charaterstics. Now for a characterstic it showing following values in a query : <b>CHARACTERSTICS#NAME</b> ================================================ YES NO YES YES NOT ASSIGNED N