Using Data Pump from Grid Control to import schema

Hi,
I have 2 10g oracle databases on 2 different windows servers. The databases are called: source and dest.
In dest I have created a database link to source.
I want to import a schema from source to dest.
I log on Grid Control and in the maintenance tab I click the link Import from database.
I choose import from schema and the database link I created earlier.
I then choose the schema name.
In the page Import From Database: Options I expand the advanced options.
I choose Exclude Only Objects Specified Below and I add a row that looks like this:
object type: TABLE
object name expression: EXCLUDE=TABLE:"IN('TEST6', 'TEST7')"
As I submit the job I get this error message:
Import Submit Failed
Errors: ORA-39001: invalid argument value ORA-39071. The value of NAME_EXPR is misformed. ORA-00920: invalid relation operator.
I have tried several ways, but I am still getting the same error.
Any suggestion will be appreciated.

Hi,
Thanks for your replay.
I have already tried the following with back slash:
EXCLUDE=TABLE:\"IN('TEST6', 'TEST7')\"
and I get the same error message.
If you have the correct syntax could you send it to me?
Regards.

Similar Messages

  • What are the 'gotcha' for exporting using Data Pump(10205) from HPUX to Win

    Hello,
    I have to export a schema using data pump from 10205 on HPUX 64bit to Windows 64bit same 10205 version database. What are the 'gotcha' can I expect from doing this? I mean export data pump is cross platform so this sounds straight forward. But are there issues I might face from export data pump on HPUX platform and then import data dump on to Windows 2008 platform same database version 10205? Thank you in advance.

    On the HPUX database, run this statement and look for the value for NLS_CHARACTERSET
    SQL> select * from NLS_DATABASE_PARAMETERS;http://docs.oracle.com/cd/B19306_01/server.102/b14237/statviews_4218.htm#sthref2018
    When creating the database on Windows, you have two options - manually create the database or use DBCA. If you plan to create the database manually, specify the database characterset in the CREATE DATABASE statement - http://docs.oracle.com/cd/B19306_01/server.102/b14200/statements_5004.htm#SQLRF01204
    If using DBCA, see http://docs.oracle.com/cd/B19306_01/server.102/b14196/install.htm#ADMQS0021 (especially http://docs.oracle.com/cd/B19306_01/server.102/b14196/install.htm#BABJBDIF)
    HTH
    Srini

  • How can we import object from 1 user to another user using DATA PUMP

    hi,
    i have taken full export by EXPDP( data pump) now i want to import user objects from target user of one database to another user of different database.
    plz reply me solution
    Thanks

    Hi,
    impdp 'user/user@db' DIRECTORY=DATA_PUMP_DIR(DEFAULT) DUMPFILE=FILENAME.dmp LOGFILE=IMPORT.LOG REMAP_SCHEMA=SOURCE SCHEMA:TArget SCHEMA
    1.Before Import check whether user importing has read,write on directory.
    2.Always Try to Add logfile clause This will help.
    3.Add TABLE_EXISTS_ACTION=REPLACE if you want target schema tables to be replaced by source schema table(if both has same table)
    Regards,
    NEerav

  • Migration from 10g to 12c using data pump

    hi there, while I've used data pump at the schema level before, I'm rather new at full database imports.
    we are attempting a full database migration from 10.2.0.4 to 12c using the full database data pump method over db link.
    the DBA has advised that we avoid moving SYSTEM and SYSAUX objects. but initially when reviewing the documentation it appeared that these objects would not be exported from the target system given TRANSPORTABLE=NEVER. can someone confirm this? the export/import log refers to objects that I believed would not be targeted:
    23-FEB-15 19:41:11.684:
    Estimated 3718 TABLE_DATA objects in 77 seconds
    23-FEB-15 19:41:12.450: Total estimation using BLOCKS method: 52.93 GB
    23-FEB-15 19:41:14.058: Processing object type DATABASE_EXPORT/TABLESPACE
    23-FEB-15 20:10:33.185: ORA-31684: Object type TABLESPACE:"UNDOTBS1" already exists
    23-FEB-15 20:10:33.185: ORA-31684: Object type TABLESPACE:"SYSAUX" already exists
    23-FEB-15 20:10:33.185: ORA-31684: Object type TABLESPACE:"TEMP" already exists
    23-FEB-15 20:10:33.185: ORA-31684: Object type TABLESPACE:"USERS" already exists
    23-FEB-15 20:10:33.200:
    Completed 96 TABLESPACE objects in 1759 seconds
    23-FEB-15 20:10:33.208: Processing object type DATABASE_EXPORT/PROFILE
    23-FEB-15 20:10:33.445:
    Completed 7 PROFILE objects in 1 seconds
    23-FEB-15 20:10:33.453: Processing object type DATABASE_EXPORT/SYS_USER/USER
    23-FEB-15 20:10:33.842:
    Completed 1 USER objects in 0 seconds
    23-FEB-15 20:10:33.852: Processing object type DATABASE_EXPORT/SCHEMA/USER
    23-FEB-15 20:10:52.368: ORA-31684: Object type USER:"OUTLN" already exists
    23-FEB-15 20:10:52.368: ORA-31684: Object type USER:"ANONYMOUS" already exists
    23-FEB-15 20:10:52.368: ORA-31684: Object type USER:"OLAPSYS" already exists
    23-FEB-15 20:10:52.368: ORA-31684: Object type USER:"MDDATA" already exists
    23-FEB-15 20:10:52.368: ORA-31684: Object type USER:"SCOTT" already exists
    23-FEB-15 20:10:52.368: ORA-31684: Object type USER:"LLTEST" already exists
    23-FEB-15 20:10:52.372:
    Completed 1140 USER objects in 19 seconds
    23-FEB-15 20:10:52.375: Processing object type DATABASE_EXPORT/ROLE
    23-FEB-15 20:10:55.255: ORA-31684: Object type ROLE:"SELECT_CATALOG_ROLE" already exists
    23-FEB-15 20:10:55.255: ORA-31684: Object type ROLE:"EXECUTE_CATALOG_ROLE" already exists
    23-FEB-15 20:10:55.255: ORA-31684: Object type ROLE:"DELETE_CATALOG_ROLE" already exists
    23-FEB-15 20:10:55.256: ORA-31684: Object type ROLE:"RECOVERY_CATALOG_OWNER" already exists
    any insight most appreciated.

    Schema's SYS,CTXSYS, MDSYS and ORDSYS are Not Exported using exp/expdp
    Doc ID: Note:228482.1
    I suppose he already installed a software 12c and created a database itseems - So when you imported you might have this "already exists"
    Whenever the database is created and software installed by default system,sys,sysaux will be created.

  • Data Guard configuration for RAC database disappeared from Grid control

    Primary Database Environment - Three node cluster
    RAC Database 10.2.0.1.0
    Linux Red Hat 4.0 2.6.9-22 64bit
    ASM 10.2.0.1.0
    Management Agent 10.2.0.2.0
    Standby Database Environment - one Node database
    Oracle Enterprise Edition 10.2.0.1.0 Single standby
    Linux Red Hat 4.0 2.6.9-22 64bit
    ASM 10.2.0.1.0
    Management Agent 10.2.0.2.0
    Grid Control 10.2.0.1.0 - Node separate from standby and cluster environments
    Oracle 10.1.0.1.0
    Grid Control 10.2.0.1.0
    Red Hat 4.0 2.6.9-22 32bit
    After adding a logical standby database through Grid Control for a RAC database, I noticed sometime later the Data Guard configuration disappeared from Grid Control. Not sure why but it is gone. I did notice that something went wrong with the standby creation but i did not get much feedback from Grid Control. The last thing I did was to view the configuration, see output below.
    Initializing
    Connected to instance qdcls0427:ELCDV3
    Starting alert log monitor...
    Updating Data Guard link on database homepage...
    Data Protection Settings:
    Protection mode : Maximum Performance
    Log Transport Mode settings:
    ELCDV.qdx.com: ARCH
    ELXDV: ARCH
    Checking standby redo log files.....OK
    Checking Data Guard status
    ELCDV.qdx.com : ORA-16809: multiple warnings detected for the database
    ELXDV : Creation status unknown
    Checking Inconsistent Properties
    Checking agent status
    ELCDV.qdx.com
    qdcls0387.qdx.com ... OK
    qdcls0388.qdx.com ... OK
    qdcls0427.qdx.com ... OK
    ELXDV ... WARNING: No credentials available for target ELXDV
    Attempting agent ping ... OK
    Switching log file 672.Done
    WARNING: Skipping check for applied log on ELXDV : disabled
    Processing completed.
    Here are the steps followed to add the standby database in Grid Control
    Maintenance tab
    Setup and Manage Data Guard
    Logged in as sys
    Add standby database
    Create a new logical standby database
    Perform a live backup of the primary database
    Specify backup directory for staging area
    Specify standby database name and Oracle home location
    Specify file location staging area on standby node
    At the end am presented with a review of the selected options and then the standby database is created
    Has any body come across a similar issue?
    Thanks,

    Any resolution on this?
    I just created a Logical Standby database and I'm getting the same warning (WARNING: No credentials available for target ...) when I do a 'Verify Configuration' from the Data Guard page.
    Everything else seems to be working fine. Logs are being applied, etc.
    I can't figure out what credentials its looking for.

  • Select table when import using Data Pump API

    Hi,
    Sorry for the trivial question, I export the data using Data Pump API, with "TABLE" mode.
    So all tables will be exported in one .dmp file.
    My question is, then how to import few tables only using Data Pump API?, how to define "TABLES" property like command line interface?
    should I use DATA_FILTER procedures?, if yes how to do that?
    Really thanks in advance
    Regards,
    Kahlil

    Hi,
    You should be using metadata_filter procedure for the same.
    eg:
    dbms_datapump.metadata_filter
                (handle1
                 ,'NAME_EXPR'
                 ,'IN (''TABLE1'', '"TABLE2'')'
    {code}
    Regards
    Anurag                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Best Approach for using Data Pump

    Hi,
    I configured a new database which I set up with schemas that I imported in from another production database. Now, before this database becomes the new production database, I need to re-import the schemas so that the data is up-to-date.
    Is there a way to use Data Pump so that I don't have to drop all the schemas first? Can I just export the schemas and somehow just overwrite what's in there already?
    Thanks,
    Nora

    Hi, you can use the NETWORK_LINK parameter for import data from other remote database.
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/dp_import.htm#i1007380
    Regards.

  • Data Pump - How to avoid exporting/importing dbms_scheduler jobs?

    Hi,
    I am using data pump to export a users objects. When I import them it also imports any jobs that user has created with dbms_scheduler - how can I avoid this. I tried EXCLUDE=JOBS but no luck.
    Thanks,
    Jon.
    Here are my export and import paramater files:
    DIRECTORY=dpump_dir1
    DUMPFILE=reveal.dmp
    CONTENT=METADATA_ONLY
    SCHEMAS=REVEAL
    EXCLUDE=TABLE_STATISTICS
    EXCLUDE=INDEX_STATISTICS
    LOGFILE=reveal.log
    DIRECTORY=dpump_dir1
    DUMPFILE=reveal.dmp
    CONTENT=METADATA_ONLY
    SCHEMAS=reveal
    REMAP_SCHEMA=reveal:reveal_backup
    TRANSFORM=SEGMENT_ATTRIBUTES:n
    EXCLUDE=TABLE_STATISTICS
    EXCLUDE=INDEX_STATISTICS
    LOGFILE=reveal.log

    Sorry for the reply to an old post.
    It seems that now (10.2.0.4) JOB is included in the list of SCHEMA_EXPORT_OBJECTS.
    SQL> SELECT OBJECT_PATH FROM SCHEMA_EXPORT_OBJECTS WHERE object_path LIKE '%JOB%';
    OBJECT_PATH
    JOB
    SCHEMA_EXPORT/JOB
    Unfortunatly, EXCLUDE=JOB still generates invalid argument on my schema imports. I also don't know whether these are old style jobs, or scheduler jobs. I don't see anything for object_path LIKE '%SCHED%' , which is my real interest anyway.
    The data pump is so rich already, I hate ask for more, but ... may we please have even more?? scheduler_programs, scheduler_jobs, scheduler etc.
    Thanks
    Steve

  • Can we load data in chunks using data pump ?

    We are loading data using data pump. So I want to clear my understanding.
    Please correct me if I am wrong on my understandings -
    ODI will fetch all data from source (whether it is INIT or CDC ) in one go and unload into staging area.
    If it is true, will performance hamper in case very huge data (50 million records at source) at source as ODI tries to load entire data in one go. I believe it will give better performance if we load in chunks using data pump.
    Please confirm and correct.
    Also I would like to know how can we configure chunk load using data-pump.
    Thanks in Advance.
    Regards,
    Dinesh.

    You may consider usingLKM Oracle to Oracle (datapump)
    http://docs.oracle.com/cd/E28280_01/integrate.1111/e12644/oracle_db.htm#r15c1-t2
    In 11g ODI reads from source and write to target in parallel. This is the case where you specify select query in source command and insert/update query in the target command. At source side Odi reads records from source and add them to a data queue. At target side a parallel thread reads data from the data queue and writes to the target. So the overall performance would be the slower of the read or write process.
    Thanks,

  • Report from Grid Control:Monthly Growth of each database in Single Report

    Hi,I want to make the report from Grid Control for Monthly Growth of each Database for Capacity planning
    I can See under Report->Storage->Oracle Database Tablespace Monthly Space Usage
    But this is only for once database and I want to include all the database in single report that too with limited INfo
    Want the format something like..just wanted to make you guys
    Database     Actual Size on 1st     Actual Size on 30th     INcrease in Size
    1.
    2.
    Something close to above format will do...
    Thanx
    Gagan

    Hi Thanx for the reply
    But Which Tablespace Metric You are talking about?I can See 5-6 Tablespace Metric and infact I dont want the data at the tablespace level and I want the total size of the database and growth in last month as shown in ' Oracle Database Tablespace Monthly Space Usage '.
    But Thanx again for the reply
    Regards
    Gagan

  • Report from Grid Control:Monthly Growth of Each Database on Single Report

    Hi,I want to make the report from Grid Control for Monthly Growth of each Database for Capacity planning
    I can See under Report->Storage->Oracle Database Tablespace Monthly Space Usage
    But this is only for once database and I want to include all the database in single report that too with limited INfo
    Want the format something like..just wanted to make you guys
    Database     Actual Size on 1st     Actual Size on 30th     INcrease in Size
    1.
    2.
    Something close to above format will do...
    Thanx
    Gagan

    Hi Thanx for the reply
    But Which Tablespace Metric You are talking about?I can See 5-6 Tablespace Metric and infact I dont want the data at the tablespace level and I want the total size of the database and growth in last month as shown in ' Oracle Database Tablespace Monthly Space Usage '.
    But Thanx again for the reply
    Regards
    Gagan

  • Using  Data Pump when database is read-only

    Hello
    I used flashback and returned my database to the past time then I opened the database read only
    then I wanted use data pump(expdp) for exporting a schema but I encounter this error
    ORA-31626: job does not exist
    ORA-31633: unable to create master table "SYS.SYS_EXPORT_SCHEMA_05"
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPV$FT", line 863
    ORA-16000: database open for read-only access
    but I could by exp, export that schema
    My question is that , don't I can use Data Pump while database is read only ? or do you know any resolution for the issue ?
    thanks

    You need to use NETWORK_LINK, so the required tables are created in a read/write database and the data is read from the read only database using a database link:
    SYSTEM@db_rw> create database link db_r_only
      2   connect to system identified by oracle using 'db_r_only';
    $ expdp system/oracle@db_rw network_link=db_r_only directory=data_pump_dir schemas=scott dumpfile=scott.dmpbut I tried it with 10.2.0.4 and found and error:
    Export: Release 10.2.0.4.0 - Production on Thursday, 27 November, 2008 9:26:31
    Copyright (c) 2003, 2007, Oracle.  All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    ORA-39006: internal error
    ORA-39065: unexpected master process exception in DISPATCH
    ORA-02054: transaction 1.36.340 in-doubt
    ORA-16000: database open for read-only access
    ORA-02063: preceding line from DB_R_ONLY
    ORA-39097: Data Pump job encountered unexpected error -2054
    I found in Metalink the bug 7331929 which is solved in 11.2! I haven't tested this procedure with prior versions or with 11g so I don't know if this bug only affects to 10.2.0.4 or 10* and 11.1*
    HTH
    Enrique
    PS. If your problem was solved, consider marking the question as answered.

  • Creation of reports from Grid control 12c

    Hi All,
    I wish to create some reports from Grid Control and push to mail especially looking for Daily Health Check report like Tablespace,Locks,Alertlog file errors,File systems,Archivelog files sync information in to one report.
    Thanks in advance.

    The native reporting application i.e. Information Publisher does not support such a feature, but in you are using BI Publisher integrated with EM, then BIP allows you to control when to email reports. for example, on completion, on failure, etc ..

  • How to consolidate data files using data pump when migrating 10g to 11g?

    We have one 10.2.0.4 database to be migrated to a new box running 11.2.0.1. The 10g database has too many data files scattered within too many file systems. I'd like to consolidate the data files into one or two large chunk in one file systems. Both OSs are RHEL 5. How should I do that using Data Pump Export/Import? I knew there is "Remap" option could be used, but it's only one to one mapping. How can I map multiple old data files into one new data file?

    hi
    datapump is terribly slow, make sure you have as much memory as possible allocated for Oracle but the bottleneck can be I/O throughput.
    Use PARALLEL option, set also these ones:
    * DISK_ASYNCH_IO=TRUE
    * DB_BLOCK_CHECKING=FALSE
    * DB_BLOCK_CHECKSUM=FALSE
    set high enough to allow for maximum parallelism:
    * PROCESSES
    * SESSIONS
    * PARALLEL_MAX_SERVERS
    more:
    http://download.oracle.com/docs/cd/B28359_01/server.111/b28319/dp_perf.htm
    that's it, patience welcome ;-)
    P.S.
    For maximum throughput, do not set PARALLEL to much more than twice the number of CPUs (two workers for each CPU).
    Edited by: g777 on 2011-02-02 09:53
    P.S.2
    breaking news ;-)
    I am playing now with storage performance and I turned the option of disk cache (also called write-back cache) to ON (goes at least along with RAID0 and 5 and setting it you don't lose any data on that volume) - and it gave me 1,5 to 2 times speed-up!
    Some says there's a risk of lose of more data when outage happens, but there's always such a risk even though you can lose less. Anyway if you can afford it (and with import it's OK, as it ss not a production at that moment) - I recommend to try. Takes 15 minutes, but you can gain 2,5 hours out of 10 of normal importing.
    Edited by: g777 on 2011-02-02 14:52

  • How to export resource manager consumer groups using Data Pump?

    Hi, there,
    Is there any way to export RM Consumer Groups/Mappings/Plans as part of a Data Pump export/import? I was wondering because I don't fancy doing it manually and I don't see the object in the database_export_objects view. I can create them manually, but was wondering whether there's an easier, less involved way of doing it?
    Mark

    Hi,
    I have not tested it but i think a full db export/import (using data pump or traditional exp/imp) may help doing this (which might not be feasible for you to have full exp/imp) because full database mode exports/imports SYS schema objects also, so there is a chance that it will also import the resource group and resource plans.
    Salman

Maybe you are looking for

  • Business Catalyst in Creative Cloud

    I have a question regarding Business Catalyst.  It seems that with my Creative Cloud subscription I can create 5 sites in either Dreamweaver or Muse, and upload to Business Catalyst.  But when I log on to manage my site it seems I get parts of Busine

  • How can I fix my Ipod nano 5 gen

    I got a refurbished Ipod nano 5 gen at a Gamestop and I have been having fits with it since. I tried using it on my laptop and it wont show and I used it on the family computer and formatted it and all it shows me is that it is a removeable disk and

  • How to open RAW files in photoshop cs3

    Three years ago I purchased a Nikon 7000 and the cs3 will not open the Raw files  

  • File to RFC Scenerio in PI 7.1

    Hi Experts, Please provide any documents for File to RFC Scenerio in PI 7.1

  • Create second wifi network with WRT54GL?

    I have DSL with a Comtrend router/wifi (WIFI1).  I would like to connect the WRT54GL to the physical network and have it create a separate wifi network (WIFI2), so my laptop will connect to WIFI2, get passed to the LAN and on to the Internet.   Cant