DATA PUMP ISSUE
SEVERAL USER BELONG TO THE TABLESPACE I WANT TO EXPORT
I WANT TO ONLY WEED OUT THE TABLESSPACE AND THE TABLES BELONGING TO ONE USER.
THE USER ALSO WANT SOME OF THE TABLES WITH ALL THE DATA IN THEM AND SOME OF THE TABLES WITH NO DATA IN THEM
I TRIED ALL OPTIONS
WITH DATA_ONLY, METADATA_ONLY ON BOTH THE IMPORT AND EXPORT AND HAVE ISSUES
HAVE TRIED EXPORTING AND IT GIVES ME ALL THE TABLES AND A SIZE TO INDICATE IT WORKS BUT ON IMPORT KAZOOM-HELL BREAKS LOOSE. JUNIOR DBA NEED HELP
SQL> select owner, table_name, tablespace_name from dba_tables where tablespace_name='USER_SEG';
ORAPROBE OSP_ACCOUNTS USER_SEG
PATSY TRAPP USER_SEG
PATSY TRAPPO USER_SEG
PATSY TRAUDIT USER_SEG
PATSY TRCOURSE USER_SEG
PATSY TRCOURSEO USER_SEG
PATSY TRDESC USER_SEG
PATSY TREMPDATA USER_SEG
PATSY TRFEE USER_SEG
PATSY TRNOTES USER_SEG
PATSY TROPTION USER_SEG
PATSY TRPART USER_SEG
PATSY TRPARTO USER_SEG
PATSY TRPART_OLD USER_SEG
PATSY TRPERCENT USER_SEG
PATSY TRSCHOOL USER_SEG
PATSY TRSUPER USER_SEG
PATSY TRTRANS USER_SEG
PATSY TRUSERPW USER_SEG
PATSY TRUSRDAT USER_SEG
PATSY TRVARDAT USER_SEG
PATSY TRVERIFY USER_SEG
PATSY TRAPPO_RESET USER_SEG
PATSY TRAPP_RESET USER_SEG
PATSY TRCOURSEO_RESET USER_SEG
PATSY TRCOURSE_RESET USER_SEG
PATSY TRPARTO_RESET USER_SEG
PATSY TRPART_RESET USER_SEG
PATSY TRTRANS_RESET USER_SEG
PATSY TRVERIFY_RESET USER_SEG
MAFANY TRVERIFY USER_SEG
MAFANY TRPART USER_SEG
MAFANY TRPARTO USER_SEG
MAFANY TRAPP USER_SEG
MAFANY TRAPPO USER_SEG
MAFANY TRCOURSE USER_SEG
MAFANY TRCOURSEO USER_SEG
MAFANY TRTRANS USER_SEG
JULIE R_REPOSITORY_LOG USER_SEG
JULIE R_VERSION USER_SEG
JULIE R_DATABASE_TYPE USER_SEG
JULIE R_DATABASE_CONTYPE USER_SEG
JULIE R_NOTE USER_SEG
JULIE R_DATABASE USER_SEG
JULIE R_DATABASE_ATTRIBUTE USER_SEG
JULIE R_DIRECTORY USER_SEG
JULIE R_TRANSFORMATION USER_SEG
JULIE R_TRANS_ATTRIBUTE USER_SEG
JULIE R_DEPENDENCY USER_SEG
JULIE R_PARTITION_SCHEMA USER_SEG
JULIE R_PARTITION USER_SEG
JULIE R_TRANS_PARTITION_SCHEMA USER_SEG
JULIE R_CLUSTER USER_SEG
JULIE R_SLAVE USER_SEG
JULIE R_CLUSTER_SLAVE USER_SEG
JULIE R_TRANS_SLAVE USER_SEG
JULIE R_TRANS_CLUSTER USER_SEG
JULIE R_TRANS_HOP USER_SEG
JULIE R_TRANS_STEP_CONDITION USER_SEG
JULIE R_CONDITION USER_SEG
JULIE R_VALUE USER_SEG
JULIE R_STEP_TYPE USER_SEG
JULIE R_STEP USER_SEG
JULIE R_STEP_ATTRIBUTE USER_SEG
JULIE R_STEP_DATABASE USER_SEG
JULIE R_TRANS_NOTE USER_SEG
JULIE R_LOGLEVEL USER_SEG
JULIE R_LOG USER_SEG
JULIE R_JOB USER_SEG
JULIE R_JOBENTRY_TYPE USER_SEG
JULIE R_JOBENTRY USER_SEG
JULIE R_JOBENTRY_COPY USER_SEG
JULIE R_JOBENTRY_ATTRIBUTE USER_SEG
JULIE R_JOB_HOP USER_SEG
JULIE R_JOB_NOTE USER_SEG
JULIE R_PROFILE USER_SEG
JULIE R_USER USER_SEG
JULIE R_PERMISSION USER_SEG
JULIE R_PROFILE_PERMISSION USER_SEG
MAFANY2 TRAPP USER_SEG
MAFANY2 TRAPPO USER_SEG
MAFANY2 TRCOURSE USER_SEG
MAFANY2 TRCOURSEO USER_SEG
MAFANY2 TRPART USER_SEG
MAFANY2 TRPARTO USER_SEG
MAFANY2 TRTRANS USER_SEG
MAFANY2 TRVERIFY USER_SEG
MAFANY BIN$ZY3M1IuZyq3gQBCs+AAMzQ==$0 USER_SEG
MAFANY BIN$ZY3M1Iuhyq3gQBCs+AAMzQ==$0 USER_SEG
MAFANY MYUSERS USER_SEG
I ONLY WANT THE TATBLES FROM PATSY AND WANT TO MOVE IT TO ANOTHER DATABASE FOR HER TO USE AND WANT TO KEEP THE SAME TABLESPACE NAME
THE TABLES BELOW SHOULD ALSO HAVE JUST THE METADATA AND NOT THE DATA
PATSY TRAPP USER_SEG
PATSY TRAPPO USER_SEG
PATSY TRAUDIT USER_SEG
PATSY TRCOURSE USER_SEG
PATSY TRCOURSEO USER_SEG
PATSY TRDESC USER_SEG
PATSY TREMPDATA USER_SEG
PATSY TRFEE USER_SEG
PATSY TRNOTES USER_SEG
PATSY TROPTION USER_SEG
PATSY TRPART USER_SEG
PATSY TRPARTO USER_SEG
PATSY TRPART_OLD USER_SEG
PATSY TRPERCENT USER_SEG
PATSY TRSCHOOL USER_SEG
PATSY TRSUPER USER_SEG
PATSY TRTRANS USER_SEG
PATSY TRUSERPW USER_SEG
PATSY TRUSRDAT USER_SEG
THE FOLLOWING WILL OR ARE SUPPOSED TO HAVE ALL THE DATA
PATSY TRVERIFY USER_SEG
PATSY TRAPPO_RESET USER_SEG
PATSY TRAPP_RESET USER_SEG
PATSY TRCOURSEO_RESET USER_SEG
PATSY TRCOURSE_RESET USER_SEG
PATSY TRPARTO_RESET USER_SEG
PATSY TRPART_RESET USER_SEG
PATSY TRTRANS_RESET USER_SEG
PATSY TRVERIFY_RESET USER_SEG
HAVE TRIED ALL THE FOLLOWING AND LATER GOT STUCK IN MY LIL EFFORT TO DOCUMENT AS I GO ALONG:
USING DATA PUMP TO EXPORT DATA
First:
Create a directory object or use one that already exists.
I created a directory object: grant
CREATE DIRECTORY "DATA_PUMP_DIR" AS '/home/oracle/my_dump_dir'
Grant read, write on directory DATA_PUMP_DIR to user;
Where user will be the user such as sys, public, oracle etc.
For example to create a directory object named expdp_dir located at /u01/backup/exports enter the following sql statement:
SQL> create directory expdp_dir as '/u01/backup/exports'
then grant read and write permissions to the users who will be performing the data pump export and import.
SQL> grant read,write on directory dpexp_dir to system, user1, user2, user3;
http://wiki.oracle.com/page/Data+Pump+Export+(expdp)+and+Data+Pump+Import(impdp)?t=anon
To view directory objects that already exist
- use EM: under Administration tab to schema section and select Directory Objects
- DESC the views: ALL_DIRECTORIES, DBA_DIRECTORIES
select * from all_directories;
select * from dba_directories;
export schema using expdb
expdp system/SCHMOE DUMPFILE=patsy_schema.dmp
DIRECTORY=DATA_PUMP_DIR SCHEMAS = PATSY
expdp system/PASS DUMPFILE=METADATA_ONLY_schema.dmp DIRECTORY=DATA_PUMP_DIR TABLESPACES = USER_SEG CONTENT=METADATA_ONLY
expdp system/PASS DUMPFILE=data_only_schema.dmp DIRECTORY=DATA_PUMP_DIR TABLES=PATSY.TRVERIFY,PATSY.TRAPPO_RESET,PATSY.TRAPP_RESET,PATSY.TRCOURSEO_RESET, PATSY.TRCOURSE_RESET,PATSY.TRPARTO_RESET,PATSY.TRPART_RESET,PATSY.TRTRANS_RESET,PATSY.TRVERIFY_RESET CONTENT=DATA_ONLY
you are correct all the patsy tables reside in the tablespace USER_SEG that are in a diffrent database and i want to move them to a new database-same version 10g. i have created a user named patsy there also. the tablespace does not exist in the target i want to move it to-same thing with the objects and indexes-they don't exists in the target.
so how can i move the schema and tablespace with all the tables and keeping the tablespace name and same username patsy that i created to import into.
tried again and get some errrors: this is beter than last time and that because i remap_tablespace:USER_SEG:USERS
BUT I WANT TO HAVE THE TABLESPACE IN TARGET CALLED USER_SEG TOO. DO I HAVE TO CREATE IT OR ?
[oracle@server1 ~]$ impdp system/blue99 remap_schema=patsy:test REMAP_TABLESPACE=USER_SEG:USERS directory=DATA_PUMP_DIR dumpfile=patsy.dmp logfile=patsy.log
Import: Release 10.1.0.3.0 - Production on Wednesday, 08 April, 2009 11:10
Copyright (c) 2003, Oracle. All rights reserved.
Connected to: Oracle Database 10g Release 10.1.0.3.0 - Production
Master table "SYSTEM"."SYS_IMPORT_FULL_03" successfully loaded/unloaded
Starting "SYSTEM"."SYS_IMPORT_FULL_03": system/******** remap_schema=patsy:test REMAP_TABLESPACE=USER_SEG:USERS directory=DATA_PUMP_DIR dumpfile=patsy.dmp logfile=patsy.log
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/TBL_TABLE_DATA/TABLE/TABLE_DATA
. . imported "TEST"."TRTRANS" 10.37 MB 93036 rows
. . imported "TEST"."TRAPP" 2.376 MB 54124 rows
. . imported "TEST"."TRAUDIT" 1.857 MB 28153 rows
. . imported "TEST"."TRPART_OLD" 1.426 MB 7183 rows
. . imported "TEST"."TRSCHOOL" 476.8 KB 5279 rows
. . imported "TEST"."TRAPPO" 412.3 KB 9424 rows
. . imported "TEST"."TRUSERPW" 123.8 KB 3268 rows
. . imported "TEST"."TRVERIFY_RESET" 58.02 KB 183 rows
. . imported "TEST"."TRUSRDAT" 54.73 KB 661 rows
. . imported "TEST"."TRNOTES" 51.5 KB 588 rows
. . imported "TEST"."TRCOURSE" 49.85 KB 243 rows
. . imported "TEST"."TRCOURSE_RESET" 47.60 KB 225 rows
. . imported "TEST"."TRPART" 39.37 KB 63 rows
. . imported "TEST"."TRPART_RESET" 37.37 KB 53 rows
. . imported "TEST"."TRTRANS_RESET" 38.94 KB 196 rows
. . imported "TEST"."TRCOURSEO" 30.93 KB 51 rows
. . imported "TEST"."TRCOURSEO_RESET" 28.63 KB 36 rows
. . imported "TEST"."TRPERCENT" 33.72 KB 1044 rows
. . imported "TEST"."TROPTION" 30.10 KB 433 rows
. . imported "TEST"."TRPARTO" 24.78 KB 29 rows
. . imported "TEST"."TRPARTO_RESET" 24.78 KB 29 rows
. . imported "TEST"."TRVERIFY" 20.97 KB 30 rows
. . imported "TEST"."TRVARDAT" 14.13 KB 44 rows
. . imported "TEST"."TRAPP_RESET" 14.17 KB 122 rows
. . imported "TEST"."TRDESC" 9.843 KB 90 rows
. . imported "TEST"."TRAPPO_RESET" 8.921 KB 29 rows
. . imported "TEST"."TRSUPER" 6.117 KB 10 rows
. . imported "TEST"."TREMPDATA" 0 KB 0 rows
. . imported "TEST"."TRFEE" 0 KB 0 rows
Processing object type TABLE_EXPORT/TABLE/GRANT/TBL_OWNER_OBJGRANT/OBJECT_GRANT
Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
ORA-39083: Object type INDEX failed to create with error:
ORA-00959: tablespace 'INDEX1_SEG' does not exist
Failing sql is:
CREATE UNIQUE INDEX "TEST"."TRPART_11" ON "TEST"."TRPART" ("TRPCPID", "TRPSSN") PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS STORAGE(INITIAL 32768 NEXT 131072 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT) TABLESPACE "INDEX1_SEG" PARALLEL 1
ORA-39083: Object type INDEX failed to create with error:
ORA-00959: tablespace 'INDEX1_SEG' does not exist
Failing sql is:
CREATE INDEX "TEST"."TRPART_I2" ON "TEST"."TRPART" ("TRPLAST") PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS STORAGE(INITIAL 131072 NEXT 131072 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT) TABLESPACE "INDEX1_SEG" PARALLEL 1
ORA-39083: Object type INDEX failed to create with error:
ORA-00959: tablespace 'INDEX1_SEG' does not exist
Failing sql is:
CREATE INDEX "TEST"."TRPART_I3" ON "TEST"."TRPART" ("TRPEMAIL") PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS STORAGE(INITIAL 131072 NEXT 131072 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT) TABLESPACE "INDEX1_SEG" PARALLEL 1
ORA-39083: Object type INDEX failed to create with error:
ORA-00959: tablespace 'INDEX1_SEG' does not exist
Failing sql is:
CREATE INDEX "TEST"."TRPART_I4" ON "TEST"."TRPART" ("TRPPASSWORD") PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS STORAGE(INITIAL 131072 NEXT 131072 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT) TABLESPACE "INDEX1_SEG" PARALLEL 1
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "BILLZ"."TRCOURSE_I1" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"BILLZ"', '"TRCOURSE_I1"', NULL, NULL, NULL, 232, 2, 232, 1, 1, 7, 2, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "BILLZ"."TRTRANS_I1" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"BILLZ"', '"TRTRANS_I1"', NULL, NULL, NULL, 93032, 445, 93032, 1, 1, 54063, 2, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "DANQ"."TRAPP_I1" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"DANQ"', '"TRAPP_I1"', NULL, NULL, NULL, 54159, 184, 54159, 1, 1, 4597, 1, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "DANQ"."TRAPP_I2" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"DANQ"', '"TRAPP_I2"', NULL, NULL, NULL, 54159, 182, 17617, 1, 2, 48776, 1, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "JULIE"."TRAPPO_I1" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"JULIE"', '"TRAPPO_I1"', NULL, NULL, NULL, 9280, 29, 9280, 1, 1, 166, 1, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "JULIE"."TRAPPO_I2" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"JULIE"', '"TRAPPO_I2"', NULL, NULL, NULL, 9280, 28, 4062, 1, 2, 8401, 1, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "JULIE"."TRCOURSEO_I1" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"JULIE"', '"TRCOURSEO_I1"', NULL, NULL, NULL, 49, 2, 49, 1, 1, 8, 1, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "JULIE"."TREMPDATA_I1" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"JULIE"', '"TREMPDATA_I1"', NULL, NULL, NULL, 0, 0, 0, 0, 0, 0, 0, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "DANQ"."TROPTION_I1" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"DANQ"', '"TROPTION_I1"', NULL, NULL, NULL, 433, 3, 433, 1, 1, 187, 1, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"TEST"."TRPART_11" creation failed
ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"TEST"."TRPART_I2" creation failed
ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"TEST"."TRPART_I3" creation failed
ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"TEST"."TRPART_I4" creation failed
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "DANQ"."TRPART_I5" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"DANQ"', '"TRPART_I5"', NULL, NULL, NULL, 19, 1, 19, 1, 1, 10, 1, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "JULIE"."TRPARTO_I1" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"JULIE"', '"TRPARTO_I1"', NULL, NULL, NULL, 29, 1, 29, 1, 1, 1, 1, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "JULIE"."TRPARTO_I2" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"JULIE"', '"TRPARTO_I2"', NULL, NULL, NULL, 29, 14, 26, 1, 1, 1, 1, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "JULIE"."TRPART_I2" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"JULIE"', '"TRPART_I2"', NULL, NULL, NULL, 7180, 19, 4776, 1, 1, 7048, 1, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "JULIE"."TRPART_I3" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"JULIE"', '"TRPART_I3"', NULL, NULL, NULL, 2904, 15, 2884, 1, 1, 2879, 1, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "JULIE"."TRPART_I4" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"JULIE"', '"TRPART_I4"', NULL, NULL, NULL, 363, 1, 362, 1, 1, 359, 0, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "JULIE"."TRPART_I5" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"JULIE"', '"TRPART_I5"', NULL, NULL, NULL, 363, 1, 363, 1, 1, 353, 0, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "DANQ"."TRPART_11" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"DANQ"', '"TRPART_11"', NULL, NULL, NULL, 7183, 29, 7183, 1, 1, 6698, 1, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "DANQ"."TRPERCENT_I1" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"DANQ"', '"TRPERCENT_I1"', NULL, NULL, NULL, 1043, 5, 1043, 1, 1, 99, 1, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "DANQ"."TRSCHOOL_I1" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"DANQ"', '"TRSCHOOL_I1"', NULL, NULL, NULL, 5279, 27, 5279, 1, 1, 4819, 1, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "JULIE"."TRVERIFY_I2" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"JULIE"', '"TRVERIFY_I2"', NULL, NULL, NULL, 30, 7, 7, 1, 1, 1, 2, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "STU"."TRVERIFY_I1" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"STU"', '"TRVERIFY_I1"', NULL, NULL, NULL, 30, 12, 30, 1, 1, 1, 2, 2, NULL, FALSE, NULL, NULL, NULL); END;
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Job "SYSTEM"."SYS_IMPORT_FULL_03" completed with 29 error(s) at 11:17
Similar Messages
-
Data pump issue for oracle 10G in window2003
Hi Experts,
I try to run data pump in oracle 10G in window 2003 server.
I got a error as
D:\>cd D:\oracle\product\10.2.0\SALE\BIN
D:\oracle\product\10.2.0\SALE\BIN>expdp system/xxxxl@sale full=Y directory=du
mpdir dumpfile=expdp_sale_20090302.dmp logfile=exp_sale_20090302.log
Export: Release 10.2.0.4.0 - Production on Tuesday, 03 March, 2009 8:05:50
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - Produc
tion
With the Partitioning, OLAP, Data Mining and Real Application Testing options
ORA-31626: job does not exist
ORA-31650: timeout waiting for master process response
However, I can run exp codes and works well.
What is wrong for my data pump?
Thanks
JIMHi Anand,
I did not see any error log at that time. Actually, it did not work any more. I will test it again based on your emial after exp done.
Based on new testing, I got below errors as
ORA-39014: One or more workers have prematurely exited.
ORA-39029: worker 1 with process name "DW01" prematurely terminated
ORA-31671: Worker process DW01 had an unhandled exception.
ORA-04030: out of process memory when trying to allocate 4108 bytes (PLS non-lib hp,pdzgM60_Make)
ORA-06512: at "SYS.KUPC$QUEUE_INT", line 277
ORA-06512: at "SYS.KUPW$WORKER", line 1366
ORA-04030: out of process memory when trying to allocate 65036 bytes (callheap,KQL tmpbuf)
ORA-06508: PL/SQL: could not find program unit being called: "SYS.KUPC$_WORKERERROR"
ORA-06512: at "SYS.KUPW$WORKER", line 13360
ORA-06512: at "SYS.KUPW$WORKER", line 15039
ORA-06512: at "SYS.KUPW$WORKER", line 6372
ORA-39125: Worker unexpected fatal error in KUPW$WORKER.DISPATCH_WORK_ITEMS while calling DBMS_METADATA.FETCH_XML_CLOB [PROCOBJ:"SALE"."SQLSCRIPT_2478179"]
ORA-06512: at "SYS.KUPW$WORKER", line 7078
ORA-04030: out of process memory when trying to allocate 4108 bytes (PLS non-lib hp,pdzgM60_Make)
ORA-06500: PL/SQL: storage error
ORA-04030: out of process memory when trying to allocate 16396 bytes (koh-kghu sessi,pmucpcon: tds)
ORA-04030: out of process memory when trying to allocate 16396 bytes (koh-kghu sessi,pmucalm coll)
Job "SYSTEM"."SYS_EXPORT_FULL_01" stopped due to fatal error at 14:41:36
ORA-39014: One or more workers have prematurely exited.
the trace file as
*** 2009-03-03 14:20:41.500
*** ACTION NAME:() 2009-03-03 14:20:41.328
*** MODULE NAME:(oradim.exe) 2009-03-03 14:20:41.328
*** SERVICE NAME:() 2009-03-03 14:20:41.328
*** SESSION ID:(159.1) 2009-03-03 14:20:41.328
Successfully allocated 7 recovery slaves
Using 157 overflow buffers per recovery slave
Thread 1 checkpoint: logseq 12911, block 2, scn 7355467494724
cache-low rba: logseq 12911, block 251154
on-disk rba: logseq 12912, block 221351, scn 7355467496281
start recovery at logseq 12911, block 251154, scn 0
----- Redo read statistics for thread 1 -----
Read rate (ASYNC): 185319Kb in 1.73s => 104.61 Mb/sec
Total physical reads: 189333Kb
Longest record: 5Kb, moves: 0/448987 (0%)
Change moves: 1378/5737 (24%), moved: 0Mb
Longest LWN: 1032Kb, moves: 45/269 (16%), moved: 41Mb
Last redo scn: 0x06b0.9406fb58 (7355467496280)
----- Recovery Hash Table Statistics ---------
Hash table buckets = 32768
Longest hash chain = 3
Average hash chain = 35384/25746 = 1.4
Max compares per lookup = 3
Avg compares per lookup = 847056/876618 = 1.0
*** 2009-03-03 14:20:46.062
KCRA: start recovery claims for 35384 data blocks
*** 2009-03-03 14:21:02.171
KCRA: blocks processed = 35384/35384, claimed = 35384, eliminated = 0
*** 2009-03-03 14:21:02.531
Recovery of Online Redo Log: Thread 1 Group 2 Seq 12911 Reading mem 0
*** 2009-03-03 14:21:04.718
Recovery of Online Redo Log: Thread 1 Group 1 Seq 12912 Reading mem 0
*** 2009-03-03 14:21:16.296
----- Recovery Hash Table Statistics ---------
Hash table buckets = 32768
Longest hash chain = 3
Average hash chain = 35384/25746 = 1.4
Max compares per lookup = 3
Avg compares per lookup = 849220/841000 = 1.0
*** 2009-03-03 14:21:28.468
tkcrrsarc: (WARN) Failed to find ARCH for message (message:0x1)
tkcrrpa: (WARN) Failed initial attempt to send ARCH message (message:0x1)
*** 2009-03-03 14:26:25.781
kwqmnich: current time:: 14: 26: 25
kwqmnich: instance no 0 check_only flag 1
kwqmnich: initialized job cache structure
ktsmgtur(): TUR was not tuned for 360 secs
Windows Server 2003 Version V5.2 Service Pack 2
CPU : 8 - type 586, 4 Physical Cores
Process Affinity : 0x00000000
Memory (Avail/Total): Ph:7447M/8185M, Ph+PgF:6833M/9984M, VA:385M/3071M
Instance name: vmsdbsea
Redo thread mounted by this instance: 0 <none>
Oracle process number: 0
Windows thread id: 2460, image: ORACLE.EXE (SHAD)
Dynamic strand is set to TRUE
Running with 2 shared and 18 private strand(s). Zero-copy redo is FALSE
*** 2009-03-03 08:06:51.921
*** ACTION NAME:() 2009-03-03 08:06:51.905
*** MODULE NAME:(expdp.exe) 2009-03-03 08:06:51.905
*** SERVICE NAME:(xxxxxxxxxx) 2009-03-03 08:06:51.905
*** SESSION ID:(118.53238) 2009-03-03 08:06:51.905
SHDW: Failure to establish initial communication with MCP
SHDW: Deleting Data Pump job infrastructure
is it a system memory issue for data pump? my exp works well
How to fix this issue?
JIM
Edited by: user589812 on Mar 3, 2009 5:07 PM
Edited by: user589812 on Mar 3, 2009 5:22 PM -
Data Pump issue with nls date format
Hi Friends,
I have a database with nls date format 'DD/MM/YYYY HH24:MI:SS' from where I wish to take export from. I have a target database with nls date format 'YYYY/MM/DD HH24:MI:SS' . I have a few tables whose create statements have some date fields with DEFAULT '01-Jan-1950' and these CREATE TABLE statements when processed by Data pump is getting failed in my target database. These tables are not getting created due to this error
Failing sql is:
CREATE TABLE "MCS_OWNER"."SECTOR_BLOCK_PEAK" ("AIRPORT_DEPART" VARCHAR2(4) NOT NULL ENABLE, "AIRPORT_ARRIVE" VARCHAR2(4) NOT NULL ENABLE, "CARRIER_CODE" VARCHAR2(3) NOT NULL ENABLE, "AC_TYPE_IATA" VARCHAR2(3) NOT NULL ENABLE, "PEAK_START" VARCHAR2(25) NOT NULL ENABLE, "PEAK_END" VARCHAR2(25), "BLOCK_TIME" VARCHAR2(25), "FLIGHT_TIME" VARCHAR2(25), "SEASON" VARC
ORA-39083: Object type TABLE failed to create with error:
ORA-01858: a non-numeric character was found where a numeric was expected
The table create sql which adds a column as ' VALID_FROM DATE DEFAULT '01-jan-1970' ' which I think is the issue. Appreciate if someone can suggest a way to get around with this. I have tried altering the nls of source db to be same as target database. Still the impdp fails.
Database is 10.2.0.1.0 on Linux X86 64 bit.
Thanks,
SSN
Edited by: SSNair on Oct 27, 2010 8:25 AMAppreciate if someone can suggest a way to get around with this.change the DDL that CREATE TABLE to include TO_DATE() function.
With Oracle characters between single quote marks are STRINGS!
'This is a string, 2009-12-31, not a date'
When a DATE datatype is desired, then use TO_DATE() function. -
Data pump import issue in Oracle 10gR2
Hi DBAs,
Please help me identifying the data pump issue I am facing in one of our Oracle 10g R2 database
Below is error message from the import log file
Job "SYS"."SYS_IMPORT_TABLE_06" completed with 62 error(s) at 18:20:20
ORA-39097: Data Pump job encountered unexpected error -39076
ORA-39065: unexpected master process exception in KUPV$FT_INT.DELETE_JOB
ORA-39076: cannot delete job SYS_IMPORT_TABLE_06 for user SYS
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPV$FT_INT", line 934
ORA-31635: unable to establish job resource synchronization
ORA-39097: Data Pump job encountered unexpected error -39076
ORA-39065: unexpected master process exception in MAIN
ORA-39076: cannot delete job SYS_IMPORT_TABLE_06 for user SYS
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPV$FT_INT", line 934
ORA-31635: unable to establish job resource synchronization
Please suggest me a workaround and root cause of this issue. Thanks.
Regards,
Arunuser13344783 wrote:
Hi DBAs,
Please help me identifying the data pump issue I am facing in one of our Oracle 10g R2 database
Below is error message from the import log file
Job "SYS"."SYS_IMPORT_TABLE_06" completed with 62 error(s) at 18:20:20
ORA-39097: Data Pump job encountered unexpected error -39076
ORA-39065: unexpected master process exception in KUPV$FT_INT.DELETE_JOB
ORA-39076: cannot delete job SYS_IMPORT_TABLE_06 for user SYS
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPV$FT_INT", line 934
ORA-31635: unable to establish job resource synchronization
ORA-39097: Data Pump job encountered unexpected error -39076
ORA-39065: unexpected master process exception in MAIN
ORA-39076: cannot delete job SYS_IMPORT_TABLE_06 for user SYS
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPV$FT_INT", line 934
ORA-31635: unable to establish job resource synchronization
Please suggest me a workaround and root cause of this issue. Thanks.
Regards,
ArunPost complete command line of expdp that made the dump file & the impdp that threw these errors -
Data Pump Export issue - no streams pool created and cannot automatically c
I am trying to use data pump on a 10.2.0.1 database that has vlm enabled and getting the following error :
Export: Release 10.2.0.1.0 - Production on Tuesday, 20 April, 2010 10:52:08
Connected to: Oracle Database 10g Release 10.2.0.1.0 - Production
ORA-31626: job does not exist
ORA-31637: cannot create job SYS_EXPORT_TABLE_01 for user E_AGENT_SITE
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPV$FT_INT", line 600
ORA-39080: failed to create queues "KUPC$C_1_20100420105208" and "KUPC$S_1_20100420105208" for Data Pump job
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPC$QUE_INT", line 1555
ORA-00832: no streams pool created and cannot automatically create one
This is my script (that I currently use on other non vlm databases successfully):
expdp e_agent_site/<password>@orcl parfile=d:\DailySitePump.par
this is my parameter file :
DUMPFILE=site_pump%U.dmp
PARALLEL=1
LOGFILE=site_pump.log
STATUS=300
DIRECTORY=DATA_DUMP
QUERY=wwv_document$:"where last_updated > sysdate-18"
EXCLUDE=CONSTRAINT
EXCLUDE=INDEX
EXCLUDE=GRANT
TABLES=wwv_document$
FILESIZE=2000M
My oracle directory is created and the user has rights
googling the issue says that the shared pool is too small or streams_pool_size needs setting. shared_pool_size = 1200M and when I query v$parameter it shows that streams_pool_size = 0
I've tried alter system set streams_pool_size=1M; but I just get :
ORA-02097: parameter cannot be modified because specified value is invalid
ORA-04033: Insufficient memory to grow pool
The server is a windows enterprise box with 16GB ram and VLM enabled, pfile memory parameters listed below:
# resource
processes = 1250
job_queue_processes = 10
open_cursors = 1000 # no overhead if set too high
# sga
shared_pool_size = 1200M
large_pool_size = 150M
java_pool_size = 50M
# pga
pga_aggregate_target = 850M # custom
# System Managed Undo and Rollback Segments
undo_management=AUTO
undo_tablespace=UNDOTBS1
# vlm support
USE_INDIRECT_DATA_BUFFERS = TRUE
DB_BLOCK_BUFFERS = 1500000
Any ideas why I cannot run data pump? I am assuming that I just need to set streams_pool_size but I don't understand why I cannot increase the size of it on this db. It is set to 0 on other databases that work fine and I can set it which is why I am possibly linking the issue to vlm
thanks
RobertSGA_MAX_SIZE?
SQL> ALTER SYSTEM SET streams_pool_size=32M SCOPE=BOTH;
ALTER SYSTEM SET streams_pool_size=32M SCOPE=BOTH
ERROR at line 1:
ORA-02097: parameter cannot be modified because specified value is invalid
ORA-04033: Insufficient memory to grow pool
SQL> show parameter sga_max
NAME TYPE VALUE
sga_max_size big integer 480M
SQL> show parameter cache
NAME TYPE VALUE
db_16k_cache_size big integer 0
db_2k_cache_size big integer 0
db_32k_cache_size big integer 0
db_4k_cache_size big integer 0
db_8k_cache_size big integer 0
db_cache_advice string ON
db_cache_size big integer 256M
db_keep_cache_size big integer 0
db_recycle_cache_size big integer 0
object_cache_max_size_percent integer 10
object_cache_optimal_size integer 102400
session_cached_cursors integer 20
SQL> ALTER SYSTEM SET db_cache_size=224M SCOPE=both;
System altered.
SQL> ALTER SYSTEM SET streams_pool_size=32M SCOPE=both;
System altered.Lukasz -
Data Pump .xlsx into a SQL Server Table and the whole 32-Bit, 64-Bit discussion
First of all...I have a headache!
Found LOTS of Google hits when trying to data pump a .xlsx File into a SQL Server Table. And the whole discussion of the Microsoft ACE 64-Bit Driver or the Microsoft Jet 32-Bit Driver.
Specifically receiving this error...
An OLE DB record is available. Source: "Microsoft Office Access Database Engine" Hresult: 0x80004005 Description: "External table is not in the expected format.".
Error: 0xC020801C at Data Flow Task to Load Alere Coaching Enrolled, Excel Source [56]: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "Excel Connection Manager"
failed with error code 0xC0202009.
Strangely enough, if I simply data pump ONE .xlsx File into a SQL Server Table utilizing my SSIS Package, it seems to work fine. If instead I am trying to be pro-active and allowing for multiple .xlsx Files by using a Foreach Loop Container and a variable
@[User::FileName], it's erroring out...but not really because it is indeed storing the rows onto the SQL Server Table. I did check all my Delay
Why does this have to be sooooooo difficult???
Can anyone help me out here in trying to set-up a SSIS Package in a rather constrictive environment to pump a .xlsx File into a SQL Server Table? What in God's name am I doing wrong? Or is all this a misnomer? But if it's working how do I disable the error
so that is stops erroring out?Hi ITBobbyP,
According to your description, when you import data of .xlsx file to SQL Server database, you got the error message.
The error can be caused by the following reasons:
The excel file is locked by other processes. Please kindly resave this file and name it to other file name to see if the issue will be fixed.
The ACE(Access Database Engine) is not up to date as Vaibhav mentioned. Please download the latest ACE and install it from the link:
https://www.microsoft.com/en-us/download/details.aspx?id=13255.
The version of OFFICE and server bitness is not the same. To solve the problem, please refer to the following document:
http://hrvoje.piasevoli.com/2010/09/01/importing-data-from-64-bit-excel-in-ssis/
If you have any more questions, please feel free to ask.
Thanks,
Wendy Fu
Wendy Fu
TechNet Community Support -
How to find table with colum that not support by data pump network_link
Hi Experts,
We try to import a database to new DB by data pump network_link.
as oracle statement, Tables with columns that are object types are not supported in a network export. An ORA-22804 error will be generated and the export will move on to the next table. To work around this restriction, you can manually create the dependent object types within the database from which the export is being run.
My question, how to find these tables with colum that that are object types are not supported in a network export.
We have LOB object and oracle spital SDO_GEOMETRY object type. our database size is about 300G. nornally exp will takes 30 hours.
We try to use data pump with network_link to speed export process.
How do we fix oracle spital users type SDO_GEOMETRY issue during data pump?
our system is 32 bit window 2003 and 10GR2 database.
Thanks
Jim
Edited by: user589812 on Nov 3, 2009 12:59 PMHi,
I remember there being issues with sdo_geometry and DataPump. You may want to contact oracle support with this issue.
Dean -
Using Data Pump when database is read-only
Hello
I used flashback and returned my database to the past time then I opened the database read only
then I wanted use data pump(expdp) for exporting a schema but I encounter this error
ORA-31626: job does not exist
ORA-31633: unable to create master table "SYS.SYS_EXPORT_SCHEMA_05"
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPV$FT", line 863
ORA-16000: database open for read-only access
but I could by exp, export that schema
My question is that , don't I can use Data Pump while database is read only ? or do you know any resolution for the issue ?
thanksYou need to use NETWORK_LINK, so the required tables are created in a read/write database and the data is read from the read only database using a database link:
SYSTEM@db_rw> create database link db_r_only
2 connect to system identified by oracle using 'db_r_only';
$ expdp system/oracle@db_rw network_link=db_r_only directory=data_pump_dir schemas=scott dumpfile=scott.dmpbut I tried it with 10.2.0.4 and found and error:
Export: Release 10.2.0.4.0 - Production on Thursday, 27 November, 2008 9:26:31
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
ORA-39006: internal error
ORA-39065: unexpected master process exception in DISPATCH
ORA-02054: transaction 1.36.340 in-doubt
ORA-16000: database open for read-only access
ORA-02063: preceding line from DB_R_ONLY
ORA-39097: Data Pump job encountered unexpected error -2054
I found in Metalink the bug 7331929 which is solved in 11.2! I haven't tested this procedure with prior versions or with 11g so I don't know if this bug only affects to 10.2.0.4 or 10* and 11.1*
HTH
Enrique
PS. If your problem was solved, consider marking the question as answered. -
What are the 'gotcha' for exporting using Data Pump(10205) from HPUX to Win
Hello,
I have to export a schema using data pump from 10205 on HPUX 64bit to Windows 64bit same 10205 version database. What are the 'gotcha' can I expect from doing this? I mean export data pump is cross platform so this sounds straight forward. But are there issues I might face from export data pump on HPUX platform and then import data dump on to Windows 2008 platform same database version 10205? Thank you in advance.On the HPUX database, run this statement and look for the value for NLS_CHARACTERSET
SQL> select * from NLS_DATABASE_PARAMETERS;http://docs.oracle.com/cd/B19306_01/server.102/b14237/statviews_4218.htm#sthref2018
When creating the database on Windows, you have two options - manually create the database or use DBCA. If you plan to create the database manually, specify the database characterset in the CREATE DATABASE statement - http://docs.oracle.com/cd/B19306_01/server.102/b14200/statements_5004.htm#SQLRF01204
If using DBCA, see http://docs.oracle.com/cd/B19306_01/server.102/b14196/install.htm#ADMQS0021 (especially http://docs.oracle.com/cd/B19306_01/server.102/b14196/install.htm#BABJBDIF)
HTH
Srini -
File name substitution with Data pump
Hi,
I'm experimenting with Oracle data pump export, 10.2 on Windows 2003 Server.
On my current export scripts, I am able to create the dump file name dynamically.
This name includes the database name, date, and time such as the
following : exp_testdb_01192005_1105.dmp.
When I try to do the same thing with data pump, it doesn't work. Has anyone
had success with this. Thanks.
ed lewisHi Ed
This is an example for your issue:
[oracle@dbservertest backups]$ expdp gsmtest/gsm directory=dpdir dumpfile=exp_testdb_01192005_1105.dmp tables=ban_banco
Export: Release 10.2.0.1.0 - Production on Thursday, 19 January, 2006 12:23:55
Copyright (c) 2003, 2005, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
Starting "GSMTEST"."SYS_EXPORT_TABLE_01": gsmtest/******** directory=dpdir dumpfile=exp_testdb_01192005_1105.dmp tables=ban_banco
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 64 KB
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type TABLE_EXPORT/TABLE/COMMENT
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
. . exported "GSMTEST"."BAN_BANCO" 7.718 KB 9 rows
Master table "GSMTEST"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
Dump file set for GSMTEST.SYS_EXPORT_TABLE_01 is:
/megadata/clona/exp_testdb_01192005_1105.dmp
Job "GSMTEST"."SYS_EXPORT_TABLE_01" successfully completed at 12:24:18
This work OK.
Regards,
Wilson -
Data pump import error with nested tables
So the problem is somewhat long :)
Actually the problems are two - why and how oracle are treating OO concept and why data pump doesn't work?
So scenario for the 1st one:
1) there is object type h1 and table of h1
2) there is unrelated object type row_text and table of row_text
3) there is object type h2 under h1 with attribute as table of row_text
4) there is table tab1 with column b with data type as table of h1. Of course column b is stored as nested table.
So how do you think - how many nested tables Oracle will create? The correct answer is 2. One explicitly defined and one hidden with system
generated name for the type h2 which is under type h1. So the question is WHY? Why if I create an instance of supertype Oracle tries to adapt
it for the subtype as well? Even more if I create another subtype h3 under h1 another hidden nested table appears.
This was the first part.
The second part is - if I do schema export and try to import it in another schema I got error saying that oracle failed to create storage table for
nested table column b. So the second question is - if Oracle has created such a mess with hidden nested tables how to import/export to another
schema?
Ok and here is test case to demonstrate problems above:
-- creating type h1 and table of it
SQL> create or replace type h1 as object (a number)
2 not final;
3 /
Type created.
SQL> create or replace type tbl_h1 as table of h1;
2 /
Type created.
-- creating type row_text and table of it
SQL> create or replace type row_text as object (
2 txt varchar2(100))
3 not final;
4 /
Type created.
SQL> create or replace type tbl_row_text as table of row_text;
2 /
Type created.
-- creating type h2 as subtype of h1
SQL> create or replace type h2 under h1 (some_texts tbl_row_text);
2 /
Type created.
SQL> create table tab1 (a number, b tbl_h1)
2 nested table b
3 store as tab1_nested;
Table created.
-- so we have 2 nested tables now
SQL> select table_name, parent_table_name, parent_table_column
2 from user_nested_tables;
TABLE_NAME PARENT_TABLE_NAME
PARENT_TABLE_COLUMN
SYSNTfsl/+pzu3+jgQAB/AQB27g== TAB1_NESTED
TREAT(SYS_NC_ROWINFO$ AS "GINTS"."H2")."SOME_TEXTS"
TAB1_NESTED TAB1
B
-- another subtype of t1
SQL> create or replace type h3 under h1 (some_texts tbl_row_text);
2 /
Type created.
-- plus another nested table
SQL> select table_name, parent_table_name, parent_table_column
2 from user_nested_tables;
TABLE_NAME PARENT_TABLE_NAME
PARENT_TABLE_COLUMN
SYSNTfsl/+pzu3+jgQAB/AQB27g== TAB1_NESTED
TREAT(SYS_NC_ROWINFO$ AS "GINTS"."H2")."SOME_TEXTS"
SYSNTfsl/+pz03+jgQAB/AQB27g== TAB1_NESTED
TREAT(SYS_NC_ROWINFO$ AS "GINTS"."H3")."SOME_TEXTS"
TAB1_NESTED TAB1
B
SQL> desc "SYSNTfsl/+pzu3+jgQAB/AQB27g=="
Name Null? Type
TXT VARCHAR2(100)OK let it be and now I'm trying to export and import in another schema:
[oracle@xxx]$ expdp gints/xxx@xxx directory=xxx dumpfile=gints.dmp logfile=gints.log
Export: Release 11.2.0.1.0 - Production on Thu Feb 4 22:32:48 2010
<irrelevant rows skipped>
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
. . exported "GINTS"."TAB1" 0 KB 0 rows
. . exported "GINTS"."SYSNTfsl/+pz03+jgQAB/AQB27g==" 0 KB 0 rows
. . exported "GINTS"."TAB1_NESTED" 0 KB 0 rows
. . exported "GINTS"."SYSNTfsl/+pzu3+jgQAB/AQB27g==" 0 KB 0 rows
Master table "GINTS"."SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
******************************************************************************And now import. In order to create types transformation of OIDs is applied and also remap_schema
Although it fails to create the table.
[oracle@xxx]$ impdp gints1/xxx@xxx directory=xxx dumpfile=gints.dmp logfile=gints_imp.log remap_schema=gints:gints1 transform=OID:n
Import: Release 11.2.0.1.0 - Production on Thu Feb 4 22:41:48 2010
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Release 11.2.0.1.0 - Production
Master table "GINTS1"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "GINTS1"."SYS_IMPORT_FULL_01": gints1/********@xxx directory=xxx dumpfile=gints.dmp logfile=gints_imp.log remap_schema=gints:gints1 transform=OID:n
Processing object type SCHEMA_EXPORT/USER
ORA-31684: Object type USER:"GINTS1" already exists
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TYPE/TYPE_SPEC
Processing object type SCHEMA_EXPORT/TABLE/TABLE
ORA-39083: Object type TABLE:"GINTS1"."TAB1" failed to create with error:
ORA-02320: failure in creating storage table for nested table column B
ORA-00904: : invalid identifier
Failing sql is:
CREATE TABLE "GINTS1"."TAB1" ("A" NUMBER, "B" "GINTS1"."TBL_H1" ) SEGMENT CREATION IMMEDIATE PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-01403: no data found
ORA-01403: no data found
Failing sql is:
DECLARE I_N VARCHAR2(60); I_O VARCHAR2(60); c DBMS_METADATA.T_VAR_COLL; df varchar2(21) := 'YYYY-MM-DD:HH24:MI:SS'; BEGIN DELETE FROM "SYS"."IMPDP_STATS"; c(1) := DBMS_METADATA.GET_STAT_COLNAME('GINTS1','TAB1_NESTED',NULL,'TREAT(SYS_NC_ROWINFO$ AS "GINTS"."H2")."SOME_TEXTS"',1); DBMS_METADATA.GET_STAT_INDNAME('GINTS1','TAB1_NESTED',c,1,i_o,i_n); INSERT INTO "
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-01403: no data found
ORA-01403: no data found
Failing sql is:
DECLARE I_N VARCHAR2(60); I_O VARCHAR2(60); c DBMS_METADATA.T_VAR_COLL; df varchar2(21) := 'YYYY-MM-DD:HH24:MI:SS'; BEGIN DELETE FROM "SYS"."IMPDP_STATS"; c(1) := DBMS_METADATA.GET_STAT_COLNAME('GINTS1','TAB1_NESTED',NULL,'TREAT(SYS_NC_ROWINFO$ AS "GINTS"."H3")."SOME_TEXTS"',1); DBMS_METADATA.GET_STAT_INDNAME('GINTS1','TAB1_NESTED',c,1,i_o,i_n); INSERT INTO "
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Job "GINTS1"."SYS_IMPORT_FULL_01" completed with 4 error(s) at 22:41:52So any idea how to make export/import of such tables?
TIA
GintsTom Kyte has said it repeatedly ... I will repeat it here for you.
The fact that Oracle allows you to build object tables is not an indication that you should.
Store your data relationally and build object_views on top of them.
http://www.morganslibrary.org/reference/object_views.html
If you model properly, and store properly, you don' have any issues. -
Schema export via Oracle data pump with Database Vault enabled question
Hi,
I have installed and configured Database Vault on an Oracle 11g-r2-11.2.0.3 to protect a specific schema (SCHEMA_NAME) via a realm. I have followed the following doc:
http://www.oracle.com/technetwork/database/security/twp-databasevault-dba-bestpractices-199882.pdf
to ensure that the sys and the system user has sufficient rights to complete a schedule Oracle data pump export operation.
I.e. I have granted to sys and system the following:
execute dvsys.dbms_macadm.authorize_scheduler_user('sys','SCHEMA_NAME');
execute dvsys.dbms_macadm.authorize_scheduler_user('system','SCHEMA_NAME');
execute dvsys.dbms_macadm.authorize_datapump_user('sys','SCHEMA_NAME');
execute dvsys.dbms_macadm.authorize_datapump_user('system','SCHEMA_NAME');
I have also create a second realm on the same schema (SCHEMA_NAME) to allow sys and system to maintain indexes for real-protected tables, To allow a sys and system to maintain indexes for realm-protected tables. This separate realm was created for all their index types: Index, Index Partition, and Indextype, sys and system have been authorized as OWNER to this realm.
However, when I try and complete an Oracle Data Pump export operation on the schema, I get two errors directly after the following line displayed in the export log:
Processing object type SCHEMA_EXPORT/TABLE/INDEX/DOMAIN_INDEX/INDEX:
ORA-39127: unexpected error from call to export_string :=SYS.DBMS_TRANSFORM_EXIMP.INSTANCE_INFO_EXP('AQ$_MGMT_NOTIFY_QTABLE_S','SYSMAN',1,1,'11.02.00.00.00',newblock)
ORA-01031: insufficient privileges
ORA-06512: at "SYS.DBMS_TRANSFORM_EXIMP", line 197
ORA-06512: at line 1
ORA-06512: at "SYS.DBMS_METADATA", line 9081
ORA-39127: unexpected error from call to export_string :=SYS.DBMS_TRANSFORM_EXIMP.INSTANCE_INFO_EXP('AQ$_MGMT_LOADER_QTABLE_S','SYSMAN',1,1,'11.02.00.00.00',newblock)
ORA-01031: insufficient privileges
ORA-06512: at "SYS.DBMS_TRANSFORM_EXIMP", line 197
ORA-06512: at line 1
ORA-06512: at "SYS.DBMS_METADATA", line 9081
The export is completed but with this errors.
Any help, suggestions, pointers, etc actually anything will be very welcome at this stage.
Thank youHi Srini,
Thank you very much for your help. Unfortunately after having followed the instructions of the DOC I am still getting the same errors ?
none the less thank you for your input.
I was also wondering if someone could tell me how to move this thread to the Database Security area of the forum, as I feel I may have posted the thread in the wrong place as it appears to be a Database Vault issue and not an imp/exp problem. ?
Edited by: zooid on May 20, 2012 10:33 PM
Edited by: zooid on May 20, 2012 10:36 PM -
Just started looking at data pump yesterday and it seems like a great tool.
I've been no issues in getting data transferred between databases and mapping to original tables. However I have refactored a couple of parts of my db and hoped I could get a pointer in the right direction.
http://docs.oracle.com/cd/B12037_01/server.101/b10825/dp_import.htm
I'm hoping to import some of my data from table BAD_DESIGN from my old schema and move a section of it to table GOOD_DESIGN in my new schema - I'm also hoping this is possible on the import.
eg.
OLD.BAD_DESIGN table
name1 VARCHAR 2
number1 INTEGER
name2 VARCHAR 2
number2 INTEGER
name3 VARCHAR 2
number3 INTEGER
I want to take only name1 and number 1 and insert into my new tables
NEW.GOOD_DESIGN table
name1 VARCHAR 2
number1 INTEGER
The instruction I've pasted below is not working which is obvious as I'm not defining the column the data should be inserted into in the new schema. I also don't think I need a function to remap the data here ...
impdp NEW/NEW_PASSWORD@NEW_DB:PORT/SERVICE dumpfile=BAD_DESIGN.dmp TABLES=NEW.GOOD_DESIGN CONTENT=DATA_ONLY remap_data=OLD.BAD_DESIGN.name1:remapConfig.remapString \
Is this even possible with a data pump? If so, a pointer to what I'm missing in the instruction or the documentation would be appreciated.
Edited by: 910652 on Jan 27, 2012 6:25 AMThanks for the reply
Database versions are 10.2.0.4.0 (old) and 11.2.0.3.0 both sitting on a Linux server.
Are you asking if datapump can be used to move data from a table that contains 6 columns into a table that contains 2 columns ?
Kind of....not realy specifically to do with the amount of columns, more to do with can I get a subset of data from a data dump and import it into a different table in a refactored db.
To clarify I'm asking if data containted in two columns in an old table can be 'extracted' from the .dmp file and inserted into a different table in a different schema (column types remain the same).
To clarify with a better example - I have a table
OLD.FAMILY
dad_name VARCHAR2
dad_age INTEGER
mum_name VARCHAR2
mum_age INTEGER
child1_name VARCHAR2
......etc.
I export this table using datapump
Now I want only want to insert the data from dad_name and dad_age into my new table
NEW.DAD
name VARCHAR2
age INTEGER
Edited by: 910652 on Jan 27, 2012 7:03 AM -
Error connecting as DBA using Data Pump Within Transportable Modules
Hi all
I am using OWB 10g R2 and trying to set up a transportable Module to test the Oracle Data Pump utility. I have followed the user manual in terms of relevant grants and permissions needed to use this functionality and have sucessfully connect to both my source and target databases and created my transportable module which will extract six tables from a source database on 10.2.0.1.0 to may target schema in my warehouse also on 10.2.0.1.0. When i come to try and deploy / execute the transportable module it fails with the following error.
RPE-01023: Failed to establish connection to target database as DBA
Now we have even gone as far as granting the DBA role to the user within our target but we still get the same error so assume it is something to do with the connection of the Transportable Target Module Location and it needs to connect as DBA somehow in the connect string. Has anyone experienced this issue and is their a way of creating the location connection that is not documented.
There is no mention of this anywhere within the manual and i have even followed the example from http://www.rittman.net/archives/2006_04.html and my target user has the privilages detailed in the manual as detailed below
User must not be SYS. Musthave ALTER TABLESPACE privilege and IMP_FULL_
DATABASE role. Must have CREATE MATERIALIZED VIEW privilege with ADMIN
option. Must be a Warehouse Builder repository databaseuser.
Any help would be appreciated before i raise a request with OracleDid you ever find a resolution ? We are experiencing the same issue..
thanks
OBX -
Exporting whole database (10GB) using Data Pump export utility
Hi,
I have a requirement that we have to export the whole database (10GB) using Data Pump export utility because it is not possible to send the 10GB dump in a CD/DVD to the system vendor of our application (to analyze few issues we have).
Now when i checked online full export is available but not able to understand how it works, as we never used this data pump utility, we use normal export method. Also, will data pump reduce the size of the dump file so it can fit in a DVD or can we use Parallel Full DB export utility to split the files and include them in a DVD, is it possible.
Please correct me if i am wrong and kindly help.
Thanks for your help in advance.You need to create a directory object.
sqlplus user/password
create directory foo as '/path_here';
grant all on directory foo to public;
exit;
then run you expdp command.
Data Pump can compress the dumpfile if you are on 11.1 and have the appropriate options. The reason for saying filesize is to limit the size of the dumpfile. If you have 10G and are not compressing and the total dumpfiles are 10G, then by specifying 600MB, you will just have 10G/600MB = 17 dumpfiles that are 600MB. You will have to send them 17 cds. (probably a few more if dumpfiles don't get filled up 100% due to parallel.
Data Pump dumpfiles are written by the server, not the client, so the dumpfiles don't get created in the directory where the job is run.
Dean
Maybe you are looking for
-
Acrobat 8 Pro swf video problem
I'm embedding video inside a pdf file. I wanted to use swf as a single source video to reduce the overall file size but... every swf conversion I insert in the file plays in a continuous loop! The only way to stop it is to close the entire pdf. The "
-
FTP Adapter - 10.1.3.3 -Urgent !!!
Hi, I am trying to configure get operation for FTP adapter (SOA Suite 10.1.3.3./ Jdev 10.1.3.3). Created an empty BPEL project, created a partner link and configured receive activiyt to invoke this PL and then rest of the steps. After deploying the p
-
I have some home movies that I have moved from my I phone to the mac. I have tried to burn them to disc but the disc will not play in my dvd player. What do I need to do?
-
Applemobiledevice not working properly message
Just downloaded the latest itunes. Every time I open itunes there's an "apple mobile device not working properly" message that continually pops up and it won't stop. I use Vista and have no clue how to stop it. Can anyone help?
-
How do I set the font size in a call out (box) tool
How do I set the font size in a call out (box) tool ?