Can data pump handle export/import of a sequence explicitly?
Is it possible to just export/import a database sequence explicitly using data pump.
I know we can export/import sequence by exporting/importing the user in which it is defined.
However I am looking for a way to export/import individual sequnces without having to export/import the entire user.
Thanks
Hi, you can use the INCLUDE clause into expdp command, with this option you can specified the object for the export and import process. You can review the next documentation for more information about this topic.
<br><br>
INCLUDE
<br><br>
Regards.
Similar Messages
-
Can data pump only export/import views also.
Hi
Can anybody tell me how to export or import only tables and views through data pump.You can use include parameter.
expdp userid=scott/tiger include=table include=view directory=DATAPUMP2 dumpfile=Only_Table_View.dmp
impdp userid=scott/tiger dumpfile=only_table_view.dmp include=view include=table directory=datapump2 -
Can we use Data Pump to export data, using a SQL query, doing a join
Folks,
I have a quick question.
Using Oracle 10g R2 on Solaris 10.
Can Data Pump be used to export data, using a SQL query which is doing a join between 3 tables ?
Thanks,
AshishHello,
No , this is from expdp help=Y
QUERY Predicate clause used to export a subset of a table.
Regards -
Can i do client export import between systems with different patch levels
we want to do a client export import between two systems. As I understand the patch levels should be the same in both for this to happen..
But right now I am in a situation where time is a big constrant. The basis patch level is on par. But the rest are not. Can I do client import successfully at this stage.The only thing that may stop it working could be additional functionality between the systems.
I work on XI and there is a huge difference between SP12 and SP15. We were able to do the transports between systems, but as Dev was patched before QA, and the transport was reliant on standard settings not being available, it still went though, but we had to transport it again after applying SP15 to QA.
Again, it probably depends on what System you have running on Netweaver... -
Data Pump : Table export from 9i to 10g
We are creating dump file for some tables with clob columns. These tables are in Oracle 9i and have to be moved to 10g. Our DBA's are suggesting that we cannot import those dump files directly in 10g database, otherwise there can be NLS and text issues, it has to be imported to 9i first and then upgraded to 10g. Is this correct assessment?
Thanks for your help.
- DWelcome to the forums !
Pl post full versions of OS, along with exact versions of "9i" and "10g".
>
... otherwise there can be NLS and text issues ...
>
Can you pl clarify what the NLS and text issues are ?
HTH
Srini -
Data Pump import to a sql file error :ORA-31655 no data or metadata objects
Hello,
I'm using Data Pump to export/import data, one requirement is to import data to a sql file. The OS is window.
I made the follow export :
expdp system/password directory=dpump_dir dumpfile=tablesdump.dmp content=DATA_ONLY tables=user.tablename
and it works, I can see the file TABLESDUMP.DMP in the directory path.
then when I tried to import it to a sql file:
impdp system/password directory=dpump_dir dumpfile=tablesdump.dmp sqlfile=tables_export.sql
the log show :
ORA-31655 no data or metadata objects selected for job
and the sql file is created empty in the directory path.
I'm not DBA, I'm a Java developer , Can you help me?
ThksHi, I added the command line :
expdp system/system directory=dpump_dir dumpfile=tablesdump.dmp content=DATA_ONLY schemas=ko1 tables=KO1QT01 logfile=capture.log
the log in the console screen is (is in Spanish), no log file was cerated in the directory path.
Export: Release 10.2.0.1.0 - Production on Martes, 26 Enero, 2010 12:59:14
Copyright (c) 2003, 2005, Oracle. All rights reserved.
Conectado a: Oracle Database 10g Express Edition Release 10.2.0.1.0 - Production
UDE-00010: se han solicitado varios modos de trabajo, schema y tables.
(English error)
UDE-00010: multiple job modes requested,schema y tables.
This is why I used tables=user.tablename instead, is this right ?
Thks -
Oracle 10g - Data Pump: Export / Import of Sequences ?
Hello,
I'm new to this forum and also to Oracle (Version 10g). Since I could not find an answer to my question, I open this post in hoping to get some help from the experienced users.
My question concerns the Data Pump Utility and what happens to sequences which were defined in the source database:
I have exported a schema with the following command:
"expdp <user>/<pass> DIRECTORY=DATA_PUMP_DIR DUMPFILE=dumpfile.dmp LOGFILE=logfile.log"
This worked fine and also the import seemed to work fine with the command:
"impdp <user>/<pass> DIRECTORY=DATA_PUMP_DIR DUMPFILE=dumpfile.dmp"
It loaded the exported objects directly into the schema of the target database.
BUT:
Something has happened to my sequences. :-(
When I want to use them, all sequences start again with value "1". Since I have already included data with higher values in my tables, I get into trouble with the PK of these tables because I used sequences sometimes as primary key.
My question go in direction to:
1. Did I something wrong with Data Pump Utility?
2. How is the correct way to export and import sequences that they keep their actual values?
3. When the behaviour described here is correct, how can I correct the values that start again from the last value that was used in the source database?
Thanks a lot in advance for any help concerning this topic!
Best regards
FireFighter
P.S.
It might be that my english sounds not perfect since it is not my native language. Sorry for that! ;-)
But I hope that someone can understand nevertheless. ;-)My question go in direction to:
1. Did I something wrong with Data Pump Utility?I do not think so. But may be with the existing schema :-(
2. How is the correct way to export and import
sequences that they keep their actual values?If the Sequences exist in the target before the import, oracle does not drop and recreate it. So you need to ensure that the sequences do not already exist in the target or the existing ones are dropped before the import.
3. When the behaviour described here is correct, how
can I correct the values that start again from the
last value that was used in the source database?You can either refresh with the import after the above correction or drop and manually recreate the sequences to START WITH the NEXT VALUE of the source sequences.
The easier way is to generate a script from the source if you know how to do it -
Data Pump - How to avoid exporting/importing dbms_scheduler jobs?
Hi,
I am using data pump to export a users objects. When I import them it also imports any jobs that user has created with dbms_scheduler - how can I avoid this. I tried EXCLUDE=JOBS but no luck.
Thanks,
Jon.
Here are my export and import paramater files:
DIRECTORY=dpump_dir1
DUMPFILE=reveal.dmp
CONTENT=METADATA_ONLY
SCHEMAS=REVEAL
EXCLUDE=TABLE_STATISTICS
EXCLUDE=INDEX_STATISTICS
LOGFILE=reveal.log
DIRECTORY=dpump_dir1
DUMPFILE=reveal.dmp
CONTENT=METADATA_ONLY
SCHEMAS=reveal
REMAP_SCHEMA=reveal:reveal_backup
TRANSFORM=SEGMENT_ATTRIBUTES:n
EXCLUDE=TABLE_STATISTICS
EXCLUDE=INDEX_STATISTICS
LOGFILE=reveal.logSorry for the reply to an old post.
It seems that now (10.2.0.4) JOB is included in the list of SCHEMA_EXPORT_OBJECTS.
SQL> SELECT OBJECT_PATH FROM SCHEMA_EXPORT_OBJECTS WHERE object_path LIKE '%JOB%';
OBJECT_PATH
JOB
SCHEMA_EXPORT/JOB
Unfortunatly, EXCLUDE=JOB still generates invalid argument on my schema imports. I also don't know whether these are old style jobs, or scheduler jobs. I don't see anything for object_path LIKE '%SCHED%' , which is my real interest anyway.
The data pump is so rich already, I hate ask for more, but ... may we please have even more?? scheduler_programs, scheduler_jobs, scheduler etc.
Thanks
Steve -
Export / Import of Apex Data
Hello,
We are planning to move our Oracle databases from Windows Servers to Linux ones. In a test lab, we have successfully exported all data from an Oracle DB instance on Windows and imported that data on an Oracle DB instance on Linux (with the old export/import Oracle tool)
But in fact...it was all data except Apex data. The complete Apex environment is available on our new Linux server but no trace of all our workspaces and applications. The only available workspace is Internal.
We do not want to export / import all workspaces, applications, images and files one by one (by hand on the Apex environment).
- Is there a particular method to follow to include the Apex data in the export / import process ?
OR
- Is there a trick or a utility program that can automate the export / import of Apex data ?
Thank you for your help.
- PatricePatrice,
There is an Apex utility (it's called ApexExport) that can be setup to run workspace, application and page exports. It's a Java based utility and sometimes can be a little maddening to get configured to run but it will do the job once you get it configured. The utility will be on the database machine in the Apex installation directory. There is pretty good documentation available from Oracle but John Scott blogged about it here http://jes.blogs.shellprompt.net/2006/12/12/backing-up-your-applications/ as well.
Earl -
SEVERAL USER BELONG TO THE TABLESPACE I WANT TO EXPORT
I WANT TO ONLY WEED OUT THE TABLESSPACE AND THE TABLES BELONGING TO ONE USER.
THE USER ALSO WANT SOME OF THE TABLES WITH ALL THE DATA IN THEM AND SOME OF THE TABLES WITH NO DATA IN THEM
I TRIED ALL OPTIONS
WITH DATA_ONLY, METADATA_ONLY ON BOTH THE IMPORT AND EXPORT AND HAVE ISSUES
HAVE TRIED EXPORTING AND IT GIVES ME ALL THE TABLES AND A SIZE TO INDICATE IT WORKS BUT ON IMPORT KAZOOM-HELL BREAKS LOOSE. JUNIOR DBA NEED HELP
SQL> select owner, table_name, tablespace_name from dba_tables where tablespace_name='USER_SEG';
ORAPROBE OSP_ACCOUNTS USER_SEG
PATSY TRAPP USER_SEG
PATSY TRAPPO USER_SEG
PATSY TRAUDIT USER_SEG
PATSY TRCOURSE USER_SEG
PATSY TRCOURSEO USER_SEG
PATSY TRDESC USER_SEG
PATSY TREMPDATA USER_SEG
PATSY TRFEE USER_SEG
PATSY TRNOTES USER_SEG
PATSY TROPTION USER_SEG
PATSY TRPART USER_SEG
PATSY TRPARTO USER_SEG
PATSY TRPART_OLD USER_SEG
PATSY TRPERCENT USER_SEG
PATSY TRSCHOOL USER_SEG
PATSY TRSUPER USER_SEG
PATSY TRTRANS USER_SEG
PATSY TRUSERPW USER_SEG
PATSY TRUSRDAT USER_SEG
PATSY TRVARDAT USER_SEG
PATSY TRVERIFY USER_SEG
PATSY TRAPPO_RESET USER_SEG
PATSY TRAPP_RESET USER_SEG
PATSY TRCOURSEO_RESET USER_SEG
PATSY TRCOURSE_RESET USER_SEG
PATSY TRPARTO_RESET USER_SEG
PATSY TRPART_RESET USER_SEG
PATSY TRTRANS_RESET USER_SEG
PATSY TRVERIFY_RESET USER_SEG
MAFANY TRVERIFY USER_SEG
MAFANY TRPART USER_SEG
MAFANY TRPARTO USER_SEG
MAFANY TRAPP USER_SEG
MAFANY TRAPPO USER_SEG
MAFANY TRCOURSE USER_SEG
MAFANY TRCOURSEO USER_SEG
MAFANY TRTRANS USER_SEG
JULIE R_REPOSITORY_LOG USER_SEG
JULIE R_VERSION USER_SEG
JULIE R_DATABASE_TYPE USER_SEG
JULIE R_DATABASE_CONTYPE USER_SEG
JULIE R_NOTE USER_SEG
JULIE R_DATABASE USER_SEG
JULIE R_DATABASE_ATTRIBUTE USER_SEG
JULIE R_DIRECTORY USER_SEG
JULIE R_TRANSFORMATION USER_SEG
JULIE R_TRANS_ATTRIBUTE USER_SEG
JULIE R_DEPENDENCY USER_SEG
JULIE R_PARTITION_SCHEMA USER_SEG
JULIE R_PARTITION USER_SEG
JULIE R_TRANS_PARTITION_SCHEMA USER_SEG
JULIE R_CLUSTER USER_SEG
JULIE R_SLAVE USER_SEG
JULIE R_CLUSTER_SLAVE USER_SEG
JULIE R_TRANS_SLAVE USER_SEG
JULIE R_TRANS_CLUSTER USER_SEG
JULIE R_TRANS_HOP USER_SEG
JULIE R_TRANS_STEP_CONDITION USER_SEG
JULIE R_CONDITION USER_SEG
JULIE R_VALUE USER_SEG
JULIE R_STEP_TYPE USER_SEG
JULIE R_STEP USER_SEG
JULIE R_STEP_ATTRIBUTE USER_SEG
JULIE R_STEP_DATABASE USER_SEG
JULIE R_TRANS_NOTE USER_SEG
JULIE R_LOGLEVEL USER_SEG
JULIE R_LOG USER_SEG
JULIE R_JOB USER_SEG
JULIE R_JOBENTRY_TYPE USER_SEG
JULIE R_JOBENTRY USER_SEG
JULIE R_JOBENTRY_COPY USER_SEG
JULIE R_JOBENTRY_ATTRIBUTE USER_SEG
JULIE R_JOB_HOP USER_SEG
JULIE R_JOB_NOTE USER_SEG
JULIE R_PROFILE USER_SEG
JULIE R_USER USER_SEG
JULIE R_PERMISSION USER_SEG
JULIE R_PROFILE_PERMISSION USER_SEG
MAFANY2 TRAPP USER_SEG
MAFANY2 TRAPPO USER_SEG
MAFANY2 TRCOURSE USER_SEG
MAFANY2 TRCOURSEO USER_SEG
MAFANY2 TRPART USER_SEG
MAFANY2 TRPARTO USER_SEG
MAFANY2 TRTRANS USER_SEG
MAFANY2 TRVERIFY USER_SEG
MAFANY BIN$ZY3M1IuZyq3gQBCs+AAMzQ==$0 USER_SEG
MAFANY BIN$ZY3M1Iuhyq3gQBCs+AAMzQ==$0 USER_SEG
MAFANY MYUSERS USER_SEG
I ONLY WANT THE TATBLES FROM PATSY AND WANT TO MOVE IT TO ANOTHER DATABASE FOR HER TO USE AND WANT TO KEEP THE SAME TABLESPACE NAME
THE TABLES BELOW SHOULD ALSO HAVE JUST THE METADATA AND NOT THE DATA
PATSY TRAPP USER_SEG
PATSY TRAPPO USER_SEG
PATSY TRAUDIT USER_SEG
PATSY TRCOURSE USER_SEG
PATSY TRCOURSEO USER_SEG
PATSY TRDESC USER_SEG
PATSY TREMPDATA USER_SEG
PATSY TRFEE USER_SEG
PATSY TRNOTES USER_SEG
PATSY TROPTION USER_SEG
PATSY TRPART USER_SEG
PATSY TRPARTO USER_SEG
PATSY TRPART_OLD USER_SEG
PATSY TRPERCENT USER_SEG
PATSY TRSCHOOL USER_SEG
PATSY TRSUPER USER_SEG
PATSY TRTRANS USER_SEG
PATSY TRUSERPW USER_SEG
PATSY TRUSRDAT USER_SEG
THE FOLLOWING WILL OR ARE SUPPOSED TO HAVE ALL THE DATA
PATSY TRVERIFY USER_SEG
PATSY TRAPPO_RESET USER_SEG
PATSY TRAPP_RESET USER_SEG
PATSY TRCOURSEO_RESET USER_SEG
PATSY TRCOURSE_RESET USER_SEG
PATSY TRPARTO_RESET USER_SEG
PATSY TRPART_RESET USER_SEG
PATSY TRTRANS_RESET USER_SEG
PATSY TRVERIFY_RESET USER_SEG
HAVE TRIED ALL THE FOLLOWING AND LATER GOT STUCK IN MY LIL EFFORT TO DOCUMENT AS I GO ALONG:
USING DATA PUMP TO EXPORT DATA
First:
Create a directory object or use one that already exists.
I created a directory object: grant
CREATE DIRECTORY "DATA_PUMP_DIR" AS '/home/oracle/my_dump_dir'
Grant read, write on directory DATA_PUMP_DIR to user;
Where user will be the user such as sys, public, oracle etc.
For example to create a directory object named expdp_dir located at /u01/backup/exports enter the following sql statement:
SQL> create directory expdp_dir as '/u01/backup/exports'
then grant read and write permissions to the users who will be performing the data pump export and import.
SQL> grant read,write on directory dpexp_dir to system, user1, user2, user3;
http://wiki.oracle.com/page/Data+Pump+Export+(expdp)+and+Data+Pump+Import(impdp)?t=anon
To view directory objects that already exist
- use EM: under Administration tab to schema section and select Directory Objects
- DESC the views: ALL_DIRECTORIES, DBA_DIRECTORIES
select * from all_directories;
select * from dba_directories;
export schema using expdb
expdp system/SCHMOE DUMPFILE=patsy_schema.dmp
DIRECTORY=DATA_PUMP_DIR SCHEMAS = PATSY
expdp system/PASS DUMPFILE=METADATA_ONLY_schema.dmp DIRECTORY=DATA_PUMP_DIR TABLESPACES = USER_SEG CONTENT=METADATA_ONLY
expdp system/PASS DUMPFILE=data_only_schema.dmp DIRECTORY=DATA_PUMP_DIR TABLES=PATSY.TRVERIFY,PATSY.TRAPPO_RESET,PATSY.TRAPP_RESET,PATSY.TRCOURSEO_RESET, PATSY.TRCOURSE_RESET,PATSY.TRPARTO_RESET,PATSY.TRPART_RESET,PATSY.TRTRANS_RESET,PATSY.TRVERIFY_RESET CONTENT=DATA_ONLYyou are correct all the patsy tables reside in the tablespace USER_SEG that are in a diffrent database and i want to move them to a new database-same version 10g. i have created a user named patsy there also. the tablespace does not exist in the target i want to move it to-same thing with the objects and indexes-they don't exists in the target.
so how can i move the schema and tablespace with all the tables and keeping the tablespace name and same username patsy that i created to import into.
tried again and get some errrors: this is beter than last time and that because i remap_tablespace:USER_SEG:USERS
BUT I WANT TO HAVE THE TABLESPACE IN TARGET CALLED USER_SEG TOO. DO I HAVE TO CREATE IT OR ?
[oracle@server1 ~]$ impdp system/blue99 remap_schema=patsy:test REMAP_TABLESPACE=USER_SEG:USERS directory=DATA_PUMP_DIR dumpfile=patsy.dmp logfile=patsy.log
Import: Release 10.1.0.3.0 - Production on Wednesday, 08 April, 2009 11:10
Copyright (c) 2003, Oracle. All rights reserved.
Connected to: Oracle Database 10g Release 10.1.0.3.0 - Production
Master table "SYSTEM"."SYS_IMPORT_FULL_03" successfully loaded/unloaded
Starting "SYSTEM"."SYS_IMPORT_FULL_03": system/******** remap_schema=patsy:test REMAP_TABLESPACE=USER_SEG:USERS directory=DATA_PUMP_DIR dumpfile=patsy.dmp logfile=patsy.log
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/TBL_TABLE_DATA/TABLE/TABLE_DATA
. . imported "TEST"."TRTRANS" 10.37 MB 93036 rows
. . imported "TEST"."TRAPP" 2.376 MB 54124 rows
. . imported "TEST"."TRAUDIT" 1.857 MB 28153 rows
. . imported "TEST"."TRPART_OLD" 1.426 MB 7183 rows
. . imported "TEST"."TRSCHOOL" 476.8 KB 5279 rows
. . imported "TEST"."TRAPPO" 412.3 KB 9424 rows
. . imported "TEST"."TRUSERPW" 123.8 KB 3268 rows
. . imported "TEST"."TRVERIFY_RESET" 58.02 KB 183 rows
. . imported "TEST"."TRUSRDAT" 54.73 KB 661 rows
. . imported "TEST"."TRNOTES" 51.5 KB 588 rows
. . imported "TEST"."TRCOURSE" 49.85 KB 243 rows
. . imported "TEST"."TRCOURSE_RESET" 47.60 KB 225 rows
. . imported "TEST"."TRPART" 39.37 KB 63 rows
. . imported "TEST"."TRPART_RESET" 37.37 KB 53 rows
. . imported "TEST"."TRTRANS_RESET" 38.94 KB 196 rows
. . imported "TEST"."TRCOURSEO" 30.93 KB 51 rows
. . imported "TEST"."TRCOURSEO_RESET" 28.63 KB 36 rows
. . imported "TEST"."TRPERCENT" 33.72 KB 1044 rows
. . imported "TEST"."TROPTION" 30.10 KB 433 rows
. . imported "TEST"."TRPARTO" 24.78 KB 29 rows
. . imported "TEST"."TRPARTO_RESET" 24.78 KB 29 rows
. . imported "TEST"."TRVERIFY" 20.97 KB 30 rows
. . imported "TEST"."TRVARDAT" 14.13 KB 44 rows
. . imported "TEST"."TRAPP_RESET" 14.17 KB 122 rows
. . imported "TEST"."TRDESC" 9.843 KB 90 rows
. . imported "TEST"."TRAPPO_RESET" 8.921 KB 29 rows
. . imported "TEST"."TRSUPER" 6.117 KB 10 rows
. . imported "TEST"."TREMPDATA" 0 KB 0 rows
. . imported "TEST"."TRFEE" 0 KB 0 rows
Processing object type TABLE_EXPORT/TABLE/GRANT/TBL_OWNER_OBJGRANT/OBJECT_GRANT
Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
ORA-39083: Object type INDEX failed to create with error:
ORA-00959: tablespace 'INDEX1_SEG' does not exist
Failing sql is:
CREATE UNIQUE INDEX "TEST"."TRPART_11" ON "TEST"."TRPART" ("TRPCPID", "TRPSSN") PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS STORAGE(INITIAL 32768 NEXT 131072 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT) TABLESPACE "INDEX1_SEG" PARALLEL 1
ORA-39083: Object type INDEX failed to create with error:
ORA-00959: tablespace 'INDEX1_SEG' does not exist
Failing sql is:
CREATE INDEX "TEST"."TRPART_I2" ON "TEST"."TRPART" ("TRPLAST") PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS STORAGE(INITIAL 131072 NEXT 131072 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT) TABLESPACE "INDEX1_SEG" PARALLEL 1
ORA-39083: Object type INDEX failed to create with error:
ORA-00959: tablespace 'INDEX1_SEG' does not exist
Failing sql is:
CREATE INDEX "TEST"."TRPART_I3" ON "TEST"."TRPART" ("TRPEMAIL") PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS STORAGE(INITIAL 131072 NEXT 131072 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT) TABLESPACE "INDEX1_SEG" PARALLEL 1
ORA-39083: Object type INDEX failed to create with error:
ORA-00959: tablespace 'INDEX1_SEG' does not exist
Failing sql is:
CREATE INDEX "TEST"."TRPART_I4" ON "TEST"."TRPART" ("TRPPASSWORD") PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS STORAGE(INITIAL 131072 NEXT 131072 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT) TABLESPACE "INDEX1_SEG" PARALLEL 1
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "BILLZ"."TRCOURSE_I1" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"BILLZ"', '"TRCOURSE_I1"', NULL, NULL, NULL, 232, 2, 232, 1, 1, 7, 2, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "BILLZ"."TRTRANS_I1" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"BILLZ"', '"TRTRANS_I1"', NULL, NULL, NULL, 93032, 445, 93032, 1, 1, 54063, 2, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "DANQ"."TRAPP_I1" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"DANQ"', '"TRAPP_I1"', NULL, NULL, NULL, 54159, 184, 54159, 1, 1, 4597, 1, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "DANQ"."TRAPP_I2" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"DANQ"', '"TRAPP_I2"', NULL, NULL, NULL, 54159, 182, 17617, 1, 2, 48776, 1, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "JULIE"."TRAPPO_I1" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"JULIE"', '"TRAPPO_I1"', NULL, NULL, NULL, 9280, 29, 9280, 1, 1, 166, 1, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "JULIE"."TRAPPO_I2" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"JULIE"', '"TRAPPO_I2"', NULL, NULL, NULL, 9280, 28, 4062, 1, 2, 8401, 1, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "JULIE"."TRCOURSEO_I1" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"JULIE"', '"TRCOURSEO_I1"', NULL, NULL, NULL, 49, 2, 49, 1, 1, 8, 1, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "JULIE"."TREMPDATA_I1" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"JULIE"', '"TREMPDATA_I1"', NULL, NULL, NULL, 0, 0, 0, 0, 0, 0, 0, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "DANQ"."TROPTION_I1" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"DANQ"', '"TROPTION_I1"', NULL, NULL, NULL, 433, 3, 433, 1, 1, 187, 1, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"TEST"."TRPART_11" creation failed
ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"TEST"."TRPART_I2" creation failed
ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"TEST"."TRPART_I3" creation failed
ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"TEST"."TRPART_I4" creation failed
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "DANQ"."TRPART_I5" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"DANQ"', '"TRPART_I5"', NULL, NULL, NULL, 19, 1, 19, 1, 1, 10, 1, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "JULIE"."TRPARTO_I1" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"JULIE"', '"TRPARTO_I1"', NULL, NULL, NULL, 29, 1, 29, 1, 1, 1, 1, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "JULIE"."TRPARTO_I2" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"JULIE"', '"TRPARTO_I2"', NULL, NULL, NULL, 29, 14, 26, 1, 1, 1, 1, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "JULIE"."TRPART_I2" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"JULIE"', '"TRPART_I2"', NULL, NULL, NULL, 7180, 19, 4776, 1, 1, 7048, 1, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "JULIE"."TRPART_I3" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"JULIE"', '"TRPART_I3"', NULL, NULL, NULL, 2904, 15, 2884, 1, 1, 2879, 1, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "JULIE"."TRPART_I4" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"JULIE"', '"TRPART_I4"', NULL, NULL, NULL, 363, 1, 362, 1, 1, 359, 0, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "JULIE"."TRPART_I5" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"JULIE"', '"TRPART_I5"', NULL, NULL, NULL, 363, 1, 363, 1, 1, 353, 0, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "DANQ"."TRPART_11" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"DANQ"', '"TRPART_11"', NULL, NULL, NULL, 7183, 29, 7183, 1, 1, 6698, 1, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "DANQ"."TRPERCENT_I1" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"DANQ"', '"TRPERCENT_I1"', NULL, NULL, NULL, 1043, 5, 1043, 1, 1, 99, 1, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "DANQ"."TRSCHOOL_I1" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"DANQ"', '"TRSCHOOL_I1"', NULL, NULL, NULL, 5279, 27, 5279, 1, 1, 4819, 1, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "JULIE"."TRVERIFY_I2" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"JULIE"', '"TRVERIFY_I2"', NULL, NULL, NULL, 30, 7, 7, 1, 1, 1, 2, 2, NULL, FALSE, NULL, NULL, NULL); END;
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "STU"."TRVERIFY_I1" does not exist or insufficient privileges
Failing sql is:
BEGIN DBMS_STATS.SET_INDEX_STATS('"STU"', '"TRVERIFY_I1"', NULL, NULL, NULL, 30, 12, 30, 1, 1, 1, 2, 2, NULL, FALSE, NULL, NULL, NULL); END;
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Job "SYSTEM"."SYS_IMPORT_FULL_03" completed with 29 error(s) at 11:17 -
Export/import to/from database
hi,
i do not know what is this for. i read the help but still no idea.
i know to use export/import to/from memory id and also set/get but not export/import to/from database.
1) what help says it stores data cluster? what is data cluster
2) can have example of export/import to/from database?
3) what is the different for export/import to/from memory id and database?
thanksHi,
1) A data cluster is a set of data objects grouped together for the purpose of storage in a storage medium(Like database, local memory or shared meomory etc), which can only be edited using ABAP statements like EXPORT, IMPORT and DELETE.
2) Suppose you want to export the data in a internal table to database.
Here TABLE_ID is the identifer of your data cluster. You are exporting your data to a SAP table INDX and giving it an ID (TABLE_ID in this case and this ID you are giving it an area by name XY where your data will be stored). EXPORT tab = itab
TO DATABASE indx(XY)
CLIENT '000'
ID 'TABLE_ID'.
You can get this data as follows.
IMPORT tab = itab
FROM DATABASE indx(xy)
CLIENT '000'
ID 'TABLE_ID'.
3) The difference is simple when you use MEMORY ID the data you export is stored in the application server MEMORY where as in the case of DATABASE it is stored in the database.
Regards,
Sesh -
Is it possible to export to an asm directory?
I tried this:
CREATE DIRECTORY "DGT01_EXP" AS '+DGT01/EXP';
(yes, this directory does exist in ASM)
But when I tried using this directory for expdp, I got this error.
$ expdp datapump/&pw schemas=test directory=DGT01_EXP dumpfile=test.dmp
Export: Release 10.2.0.1.0 - Production on Thursday, 27 December, 2007 14:44:53
Copyright (c) 2003, 2005, Oracle. All rights reserved.
Connected to: Oracle Database 10g Release 10.2.0.1.0 - Production
With the Real Application Clusters option
ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-29283: invalid file operation
ORA-06512: at "SYS.UTL_FILE", line 475
ORA-29283: invalid file operation
Please let me know if anyone has been able to do this.
Thanks.Hi,
Sorry, misread your first post. You can use data pump to export data to a disk but not to ASM.
If you are trying to move data from one database to another, you may use NETWORK_LINK parameter with datapump to directly read from the source database and copy it to destination database.
Regards -
Problem with export / import
Hi everybody, I am having a problem with an import. I am doing an export with a value in an user exit:
EXPORT V_NLQNR = V_NLQNR TO MEMORY ID 'QUANTL'
And after I call import in another program:
IMPORT V_NLQNR = V_NLQNR FROM MEMORY ID 'QUANTL'.
But when I check the sy-subrc I am getting 4, so I don't get anything.
Does anybody why?? Is there a problem if I call the export from an user exit??
Thanks in advance.Hello,
I think you have the right idea.
As a suggestion I would name my variables to make it clear which data is being
exported/imported. I would also use different names on the left and right side of the = sign.
Here is a working example from programs that I use:
In the first program
EXPORT intercodata FROM g_data_exp TO MEMORY ID 'INTERCOWOS'.
where g_data_exp is declared as a global variable
In the second program
IMPORT intercodata TO g_data_imp FROM MEMORY ID 'INTERCOWOS'.
where g_data_imp is declared as a global variable
The syntax that you use ( p1 = dobj1 ) should work as well, just make sure that the variable v_nlqnr to the right of the equal sign has a value before the export.
Regards
Greg Kern -
Exporting/Importing Reports
Hi,
We have different environments for our application....Dev, Test, and Production.
We want to create Discoverer Reports on Dev and then port them on test and Production. Is there any way I can do this through Exporting/Importing.
I know one way of doing this is through the Discoverer Desktop. is there any other way to do this via Discoverer Plus?
Please let me know.
Thanks and Regards,
SMHey Jayu,
Thanks for the reply. It works.
We have the Discoverer Admin but I didnt knew that it can be used to import/export workbooks too.
Thanks Again.
Cheers,
SM -
TAPE로 EXPORT, IMPORT, LOADER 사용하기(PIPE 사용)
제품 : ORACLE SERVER
작성날짜 : 2002-04-11
TAPE로 EXPORT, IMPORT, LOADER 사용하기(PIPE 사용)
================================================
Purpose
대용량의 DATA를 BACKUP 받거나 DATA를 처리할 때에는 TAPE을 이용하는
경우가 있다. 이럴 때 EXPORT, IMPORT, SQL*LOADER에서 TAPE 를 이용하는
방법을 종류별로 정리하였다.
Explanation
1. TAPE DEVICE로 EXPORT 받기
% exp userid=system/manager full= y file=/dev/rmt/0m volsize=245M
FILE은 TAPE이 있는 DEVICE 이름이고 VOLSIZE는 TAPE에 들어갈 DATA의
SIZE이다. 만약 첫번 째 TAPE이 245M에 이르게 되면 다음 TAPE를 넣으라는
메시지가 나온다.
(주의) VOLSIZE should be < tape capacity
2. Tape device에서 import받기
% imp userid=system/manager full=y file=/dev/rmt/0m volsize=245M
첫번째 TAPE이 245M에 이르면 다음 TAPE를 위한 메시지가 나온다.
3. PIPE와 DD를 이용한 tape의 export
% mknod /tmp/exp_pipe p # Make the pipe
% dd if=/tmp/exp_pipe of=<tape device> & # Write from pipe to tape
% exp file=/tmp/exp_pipe <other options> # Export to the pipe
4. PIPE와 DD를 이용한 tape의 import
% mknod /tmp/imp_pipe p # Make the pipe
% dd if=<tape device> of=/tmp/imp_pipe & # Write from tape to pipe
% imp file=/tmp/imp_pipe <other options> # Import from the pipe
5. PIPE 와 DD를 이용한 remote server의 tape device에 export하기
% mknod /tmp/exp_pipe p
% dd if=/tmp/exp_pipe | rsh <hostname> dd of=<file or device> &
% exp file=/tmp/exp_pipe <other options>
6. PIPE 와 DD 를 이용한 remote server의 tape device에서 import하기
% mknod /tmp/imp_pipe p
% rsh <hostname> dd if=<file or device | dd of=/tmp/imp_pipe &
% imp file=/tmp/imp_pipe <other options>
7. TAPE에 있는 DATA FILE을 SQL*LOADER로 받기.
% mknod /tmp/load_pipe p
% dd if=<tape_device> of=/tmp/load_pipe &
(주의) 만약 tape이 EBCDIC이면 다음 명령으로 ASCII로 바꾸어 줍니다.
% dd if=<tape_device conv=ascii of=/tmp/load_pipe &
% sqlldr userid=user/pass control=contol.ctl log=loader.log
infile='/tmp/load_pipe'
* PIPE는 I/O operation의 보다 빠른 상호 작용을 위한 memory 안의 가상
화일이다. PIPE buffer는 Sun Solaris에서는 5K, HP에서는 8K, SGI에서는
10K이다. 이것은 FIFO를 따르며 command는 다음과 같다.
% mknod filename p
* DD는 한 device로부터 다른 곳으로 data를 raw copy하는 명령어이다.re 1)
you need to explicitely state the length of the varchar field, otherwise there is a limit (I think it's 2000 characters but I'm not sure)
So your .ctl file should looke like this:
INTO TABLE SCHEMA.TESTABSTRACT
TRUNCATE
FIELDS TERMINATED BY '|'
TRAILING NULLCOLS
PROP_ID,
ABSTRACT VARCHAR(4000)
)
Maybe you are looking for
-
Hello SAP Experts! I have following error when exeuting test program for checking content server RSCMST: HTTP error: 500 Internal Server Error It is appearing continuosly every day. Only solution I have found now is restaring Server when DMS is insta
-
Where is the running JAVA file is located
Hey, lets say my java file would be in C:\package1\whereIam.java if I run this program, how can I ask (command ?), in which directory the running java file is located. i.e. I would like to get C:\package1 Aykut
-
I had to replace my hard drive on my mac book pro as it had died. Since getting laptop repaired I have been trying to sync my iphone with the new hard drive but it will not allow me to do this without first wanting to wipe the phone clean and put inf
-
Audio not working on trial version of captivate 7
Im using trial captivate 7 and i need to import audio to the silde and the option is blocked is tht normal?
-
Project Revenue as in Project Currency
Hi , Can we generate the revenue in the project Currency irrespective of Project Functional Currency ? In Project Billing Manual -- It has been mentioned that If you select Project Functional Currency and Invoice Transaction currency as Revenue Trans