Data Pump: Importing partitioned objects into target DB w/o partitioning
Hello community,
I have the following issue when importing partitioned tables from a source DB to a target DB without the partitioning option.
In this scenario, the tables and indexes on the target are already present, solely without partitions. Release is 10gR2.
<ul>
<li> First try: impdp user/pw directory=dump_dir dumpfile=thedump tables=my_part_table TABLE_EXISTS_ACTION=truncate
This stopped with "ORA-00439: feature not enabled: Partitioning", although the table already existed! Seems as if the partitioning is checked before the table_exists_action is executed.
<li> Second try: impdp user/pw directory=dump_dir dumpfile=thedump tables=my_part_table TABLE_EXISTS_ACTION=append
Table was truncated manually beforehand. Result: Same as above.
<li> Third try: impdp user/pw directory=dump_dir dumpfile=thedump tables=my_part_table CONTENT=DATA_ONLY
Table was truncated manually beforehand. Result: Success.
BUT: This requires manual action on those objects and further means that you should manually de-/re-activate constraints and indexes on the table, at least when there's lots of data to load.
</ul>
So, option #3 is not that good of an option, as there's much more fiddling involved than it was earlier with "imp ... IGNORE=Y".
Does anyone have a better idea how to handle this?
I'd love to see a "use_partitions=N" option in impdp, but haven't found it so far. ;-)
Regards,
Uwe
P.S.: Yes I know, the Index/Constraint part of it was more or less the same in the "imp age". Still, there's much more manual action needed than I am happy with.
Hello community,
I have the following issue when importing partitioned tables from a source DB to a target DB without the partitioning option.
In this scenario, the tables and indexes on the target are already present, solely without partitions. Release is 10gR2.
<ul>
<li> First try: impdp user/pw directory=dump_dir dumpfile=thedump tables=my_part_table TABLE_EXISTS_ACTION=truncate
This stopped with "ORA-00439: feature not enabled: Partitioning", although the table already existed! Seems as if the partitioning is checked before the table_exists_action is executed.
<li> Second try: impdp user/pw directory=dump_dir dumpfile=thedump tables=my_part_table TABLE_EXISTS_ACTION=append
Table was truncated manually beforehand. Result: Same as above.
<li> Third try: impdp user/pw directory=dump_dir dumpfile=thedump tables=my_part_table CONTENT=DATA_ONLY
Table was truncated manually beforehand. Result: Success.
BUT: This requires manual action on those objects and further means that you should manually de-/re-activate constraints and indexes on the table, at least when there's lots of data to load.
</ul>
So, option #3 is not that good of an option, as there's much more fiddling involved than it was earlier with "imp ... IGNORE=Y".
Does anyone have a better idea how to handle this?
I'd love to see a "use_partitions=N" option in impdp, but haven't found it so far. ;-)
Regards,
Uwe
P.S.: Yes I know, the Index/Constraint part of it was more or less the same in the "imp age". Still, there's much more manual action needed than I am happy with.
Similar Messages
-
I am having trouble using the data pump import tool. I am working with Oracle 10.2.4.0 on Windows Server 2008. I am trying to import a lot of data (several schemas, thousands of tables, hundreds of GB of data) from the same version of Oracle on Linux. Since it's so much data and will take so long to import, I am trying to make sure I can get it to work for one table or one schema before I attempt everything. So I'm trying to import the TEST_USER.TABLE1 table using the following command:
impdp.exe dumpfile=big_file.dmp logfile=big_file_import.log tables=test_user.table1 remap_datafile=\'/oradata1/data1.dbf\':\'D:\oracle\product\10.2.0\oradata\orcl\data1.dbf\'
I supply sys as sysdba for the login when it asks (not sure how you get that info into the initial command). However, I am getting the following error:
user 'TEST_USER' does not exist
My understanding was that the data pump utility would create all the requisite schemas for me. Is that not the case? The target database is a new install so the source schemas don't exist there.
Even if I create the test_user schema by hand, then I get an error stating that the tablespace doesn't exist:
ORA-00959: tablespace 'TS_1' does not exist
Then even doing that, which I don't want to have to do to begin with, doesn't work. Then it gives me something about how the user doesn't have the proper privileges on the tablespace.
Isn't the data pump utility supposed to make this stuff automatic? Do I really have to create all the schemas and tablespaces by hand? That's going to take a long time. Am I doing something wrong here?
Thanks,
Daveuser5482172 wrote:
Hemant K Chitale wrote:
tables=test_user.table1 The "TABLES" mode does NOT create the database accounts.
The FULL mode creates tablespaces and database accounts before importing data.
The SCHEMAS mode creates database accounts before importing data-- but expects Tablespaces to pre-exist so that the tables and indexes can be created in the correct tablespaces.
Hemant K Chitale
http://hemantoracledba.blogspot.com
That is consistent with what I'm seeing. So basically there is no way to import just a few tables or schemas and have it create all the necessary users and tablespaces for you. That seems a little arbitrary, but whatever. Why make things easy on your users when you can be all Oracley about it. Anyway, I'll give it a try and let you know what happens.
Can I ask where you got this information? I'm sure you're right, but I can't find anything stating this.
Thanks,
Dave
Edited by: user5482172 on Feb 18, 2010 10:51 PMAfter some testing, this appears to be correct. The system will not create tablespaces under the SCHEMAS mode and it will not create schemas under the TABLES mode; only under the FULL mode will everything be created for you. I find that unintuitive and extremely unfriendly, but that's Oracle for you. Have you ever tried doing this in SQL Server? You open the GUI, tell it where the dump file is, the name of the database, where to put the data files and you're done. Oracle has a real knack for making things way more complicated than they need to be. End Oracle rant. -
Exclude DBMS_SCHEDULER jobs in DATA PUMP import
Hello,
I need to exclude all DBMS_SCHEDULER jobs during DATA PUMP import.
I tried to do this with: EXCLUDE=JOB but this only works on DBMS_JOB.
Is there a way to exclude all DBMS_SCHEDULER jobs during import?
Kind RegardsThere are PROCOBJ that can be excluded (Procedural objects in the selected schemas) but I'm affraid it exclude not only DBMS_SCHEDULER jobs.
Any ideas? -
Can we import 3D objects into AE?
I have v6.5 and in the manual it says AE can import certain 3D files. Yet everyone in this forum always say to render stuff in whatever 3D app and import the *video* into AE. Even 3D plugins like zaewerks seem to export only video into AE, not 3D model (or so I'm told).
So what is it, can AE read 3D models or not? If not, is there any reason why it hasn't been improved to do so? Shouldn't this be a high priority thing and easy to do, since it's already almost there.
If yes, why isn't anyone doing it? At least I haven't read about how to import 3D objects into AE.This area of functionality has changed little between After Effects 6.5 and CS3, so the
" 3D files from other applications" section of After Effects CS3 Help on the Web is still relevant for you.
Here's an excerpt:
"After Effects can import 3D-image files saved in Softimage PIC, RLA, RPF, and Electric Image EI format. These 3D-image files contain red, green, blue, and alpha (RGBA) channels, as well as auxiliary channels with optional information, such as z depth, object IDs, texture coordinates....
After Effects treats each composited 3D file from another application as a single 2D layer. That layer, as a whole, can be given 3D attributes and treated like any After Effects 3D layer, but the objects contained within that 3D file cannot be manipulated individually in 3D space. To access the 3D depth information and other auxiliary channel information in 3D image files, use the 3D Channel effects."
Also, you can import camera data and other data from 3D applications.
So, yes, you can import files from 3D applications; but, no, you can't import 3D models per se and do new camera moves on them, et cetera.
Yes, this is a common feature request. Yes, it is being very seriously considered.
P.S. Mylenium has a lot of information about working with After Effects and data from 3D applications on his website: http://mylenium.de/creation/motiondesign/ae_vault/ae_vault_main.html -
Data Pump import to a sql file error :ORA-31655 no data or metadata objects
Hello,
I'm using Data Pump to export/import data, one requirement is to import data to a sql file. The OS is window.
I made the follow export :
expdp system/password directory=dpump_dir dumpfile=tablesdump.dmp content=DATA_ONLY tables=user.tablename
and it works, I can see the file TABLESDUMP.DMP in the directory path.
then when I tried to import it to a sql file:
impdp system/password directory=dpump_dir dumpfile=tablesdump.dmp sqlfile=tables_export.sql
the log show :
ORA-31655 no data or metadata objects selected for job
and the sql file is created empty in the directory path.
I'm not DBA, I'm a Java developer , Can you help me?
ThksHi, I added the command line :
expdp system/system directory=dpump_dir dumpfile=tablesdump.dmp content=DATA_ONLY schemas=ko1 tables=KO1QT01 logfile=capture.log
the log in the console screen is (is in Spanish), no log file was cerated in the directory path.
Export: Release 10.2.0.1.0 - Production on Martes, 26 Enero, 2010 12:59:14
Copyright (c) 2003, 2005, Oracle. All rights reserved.
Conectado a: Oracle Database 10g Express Edition Release 10.2.0.1.0 - Production
UDE-00010: se han solicitado varios modos de trabajo, schema y tables.
(English error)
UDE-00010: multiple job modes requested,schema y tables.
This is why I used tables=user.tablename instead, is this right ?
Thks -
10g to 11gR2 Upgrade using Data Pump Import
Hi,
I am intending to move a 10g database from one windows server to another. However there is also a requirement to upgrade this database to 11gR2. Therefore I was going to combine the 2 in one movement by -
1. take a full data pump export of the source 10g database
2. create a new empty 11g database on the target environment
3. import the dump file into the target database
However I have a couple of queries running over in my mind about this approach -
Q1. What happens with the SYSTEM and SYS objects from SYSTEM and SYSAUX during the import ? Given that I will have in effect already created a new dictionary on the empty target database - will any import of SYSTEM or SYS simply produce errror messages which should be ignored ?
Q2. should I use EXCLUDE on SYS and SYSTEM ( is this EXCLUDE better on the export or import side ) ?
Q3. what happens if there are things such as scheduled jobs etc on the source system - since these would be stored in SYSTEM owned tables, how would I bring these across to the new target 11g database ?
thanks,
JimJim Thompson wrote:
Hi,
I am intending to move a 10g database from one windows server to another. However there is also a requirement to upgrade this database to 11gR2. Therefore I was going to combine the 2 in one movement by -
1. take a full data pump export of the source 10g database
2. create a new empty 11g database on the target environment
3. import the dump file into the target database
However I have a couple of queries running over in my mind about this approach -
Q1. What happens with the SYSTEM and SYS objects from SYSTEM and SYSAUX during the import ? Given that I will have in effect already created a new dictionary on the empty target database - will any import of SYSTEM or SYS simply produce errror messages which should be ignored ?
You wont get error related to system , sysaux tablespace bacsei these wont be exported at all. Schemas like "SYS, CTXSYS, MDSYS and ORDSYS" are never exported using datapump. Thats why oracle reommends not to create any objects under sys,system schema.
Q2. should I use EXCLUDE on SYS and SYSTEM ( is this EXCLUDE better on the export or import side ) ?
Not required, as datadictionary schemas wont be exported, so spcefying EXCLUDE wont do anything
Q3. what happens if there are things such as scheduled jobs etc on the source system - since these would be stored in SYSTEM owned tables, how would I bring these across to the new target 11g database ?
DDL will get export and imported into new database. For example if you have schema A and you had defined job whos owner is A, then while importing this information also gets imported. For user defined Jobs there are view like USER_SCHEDULER_JOBS etc, So dont worry about the user jobs, these will be created.
Also see
MOS note - Schema's CTXSYS, MDSYS and ORDSYS are Not Exported [ID 228482.1]
Export system or sys schema
I also run following test in my db which shows i cannt export the objects under sys schema;
SQL> show user
USER is "SYS"
SQL> create table pump (id number);
Table created.
SQL> insert into pump values (1);
1 row created.
SQL> insert into pump values (2);
1 row created.
SQL> exit
Disconnected from Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Pr
oduction
With the Partitioning, OLAP, Data Mining and Real Application Testing options
C:\Documents and Settings\rnagi\My Documents\Ranjit Doc\Performance\SQL>expdp tables=sys.pump logfile=test.log
Export: Release 11.2.0.1.0 - Production on Mon Feb 27 18:11:29 2012
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Username: / as sysdba
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Produc
tion
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "SYS"."SYS_EXPORT_TABLE_01": /******** AS SYSDBA tables=sys.pump logfi
le=test.log
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 0 KB
*ORA-39166: Object SYS.PUMP was not found.*
*ORA-31655: no data or metadata objects selected for job*
*Job "SYS"."SYS_EXPORT_TABLE_01" completed with 2 error(s) at 18:11:57* -
Hi All,
I am having errors below when trying to use data pump to import one table from the dump file (from a pre-prod db) into another database (prod_db) same version. I used the expdp to generate the dump file.
Gettings errors -
ORA-39083
ORA-00959
ORA-39112
Any suggestions and/or advice would be appreciated.
Thank you
Import: Release 10.2.0.1.0 - 64bit Production
Copyright (c) 2003, 2005, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
With the Partitioning, OLAP and Data Mining options
Master table "SYSTEM"."SYS_IMPORT_TABLE_01" successfully loaded/unloaded
Starting "SYSTEM"."SYS_IMPORT_TABLE_01": system/******** DIRECTORY=DATA_PUMP_DIR DUMPFILE=EXP_Mxxxx_1203.DMP LOGFILE=Mxxxx_CLAIMS.LOG TABLES=CLAIMS
Processing object type SCHEMA_EXPORT/TABLE/TABLE
ORA-39083: Object type TABLE failed to create with error:
ORA-00959: tablespace 'OXFORD_DATA_01' does not exist
Failing sql is:
CREATE TABLE "Mxxxx"."CLAIMS" ("CLAIM_NUM" VARCHAR2(25) NOT NULL ENABLE, "LINE_NUM" NUMBER(3,0) NOT NULL ENABLE, "PAID_FLAG" CHAR(1), "ADJ_FLAG" CHAR(1), "CLAIM_TYPE" VARCHAR2(20), "MEM_ID" VARCHAR2(20), "MEM_SUF" VARCHAR2(20), "MEM_BEG_DATE" DATE, "MEM_REL" CHAR(2), "MEM_NAME" VARCHAR2(40), "MEM_DOB" DATE, "MEM_SEX" CHAR(1), "REC_DATE" DATE, "PAID_DATE" DATE, "FROM_DATE" DATE, "
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
ORA-39112: Dependent object type INDEX:"Mxxxx"."CLAIMS_IDX1" skipped, base object type TABLE:"Mxxxx"."CLAIMS" creation failed
ORA-39112: Dependent object type INDEX:"Mxxxx"."CLAIMS_PK" skipped, base object type TABLE:"Mxxxx"."CLAIMS" creation failed
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"Mxxxx"."CLAIMS_IDX1" creation failed
ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"Mxxxx"."CLAIMS_PK" creation failed
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
ORA-39112: Dependent object type TABLE_STATISTICS skipped, base object type TABLE:"Mxxxx"."CLAIMS" creation failed
Job "SYSTEM"."SYS_IMPORT_TABLE_01" completed with 6 error(s) at 13:49:33
impdp username/password DIRECTORY=DATA_PUMP_DIR DUMPFILE=EXP_Mxxxx_1203.DMP LOGFILE=Mxxxx_CLAIMS.LOG TABLES=CLAIMSOk,
Thank you for the advice-
So I did use REMAP_TABLESPACE option and it did import and load the file but should I be concerned with 4 errors? Looks like have to rebuilt indexes which is fine.
Master table "Mxxxx"."SYS_IMPORT_TABLE_01" successfully loaded/unloaded
Starting "Mxxxx"."SYS_IMPORT_TABLE_01": Mxxxx/******** DIRECTORY=DATA_PUMP_DIR DUMPFILE=EXP_Mxxxx_1203.DMP LOGFILE=Mxxxx_CLAIMS.LOG TABLES=CLAIMS REMAP_TABLESPACE=OXFORD_DATA_01:Mxxxx_DATA_01
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
. . imported "Mxxxx"."CLAIMS" 605.3 MB 1715554 rows
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
ORA-39083: Object type INDEX failed to create with error:
ORA-00959: tablespace 'USER_INDEX_01' does not exist
Failing sql is:
CREATE INDEX "Mxxxx"."CLAIMS_IDX1" ON "Mxxxx"."CLAIMS" ("MEM_ID") PCTFREE 10 INITRANS 2 MAXTRANS 255 NOLOGGING STORAGE(INITIAL 8388608 NEXT 8388608 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT) TABLESPACE "USER_INDEX_01" PARALLEL 1
ORA-39083: Object type INDEX failed to create with error:
ORA-00959: tablespace 'USER_INDEX_01' does not exist
Failing sql is:
CREATE UNIQUE INDEX "Mxxxx"."CLAIMS_PK" ON "Mxxxx"."CLAIMS" ("CLAIM_NUM", "LINE_NUM") PCTFREE 10 INITRANS 2 MAXTRANS 255 NOLOGGING STORAGE(INITIAL 8388608 NEXT 8388608 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT) TABLESPACE "USER_INDEX_01" PARALLEL 1
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"Mxxxx"."CLAIMS_IDX1" creation failed
ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"Mxxxx"."CLAIMS_PK" creation failed
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Job "Mxxxx"."SYS_IMPORT_TABLE_01" completed with 4 error(s) at 16:57:12 -
Error while importing CIM file into target SLD
Hi
We are in the process of moving our XI objects to QA.
As a first step , we are preparing the QA SLD ( we have a separate SLD instance for XI DEV and QA environments ).
We have our standard objects loaded into the QA SLD. Now for the custom objects :
1.I export the product and the corresponding SWCV from the source SLD .
2. I import the zip file into the target SLD in that order. While importing the product into the target, I get a warning -
<b>The target namespace for the special import already contains data for one or more export lines. Continuing this import may corrupt the state of your data.</b>
I am not sure what this means - should I go ahead and import my custom product and its SWCV ( the next step ) despite this warning ? Or am I missing any step ? I checked the weblog on the SLD preparation and did not see any additional steps other than export/import from the source/target SLDs
Thank you for your time in advance.Hi Karthik,
Check out this link
==>http://help.sap.com/saphelp_nw04s/helpdata/en/b2/0aae42e5adcd6ae10000000a155106/frameset.htm
Hope this will explain you all the steps in importing...:)
Regards,
Sundararamaprasad. -
Data pump import from 11g to 10g
I have 2 database: first is 11.2.0.2.0 and second is 10.2.0.1.0
In 10g i created database link on 11g
CREATE DATABASE LINK "TEST.LINK"
CONNECT TO "monservice" IDENTIFIED BY "monservice"
USING '(DESCRIPTION = (ADDRESS_LIST = (ADDRESS = (PROTOCOL = TCP)(HOST = host)(PORT = port))) (CONNECT_DATA = (SID = sid)))';
And execute this query for test dbLink which work fine:
select * from v$[email protected];
After it i try to call open function:
declare
h number;
begin
h := dbms_datapump.open('IMPORT', 'TABLE', 'TEST.LINK', null, '10.2');
end;
and get exception: 39001. 00000 - "invalid argument value"
if i remove 'TEST.LINK' from the arguments it works fine
Edited by: 990594 on 26.02.2013 23:41Hi Richard Harrison,
result for import from 11g to 10g:
impdp user/pass@dburl schemas=SCHEMANAME network_link=TEST_LINK version=10.2
Connected to: Oracle Database 10g Express Edition Release 10.2.0.1.0 - Production
ORA-39001: invalid argument value
ORA-39169: Local version of 10.2.0.1.0 cannot work with remote version of 11.2.0.2.0
result for export from 11g to 10g:
expdp user/pass@dburl schemas=SCHEMANAME network_link=TEST_LINK version=10.2
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64 bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing option
ORA-39006: internal error
ORA-39065: unexpected master process exception in DISPATCH
ORA-04052: error occurred when looking up remote object SYS.KUPM$MCP@TEST_LINK
ORA-00604: error occurred at recursive SQL level 3
ORA-06544: PL/SQL: internal, error, arguments: [55916], [], [], [], [], [], [], []
ORA-06553: PLS-801: internal error [55916]
ORA-02063: preceding 2 lines from TEST_LINK
ORA_39097: Data Pump job encountered unexpected error -4052 -
Data pump import error with nested tables
So the problem is somewhat long :)
Actually the problems are two - why and how oracle are treating OO concept and why data pump doesn't work?
So scenario for the 1st one:
1) there is object type h1 and table of h1
2) there is unrelated object type row_text and table of row_text
3) there is object type h2 under h1 with attribute as table of row_text
4) there is table tab1 with column b with data type as table of h1. Of course column b is stored as nested table.
So how do you think - how many nested tables Oracle will create? The correct answer is 2. One explicitly defined and one hidden with system
generated name for the type h2 which is under type h1. So the question is WHY? Why if I create an instance of supertype Oracle tries to adapt
it for the subtype as well? Even more if I create another subtype h3 under h1 another hidden nested table appears.
This was the first part.
The second part is - if I do schema export and try to import it in another schema I got error saying that oracle failed to create storage table for
nested table column b. So the second question is - if Oracle has created such a mess with hidden nested tables how to import/export to another
schema?
Ok and here is test case to demonstrate problems above:
-- creating type h1 and table of it
SQL> create or replace type h1 as object (a number)
2 not final;
3 /
Type created.
SQL> create or replace type tbl_h1 as table of h1;
2 /
Type created.
-- creating type row_text and table of it
SQL> create or replace type row_text as object (
2 txt varchar2(100))
3 not final;
4 /
Type created.
SQL> create or replace type tbl_row_text as table of row_text;
2 /
Type created.
-- creating type h2 as subtype of h1
SQL> create or replace type h2 under h1 (some_texts tbl_row_text);
2 /
Type created.
SQL> create table tab1 (a number, b tbl_h1)
2 nested table b
3 store as tab1_nested;
Table created.
-- so we have 2 nested tables now
SQL> select table_name, parent_table_name, parent_table_column
2 from user_nested_tables;
TABLE_NAME PARENT_TABLE_NAME
PARENT_TABLE_COLUMN
SYSNTfsl/+pzu3+jgQAB/AQB27g== TAB1_NESTED
TREAT(SYS_NC_ROWINFO$ AS "GINTS"."H2")."SOME_TEXTS"
TAB1_NESTED TAB1
B
-- another subtype of t1
SQL> create or replace type h3 under h1 (some_texts tbl_row_text);
2 /
Type created.
-- plus another nested table
SQL> select table_name, parent_table_name, parent_table_column
2 from user_nested_tables;
TABLE_NAME PARENT_TABLE_NAME
PARENT_TABLE_COLUMN
SYSNTfsl/+pzu3+jgQAB/AQB27g== TAB1_NESTED
TREAT(SYS_NC_ROWINFO$ AS "GINTS"."H2")."SOME_TEXTS"
SYSNTfsl/+pz03+jgQAB/AQB27g== TAB1_NESTED
TREAT(SYS_NC_ROWINFO$ AS "GINTS"."H3")."SOME_TEXTS"
TAB1_NESTED TAB1
B
SQL> desc "SYSNTfsl/+pzu3+jgQAB/AQB27g=="
Name Null? Type
TXT VARCHAR2(100)OK let it be and now I'm trying to export and import in another schema:
[oracle@xxx]$ expdp gints/xxx@xxx directory=xxx dumpfile=gints.dmp logfile=gints.log
Export: Release 11.2.0.1.0 - Production on Thu Feb 4 22:32:48 2010
<irrelevant rows skipped>
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
. . exported "GINTS"."TAB1" 0 KB 0 rows
. . exported "GINTS"."SYSNTfsl/+pz03+jgQAB/AQB27g==" 0 KB 0 rows
. . exported "GINTS"."TAB1_NESTED" 0 KB 0 rows
. . exported "GINTS"."SYSNTfsl/+pzu3+jgQAB/AQB27g==" 0 KB 0 rows
Master table "GINTS"."SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
******************************************************************************And now import. In order to create types transformation of OIDs is applied and also remap_schema
Although it fails to create the table.
[oracle@xxx]$ impdp gints1/xxx@xxx directory=xxx dumpfile=gints.dmp logfile=gints_imp.log remap_schema=gints:gints1 transform=OID:n
Import: Release 11.2.0.1.0 - Production on Thu Feb 4 22:41:48 2010
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Release 11.2.0.1.0 - Production
Master table "GINTS1"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "GINTS1"."SYS_IMPORT_FULL_01": gints1/********@xxx directory=xxx dumpfile=gints.dmp logfile=gints_imp.log remap_schema=gints:gints1 transform=OID:n
Processing object type SCHEMA_EXPORT/USER
ORA-31684: Object type USER:"GINTS1" already exists
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TYPE/TYPE_SPEC
Processing object type SCHEMA_EXPORT/TABLE/TABLE
ORA-39083: Object type TABLE:"GINTS1"."TAB1" failed to create with error:
ORA-02320: failure in creating storage table for nested table column B
ORA-00904: : invalid identifier
Failing sql is:
CREATE TABLE "GINTS1"."TAB1" ("A" NUMBER, "B" "GINTS1"."TBL_H1" ) SEGMENT CREATION IMMEDIATE PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-01403: no data found
ORA-01403: no data found
Failing sql is:
DECLARE I_N VARCHAR2(60); I_O VARCHAR2(60); c DBMS_METADATA.T_VAR_COLL; df varchar2(21) := 'YYYY-MM-DD:HH24:MI:SS'; BEGIN DELETE FROM "SYS"."IMPDP_STATS"; c(1) := DBMS_METADATA.GET_STAT_COLNAME('GINTS1','TAB1_NESTED',NULL,'TREAT(SYS_NC_ROWINFO$ AS "GINTS"."H2")."SOME_TEXTS"',1); DBMS_METADATA.GET_STAT_INDNAME('GINTS1','TAB1_NESTED',c,1,i_o,i_n); INSERT INTO "
ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-01403: no data found
ORA-01403: no data found
Failing sql is:
DECLARE I_N VARCHAR2(60); I_O VARCHAR2(60); c DBMS_METADATA.T_VAR_COLL; df varchar2(21) := 'YYYY-MM-DD:HH24:MI:SS'; BEGIN DELETE FROM "SYS"."IMPDP_STATS"; c(1) := DBMS_METADATA.GET_STAT_COLNAME('GINTS1','TAB1_NESTED',NULL,'TREAT(SYS_NC_ROWINFO$ AS "GINTS"."H3")."SOME_TEXTS"',1); DBMS_METADATA.GET_STAT_INDNAME('GINTS1','TAB1_NESTED',c,1,i_o,i_n); INSERT INTO "
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Job "GINTS1"."SYS_IMPORT_FULL_01" completed with 4 error(s) at 22:41:52So any idea how to make export/import of such tables?
TIA
GintsTom Kyte has said it repeatedly ... I will repeat it here for you.
The fact that Oracle allows you to build object tables is not an indication that you should.
Store your data relationally and build object_views on top of them.
http://www.morganslibrary.org/reference/object_views.html
If you model properly, and store properly, you don' have any issues. -
Re: 10g to 11gR2 Upgrade using Data Pump Import
Hi Srini,
I read the below documented in export/import.
"Grants on objects owned by the SYS schema are never exported."
If this is the case, how do we import the grants in target database as we see many invalids due to grants.
For example, I'm using expdp to export 11gr1 database from solaris and importing in 11gr2 database in RHEL.
In target I see many invalids and when try to compile it manually it throws an error "insufficient privileges" which is missing grants in CTXSYS schema.
How do we go about it.I branched this for you.
The answer is not with Data Pump. Data Pump (expdp) and the old export utility (exp) does not support exporting grants on Oracle schemas. You would have to find these grants yourself and then reapply them on the target database.
Probably not the answer you wanted, but with Data Pump, anything that is currently owned by the Oracle schemas is not exportable.
Dean -
Error while Importing SYSTEM objects into 9.2.0.6
HI,
I got the following error while importing data into 9.2.0.6 DB .
It is related to a SYSTEM object.
. importing SYSTEM's objects into SYSTEM
IMP-00017: following statement failed with ORACLE error 2270:
"ALTER TABLE "DEF$_CALLDEST" ADD CONSTRAINT "DEF$_CALL_DESTINATION" FOREIGN "
"KEY ("DBLINK") REFERENCES "DEF$_DESTINATION" ("DBLINK") ENABLE NOVALIDATE"
IMP-00003: ORACLE error 2270 encountered
ORA-02270: no matching unique or primary key for this column-list
Constarint exists on Composite Primary Key on the parent table.
Can anybody help me out on this.
Regards,
Sumit Singh ChadhaHi,
don't mentioning why are You importing SYSTEM objects, the error that You have, just looking the message, is due to an export done without the clause CONSISTENT=Y (so all relations parent/child preserved) having the replication (the table in error is one of the replication dictionary table) active (not quisced)
Repeat the export operation using CONSISTENT=Y an stop replication activity.
Hope this helps
Max -
i have exported Hr schema from one macine using following code it export successfully but i tried to import this exported file in another machine it gave error. one thing more how i use package to import file as we exported by using data pump using package plz help
C:\Documents and Settings\remote>impdp hr/hr DIRECTORY=DATA_PUMP_DIR DUMPFILE=YAHOO.DMP full=y
With the Partitioning, OLAP, Data Mining and Real Application Testing options
ORA-31626: job does not exist
ORA-31637: cannot create job SYS_IMPORT_FULL_01 for user HR
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPV$FT_INT", line 663
ORA-39080: failed to create queues "KUPC$C_1_20090320121353" and "KUPC$S_1_20090
320121353" for Data Pump job
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPC$QUE_INT", line 1665
ORA-01658: unable to create INITIAL extent for segment in tablespace SYSTEM
DECLARE
handle NUMBER;
BEGIN
handle := dbms_datapump.open (
operation => 'EXPORT' ,job_mode => 'SCHEMA');
dbms_datapump.add_file(
handle => handle
*,filename => 'YAHOO.dmp'*
*,directory => 'DATA_PUMP_DIR'*
*,filetype => 1);*
dbms_datapump.metadata_filter(
handle => handle
*,name => 'SCHEMA_EXPR'*
*,value => 'IN(''HR'')');*
dbms_datapump.start_job(
handle => handle);
dbms_datapump.detach(handle => handle);
end;+*<Moderator edit - deleted contents of MOS Doc 752374.1 - pl do not post such contents - it is a violation of your Support agreement - locking this thread>*+
-
Error during data pump import with SQL developer
Hello,
I try to trasnfer data from one database to another one through data pump via SQL Developper (data amount is quite important) exporting several tables.
Tables export is doing fine, but I encounter the following error when I import the file (I try data only and data + DDL).
"Exception: ORA-39001: argument value invalid dbms_datapump.get_status(64...=
ORA-39001: argument value invalid
ORA-39000: ....
ORA-31619: ...
The file is in the right place, data pump folder of the new database. User is the same on both base, database version are similar.
Do you have any idea of the problem ?
ThanksWith query SELECT * FROM v$version;
Environment source
Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
PL/SQL Release 11.2.0.1.0 - Production
"CORE 11.2.0.1.0 Production"
TNS for Linux: Version 11.2.0.1.0 - Production
NLSRTL Version 11.2.0.1.0 - Production
Environment target
Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
PL/SQL Release 11.2.0.1.0 - Production
"CORE 11.2.0.1.0 Production"
TNS for Linux: Version 11.2.0.1.0 - Production
NLSRTL Version 11.2.0.1.0 - Production -
Import design object into integration repository
HI.
We are going to import "XI3_0_SAP_BASIS_6.40_SP09_01.tpz" into integration reposiotry. So, we move this file into "G:\usr\sap\DEV\SYS\global\xi\repository_server\import" and start integration reposiotry.
Then, I clicked tool -> import design object but system can't find any object.
How can I fix this ?
Regards, Arnold.Hi,
Generally, the import of this component is done during installation itself, just check if the installation has been completed. Check out michal's FAQ blog to get simple steps to check if installation has been completed properly. You can probably copy the tpz version of the SAP BASIS component again from the installation DVD and try and import it.
Btw, what kind of a platform is XI running on - Unix platform? Whatever it is, are you able to see the tpz file at its location? Check if the folder that the tpz if existing in has all the permissions to pick up the file.
Regards,
Sushumna
Maybe you are looking for
-
Can I copy music on an iPod to a "new" computer?
I have a new computer and would like to copy the current music on my iPod to the new computer. Music was originally loaded from the old computer. Is there a way to transfer the files FROM the iPod to the new COMPUTER using iTunes? (iPod --> computer)
-
Framemaker crashes on start up.
I have just recently downloaded and installed Frammaker9. So we can test it i dont really want to splash out £100's So any help here is much apprechiated. So after i installed it i got an error. And in that log file was this. === Header Begin === Int
-
Trouble deleting music off Iphone
I trouble deleting music off iPhone and iPod through iTunes. Can't access anything on it.
-
Hi All, I just purchased the iphone 4S, was upgrading my phone with my itunes acct, its stuck with itunes, cable icon...Help.
-
When is the (HID) profile updates due for the next implementation since bluetooth functionality is currently restricted to keyboards. Is there a time-line projected for Production, I assume this is still in UAT or early beta.