Full database impdp
Hi,
When i am trying to import data using datapump i am facing the below error.
OS : AIX 5
Oracle Version : Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - 64bit
Command used fro EXPORT:'
expdp admin/XXXXs DIRECTORY=DATA_PUMP_DIR dumpfile=XXXX_24_08_2010.DMP LOGFILE=XXX_24_08_2010.LOG schemas=XXXXX CONTENT=ALL
Command used for IMPORT:
impdp admin/XXXXs DIRECTORY=DATA_PUMP_DIR dumpfile=XXXX_24_08_2010.DMP LOGFILE=XXX_24_08_2010.LOG schemas=XXXXX
Processing object type SCHEMA_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
ORA-39083: Object type INDEX failed to create with error:
ORA-01452: cannot CREATE UNIQUE INDEX; duplicate keys found
Failing sql is:
CREATE UNIQUE INDEX "XXXX"."UNI_MODULE_DEF__ID" ON "XXXX"."T_MODULE_DEF" ("COMPONENT_ID", "MODULE_NAME") PCTFREE 10 INITRANS 2 MAXTRANS 255 STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT) TABLESPACE "USERS" PARALLEL 1
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
ORA-39083: Object type CONSTRAINT failed to create with error:
ORA-02299: cannot validate (XXXX.UNI_MODULE_DEF__ID) - duplicate keys found
Failing sql is:
ALTER TABLE "XXX"."T_MODULE_DEF" ADD CONSTRAINT "UNI_MODULE_DEF__ID" UNIQUE ("COMPONENT_ID", "MODULE_NAME") USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255 STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT) TABLESPACE "USERS" ENABLE
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"XXXX"."UNI_MODULE_DEF__ID" creation failed
Please help me to resolve the issue.
Thanks
Rangaraj
Dear Rangaraj,
ORA-01452: cannot CREATE UNIQUE INDEX; duplicate keys found
Cause: A CREATE UNIQUE INDEX statement specified one or more columns that currently contain duplicate values. All values in the indexed columns must be unique by row to create a UNIQUE INDEX.
Action: If the entries need not be unique, remove the keyword UNIQUE from the CREATE INDEX statement, then re-execute the statement. If the entries must be unique, as in a primary key, then remove duplicate values before creating the UNIQUE index. There are duplicate values in the target table that the import accessed to create the unique index. Anyway, i don't understand what you are trying to achieve? Are you trying to import the export to the same schema?
Regards.
Ogan
Similar Messages
-
Full database exp/imp between RAC and single database
Hi Experts,
we have a RAC database oracle 10GR2 with 4 node in linux. i try to duplicate rac database into single instance window database.
there are same version both database. during importing, I need to create 4 undo tablespace to keep imp processing.
How to keep one undo tablespace in single instance database?
any experience of exp/imp RAC database into single instance database to share with me?
Thanks
Jim
Edited by: user589812 on Nov 13, 2009 10:35 AMJIm,
I also want to know can we add the exclude=tablespace on the impdp command for full database exp/imp?You can't use exclude=tablespace on exp/imp. It is for datapump expdp/impdp only.
I am very insteresting in your recommadition.
But for a full database impdp, how to exclude a table during full database imp? May I have a example for this case?
I used a expdp for full database exp. but I got a exp error in expdp log as ORA-31679: Table data object "SALE"."TOAD_PLAN_TABLE" has long columns, and longs can not >be loaded/unloaded using a network linkHaving long columns in a table means that it can't be exported/imported over a network link. To exclude this, you can use the exclude expression:
expdp user/password exclude=TABLE:"= 'SALES'" ...
This will exclude all tables named sales. If you have that table in schema scott and then in schema blake, it will exclude both of them. The error that you are getting is not a fatal error, but that table will not be exported/imported.
the final message as
Master table "SYSTEM"."SYS_EXPORT_FULL_01" successfully loaded/unloaded
Dump file set for SYSTEM.SYS_EXPORT_FULL_01 is:
F:\ORACLEBACKUP\SALEFULL091113.DMP
Job "SYSTEM"."SYS_EXPORT_FULL_01" completed with 1 error(s) at 16:50:26Yes, the fact that it did not export one table does not make the job fail, it will continue on exporting all other objects.
. I drop database that gerenated a expdp dump file.
and recreate blank database and then impdp again.
But I got lots of error as
ORA-39151: Table "SYSMAN"."MGMT_ARU_OUI_COMPONENTS" exists. All dependent metadata and data will be skipped due to table_exists_action of skip
ORA-39151: Table "SYSMAN"."MGMT_BUG_ADVISORY" exists. All dependent metadata and data will be skipped due to table_exists_action of skip
......ORA-31684: Object type TYPE_BODY:"SYSMAN"."MGMT_THRESHOLD" already exists
ORA-39111: Dependent object type TRIGGER:"SYSMAN"."SEV_ANNOTATION_INSERT_TR" skipped, base object type VIEW:"SYSMAN"."MGMT_SEVERITY_ANNOTATION" >already exists
and last line as
Job "SYSTEM"."SYS_IMPORT_FULL_01" completed with 2581 error(s) at 11:54:57Yes, even though you think you have an empty database, if you have installed any apps or anything, it may create tables that could exist in your dumpfile. If you know that you want the tables from the dumpfile and not the existing ones in the database, then you can use this on the impdp command:
impdp user/password table_exists_action=replace ...
If a table that is being imported exists, DataPump will detect this, drop the table, then create the table. Then all of the dependent objects will be created. If you don't then the table and all of it's dependent objects will be skipped, (which is the default).
There are 4 options with table_exists_action
replace - I described above
skip - default, means skip the table and dependent objects like indexes, index statistics, table statistics, etc
append - keep the existing table and append the data to it, but skip dependent objects
truncate - truncate the existing table and add the data from the dumpfile, but skip dependent objects.
Hope this helps.
Dean -
Full database import(impdp) error
HI ppl,
Plz help..
My full database import is failing due to
ORA-39126: Worker unexpected fatal error in KUPW$WORKER.LOAD_METADATA [TABLE_DATA:"SYS"."SYS_IMPORT_FULL_04"]
SELECT process_order, flags, xml_clob, NVL(dump_fileid, :1), NVL(dump_position, :2), dump_length, dump_allocation, grantor, object_row, object_schema, object_long_name, processing_status, processing_state, base_object_type, base_object_schema, base_object_name, property, size_estimate, in_progress FROM "SYS"."SYS_IMPORT_FULL_04" WHERE process_order between :3 AND :4 AND processing_state <> :5 AND duplicate = 0 ORDER BY process_order
ORA-25153: Temporary Tablespace is Empty
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 105
ORA-06512: at "SYS.KUPW$WORKER", line 6272
----- PL/SQL Call Stack -----
object line object
handle number name
c000000267348288 14916 package body SYS.KUPW$WORKER
c000000267348288 6293 package body SYS.KUPW$WORKER
c000000267348288 3511 package body SYS.KUPW$WORKER
c000000267348288 6882 package body SYS.KUPW$WORKER
c000000267348288 1259 package body SYS.KUPW$WORKER
c00000026fb2de68 2 anonymous block
Job "SYS"."SYS_IMPORT_FULL_04" stopped due to fatal error at 14:56:10
regardsHi..
ya temporary tablespace is present,and also contains 2gb freespace,with 2 datafiles.Are you sure its a datafile.It must be a tempfile. :)
Secondly, have you checked the default temporary tablespace of the database.Is it the one that you have mentioned.
SELECT PROPERTY_NAME,PROPERTY_VALUE FROM DATABASE_PROPERTIES where PROPERTY_NAME='DEFAULT_TEMP_TABLESPACE';
Anand -
Impdp full database require user??
Hi All,
I have taken a expdp backup of full database using "indiamart" user which have the sys privilege.
Now on the other machine i hv created a database using dbca.
Now i want to import that backup here, do i need to create "indiamart" user here also???
Plz guide me what steps i need to follow..
Thanks
Nimish GargYou need a user with the DATAPUMP_IMP_FULL_DATABASE role granted. Neither the same name nor DBA (or even SYSDBA) privilege is required.
http://download.oracle.com/docs/cd/E11882_01/server.112/e10701/dp_import.htm#i1011935
Werner -
Hi
I wanted to refresh my dev environment quickly..
As we also wanted to migrate from exp/imp to datapump, I tried these..
Only thing worked for me is impdp on network mode using remap schema.
1)First i wanted to implement expdp and impdp standard procedures(Full database export/import)
My expdp failed with the following error on a full export..
Pls see below:
C:\Documents and Settings\oracle2>expdp system/<p.w> full=Y directory=DATA_PUMP_DIR dumpfile=OAPFULL.dmp logfile=OAPFULL.log
Export: Release 11.1.0.7.0 - Production on Thursday, 26 April, 2012 16:12:49
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "SYSTEM"."SYS_EXPORT_FULL_02": system/******** full=Y directory=DATA_PUMP_DIR dumpfile=OAPFULL.dmp logfile=OAPFULL.log
Estimate in progress using BLOCKS method...
Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 4.351 GB
Processing object type DATABASE_EXPORT/TABLESPACE
Processing object type DATABASE_EXPORT/PASSWORD_VERIFY_FUNCTION
Processing object type DATABASE_EXPORT/PROFILE
Processing object type DATABASE_EXPORT/SYS_USER/USER
Processing object type DATABASE_EXPORT/SCHEMA/USER
Processing object type DATABASE_EXPORT/ROLE
Processing object type DATABASE_EXPORT/GRANT/SYSTEM_GRANT/PROC_SYSTEM_GRANT
Processing object type DATABASE_EXPORT/SCHEMA/GRANT/SYSTEM_GRANT
Processing object type DATABASE_EXPORT/SCHEMA/ROLE_GRANT
Processing object type DATABASE_EXPORT/SCHEMA/DEFAULT_ROLE
Processing object type DATABASE_EXPORT/SCHEMA/TABLESPACE_QUOTA
Processing object type DATABASE_EXPORT/RESOURCE_COST
Processing object type DATABASE_EXPORT/SCHEMA/DB_LINK
Processing object type DATABASE_EXPORT/TRUSTED_DB_LINK
Processing object type DATABASE_EXPORT/SCHEMA/SEQUENCE/SEQUENCE
Processing object type DATABASE_EXPORT/SCHEMA/SEQUENCE/GRANT/OWNER_GRANT/OBJECT_GRANT
Processing object type DATABASE_EXPORT/DIRECTORY/DIRECTORY
Processing object type DATABASE_EXPORT/DIRECTORY/GRANT/OWNER_GRANT/OBJECT_GRANT
Processing object type DATABASE_EXPORT/CONTEXT
Processing object type DATABASE_EXPORT/SCHEMA/PUBLIC_SYNONYM/SYNONYM
Processing object type DATABASE_EXPORT/SCHEMA/SYNONYM
ORA-39014: One or more workers have prematurely exited.
ORA-39029: worker 1 with process name "DW01" prematurely terminated
ORA-31672: Worker process DW01 died unexpectedly.
Job "SYSTEM"."SYS_EXPORT_FULL_02" stopped due to fatal error at 16:14:562)As the above failed, I tried impdp with network_link import and full=y and its a different issue!
impdp system/<p.w> NETWORK_LINK=OAPLIVE.WORLD full=y logfile=OAPDRfull2.log table_exists_
action=REPLACE
Import: Release 11.1.0.7.0 - Production on Thursday, 26 April, 2012 13:36:01
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "SYSTEM"."SYS_IMPORT_FULL_01": system/******** NETWORK_LINK=OAPLIVE.WORLD full=y logfile=OAPDRfull2.log table_exists_action=REPLACE
Estimate in progress using BLOCKS method...
Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 4.670 GB
Processing object type DATABASE_EXPORT/TABLESPACE
ORA-31684: Object type TABLESPACE:"SYSAUX" already exists
*************lots more of object already exists errors here............(space limit so cant paste)***************
ORA-31684: Object type SYNONYM:"GEOPROD"."EXT_POSTAIM_PRESORT_61" already exists
ORA-39126: Worker unexpected fatal error in KUPW$WORKER.DISPATCH_WORK_ITEMS [SYNONYM:"GEOPROD"."EXT_POSTAIM_PRESORT_61"]
ORA-31600: invalid input value 200001 for parameter HANDLE in function CLOSE
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 105
ORA-06512: at "SYS.DBMS_METADATA", line 569
ORA-06512: at "SYS.DBMS_METADATA", line 4731
ORA-06512: at "SYS.DBMS_METADATA", line 792
ORA-06512: at "SYS.DBMS_METADATA", line 4732
ORA-06512: at "SYS.KUPW$WORKER", line 2718
ORA-03113: end-of-file on communication channel
ORA-02055: distributed update operation failed; rollback required
ORA-02063: preceding lines from OAPLIVE.WORLD
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPW$WORKER", line 7858
----- PL/SQL Call Stack -----
object line object
handle number name
242F2F2C 18256 package body SYS.KUPW$WORKER
242F2F2C 7885 package body SYS.KUPW$WORKER
242F2F2C 8657 package body SYS.KUPW$WORKER
242F2F2C 1545 package body SYS.KUPW$WORKER
241DDF3C 2 anonymous block
ORA-39126: Worker unexpected fatal error in KUPW$WORKER.UNLOAD_METADATA []
ORA-31642: the following SQL statement fails:
SELECT unique ku$.seq# from sys.metanametrans$ ku$ WHERE ku$.htype='DATABASE_EXPORT' AND ku$.model='ORACLE' AND NOT ( ku$.seq#>=(select a.seq# from sys.metanametrans$ a where
a.model='ORACLE' and a.htype='DATABASE_EXPORT' and a.name ='DATABASE_EXPORT/SCHEMA/SYNONYM')) order by ku$.seq#
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 86
ORA-06512: at "SYS.DBMS_METADATA_INT", line 5002
ORA-01427: single-row subquery returns more than one row
ORA-06512: at "SYS.DBMS_
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 86
ORA-06512: at "SYS.KUPW$WORKER", line 7853
----- PL/SQL Call Stack -----
object line object
handle number name
242F2F2C 18256 package body SYS.KUPW$WORKER
242F2F2C 7885 package body SYS.KUPW$WORKER
242F2F2C 2744 package body SYS.KUPW$WORKER
242F2F2C 8523 package body SYS.KUPW$WORKER
242F2F2C 1545 package body SYS.KUPW$WORKER
241DDF3C 2 anonymous block
Job "SYSTEM"."SYS_IMPORT_FULL_01" stopped due to fatal error at 13:39:48So its basically these 2 issues I want to know how to fix.
1-expdp error cause and fix
2-impdp error cause and fix -Also how to avoid object alreay exists error?
Also for example the package etc. is the same name..but i want new package from LIVE so it means if the same name package, view, etc is there, it wouldnt get updated?
Any way to overcome this?
I need it exactly same as LIVE...(with a few exceptions which is small enough i can do after impdp finishes fine)
Pleaseeeeeeee help!!
Thanks&Regards.......Hi..
Thanks for the links..I applied the tips on each of them but it didnt work.
Also my database is 11g so it is not true that this happens on 10g only.
Things tried:
1)I tried with different values on parallel parameter but same error
2)I applied the following:
alter system set open_cursors=1024 scope=spfile;
alter system set "_optimizer_cost_based_transformation"=off;
commit;
The 3rd link was bit better
I tried to find out where exactly the error was causesusing
expdp attach =SYS_EXPORT_FULL_03
But I cant figure out what the object : PUBLIC
oracle/context/isearch/GetPage
is..?
Does this needs to be excluded from the export?if so, how?
Can someone help how to fix the error now?
Processing object type DATABASE_EXPORT/SCHEMA/SYNONYM
ORA-39014: One or more workers have prematurely exited.
ORA-39029: worker 1 with process name "DW01" prematurely terminated
ORA-31672: Worker process DW01 died unexpectedly.
Job "SYSTEM"."SYS_EXPORT_FULL_03" stopped due to fatal error at 11:29:32
C:\Documents and Settings\ora2>expdp attach=SYS_EXPORT_FULL_03
Export: Release 11.1.0.7.0 - Production on Tuesday, 01 May, 2012 11:35:38
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Username: system
Password:
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Job: SYS_EXPORT_FULL_03
Owner: SYSTEM
Operation: EXPORT
Creator Privs: TRUE
GUID: 8499C802F52A414A8BCACE552DDF6F11
Start Time: Tuesday, 01 May, 2012 11:37:56
Mode: FULL
Instance: geooap
Max Parallelism: 1
EXPORT Job Parameters:
Parameter Name Parameter Value:
CLIENT_COMMAND system/******** parfile=h:\datapump\oapfull.par
State: IDLING
Bytes Processed: 0
Current Parallelism: 1
Job Error Count: 0
Dump File: H:\datapump\oapfull.dmp
bytes written: 4,096
Worker 1 Status:
Process Name: DW01
State: UNDEFINED
Object Schema: PUBLIC
Object Name: oracle/context/isearch/GetPage
Object Type: DATABASE_EXPORT/SCHEMA/PUBLIC_SYNONYM/SYNONYM
Completed Objects: 766
Total Objects: 766
Worker Parallelism: 1 -
Export/Import full database dump fails to recreate the workspaces
Hi,
We are studying the possibility of using Oracle Workspace to maintain multiple versions of our business data. I recently did some tests with the import/export of a dump that contains multiple workspaces. I'm not able to import the dump successfully because it doesn't recreate the workspaces on the target database instance, which is an empty Oracle database freshly created. Before to launch the import with the Oracle Data Pump utility, I have the "LIVE" workspace that exists in the WMSYS.WM$WORKSPACES_TABLE table. After the import is done, there is nothing in that table...
The versions of the Oracle database and Oracle Workspace Manager are the same on source and target database:
Database version (from the V$VERSION view):
Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
PL/SQL Release 10.2.0.4.0 - Production
CORE 10.2.0.4.0 Production
TNS for 64-bit Windows: Version 10.2.0.4.0 - Production
NLSRTL Version 10.2.0.4.0 - Production
Workspace Manager version (from the WM_INSTALLATION view):
OWM_VERSION: 10.2.0.4.3
In order to recreate the tablespaces successfully during the full import of the dump, the directory structure for the tablespaces is the same on the target database's computer. I used the instructions given in this document, section "1.6 Import and Export Considerations" at page 1-19:
http://www.wyswert.com/documentation/oracle/database/11.2/appdev.112/e11826.pdf
Considering that the release of Oracle database used is version 10.2.0.4, I use the following command to import the dump since it doesn't contain the WMSYS schema:
impdp system/<password>@<database> DIRECTORY=data_pump_dir DUMPFILE=expfull.dmp FULL=y TABLE_EXISTS_ACTION=append LOGFILE=impfull.log
The only hints I have in the import log file are the following block of lines:
1st one:
==============================
Traitement du type d'objet DATABASE_EXPORT/SYSTEM_PROCOBJACT/PRE_SYSTEM_ACTIONS/PROCACT_SYSTEM
ORA-39083: Echec de la création du type d'objet PROCACT_SYSTEM avec erreur :
ORA-01843: ce n'est pas un mois valide
SQL en échec :
BEGIN
if (system.wm$_check_install) then
return ;
end if ;
begin execute immediate 'insert into wmsys.wm$workspaces_table
values(''LIVE'',
''194'',
''-1'',
2nd one at the end of the import process:
==============================
Traitement du type d'objet DATABASE_EXPORT/SCHEMA/POST_SCHEMA/PROCACT_SCHEMA
ORA-39083: Echec de la création du type d'objet PROCACT_SCHEMA avec erreur :
ORA-20000: Workspace Manager Not Properly Installed
SQL en échec :
BEGIN
declare
compile_exception EXCEPTION;
PRAGMA EXCEPTION_INIT(compile_exception, -24344);
begin
if (system.wm$_check_install) then
return ;
end if ;
execute immediate 'alter package wmsys.ltadm compile';
execute immediate 'alter packag
The target operating system is in french... here is a raw translation:
1st one:
==============================
Treatment of object type DATABASE_EXPORT/SYSTEM_PROCOBJACT/PRE_SYSTEM_ACTIONS/PROCACT_SYSTEM
ORA-39083: Object type creation failed PROCACT_SYSTEM with error :
ORA-01843: invalid month
failed SQL :
BEGIN
if (system.wm$_check_install) then
return ;
end if ;
begin execute immediate 'insert into wmsys.wm$workspaces_table
values(''LIVE'',
''194'',
''-1'',
2nd one at the end of the import process:
==============================
Treatment of object type DATABASE_EXPORT/SCHEMA/POST_SCHEMA/PROCACT_SCHEMA
ORA-39083: Object type creation failed PROCACT_SCHEMA with error :
ORA-20000: Workspace Manager Not Properly Installed
failed SQL :
BEGIN
declare
compile_exception EXCEPTION;
PRAGMA EXCEPTION_INIT(compile_exception, -24344);
begin
if (system.wm$_check_install) then
return ;
end if ;
execute immediate 'alter package wmsys.ltadm compile';
execute immediate 'alter packag
By the way, the computer of the source database is Vista 64 in english and the destination computer is Vista 64 in french, do you think this has anything to do with these error messages? The parameters of the NLS_SESSION_PARAMETERS view are the same in the source and target database...
Thanks in advance for your help!
Joel Autotte
Lead Developer
Edited by: 871007 on Jul 13, 2011 7:31 AMI tried to import the full database dump with the "imp" command and I had the same error message with more detail:
. Import d'objets SYSTEM dans SYSTEM
IMP-00017: Echec de l'instruction suivante avec erreur ORACLE 1843 :
"BEGIN "
"if (system.wm$_check_install) then"
" return ;"
" end if ;"
" begin execute immediate 'insert into wmsys.wm$workspaces_ta"
"ble"
" values(''LIVE'',"
" ''194'',"
" ''-1'',"
" ''SYS'',"
" *to_date(''09-DEC-2009 00:00:00'', ''DD-MON-YYYY HH"*
*"24:MI:SS''),"*
" ''0'',"
" ''UNLOCKED'',"
" ''0'',"
" ''0'',"
" ''37'',"
" ''CRS_ALLNONCR'',"
" to_date(''06-APR-2011 14:24:57'', ''DD-MON-YYYY HH"
"24:MI:SS''),"
" ''0'',"
" '''')'; end;"
"COMMIT; END;"
IMP-00003: Erreur ORACLE 1843 rencontrée
ORA-01843: ce n'est pas un mois valide
ORA-06512: à ligne 5
The call to the TO_DATE function with the string of the date format is incompatible with the format and language set in the NLS parameters of my database. Since I know that the "imp" and "exp" commands work client side (unlike "impdp" and "expdp"), I did a full export of the source database from the computer on which I try to import the dump to have the same language exported in the dump and I was able to import the dump successfully.
But, I would like to use the Data Pump tool instead of the old import and export tools. How would it be possible to set the NLS parameters that "impdp" must use before to launch the import of the dump?
The NLS parameters are set to the "AMERICAN" settings in both source and destination database when I validate that with the "NLS_DATABASE_PARAMETERS" view. The dump I did with "expdp" on the source database exported the data with the "AMERICAN" settings as well...
On the destination database, the computer language is in french and I guess it is considered by the Data Pump import tool since it outputs the log in the language of the computer and it tries to do a TO_DATE with the wrong date format because it works with the "CANADIAN FRENCH" settings...
Any suggestions? I'll continue to read on my side and re-post if I find the solution.
Thanks!
Edited by: 871007 on Jul 19, 2011 8:42 AM
Edited by: 871007 on Jul 19, 2011 8:43 AM
Edited by: 871007 on Jul 19, 2011 8:43 AM -
[Q] ORA-30510 error while doing full database import
We have ORACLE 9.2.0.6 on Redhat LINUX AS 3.4. while we doing "full database import" there has ORA-30510 come out. I searched Metalink and Google, but not found much information. The error messages on import are:
. importing OUTLN's objects into OUTLN
. importing SYSTEM's objects into SYSTEM
. importing TKOWNER's objects into TKOWNER
IMP-00017: following statement failed with ORACLE error 30510:
"CREATE TRIGGER "TKOWNER".TRIG_DONOTANALYZE before analyze on schema decla"
"re begin if sys.dictionary_obj_name = 'MYWTKEMPLOYEE' then raise_ap"
"plication_error(-20101, 'Do Not ANALYZE MYWTKEMPLOYEE'); end if; end; "
IMP-00003: ORACLE error 30510 encountered
ORA-30510: system triggers cannot be defined on the schema of SYS user
. importing SYS's objects into SYS
. importing TKOWNER's objects into TKOWNER
About to enable constraints...
Import terminated successfully with warnings.877410 wrote:
Hi I get the following ierrors while importing a schema from prod to dev database... can anyone help ,thanks!
impdp system DIRECTORY=DATA_PUMP_DIR DUMPFILE=abcdprod.DMP LOGFILE=abcdprod.log REMAP_SCHEMA=abcdprod:abcddev
ORA-39002: invalid operation
ORA-31694: master table "SYSTEM"."SYS_IMPORT_FULL_01" failed to load/unload
ORA-31644: unable to position to block number 170452 in dump file "/ots/oracle/echo/wxyz/datapump/abcdprod.DMPpost complete command line for the expdp that made the abcdprod.DMP file -
Hi,
I am trying to import a full database using the impdp method.
impdp system/oracle dumpfile=datap:fulldb.dmp full=y logfile=datap:full.log
I am getting fatal error message while importing.
===========================
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPW$WORKER", line 6273
----- PL/SQL Call Stack -----
object line object
handle number name
5e6414d4 14916 package body SYS.KUPW$WORKER
5e6414d4 6300 package body SYS.KUPW$WORKER
5e6414d4 12689 package body SYS.KUPW$WORKER
5e6414d4 11968 package body SYS.KUPW$WORKER
5e6414d4 3279 package body SYS.KUPW$WORKER
5e6414d4 6889 package body SYS.KUPW$WORKER
5e6414d4 1262 package body SYS.KUPW$WORKER
6408cfd0 2 anonymous block
Job "SYSTEM"."SYS_IMPORT_FULL_03" stopped due to fatal error at 13:19:32:
==================================
My question is 'should we have to create all the tablespaces(other than system tablespaces) that was available in the primary database manually in the target machine before executing the import command?
What are the prerequisites for importing a full export into a database? How do we make the target database ready to accept the import?
my oracle version is 10.2.0 on solaris 10
Thanks in advance....
Edited by: user13364377 on 24-Jul-2010 14:33Hi,
My db version is 10.2.0.0 and OS is solaris 10..
I am pasitng the contents of the alert log file.
==================================
kupprdp: master process DM00 started with pid=19, OS id=1328
to execute - SYS.KUPM$MCP.MAIN('SYS_IMPORT_FULL_01', 'SYSTEM', 'KUPC$C_1_20100724100422', 'KUPC$S_1_20100724100422', 0);
kupprdp: worker process DW01 started with worker id=1, pid=20, OS id=1330
to execute - SYS.KUPW$WORKER.MAIN('SYS_IMPORT_FULL_01', 'SYSTEM');
Sat Jul 24 10:04:28 2010
CREATE UNDO TABLESPACE "UNDOTBS1" DATAFILE '/oracle/product/dbn/undotbs01.dbf' SIZE 26214400 AUTOEXTEND ON NEXT 5242880 MAXSIZE 32767M BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE
ORA-1543 signalled during: CREATE UNDO TABLESPACE "UNDOTBS1" DATAFILE '/oracle/product/dbn/undotbs01.dbf' SIZE 26214400 AUTOEXTEND ON NEXT 5242880 MAXSIZE 32767M BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE...
Sat Jul 24 10:04:28 2010
CREATE TABLESPACE "SYSAUX" DATAFILE '/oracle/product/dbn/sysaux01.dbf' SIZE 125829120 AUTOEXTEND ON NEXT 10485760 MAXSIZE 32767M LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO
ORA-1543 signalled during: CREATE TABLESPACE "SYSAUX" DATAFILE '/oracle/product/dbn/sysaux01.dbf' SIZE 125829120 AUTOEXTEND ON NEXT 10485760 MAXSIZE 32767M LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO...
Sat Jul 24 10:04:28 2010
CREATE TEMPORARY TABLESPACE "TEMP" TEMPFILE '/oracle/product/dbn/temp01.dbf' SIZE 31457280 AUTOEXTEND ON NEXT 655360 MAXSIZE 32767M EXTENT MANAGEMENT LOCAL UNIFORM SIZE 1048576
ORA-1543 signalled during: CREATE TEMPORARY TABLESPACE "TEMP" TEMPFILE '/oracle/product/dbn/temp01.dbf' SIZE 31457280 AUTOEXTEND ON NEXT 655360 MAXSIZE 32767M EXTENT MANAGEMENT LOCAL UNIFORM SIZE 1048576
Sat Jul 24 10:04:28 2010
CREATE TABLESPACE "USERTB" DATAFILE '/oracle/product/dbn/users01.dbf' SIZE 5242880 AUTOEXTEND ON NEXT 1310720 MAXSIZE 32767M LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO
ORA-1119 signalled during: CREATE TABLESPACE "USERTB" DATAFILE '/oracle/product/dbn/users01.dbf' SIZE 5242880 AUTOEXTEND ON NEXT 1310720 MAXSIZE 32767M LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO...
Sat Jul 24 10:04:28 2010
CREATE TABLESPACE "EXAMPLE" DATAFILE '/oracle/product/dbn/example01.dbf' SIZE 104857600 AUTOEXTEND ON NEXT 655360 MAXSIZE 32767M NOLOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO
ORA-1119 signalled during: CREATE TABLESPACE "EXAMPLE" DATAFILE '/oracle/product/dbn/example01.dbf' SIZE 104857600 AUTOEXTEND ON NEXT 655360 MAXSIZE 32767M NOLOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO
Sat Jul 24 10:04:28 2010
CREATE TABLESPACE "NEWTABLESPACE" DATAFILE '/oracle/product/newtablespace.dbf' SIZE 10485760 LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL UNIFORM SIZE 1048576 SEGMENT SPACE MANAGEMENT AUTO
ORA-1119 signalled during: CREATE TABLESPACE "NEWTABLESPACE" DATAFILE '/oracle/product/newtablespace.dbf' SIZE 10485760 LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL UNIFORM SIZE 1048576 SEGMENT SPACE MANAGEMENT AUTO
Sat Jul 24 10:04:28 2010
CREATE TABLESPACE "T16KTAB" DATAFILE '/oracle/16k.dbf' SIZE 3145728 LOGGING ONLINE PERMANENT BLOCKSIZE 16384 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO
ORA-29339 signalled during: CREATE TABLESPACE "T16KTAB" DATAFILE '/oracle/16k.dbf' SIZE 3145728 LOGGING ONLINE PERMANENT BLOCKSIZE 16384 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO
Sat Jul 24 10:04:28 2010
CREATE TABLESPACE "INDX_TS" DATAFILE '/oracle/indxts.dbf' SIZE 26214400,'/oracle/secondf.dbf' SIZE 26214400 LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO
ORA-1119 signalled during: CREATE TABLESPACE "INDX_TS" DATAFILE '/oracle/indxts.dbf' SIZE 26214400,'/oracle/secondf.dbf' SIZE 26214400 LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO
Sat Jul 24 10:04:43 2010
Replication pre-import: trying to disable constraint REPCAT$_TEMPLATE_OBJECTS_FK1 for "SYSTEM"."REPCAT$_TEMPLATE_OBJECTS"
Replication pre-import: constraint REPCAT$_TEMPLATE_OBJECTS_FK1 for "SYSTEM"."REPCAT$_TEMPLATE_OBJECTS" is disabled successfully
Replication pre-import: trying to disable constraint REPCAT$_TEMPLATE_PARMS_FK1 for "SYSTEM"."REPCAT$_TEMPLATE_PARMS"
Replication pre-import: constraint REPCAT$_TEMPLATE_PARMS_FK1 for "SYSTEM"."REPCAT$_TEMPLATE_PARMS" is disabled successfully
Replication pre-import: trying to disable constraint REPCAT$_OBJECT_PARMS_FK1 for "SYSTEM"."REPCAT$_OBJECT_PARMS"
Replication pre-import: constraint REPCAT$_OBJECT_PARMS_FK1 for "SYSTEM"."REPCAT$_OBJECT_PARMS" is disabled successfully
Replication pre-import: trying to disable constraint REPCAT$_OBJECT_PARMS_FK2 for "SYSTEM"."REPCAT$_OBJECT_PARMS"
Replication pre-import: constraint REPCAT$_OBJECT_PARMS_FK2 for "SYSTEM"."REPCAT$_OBJECT_PARMS" is disabled successfully
Replication pre-import: trying to disable constraint REPCAT$_USER_AUTHORIZATION_FK2 for "SYSTEM"."REPCAT$_USER_AUTHORIZATIONS"
Replication pre-import: constraint REPCAT$_USER_AUTHORIZATION_FK2 for "SYSTEM"."REPCAT$_USER_AUTHORIZATIONS" is disabled successfully
Replication pre-import: trying to disable constraint REPCAT$_USER_PARM_VALUES_FK1 for "SYSTEM"."REPCAT$_USER_PARM_VALUES"
Replication pre-import: constraint REPCAT$_USER_PARM_VALUES_FK1 for "SYSTEM"."REPCAT$_USER_PARM_VALUES" is disabled successfully
Replication pre-import: trying to disable constraint REPCAT$_FLAVOR_OBJECTS_FK2 for "SYSTEM"."REPCAT$_FLAVOR_OBJECTS"
Replication pre-import: constraint REPCAT$_FLAVOR_OBJECTS_FK2 for "SYSTEM"."REPCAT$_FLAVOR_OBJECTS" is disabled successfully
Replication pre-import: trying to disable constraint REPCAT$_FLAVOR_OBJECTS_FK1 for "SYSTEM"."REPCAT$_FLAVOR_OBJECTS"
Replication pre-import: constraint REPCAT$_FLAVOR_OBJECTS_FK1 for "SYSTEM"."REPCAT$_FLAVOR_OBJECTS" is disabled successfully
Replication pre-import: trying to disable constraint REPCAT$_FLAVORS_FK1 for "SYSTEM"."REPCAT$_FLAVORS"
Replication pre-import: constraint REPCAT$_FLAVORS_FK1 for "SYSTEM"."REPCAT$_FLAVORS" is disabled successfully
Replication pre-import: trying to disable constraint REPCAT$_REPSCHEMA_DEST for "SYSTEM"."REPCAT$_REPSCHEMA"
Replication pre-import: constraint REPCAT$_REPSCHEMA_DEST for "SYSTEM"."REPCAT$_REPSCHEMA" is disabled successfully
Replication pre-import: trying to disable constraint REPCAT$_REPOBJECT_PRNT for "SYSTEM"."REPCAT$_REPOBJECT"
Replication pre-import: constraint REPCAT$_REPOBJECT_PRNT for "SYSTEM"."REPCAT$_REPOBJECT" is disabled successfully
Replication pre-import: trying to disable constraint REPCAT$_REPGEN_PRNT2 for "SYSTEM"."REPCAT$_GENERATED"
Replication pre-import: constraint REPCAT$_REPGEN_PRNT2 for "SYSTEM"."REPCAT$_GENERATED" is disabled successfully
Replication pre-import: trying to disable constraint REPCAT$_REPGEN_PRNT for "SYSTEM"."REPCAT$_GENERATED"
Replication pre-import: constraint REPCAT$_REPGEN_PRNT for "SYSTEM"."REPCAT$_GENERATED" is disabled successfully
Replication pre-import: trying to disable constraint REPCAT$_PARAMETER_COLUMN_F1 for "SYSTEM"."REPCAT$_PARAMETER_COLUMN"
Replication pre-import: constraint REPCAT$_PARAMETER_COLUMN_F1 for "SYSTEM"."REPCAT$_PARAMETER_COLUMN" is disabled successfully
Replication pre-import: trying to disable constraint REPCAT$_PRIORITY_F1 for "SYSTEM"."REPCAT$_PRIORITY"
Replication pre-import: constraint REPCAT$_PRIORITY_F1 for "SYSTEM"."REPCAT$_PRIORITY" is disabled successfully
Replication pre-import: trying to disable constraint REPCAT$_REPSCHEMA_PRNT for "SYSTEM"."REPCAT$_REPSCHEMA"
Replication pre-import: constraint REPCAT$_REPSCHEMA_PRNT for "SYSTEM"."REPCAT$_REPSCHEMA" is disabled successfully
Replication pre-import: trying to disable constraint REPCAT$_REPCOLUMN_FK for "SYSTEM"."REPCAT$_REPCOLUMN"
Replication pre-import: constraint REPCAT$_REPCOLUMN_FK for "SYSTEM"."REPCAT$_REPCOLUMN" is disabled successfully
Replication pre-import: trying to disable constraint REPCAT$_KEY_COLUMNS_PRNT for "SYSTEM"."REPCAT$_KEY_COLUMNS"
Replication pre-import: constraint REPCAT$_KEY_COLUMNS_PRNT for "SYSTEM"."REPCAT$_KEY_COLUMNS" is disabled successfully
Replication pre-import: trying to disable constraint REPCAT$_REPPROP_PRNT for "SYSTEM"."REPCAT$_REPPROP"
Replication pre-import: constraint REPCAT$_REPPROP_PRNT for "SYSTEM"."REPCAT$_REPPROP" is disabled successfully
Replication pre-import: trying to disable constraint REPCAT$_DDL_PRNT for "SYSTEM"."REPCAT$_DDL"
Replication pre-import: constraint REPCAT$_DDL_PRNT for "SYSTEM"."REPCAT$_DDL" is disabled successfully
Replication pre-import: trying to disable constraint REPCAT$_GROUPED_COLUMN_F1 for "SYSTEM"."REPCAT$_GROUPED_COLUMN"
Replication pre-import: constraint REPCAT$_GROUPED_COLUMN_F1 for "SYSTEM"."REPCAT$_GROUPED_COLUMN" is disabled successfully
Replication pre-import: trying to disable constraint REPCAT$_RESOLUTION_F1 for "SYSTEM"."REPCAT$_RESOLUTION"
Replication pre-import: constraint REPCAT$_RESOLUTION_F1 for "SYSTEM"."REPCAT$_RESOLUTION" is disabled successfully
Replication pre-import: trying to disable constraint REPCAT$_RESOLUTION_F3 for "SYSTEM"."REPCAT$_RESOLUTION"
Replication pre-import: constraint REPCAT$_RESOLUTION_F3 for "SYSTEM"."REPCAT$_RESOLUTION" is disabled successfully
Replication pre-import: trying to disable constraint REPCAT$_AUDIT_COLUMN_F2 for "SYSTEM"."REPCAT$_AUDIT_COLUMN"
Replication pre-import: constraint REPCAT$_AUDIT_COLUMN_F2 for "SYSTEM"."REPCAT$_AUDIT_COLUMN" is disabled successfully
Replication pre-import: trying to disable constraint REPCAT$_AUDIT_COLUMN_F1 for "SYSTEM"."REPCAT$_AUDIT_COLUMN"
Replication pre-import: constraint REPCAT$_AUDIT_COLUMN_F1 for "SYSTEM"."REPCAT$_AUDIT_COLUMN" is disabled successfully
Replication pre-import: trying to disable constraint REPCAT$_REPGROUP_PRIVS_FK for "SYSTEM"."REPCAT$_REPGROUP_PRIVS"
Replication pre-import: constraint REPCAT$_REPGROUP_PRIVS_FK for "SYSTEM"."REPCAT$_REPGROUP_PRIVS" is disabled successfully
Replication pre-import: trying to disable constraint REPCAT$_PARAMETER_COLUMN_PK for "SYSTEM"."REPCAT$_PARAMETER_COLUMN"
Replication pre-import: constraint REPCAT$_PARAMETER_COLUMN_PK for "SYSTEM"."REPCAT$_PARAMETER_COLUMN" is disabled successfully
Replication pre-import: trying to disable constraint REPCAT$_GROUPED_COLUMN_PK for "SYSTEM"."REPCAT$_GROUPED_COLUMN"
Replication pre-import: constraint REPCAT$_GROUPED_COLUMN_PK for "SYSTEM"."REPCAT$_GROUPED_COLUMN" is disabled successfully
Replication pre-import: trying to disable constraint REPCAT$_SITES_NEW_FK1 for "SYSTEM"."REPCAT$_SITES_NEW"
Replication pre-import: constraint REPCAT$_SITES_NEW_FK1 for "SYSTEM"."REPCAT$_SITES_NEW" is disabled successfully
Replication pre-import: trying to disable constraint REPCAT$_SITES_NEW_FK2 for "SYSTEM"."REPCAT$_SITES_NEW"
Replication pre-import: constraint REPCAT$_SITES_NEW_FK2 for "SYSTEM"."REPCAT$_SITES_NEW" is disabled successfully
Sat Jul 24 10:14:19 2010
The value (30) of MAXTRANS parameter ignored.
kupprdp: master process DM00 started with pid=20, OS id=1354
to execute - SYS.KUPM$MCP.MAIN('SYS_IMPORT_TABLESPACE_01', 'SYSTEM', 'KUPC$C_1_20100724101420', 'KUPC$S_1_20100724101420', 0);
kupprdp: worker process DW01 started with worker id=1, pid=23, OS id=1356
to execute - SYS.KUPW$WORKER.MAIN('SYS_IMPORT_TABLESPACE_01', 'SYSTEM');
Sat Jul 24 10:15:27 2010
create tablespace indx_ts datafile '/oracle/ts.dbf' size 10m
Sat Jul 24 10:15:29 2010
Completed: create tablespace indx_ts datafile '/oracle/ts.dbf' size 10m
The value (30) of MAXTRANS parameter ignored.
kupprdp: master process DM00 started with pid=23, OS id=1368
to execute - SYS.KUPM$MCP.MAIN('SYS_IMPORT_TABLESPACE_01', 'SYSTEM', 'KUPC$C_1_20100724101548', 'KUPC$S_1_20100724101548', 0);
kupprdp: worker process DW01 started with worker id=1, pid=24, OS id=1370
to execute - SYS.KUPW$WORKER.MAIN('SYS_IMPORT_TABLESPACE_01', 'SYSTEM');
statement in resumable session 'SYSTEM.SYS_IMPORT_TABLESPACE_01.1' was suspended due to
ORA-01652: unable to extend temp segment by 128 in tablespace INDX_TS
Sat Jul 24 10:46:04 2010
The value (30) of MAXTRANS parameter ignored.
kupprdp: master process DM02 started with pid=17, OS id=1426
to execute - SYS.KUPM$MCP.MAIN('SYS_EXPORT_SCHEMA_01', 'SYSTEM', 'KUPC$C_1_20100724104604', 'KUPC$S_1_20100724104604', 0);
Sat Jul 24 10:46:46 2010
The value (30) of MAXTRANS parameter ignored.
kupprdp: master process DM02 started with pid=17, OS id=1432
to execute - SYS.KUPM$MCP.MAIN('SYS_EXPORT_SCHEMA_01', 'SYSTEM', 'KUPC$C_1_20100724104647', 'KUPC$S_1_20100724104647', 0);
Sat Jul 24 10:48:16 2010
The value (30) of MAXTRANS parameter ignored.
kupprdp: master process DM02 started with pid=17, OS id=1463
to execute - SYS.KUPM$MCP.MAIN('SYS_SQL_FILE_FULL_01', 'SYSTEM', 'KUPC$C_1_20100724104816', 'KUPC$S_1_20100724104816', 0);
kupprdp: worker process DW03 started with worker id=1, pid=18, OS id=1465
to execute - SYS.KUPW$WORKER.MAIN('SYS_SQL_FILE_FULL_01', 'SYSTEM');
Sat Jul 24 12:00:40 2010
Thread 1 advanced to log sequence 2
Current log# 2 seq# 2 mem# 0: /rmandb/rmandata/redo02.log
Sat Jul 24 12:15:52 2010
statement in resumable session 'SYSTEM.SYS_IMPORT_TABLESPACE_01.1' was timed out
Sat Jul 24 13:18:59 2010
The value (30) of MAXTRANS parameter ignored.
kupprdp: master process DM00 started with pid=18, OS id=1841
to execute - SYS.KUPM$MCP.MAIN('SYS_IMPORT_FULL_03', 'SYSTEM', 'KUPC$C_1_20100724131859', 'KUPC$S_1_20100724131859', 0);
kupprdp: worker process DW01 started with worker id=1, pid=19, OS id=1843
to execute - SYS.KUPW$WORKER.MAIN('SYS_IMPORT_FULL_03', 'SYSTEM');
Sat Jul 24 13:19:10 2010
CREATE UNDO TABLESPACE "UNDOTBS1" DATAFILE '/oracle/product/dbn/undotbs01.dbf' SIZE 26214400 AUTOEXTEND ON NEXT 5242880 MAXSIZE 32767M BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE
ORA-1543 signalled during: CREATE UNDO TABLESPACE "UNDOTBS1" DATAFILE '/oracle/product/dbn/undotbs01.dbf' SIZE 26214400 AUTOEXTEND ON NEXT 5242880 MAXSIZE 32767M BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE...
Sat Jul 24 13:19:10 2010
CREATE TABLESPACE "SYSAUX" DATAFILE '/oracle/product/dbn/sysaux01.dbf' SIZE 125829120 AUTOEXTEND ON NEXT 10485760 MAXSIZE 32767M LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO
ORA-1543 signalled during: CREATE TABLESPACE "SYSAUX" DATAFILE '/oracle/product/dbn/sysaux01.dbf' SIZE 125829120 AUTOEXTEND ON NEXT 10485760 MAXSIZE 32767M LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO...
Sat Jul 24 13:19:10 2010
CREATE TEMPORARY TABLESPACE "TEMP" TEMPFILE '/oracle/product/dbn/temp01.dbf' SIZE 31457280 AUTOEXTEND ON NEXT 655360 MAXSIZE 32767M EXTENT MANAGEMENT LOCAL UNIFORM SIZE 1048576
ORA-1543 signalled during: CREATE TEMPORARY TABLESPACE "TEMP" TEMPFILE '/oracle/product/dbn/temp01.dbf' SIZE 31457280 AUTOEXTEND ON NEXT 655360 MAXSIZE 32767M EXTENT MANAGEMENT LOCAL UNIFORM SIZE 1048576
Sat Jul 24 13:19:10 2010
CREATE TABLESPACE "USERTB" DATAFILE '/oracle/product/dbn/users01.dbf' SIZE 5242880 AUTOEXTEND ON NEXT 1310720 MAXSIZE 32767M LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO
ORA-1119 signalled during: CREATE TABLESPACE "USERTB" DATAFILE '/oracle/product/dbn/users01.dbf' SIZE 5242880 AUTOEXTEND ON NEXT 1310720 MAXSIZE 32767M LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO...
Sat Jul 24 13:19:10 2010
CREATE TABLESPACE "EXAMPLE" DATAFILE '/oracle/product/dbn/example01.dbf' SIZE 104857600 AUTOEXTEND ON NEXT 655360 MAXSIZE 32767M NOLOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO
ORA-1119 signalled during: CREATE TABLESPACE "EXAMPLE" DATAFILE '/oracle/product/dbn/example01.dbf' SIZE 104857600 AUTOEXTEND ON NEXT 655360 MAXSIZE 32767M NOLOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO
Sat Jul 24 13:19:10 2010
CREATE TABLESPACE "NEWTABLESPACE" DATAFILE '/oracle/product/newtablespace.dbf' SIZE 10485760 LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL UNIFORM SIZE 1048576 SEGMENT SPACE MANAGEMENT AUTO
ORA-1119 signalled during: CREATE TABLESPACE "NEWTABLESPACE" DATAFILE '/oracle/product/newtablespace.dbf' SIZE 10485760 LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL UNIFORM SIZE 1048576 SEGMENT SPACE MANAGEMENT AUTO
Sat Jul 24 13:19:10 2010
CREATE TABLESPACE "T16KTAB" DATAFILE '/oracle/16k.dbf' SIZE 3145728 LOGGING ONLINE PERMANENT BLOCKSIZE 16384 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO
ORA-29339 signalled during: CREATE TABLESPACE "T16KTAB" DATAFILE '/oracle/16k.dbf' SIZE 3145728 LOGGING ONLINE PERMANENT BLOCKSIZE 16384 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO
Sat Jul 24 13:19:10 2010
CREATE TABLESPACE "INDX_TS" DATAFILE '/oracle/indxts.dbf' SIZE 26214400,'/oracle/secondf.dbf' SIZE 26214400 LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO
ORA-1543 signalled during: CREATE TABLESPACE "INDX_TS" DATAFILE '/oracle/indxts.dbf' SIZE 26214400,'/oracle/secondf.dbf' SIZE 26214400 LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO
... -
Import all users and their objects without doing full database import
Hi Guys,
I have a task to that involves Importing all existing users and their objects in production to the test database without doing a full database import - Please how do i do this?
Pls i need your help urgently.SQL> select * from v$version;
BANNER
Oracle Database 10g Express Edition Release 10.2.0.1.0 - Product
PL/SQL Release 10.2.0.1.0 - Production
CORE 10.2.0.1.0 Production
TNS for Linux: Version 10.2.0.1.0 - Production
NLSRTL Version 10.2.0.1.0 - Production
I tried to import objects and data from a user from a FULL dump file. File was created with the following command:
server is: SQL*Plus: Release 10.2.0.1.0 - Production on Wed May 26 15:34:05 2010
exp full=y FILE="full.dmp" log=full.log
Now I imported:
imp file=full.dmp log=full.log INCTYPE=SYSTEM
imp fromuser=user1 file=full.dmp
Results: not all the user procedures have been imported:
SQL> select count(*) from user_procedures;
the Original
COUNT(*)
134
the current:
select count(*) from user_procedures;
COUNT(*)
18
I also tried these alternatives:
exp tablespaces="user1_data" FILE="user1.dmp" log=user1.log
exp LOG=user1.log TABLESPACES=user1_data FILE=user1_data.dmp
exp LOG=user1owner.log owner=user1 FILE=user1owner.dmp
expdp DIRECTORY=dpump_dir1 dumpfile=servdata_pump version=compatible SCHEMAS=user1
impdp directory=data_pump_dir dumpfile=servdata_pump.dmp :
ORA-39213: Metadata processing is not available
SQL> execute dbms_metadata_util.load_stylesheets
BEGIN dbms_metadata_util.load_stylesheets; END;
ERROR at line 1:
ORA-31609: error loading file "kualter.xsl" from file system directory
"/usr/lib/oracle/xe/app/oracle/product/10.2.0/server/rdbms/xml/xsl"
ORA-06512: at "SYS.DBMS_METADATA_UTIL", line 1793
ORA-06512: at line 1
file kualter.xsl does not exist in XE !!
imp owner=user1 rows=n ignore=y
imp full=y file=user1_data.dmp
imp full=y file=full.dmp tablespaces=user1_data,user1_index rows=n ignore=y
So, I do not understand why user1 objects are not imported:
see this part of the first import log:
Export file created by EXPORT:V10.02.01 via conventional path
import done in US7ASCII character set and AL16UTF16 NCHAR character set
import server uses WE8ISO8859P1 character set (possible charset conversion)
. importing SYS's objects into SYS
. importing USER1's objects into USER1
. . importing table .........................
why only 18 rows?
if you have an suggestion, you are welcome, as I do not have any other idea...
ren
Edited by: ronpetitpatapon on May 26, 2010 12:38 PM
Edited by: ronpetitpatapon on May 26, 2010 12:41 PM
Edited by: ronpetitpatapon on May 26, 2010 1:03 PM -
Hello Gurus,
I am need to take make full database refresh from one database to other, Can someone let me know step how to do it?
select * from V$VERSIONBANNER
Oracle Database 10g Enterprise Edition Release 10.2.0.5.0 - 64bi
PL/SQL Release 10.2.0.5.0 - Production
CORE 10.2.0.5.0 Production
TNS for HPUX: Version 10.2.0.5.0 - Production
NLSRTL Version 10.2.0.5.0 - ProductionCKPT wrote:
jgarry wrote:
I'm with Ed and Brendan on this. 3 main reasons:
1. Legacy has some special requirements.
2. Definitely want different schema names, this has saved my bacon sooo many times. Even so, people still get it wrong (including me, and I'm totally paranoid about it to begin with).
3. There are many more copies of production than actual production, for various reasons (test, several dev, special projects, future app versions...) . We would rather have one db with many schemata than many db's, right?
Now if we could just rename user...Agreed some extent, I can say to refresh entire database the preferred method is RMAN restore, where it can manage very easily
how?
1) To import again we need to create a new database
2) Need to create all the tablespaces with same names or we can use REMAP option
3) need to monitor entire import process,
4) need to take care of other things like , temp utilization and all.
5) need to perform export, Need to copy those dumpfiles, We should have enough space on mount points of both source & destinations. ;-)
I would just caution there that you shouldn't think that because you are taking rman backups you have no need to also regularly export the db. A good backup policy includes both physical backup (rman) and logical backup of the data (export).
If you talk with couple of schemas the preferred method is EXPDP/IMPDP, totally agreed.
If in case of database size is huge, then better to go with RMAN to refresh database. Either DUPLICATE or manual way restore.
as ED said, still we could have chance to test RMAN backups too. Multiple advantages.
Let me know if I'm wrong. :) -
How to determine who performed the last full database dump before cumulative dump
Hello,
ASE 15.7 SP100 allows cumulative backups, and if cumulative dump is tried before a full dump, ASE shows this error:
You cannot execute an incremental dump before a full database dump.
From my experiments, it seems that it does not matter how the full dump is performed (by native isql, or by a 3rd party API). As long as a full dump is done on a database, ASE seems to be keeping track of all the changed pages since the last full dump. So this means that you can perform a full dump using a 3rd party API, and then on isql, if you run a cumulative dump command, the cumulative dump succeeds, which is based on the last full dump by another library!
So, my question is: is there any way to programmatically determine how (by native isql or 3rd party) the last full backup was performed? I believe $SYBASE_HOME/ASE-15_0/dumphist contains this info, but it requires 'enable dump history' to be set first, and I am looking for a solution which does not involve checking a disk file.
Thanks,
AliDear Mr Razib,
I have not explored the feature but ASE autiding might provide you the possibility to access information on past database dumps via SQL.
Apart from that - I am not aware of an SQL interface to the dumphist file (would be nice to have, I agree)
Enabling dump history is definitley highly recommended (in my opinion) .
There is yet another feature which might help you to prevent an DBA from dumping databases and transactions to various locations.
When you create a DUMP CONFIGURATION and additionally set parameter
enforce dump configuration
ASE will prevent normal (free style) DUMP commands but enforce the use of an existing dump configuration. The mechanism is not fool proof (nothing prevents from creating yet another dump configuration on the fly) - but at least something.
With kind regards
Tilman Model-Bosch -
Problem with full database backup.
This is what I got after execute backup full database statement
"RMAN-03009: failure of backup command on ORA_DISK_1 channel at 03/03/2009 14:15:15
ORA-19502: write error on file "/u01/app/oracle/backup/ORCL/ora_df680537660_s2_s1", blockno 25985 (blocksize=8192)
ORA-27072: File I/O error
Linux Error: 2: No such file or directory
Additional information: 4
Additional information: 25985
Additional information: 483328"
I have quite the same problem with create tablespace, I can't create tablespace 512m, but I can create the same tablespace only 100m.
I think it must be somethink with storage???Starting backup at 03-MAR-09
using channel ORA_DISK_1
input datafile fno=00015 name=/u01/app/oracle/oradata/o2_mf_system_48fprop3_.dbf
input datafile fno=00003 name=/u01/app/oracle/oradata/ORCL/datafile/o1_mf_sysaux_48fpropg_.dbf
input datafile fno=00014 name=/u01/app/oracle/oradata/o2_mf_sysaux_48fpropg_.dbf
input datafile fno=00005 name=/u01/app/oracle/oradata/ORCL/datafile/o1_mf_example_48fpw04c_.dbf
input datafile fno=00017 name=/u01/app/oracle/oradata/ORCL/datafile/rcvcat01.dbf
input datafile fno=00002 name=/u01/app/oracle/oradata/ORCL/datafile/o1_mf_undotbs1_48fprovo_.dbf
input datafile fno=00016 name=/u01/app/oracle/oradata/inventory03.dbf
input datafile fno=00011 name=/u01/app/oracle/oradata/ORCL/datafile/inventory01.dbf
input datafile fno=00012 name=/u01/app/oracle/oradata/ORCL/datafile/inventory02.dbf
input datafile fno=00006 name=/u01/app/oracle/oradata/ORCL/datafile/ts01.dbf
input datafile fno=00008 name=/u01/app/oracle/oradata/ORCL/datafile/ts02.dbf
input datafile fno=00001 name=/u01/app/oracle/oradata/ORCL/datafile/o1_mf_system_48fprop3_.dbf
channel ORA_DISK_1: specifying datafile(s) in backupset
channel ORA_DISK_1: starting full datafile backupset
input datafile fno=00009 name=/u01/app/oracle/oradata/ORCL/datafile/undo01.dbf
channel ORA_DISK_1: starting piece 1 at 03-MAR-09
input datafile fno=00013 name=/u01/app/oracle/oradata/ORCL/datafile/val01.dbf
input datafile fno=00007 name=/u01/app/oracle/oradata/ORCL/datafile/test_reorg0
input datafile fno=00004 name=/u01/app/oracle/oradata/ORCL/datafile/o1_mf_users_48fprowb_.dbf
input datafile fno=00010 name=/u01/app/oracle/oradata/ORCL/datafile/ts01b.dbf
RMAN-00571: ===========================================================
RMAN-00569: =============== ERROR MESSAGE STACK FOLLOWS ===============
RMAN-00571: ===========================================================
RMAN-03009: failure of backup command on ORA_DISK_1 channel at 03/03/2009 14:15:15
ORA-19502: write error on file "/u01/app/oracle/backup/ORCL/ora_df680537660_s2_s1", blockno 25985 (blocksize=8192)
ORA-27072: File I/O error
Linux Error: 2: No such file or directory
Additional information: 4
Additional information: 483328
Additional information: 25985
Directory exist because I can backup single tablespace, into the same directory, I just can't backup whole database.
Edited by: val75 on Mar 3, 2009 1:57 PM -
Getting error in full database export
hi,
I am exporting full database i am getting this error shown below:
operating system:windows 2003 server
oracle:oracle 10g
C:\Documents and Settings\Administrator.SRCSD>exp system/systemsms file=d:\21041
0.dmp log=d:\210410.log full=y
Export: Release 10.1.0.2.0 - Production on ╟ط╟╤╚┌╟┴ ╟╚╤وط 21 11:15:42 2010
Copyright (c) 1982, 2004, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Produc
tion
With the Partitioning, OLAP and Data Mining options
Export done in AR8MSWIN1256 character set and AL16UTF16 NCHAR character set
server uses AR8ISO8859P6 character set (possible charset conversion)
About to export the entire database ...
. exporting tablespace definitions
. exporting profiles
. exporting user definitions
. exporting roles
. exporting resource costs
. exporting rollback segment definitions
. exporting database links
. exporting sequence numbers
. exporting directory aliases
. exporting context namespaces
. exporting foreign function library names
. exporting PUBLIC type synonyms
. exporting private type synonyms
. exporting object type definitions
. exporting system procedural objects and actions
. exporting pre-schema procedural objects and actions
. exporting cluster definitions
EXP-00056: ORACLE error 4021 encountered
ORA-04021: timeout occurred while waiting to lock object
ORA-06512: at "SYS.DBMS_METADATA", line 1511
ORA-06512: at "SYS.DBMS_METADATA", line 1548
ORA-06512: at "SYS.DBMS_METADATA", line 1864
ORA-06512: at "SYS.DBMS_METADATA", line 3707
ORA-06512: at "SYS.DBMS_METADATA", line 3689
ORA-06512: at line 1
EXP-00056: ORACLE error 4021 encountered
ORA-04021: timeout occurred while waiting to lock object
ORA-06512: at "SYS.DBMS_METADATA", line 1511
ORA-06512: at "SYS.DBMS_METADATA", line 1548
ORA-06512: at "SYS.DBMS_METADATA", line 1864
ORA-06512: at "SYS.DBMS_METADATA", line 3707
ORA-06512: at "SYS.DBMS_METADATA", line 3689
ORA-06512: at line 1
EXP-00000: Export terminated unsuccessfully
C:\Documents and Settings\Administrator.SRCSD>
plz reply me appropriate solution.
thanks.Possibly bug 3386590 (4021: TIMEOUT OCCURRED WHILE WAITING TO LOCK OBJECT), upgrade to 10gR2 or even better 11gR2.
Werner -
Hi
We are having a database 10g running on solaris OS. We are getting a new server and I need to move the data. This is what I am planning to do.
Take full database export from original server. (This includes tablespaces, all schemas and data)
Import in new server, from SYS schema once Oracle is installed. so all the schemas and objects are imported.
Can you let me know this approach is right any precautions to be taken?
Thanksuser625850 wrote:
Hi
We are having a database 10g running on solaris OS. We are getting a new server and I need to move the data. This is what I am planning to do.
Take full database export from original server. (This includes tablespaces, all schemas and data)
Import in new server, from SYS schema once Oracle is installed. so all the schemas and objects are imported.
Can you let me know this approach is right any precautions to be taken?
ThanksRMAN DUPLICATE -
Full database export hangs (Oracle9i rel2)
I recently migrated successfully from oracle8i rel 8.0. to oracle9i release 2,But now am not able to do a full database export.has anyone experienced this and whats the remedy?.
the text i get is listed below.
Microsoft Windows [Version 5.2.3790]
(C) Copyright 1985-2003 Microsoft Corp.
C:\Documents and Settings\Administrator>exp system/manager@ORCL file=C:\orclbk\p
oll.dmp full=y
Export: Release 9.2.0.1.0 - Production on Tue Aug 9 18:36:13 2005
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Connected to: Oracle9i Enterprise Edition Release 9.2.0.1.0 - Production
With the Partitioning, OLAP and Oracle Data Mining options
JServer Release 9.2.0.1.0 - Production
Export done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
About to export the entire database ...
. exporting tablespace definitions
. exporting profiles
. exporting user definitions
. exporting roles
. exporting resource costs
. exporting rollback segment definitions
. exporting database links
. exporting sequence numbers
. exporting directory aliases
. exporting context namespaces
. exporting foreign function library names
. exporting PUBLIC type synonyms
. exporting private type synonyms
. exporting object type definitions
EXP-00090: cannot pin type "OE"."CUSTOMER_TYP"
EXP-00056: ORACLE error 22303 encountered
OCI-22303: type "OE"."CUSTOMER_TYP" not found
EXP-00056: ORACLE error 24323 encountered
ORA-24323: value not allowed
EXP-00000: Export terminated unsuccessfully
C:\Documents and Settings\Administrator>You have to run the export with the SYS user. This error may arise on any plattform ranging form 9.0.1 to 10.2.0.
C:\>exp "'sys/oracle as sysdba'" full=y
Maybe you are looking for
-
I continue to have issues with Outlook 365 and iCloud. I have had no luck getting these programs to work together. I had no issues with Outlook 2010 and iCloud. Any suggestions? Thanks
-
When my cycle count got finish for macbook pro '13 Late 2011
How long my battery can use and how long should i replace it ?
-
BT is blocking my forwarded emails !
I have a problem with receiving emails which are forwarded from the domain name I own. That is I have an email address such as [email protected] which is forwarded to my BT email address such as [email protected]. In email testing I can, and othe
-
Hi there, I'm facing this exception. My code: final ReportTemplateAttachment rta = new ReportTemplateAttachment(); FileBlob fb = new FileBlob(); fb.setContentType(attachmentDto.getContentType().getValue()); fb.setCreationDate(sysDate); fb.setCreation
-
Yaourt -Sybu - What does it do and why it doesn't work?
AFAIK yaourt -Sybu updates and builds packages. But I can't get it to work. I always get this error: *name of the package*m wasn't found on abs repos.archlinux.org. I do have abs working, though I never actually tried to build anything using it. Also