Importing tablespace using impdp
Hi,
I am trying to transport tablespace using expdp/impdp like:
source side: expdp dumpfile=tbs1.dmp logfile=tbs1.log directory=datapump transport_tablespaces=tbs1
transformed to target side
target side: impdp dumpfile=tbs1.dmp directory=datapump transport_datafiles='/u01/app/oracle/sales/tbs1.dbf'
it is giving:
ORA-39123: Data Pump transportable tablespace job aborted
ORA-29345: cannot plug a tablespace into a database using an incompatible character set
when i searched in google some suggestions are given like do transport using rman.
But when we have the option in impdp why i need to go to rman tool.
i want to use impdp only. but i ma unable to use it properly while importing.
can you please alter me.
thank you!
Hello,
ORA-29345: cannot plug a tablespace into a database using an incompatible character setIt cannot work due to the error ORA-29345 above:
ORA-29345: cannot plug a tablespace into a database using an incompatible character set
Cause: Oracle does not support plugging a tablespace into a database using an
incompatible character set.
Action: Use import/export or unload/load to move data instead.
i want to use impdp only. but i ma unable to use it properly while importing.So, as suggested, move the data by import/export. But, be careful, as you have different character set you may have conversions.
Hope this help.
Best regards,
Jean-Valentin
Similar Messages
-
Errors importing TTS using impdp
Hello All,
I am seeing following errors when importing TTS using impdp. Please let me know if you know any work around.
Thanks,
Sunny boy
Corp_DEODWPRD$ impdp parfile=impdp_tts.par
Import: Release 11.2.0.3.0 - Production on Mon Jul 30 13:23:50 2012
Copyright (c) 1982, 2011, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
With the Partitioning, Automatic Storage Management, OLAP, Data Mining
and Real Application Testing options
Master table "SYS"."SYS_IMPORT_TRANSPORTABLE_02" successfully loaded/unloaded
Starting "SYS"."SYS_IMPORT_TRANSPORTABLE_02": /******** AS SYSDBA parfile=impdp_tts.par
Processing object type TRANSPORTABLE_EXPORT/PLUGTS_BLK
Processing object type TRANSPORTABLE_EXPORT/TABLE
Processing object type TRANSPORTABLE_EXPORT/GRANT/OWNER_GRANT/OBJECT_GRANT
ORA-39126: Worker unexpected fatal error in KUPW$WORKER.LOAD_METADATA [SELECT process_order, flags, xml_clob, NVL(dump_fileid, :1), NVL(dump_position, :2), dump_length, dump_allocation, NVL(value_n, 0), grantor, object_row, object_schema, object_long_name, partition_name, subpartition_name, processing_status, processing_state, base_object_type, base_object_schema, base_object_name, base_process_order, property, size_estimate, in_progress, original_object_schema, original_object_name, creation_level, object_int_oid FROM "SYS"."SYS_IMPORT_TRANSPORTABLE_02" WHERE process_order between :3 AND :4 AND duplicate = 0 AND processing_state NOT IN (:5, :6, :7) ORDER BY process_order]
ORA-39183: internal error -19 ocurred during decompression phase 2
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPW$WORKER", line 9001
----- PL/SQL Call Stack -----
object line object
handle number name
0x5d9562370 20462 package body SYS.KUPW$WORKER
0x5d9562370 9028 package body SYS.KUPW$WORKER
0x5d9562370 4185 package body SYS.KUPW$WORKER
0x5d9562370 9725 package body SYS.KUPW$WORKER
0x5d9562370 1775 package body SYS.KUPW$WORKER
0x59c4fbb90 2 anonymous block
Job "SYS"."SYS_IMPORT_TRANSPORTABLE_02" stopped due to fatal error at 13:52:57
PAR file:
USERID='/ as sysdba'
directory=ODW_MIG_DIR
dumpfile=exp_full_ODW_tts_20120719_01.dmp
logfile=imp_FULL_ODW_TTS_20120730.log
exclude=PROCACT_INSTANCE
TRANSPORT_DATAFILES=(
'+DEODW_DATA_GRP/deodwprd/data/ts_appsds_d_01.dbf',
'+DEODW_DATA_GRP/deodwprd/data/ts_appsds_i_01.dbf',
'+DEODW_DATA_GRP/deodwprd/data/deodw_aud_d_01.dbf',
'+DEODW_DATA_GRP/deodwprd/data/deodw_capone_d01_02.dbf',
'+DEODW_DATA_GRP/deodwprd/data/deodw_capone_d01_03.dbf',
'+DEODW_DATA_GRP/deodwprd/data/deodw_capone_d01_01.dbf',
'+DEODW_DATA_GRP/deodwprd/data/deodw_capone_d_p2012_01.dbf',
'+DEODW_DATA_GRP/deodwprd/data/deodw_capone_i01_01.dbf',
'+DEODW_DATA_GRP/deodwprd/data/deodw_ts_ccr_d01_3.dbf',
'+DEODW_DATA_GRP/deodwprd/data/deodw_ts_ccr_d01_2.dbf',
'+DEODW_DATA_GRP/deodwprd/data/deodw_ts_ccr_d01_1.dbf',
'+DEODW_DATA_GRP/deodwprd/data/deodw_cst_stage_d_01.dbf',
'+DEODW_DATA_GRP/deodwprd/data/deodw_cst_stage_i_01.dbf',
'+DEODW_DATA_GRP/deodwprd/data/deodw_dsdw_doc_stg_d01_01.dbf',
'+DEODW_DATA_GRP/deodwprd/data/deodw_x_etl_d01_02.dbf',
'+DEODW_DATA_GRP/deodwprd/data/deodw_x_report_work_d01_01.dbf',
'+DEODW_DATA_GRP/deodwprd/data/deodw_x_sas_d01_01.dbf')Still the same error with system account. Working with oracle about this iissue.
Corp_DEODWPRD$ impdp parfile=impdp_tts.par
Import: Release 11.2.0.3.0 - Production on Tue Jul 31 11:47:32 2012
Copyright (c) 1982, 2011, Oracle and/or its affiliates. All rights reserved.
UDI-28002: operation generated ORACLE error 28002
ORA-28002: the password will expire within 6 days
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
With the Partitioning, Automatic Storage Management, OLAP, Data Mining
and Real Application Testing options
Master table "SYSTEM"."SYS_IMPORT_TRANSPORTABLE_01" successfully loaded/unloaded
Starting "SYSTEM"."SYS_IMPORT_TRANSPORTABLE_01": system/******** parfile=impdp_tts.par
Processing object type TRANSPORTABLE_EXPORT/PLUGTS_BLK
Processing object type TRANSPORTABLE_EXPORT/TABLE
Processing object type TRANSPORTABLE_EXPORT/GRANT/OWNER_GRANT/OBJECT_GRANT
ORA-39126: Worker unexpected fatal error in KUPW$WORKER.LOAD_METADATA [SELECT process_order, flags, xml_clob, NVL(dump_fileid, :1), NVL(dump_position, :2), dump_length, dump_allocation, NVL(value_n, 0), grantor, object_row, object_schema, object_long_name, partition_name, subpartition_name, processing_status, processing_state, base_object_type, base_object_schema, base_object_name, base_process_order, property, size_estimate, in_progress, original_object_schema, original_object_name, creation_level, object_int_oid FROM "SYSTEM"."SYS_IMPORT_TRANSPORTABLE_01" WHERE process_order between :3 AND :4 AND duplicate = 0 AND processing_state NOT IN (:5, :6, :7) ORDER BY process_order]
ORA-39183: internal error -19 ocurred during decompression phase 2
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPW$WORKER", line 9001
----- PL/SQL Call Stack -----
object line object
handle number name
0x5dac10b98 20462 package body SYS.KUPW$WORKER
0x5dac10b98 9028 package body SYS.KUPW$WORKER
0x5dac10b98 4185 package body SYS.KUPW$WORKER
0x5dac10b98 9725 package body SYS.KUPW$WORKER
0x5dac10b98 1775 package body SYS.KUPW$WORKER
0x59c554178 2 anonymous block
Job "SYSTEM"."SYS_IMPORT_TRANSPORTABLE_01" stopped due to fatal error at 12:13:04
Edited by: Sunny boy on Jul 31, 2012 12:16 PM -
Hello,
I wanna import a database using impdp. Can i import while i'm logged with system/<password> ? On which database should i be logged? I have a single database, 'orcl'
ThanksI wanna import a database using impdp.
Can i import while i'm logged with system/<password> ?
On which database should i be logged? I have a single database, 'orcl'You don't have to login to any database to perform the import using datapump import.
You just have to set the oracle database environment where you want to import into.
suppose, you have an export dump generated already,using datapump export(expdp) and if you want to import into orcl database,
set the oracle environment to orcl
You have to have the directory object created with read,write access to the user performing export/import(expdp or impdp)
use impdp help=y at the command line to see the list of available options.
http://download.oracle.com/docs/cd/B13789_01/server.101/b10825/dp_import.htm
-Anantha -
Unable to import database using impdp via network_link
Dear All,
I am trying to import database from remote location's dump to my local database after creating dblink from local db to remote db.
At Remote Server:
- Directory created and granted the privileges. Dir Name: PREMLIVE_EXPDP
At Local server:
Username: expuser with dba, imp and exp full database privileges.
DB Link name: to_premiatest14
I faced an error of directory invalid when I run the below command in local server to import schema from remote database.
C:\>impdp expuser/expuser network_link=to_premiatest14 directory=PREMLIVE_EXPDP remap_schema=PREMIATEST:PREMIATEST15 logfile=2013-09-02_premiatest15.log
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.5.0 - 64bit
Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-39087: directory name PREMLIVE_EXPDP is invalid
Any help or advice's would be really highly appreciated.
Regards,
SyedSome more details follows.
Server 1: Remote Server
BANNER
Oracle Database 10g Enterprise Edition Release 10.2.0.5.0 - 64bi
PL/SQL Release 10.2.0.5.0 - Production
CORE 10.2.0.5.0 Production
TNS for 64-bit Windows: Version 10.2.0.5.0 - Production
NLSRTL Version 10.2.0.5.0 - Production
TNS-Entry:
PREMIATEST14=
(DESCRIPTION =
(ADDRESS = (PROTOCOL = TCP)(HOST = XX.XX.XX.14)(PORT = 1521))
(CONNECT_DATA =
(SERVER = DEDICATED)
(SERVICE_NAME = malath)
DIRECTORY:
PREMLIVE_EXPDP AS
'C:\Dump\PREMLIVE_EXPDP'; ---->>>> This location/directory holds the dump file.
Server 2: Local Server
BANNER
Oracle Database 10g Enterprise Edition Release 10.2.0.5.0 - 64bi
PL/SQL Release 10.2.0.5.0 - Production
CORE 10.2.0.5.0 Production
TNS for 64-bit Windows: Version 10.2.0.5.0 - Production
NLSRTL Version 10.2.0.5.0 - Production
TNS-Entry:
PREMIATEST15=
(DESCRIPTION=
(ADDRESS=
(PROTOCOL=TCP)
(HOST=XX.XX.XX.15)
(PORT=1521)
(CONNECT_DATA=
(SERVICE_NAME=PREMIA15)
PREMIATEST14 =
(DESCRIPTION =
(ADDRESS = (PROTOCOL = TCP)(HOST = XX.XX.XX.14)(PORT = 1521))
(CONNECT_DATA =
(SERVER = DEDICATED)
(SERVICE_NAME = malath)
Network/DBLink:
CREATE PUBLIC DATABASE LINK TO_PREMIATEST14 CONNECT TO EXPUSER IDENTIFIED BY <PWD> USING 'PREMIATEST14';
DBLink tested :
SQL> select count(*) from premiatest.brkdivion@to_premiatest14;
COUNT(*)
94
Error on importing:
C:\>impdp expuser/expuser network_link=to_premiatest14 directory=PREMLIVE_EXPDP
remap_schema=PREMIATEST:PREMIATEST15 logfile=2013-09-02_premiatest15.log
Import: Release 10.2.0.5.0 - 64bit Production on Tuesday, 03 September, 2013 1:2
3:10
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.5.0 - 64bit
Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-39087: directory name PREMLIVE_EXPDP is invalid
I don't know where am i going wrong . Valuable advice's/assistance will be highly appreciated.
Thanks & Regards,
Syed -
Can we use impdp to import the data from an normal exp dump?
Hi All,
I have a export dump taken from a 9i database. Can i use impdp to import the data into 10g database from that 9i exp dump?
Please suggest
thanks and Regards
ArunHi,
I have a export dump taken from a 9i database. Can i use impdp to import the data into 10g database from that 9i exp dump?Yes, it can be.
Refer:
http://wiki.oracle.com/thread/3734722/can+a+9i+dump+file+be+imported+into+10%3F
thanks,
X A H E E R -
ORA-02374: conversion error loading table during import using IMPDP
HI All,
We are trying to migrate the data from one database to an other database.
The source database is having character set
SQL> select value from nls_database_parameters where parameter='NLS_CHARACTERSET';
VALUE
US7ASCII
The destination database is having character set
SQL> select value from nls_database_parameters where parameter='NLS_CHARACTERSET';
VALUE
AL32UTF8
We took an export of the whole database using expdp and when we try to import to the destination database using impdp. We are getting the following error.
ORA-02374: conversion error loading table <TABLE_NAME>
ORA-12899: value too large for column <COLUMN NAME> (actual: 42, maximum: 40)
ORA-02372: data for row:<COLUMN NAME> : 0X'4944454E5449464943414349E44E204445204C4C414D414441'
Kindly let me know how to overcome this issue in destination.
Thanks & Regards,
Vikas KrishnaHi,
You can overcome this issue by increasing the column width in the target database for the max value required for all data to be imported successfully in the table.
Regards -
Only to import JAVA CLASS and JAVA SOURCE from .DMP file using IMPDP
hi,
I have a schema X, In that schema some of the *"JAVA CLASS" and "JAVA SOURCE"* objects are missing ..
The procedures ,functions..etc objects were updated at X schema..
I have 1 dmp file of Y schema , containing the missing "JAVA CLASS" and "JAVA SOURCE" s.. Can I import the the same to the schema X using IMPDP
i tried using INCLUDE clause but it is not working
eg:
impdp john1/john1@me11g22 directory=datadump dumpfile=DEVF2WAR.DMP remap_schema=devf2war:john INCLUDE="JAVA CLASS","JAVA CLASS"but error..
ORA-39001: invalid argument value
ORA-39041: Filter "INCLUDE" either identifies all object types or no object types.regards,
jpDouble post: IMPDP to import JAVA CLASS and JAVA SOURCE from .DMP file
Already replied.
Regards
Gokhan -
Database full compile while doing schema import using impdp
hi,
oracle 10g
Database full compile while doing schema import using impdp..
what is happening here..
regards,
DeepakMy scenario
I need to import the particular schema from the full export dump. which has taken by using expdp. while importing i am using remap_schema for a single schema.
But it try to import the full dump and compile all the schema objects.
regards,
Deepak -
Need syntax for remap tablespaces for Import datapump using API
Good day all,
I am not able to find an example for remap_tablespaces for import datapump using API. I am not able to perform the import as it complains it doesnt have grant to the exported tables's tablespace. I created a new user with new table spaces associated with it to import tables and other objects.
I have remap_schema in API procedure but not remap_tablesapce. Could you please tell me how to accomplish it.
Thanks a lot.
MaggieHi Maggie,
The other parameter is 'OBJECT_TYPE'
so you could have
DBMS_DATAPUMP.METADATA_REMAP (handle => k1, name => 'REMAP_TABLESPACE', OLD_VALUE =>'+ORIGINAL_TABLESPACE_NAME+', VALUE =>'+NEW_TABLESPACE_NAME+',OBJECT_TYPE=>'TABLE');The full list of options from the oracle docs is suposedly this:
TABLE, INDEX, ROLLBACK_SEGMENT, MATERIALIZED_VIEW, MATERIALIZED_VIEW_LOG,TABLE_SPACE
ref http://docs.oracle.com/cd/B28359_01/appdev.111/b28419/d_datpmp.htm#i1007115
As for an example with table - here is one i did which also includes remap_data - you can ignore that part - the rest of the code shows you how to do a table level export based on a list of tables returned from a SQL query.
http://dbaharrison.blogspot.de/2013/04/dbmsdatapump-with-remapdata.html
Cheers,
Harry -
Export / import tablespace with all objects (datas, users, roles)
Hi, i have a problem or question to the topic export / import tablespace.
On the one hand, i have a database 10g (A) and on the other hand, a database 11g (B).
On A there is a tablespace called PRO.
Furthermore 3 Users:
PRO_Main - contains the datas - Tablespace PRO
PRO_Users1 with a role PRO_UROLE - Tablespace PRO
PRO_Users2 with a role PRO_UROLE - Tablespace PRO
Now, i want to transfer the whole tablespace PRO (included users PRO_MAIN, PRO_USER1, PRO_User2 and the role PRO_UROLE) from A to B.
On B, I 've created the user PRO_Main and the tablespace PRO.
On A , i execute following statement:
expdp PRO_Main/XXX TABLESPACES=PRO DIRECTORY=backup_datapump DUMPFILE=TSpro.dmp LOGFILE=TSpro.log
On B:
impdp PRO_Main/XXX TABLESPACES=PRO DIRECTORY=backup_datapump DUMPFILE=TSpro.dmp LOGFILE=TSpro.log
Result:
The User PRO_Main was imported with all the datas.
But i 'm missing PRO_USER1, PRO_User2 and the role PRO_UROLE...
I assume, i 've used wrong parameters in my expd and / or impdp.
It would be nice, if anybody can give me a hint.
Thanks in advance.
Best Regards,
FrankWhen you do a TABLESPACE mode export by specifying just the tablespaces, then all that gets exported are the tables and their dependent objects. The users, roles, and the tablespace definitions themselves don't get exported.
When you do a SCHEMA mode export by specifying the schemas, you will get the schema definitions (if the schema running the export is privied) and all of the objects that the schema owns. The schema does not own roles or tablespace definitions.
In your case, you want to move
1. schemas - which you already created 1 on your target database
2. roles
3. everything in the tablespaces owned by multiple schemas.
There is no 1 export/import command that will do this. This is how i would do this:
1 - move the schema definitions
a. you can either create these manually or
b1. expdp schemas=<your list of schemas> include=user
b2 impdp the results from b1.
2. move the roles
expdp full=y include=role ...
remember, this will include all roles. If you want to limit what gets exported, then use:
include=role:"in ('ROLE1', 'ROLE2', ETC.)
impdo the roles just exported
3. move the user information
a. If you want to move all of the schema's objects like functions, packages, etc, then you need to use a schema mode
export
expdp user/password schemas=a,b,c ...
b. If you want to move only the objects in those tablespaces, then use the tablespace export
expdp user/password tablespaces=tbs1, tbs2, ...
c. import the dumpfile generated in step 3
impdp user/password ...
Hope this helps.
Dean -
DB Upgrade+Migrate from 10.1 to 11.2 using IMPDP with network_link param
Dear all,
I would like to upgrade and migrate my 10.1.0.5 DB on old server to 11.2.0.2 on new server.
Here is the background info:
Old server:
OS : Redhat Linux AS 2.1
DB Version : 10.1.0.5 (32 bit)
No RAC
New Server:
OS : OEL 5.5
DB Version : 11.2.0.2
RAC
ASM
What I have done so far:
1. Create new clustered 11Gr2 DB on new server.
2. Pre-create tablespaces on new DB.
3. Migrate 10.1.0.5 to 11.2.0.2 using IMPDP.
impdp system/******* DIRECTORY=dump_file_dir NETWORK_LINK=DWH_DBLINK LOGFILE=log_file_dir:DWH_import_20110621.log FULL=Y SERVICE_NAME=dwhdb.xxx.xxx TABLE_EXISTS_ACTION=replace cluster=N exclude=statistics
4. After IMPDP complete, invalid objects and components are found, run utlrp.sql no help
SQL> select owner, count(*) from dba_objects where status = 'INVALID' group by owner;
OWNER COUNT(*)
WKSYS 16
PUBLIC 12
OLAPSYS 7
ODM 21
SYS 2
WMSYS 11
12 rows selected.
SQL> select comp_name, status, version from dba_registry;
COMP_NAME STATUS VERSION
OWB VALID 11.2.0.2.0
Oracle Application Express VALID 3.2.1.00.12
Oracle Enterprise Manager VALID 11.2.0.2.0
OLAP Catalog VALID 11.2.0.2.0
Spatial VALID 11.2.0.2.0
Oracle Multimedia VALID 11.2.0.2.0
Oracle XML Database VALID 11.2.0.2.0
Oracle Text VALID 11.2.0.2.0
Oracle Expression Filter VALID 11.2.0.2.0
Oracle Rules Manager VALID 11.2.0.2.0
Oracle Workspace Manager INVALID 11.2.0.2.0
Oracle Database Catalog Views VALID 11.2.0.2.0
Oracle Database Packages and Types INVALID 11.2.0.2.0
JServer JAVA Virtual Machine VALID 11.2.0.2.0
Oracle XDK VALID 11.2.0.2.0
Oracle Database Java Packages VALID 11.2.0.2.0
OLAP Analytic Workspace VALID 11.2.0.2.0
Oracle OLAP API VALID 11.2.0.2.0
Oracle Real Application Clusters VALID 11.2.0.2.0
19 rows selected.
5. Check SYS's invalid objects, e.g. DBA_OUTLINE_HINTS, after tracing the reason, find outln.ol$hints is replaced by 10.1.0.5 version. I think it is due to the IMPDP's "TABLE_EXISTS_ACTION=replace" parameter.
Others invalid objects like WMSYS.WM$ENV_VARS, also replaced by old version.
What should I do now?
Do I need to run upgrade script after DB migration using IMPDP?
Is the migration procedure correct?
Please advise
Thanks in advanceHi,
It looks like you've messed up the Non User (Oracle default user) data and/or metadata i.e sys, system, wksys.
How many non Oracle default user do you have? and how big is the database? If you're using this method I'm assuming the data is not really big.
I personally will not do full export/ import across different version, its better to just export the non default Oracle user schema as you might ended up messed up the sys objects, etc
What you might do is
- drop the 11g database and start from beginning again but exclude Oracle default user e.g sys,system, etc
- or try run catupgrd.sql script (this will drop and recreate the objects), -this may or may not be fixing your invalid components
Cheers -
Unable to restore TABLESPACE using RMAN backups
Hi,
I am not able to restore tablespace using RMAN (TSPITR).
I have full backup of database,
While try to restore it's failed.
RMAN> recover tablespace TEST1 until logseq 706 auxiliary destination '/tmp';
Starting recover at 16-OCT-13
using channel ORA_DISK_1
RMAN-05026: WARNING: presuming following set of tablespaces applies to specified point-in-time
List of tablespaces expected to have UNDO segments
Tablespace SYSTEM
Tablespace UNDOTBS2
Creating automatic instance, with SID='CsFz'
initialization parameters used for automatic instance:
db_name=TRAINEE
db_unique_name=CsFz_tspitr_TRAINEE
compatible=11.2.0.0.0
db_block_size=8192
db_files=200
sga_target=280M
processes=50
db_create_file_dest=/tmp
log_archive_dest_1='location=/tmp'
#No auxiliary parameter file used
starting up automatic instance TRAINEE
Oracle instance started
Total System Global Area 292933632 bytes
Fixed Size 1336092 bytes
Variable Size 100666596 bytes
Database Buffers 184549376 bytes
Redo Buffers 6381568 bytes
Automatic instance created
List of tablespaces that have been dropped from the target database:
Tablespace TEST1
contents of Memory Script:
# set requested point in time
set until logseq 706 thread 1;
# restore the controlfile
restore clone controlfile;
# mount the controlfile
sql clone 'alter database mount clone database';
# archive current online log
sql 'alter system archive log current';
# avoid unnecessary autobackups for structural changes during TSPITR
sql 'begin dbms_backup_restore.AutoBackupFlag(FALSE); end;';
executing Memory Script
executing command: SET until clause
Starting restore at 16-OCT-13
allocated channel: ORA_AUX_DISK_1
channel ORA_AUX_DISK_1: SID=81 device type=DISK
channel ORA_AUX_DISK_1: starting datafile backup set restore
channel ORA_AUX_DISK_1: restoring control file
channel ORA_AUX_DISK_1: reading from backup piece /oracle/product/11.2.0/dbhome_1/dbs/c-332232391-20131016-09
channel ORA_AUX_DISK_1: piece handle=/oracle/product/11.2.0/dbhome_1/dbs/c-332232391-20131016-09 tag=TAG20131016T144951
channel ORA_AUX_DISK_1: restored backup piece 1
channel ORA_AUX_DISK_1: restore complete, elapsed time: 00:00:01
output file name=/tmp/TRAINEE/controlfile/o1_mf_95wbkpvj_.ctl
Finished restore at 16-OCT-13
sql statement: alter database mount clone database
sql statement: alter system archive log current
sql statement: begin dbms_backup_restore.AutoBackupFlag(FALSE); end;
contents of Memory Script:
# set requested point in time
set until logseq 706 thread 1;
# set destinations for recovery set and auxiliary set datafiles
set newname for clone datafile 1 to new;
set newname for clone datafile 7 to new;
set newname for clone datafile 2 to new;
set newname for clone tempfile 1 to new;
set newname for datafile 6 to
"/oracle/oradata/TRAINEE/datafile/o1_mf_test1_95w9fln9_.dbf";
# switch all tempfiles
switch clone tempfile all;
# restore the tablespaces in the recovery set and the auxiliary set
restore clone datafile 1, 7, 2, 6;
switch clone datafile all;
executing Memory Script
executing command: SET until clause
executing command: SET NEWNAME
executing command: SET NEWNAME
executing command: SET NEWNAME
executing command: SET NEWNAME
executing command: SET NEWNAME
renamed tempfile 1 to /tmp/TRAINEE/datafile/o1_mf_temp_%u_.tmp in control file
Starting restore at 16-OCT-13
using channel ORA_AUX_DISK_1
channel ORA_AUX_DISK_1: starting datafile backup set restore
channel ORA_AUX_DISK_1: specifying datafile(s) to restore from backup set
channel ORA_AUX_DISK_1: restoring datafile 00001 to /tmp/TRAINEE/datafile/o1_mf_system_%u_.dbf
channel ORA_AUX_DISK_1: restoring datafile 00007 to /tmp/TRAINEE/datafile/o1_mf_undotbs2_%u_.dbf
channel ORA_AUX_DISK_1: restoring datafile 00002 to /tmp/TRAINEE/datafile/o1_mf_sysaux_%u_.dbf
channel ORA_AUX_DISK_1: restoring datafile 00006 to /oracle/oradata/TRAINEE/datafile/o1_mf_test1_95w9fln9_.dbf
channel ORA_AUX_DISK_1: reading from backup piece /tmp/1iomi9rv_1_1
channel ORA_AUX_DISK_1: piece handle=/tmp/1iomi9rv_1_1 tag=TAG20131016T144935
channel ORA_AUX_DISK_1: restored backup piece 1
channel ORA_AUX_DISK_1: restore complete, elapsed time: 00:00:15
Finished restore at 16-OCT-13
datafile 1 switched to datafile copy
input datafile copy RECID=11 STAMP=828975325 file name=/tmp/TRAINEE/datafile/o1_mf_system_95wbkybb_.dbf
datafile 7 switched to datafile copy
input datafile copy RECID=12 STAMP=828975325 file name=/tmp/TRAINEE/datafile/o1_mf_undotbs2_95wbkycy_.dbf
datafile 2 switched to datafile copy
input datafile copy RECID=13 STAMP=828975325 file name=/tmp/TRAINEE/datafile/o1_mf_sysaux_95wbkybz_.dbf
contents of Memory Script:
# set requested point in time
set until logseq 706 thread 1;
# online the datafiles restored or switched
sql clone "alter database datafile 1 online";
sql clone "alter database datafile 7 online";
sql clone "alter database datafile 2 online";
sql clone "alter database datafile 6 online";
# recover and open resetlogs
recover clone database tablespace "TEST1", "SYSTEM", "UNDOTBS2", "SYSAUX" delete archivelog;
alter clone database open resetlogs;
executing Memory Script
executing command: SET until clause
sql statement: alter database datafile 1 online
sql statement: alter database datafile 7 online
sql statement: alter database datafile 2 online
sql statement: alter database datafile 6 online
Starting recover at 16-OCT-13
using channel ORA_AUX_DISK_1
starting media recovery
archived log for thread 1 with sequence 702 is already on disk as file /oracle/product/11.2.0/dbhome_1/dbs/arch1_702_807275402.dbf
archived log for thread 1 with sequence 703 is already on disk as file /oracle/product/11.2.0/dbhome_1/dbs/arch1_703_807275402.dbf
archived log for thread 1 with sequence 704 is already on disk as file /oracle/product/11.2.0/dbhome_1/dbs/arch1_704_807275402.dbf
archived log for thread 1 with sequence 705 is already on disk as file /oracle/product/11.2.0/dbhome_1/dbs/arch1_705_807275402.dbf
archived log file name=/oracle/product/11.2.0/dbhome_1/dbs/arch1_702_807275402.dbf thread=1 sequence=702
archived log file name=/oracle/product/11.2.0/dbhome_1/dbs/arch1_703_807275402.dbf thread=1 sequence=703
archived log file name=/oracle/product/11.2.0/dbhome_1/dbs/arch1_704_807275402.dbf thread=1 sequence=704
archived log file name=/oracle/product/11.2.0/dbhome_1/dbs/arch1_705_807275402.dbf thread=1 sequence=705
media recovery complete, elapsed time: 00:00:01
Finished recover at 16-OCT-13
database opened
contents of Memory Script:
# make read only the tablespace that will be exported
sql clone 'alter tablespace TEST1 read only';
# create directory for datapump import
sql "create or replace directory TSPITR_DIROBJ_DPDIR as ''
/tmp''";
# create directory for datapump export
sql clone "create or replace directory TSPITR_DIROBJ_DPDIR as ''
/tmp''";
executing Memory Script
sql statement: alter tablespace TEST1 read only
sql statement: create or replace directory TSPITR_DIROBJ_DPDIR as ''/tmp''
sql statement: create or replace directory TSPITR_DIROBJ_DPDIR as ''/tmp''
Performing export of metadata...
EXPDP> Starting "SYS"."TSPITR_EXP_CsFz":
EXPDP> Processing object type TRANSPORTABLE_EXPORT/PLUGTS_BLK
EXPDP> Processing object type TRANSPORTABLE_EXPORT/TABLE
EXPDP> Processing object type TRANSPORTABLE_EXPORT/POST_INSTANCE/PLUGTS_BLK
EXPDP> Master table "SYS"."TSPITR_EXP_CsFz" successfully loaded/unloaded
EXPDP> ******************************************************************************
EXPDP> Dump file set for SYS.TSPITR_EXP_CsFz is:
EXPDP> /tmp/tspitr_CsFz_17454.dmp
EXPDP> ******************************************************************************
EXPDP> Datafiles required for transportable tablespace TEST1:
EXPDP> /tmp/TRAINEE/datafile/o1_mf_test1_95wbkyck_.dbf
EXPDP> Job "SYS"."TSPITR_EXP_CsFz" successfully completed at 14:56:02
Export completed
contents of Memory Script:
# shutdown clone before import
shutdown clone immediate
executing Memory Script
database closed
database dismounted
Oracle instance shut down
Performing import of metadata...
IMPDP> Master table "SYS"."TSPITR_IMP_CsFz" successfully loaded/unloaded
IMPDP> Starting "SYS"."TSPITR_IMP_CsFz":
IMPDP> Processing object type TRANSPORTABLE_EXPORT/PLUGTS_BLK
Removing automatic instance
Automatic instance removed
auxiliary instance file /tmp/TRAINEE/datafile/o1_mf_temp_95wblk08_.tmp deleted
auxiliary instance file /tmp/TRAINEE/onlinelog/o1_mf_3_95wblj14_.log deleted
auxiliary instance file /tmp/TRAINEE/onlinelog/o1_mf_2_95wblhn8_.log deleted
auxiliary instance file /tmp/TRAINEE/onlinelog/o1_mf_1_95wblh8q_.log deleted
auxiliary instance file /tmp/TRAINEE/datafile/o1_mf_sysaux_95wbkybz_.dbf deleted
auxiliary instance file /tmp/TRAINEE/datafile/o1_mf_undotbs2_95wbkycy_.dbf deleted
auxiliary instance file /tmp/TRAINEE/datafile/o1_mf_system_95wbkybb_.dbf deleted
auxiliary instance file /tmp/TRAINEE/controlfile/o1_mf_95wbkpvj_.ctl deleted
RMAN-00571: ===========================================================
RMAN-00569: =============== ERROR MESSAGE STACK FOLLOWS ===============
RMAN-00571: ===========================================================
RMAN-03002: failure of recover command at 10/16/2013 14:56:16
RMAN-06963: Error received during import of metadata
RMAN-06961: IMPDP> ORA-39123: Data Pump transportable tablespace job aborted
ORA-01565: error in identifying file '/oracle/oradata/TRAINEE/datafile/o1_mf_test1_95w9fln9_.dbf'
ORA-27037: unable to obtain file status
Linux Error: 2: No such file or directory
Additional information: 3Hi,
Cloud you please cheek below link:
ORACLE Cookies: TSPITR to recover a dropped tablespace
Thank you -
Memory fault while using impdp
I'm unable to import a database using impdp; as soon as I start the impdp cmd it errors out with a memory fault
I was able to previously import 1 TB out of a total of 3 TB, and then got an error related to the new MEMORY_TARGET parameter, complaining about insufficient tmpfs.
I resolved that error, and then after that I can't invoke impdp without the memory fault error as described below..
$ nohup impdp "'/ as sysdba'" JOB_NAME=LANCE_FULL_01 parfile=auexpdp.dat &
[1] 32118
$ nohup: appending output to `nohup.out'
[1] + Memory fault nohup impdp "'/ as sysdba'" JOB_NAME=LANCE_FULL_01 parfile=auexpdp.dat &
$
I also get the following error in the import log
ORA-39012: Client detached before the job started.
This is what my parfile looks like
directory=dmpdir
dumpfile=aexp1_%U.dmp,aexp2_%U.dmp,aexp3_%U.dmp,aexp4_%U.dmp
parallel=8
full=y
exclude=SCHEMA:"='MDDATA'"
exclude=SCHEMA:"='OLAPSYS'"
exclude=SCHEMA:"='ORDSYS'"
exclude=SCHEMA:"='DMSYS'"
exclude=SCHEMA:"='OUTLN'"
exclude=SCHEMA:"='ORDPLUGINS'"
include=tablespace
#transform=oid:n
logfile=expdpapps.log
trace=1FF0300
Has anybody seen this type of error before ?
ThanksPl post details of OS and database versions
>
$ nohup impdp "'/ as sysdba'" JOB_NAME=LANCE_FULL_01 parfile=auexpdp.dat &
>
You should not use '/ as sysdba' for expdp/impdp - use SYSTEM account instead - see the first NOTE sections in these links
http://docs.oracle.com/cd/E11882_01/server.112/e22490/dp_export.htm#i1012781
http://docs.oracle.com/cd/E11882_01/server.112/e22490/dp_import.htm#i1012504
HTH
Srini -
Import tablespace failed due to different release version for 10g
I try to exp tablespaces from 10.2.0.2 (Enterprise Edition, UNIX) to our test server 10.2.0.1 (express eidtion, linux) with transportable tablespace, when import the tablespaces, get the following error:
Export file created by EXPORT:V10.02.01 via conventional path
About to import transportable tablespace(s) metadata...
import done in AL32UTF8 character set and AL16UTF16 NCHAR character set
import server uses WE8MSWIN1252 character set (possible charset conversion)
export client uses US7ASCII character set (possible charset conversion)
export server uses UTF8 NCHAR character set (possible ncharset conversion)
. importing SYS's objects into SYS
. importing SYS's objects into SYS
IMP-00017: following statement failed with ORACLE error 721:
"BEGIN sys.dbms_plugts.checkCompType('COMPATSG','10.2.0.2.0'); END;"
IMP-00003: ORACLE error 721 encountered
ORA-00721: changes by release 10.2.0.2.0 cannot be used by release 10.2.0.1.0
ORA-06512: at "SYS.DBMS_PLUGTS", line 2004
ORA-06512: at line 1
IMP-00000: Import terminated unsuccessfully
I have done the transportable tablespace between 2 test servers before, which are both on the 10.2.0.1. It's done sucessfully at the same target server (LINUX, XE edition). So this is the 1st time I get into this problem. Does anyone know that I can have the express edition upgrade/apply patches so that it will be on 10.2.0.2? Or is there anything I can try to import the tablespace from 10.2.0.2 to 10.2.0.1?
Thanks,
HongmeiDue to no patchset for XE edition, I try to import the tablespace to another LINUX server with EE edition, which is 10.2.0.2, I thought I should be ok for this time. But I get the same error again. I have applied the patches set for 10.2.0.2, and I have followed all steps in the poststeps. Do I miss somthing? Why it still mention it's 10.2.0.1. Thanks for your help. I attached the output here:
Import: Release 10.2.0.2.0 - Production on Mon Jul 21 09:20:57 2008
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.2.0 - Production
With the Partitioning, OLAP and Data Mining options
IMP-00002: failed to open /tmp/3tbsp_exp.dmp for read
Import file: expdat.dmp > /tmp/3tbsps_exp.dmp
Export file created by EXPORT:V10.02.01 via conventional path
About to import transportable tablespace(s) metadata...
import done in US7ASCII character set and AL16UTF16 NCHAR character set
import server uses WE8ISO8859P1 character set (possible charset conversion)
export server uses UTF8 NCHAR character set (possible ncharset conversion)
. importing SYS's objects into SYS
. importing SYS's objects into SYS
IMP-00017: following statement failed with ORACLE error 721:
"BEGIN sys.dbms_plugts.checkCompType('COMPATSG','10.2.0.2.0'); END;"
IMP-00003: ORACLE error 721 encountered
ORA-00721: changes by release 10.2.0.2.0 cannot be used by release 10.2.0.1.0
ORA-06512: at "SYS.DBMS_PLUGTS", line 2004
ORA-06512: at line 1
IMP-00000: Import terminated unsuccessfully
Here is the imp -help output (top lines), it looks like the 10.2.0.2 version:
$ imp -help
Import: Release 10.2.0.2.0 - Production on Mon Jul 21 09:58:56 2008
Copyright (c) 1982, 2005, Oracle. All rights reserved. -
Getting Errors while using IMPDP command in UNIX
Hi All,
My work involves Refreshing of DB with latest Data. For this i am using IMPDP command to import the data. But Every time during refresh i will get errors saying like "oracle Generated Errors Oracle Not available".
I don't know what is the reason for getting this error, and that too my db goes down after this error. Again i will startup the db , and i will try my luck to complete the process.
so please tell me what are the actual problems that might cause the IMPDP to be failed.
with one advise , i have stopped my listener and done the importing, Interestingly this time IMPDP works without errors. But this is also not always working.
But i feel that every time starting and stopping the listener is not the correct solution.
so please help me in this regard.
Regards
Naveen R.The error i am getting is...
UDI-03113: operation generated ORACLE error 3113
ORA-03113: end-of-file on communication channel
Process ID: 6208
Session ID: 346 Serial number: 298
UDI-03114: operation generated ORACLE error 3114
ORA-03114: not connected to ORACLE
[user1@node1 cpt]$
DB version is 11.1.0.6.0
os is.. Red Hat Linux
Maybe you are looking for
-
ZTE Mf190s Internet USB Modem not working after upgrading to Yosemite.
ZTE Mf190s Internet USB Modem not working after upgrading to Yosemite. any salutation!!
-
Mac Mini & The Simpsons DVD Boxsets.
Today, I tried to watch a DVD of The Simpsons on my Mac Mini. However, when I insert the DVD, the drive spins for about 30 seconds, and then just ejects the disc without mounting the DVD or trying to load DVD player at all. I cannot find any mention
-
Problem Status Gui in Ecc 6.0
Hi all, I have a problem in Ecc 6.0 with status gui. The status not allow to write on the help icon. This icon (help) is protected. In 4.6c, the status gui allows to write. The Icon isn't protected in 4.6C. Many thanks.
-
DVCPRO HD Frame rate converter not working
Hello all I cant seem to get my frame rate converter to work. I shot the footage on my hvx200 at 720p30 and then put it in film mode and changed the frame rate to 60. Then I log and transfered it into FCP and when i go to convert it i keep getting th
-
Slightly OT: Strange Eclispe Error -- Giant Cursor, Blank File
No answer yet on the Eclipse newsgroup for this one, so I'm posting here.... When I open eclipse, my .java files are not being displayed. The file is simply blank and bluish (as if it's been selected), with what appears to be a very tall (as tall as