Import a table using impdp
Hi,
I have export of a table with name "abc". I need to import the table "abc" into another database but with a different name like "abc_temp". Is it possible?
I cannot import as "abc" and rename as "abc_temp"
Thanks
Hi,
I have export of table named "abc" in "xyz" schema. I
need to import "abc" into another database, but "xyz"
schema in another database is already having table
named "abc". I need to import the "abc" as "abc_temp"
so that original "abc" is not disturbed.
ThanksWell, oracle11 has "remap_table" argument that allows you to do what you need.
Unfortunately, oracle10 does not support that. Bummer.
Similar Messages
-
What are the important standard table use in FI/CO module?
What are the important standard table use in FI/CO module?
Moderator Message: Please avoid asking such queries.
Edited by: kishan P on Jan 24, 2012 12:37 PMHi Sanj,
Please go through the available information and then you can ask if there is any queries.
[Google|http://www.google.co.in/search?sourceid=chrome&ie=UTF-8&q=importanttablesinficoin+sap]
Regards,
Madhu. -
How do I rename the table using Impdp?
Using the Impdp/Expdp utility of Oracle, I am trying to import a table. However the target database contains a table with the same name. How do I rename the table using Import?
REgards,
SakthiSAKTHIVEL wrote:
Using the Impdp/Expdp utility of Oracle, I am trying to import a table. However the target database contains a table with the same name. How do I rename the table using Import?
Hmm well why you just can't rename the table on the target db with the rename table to command and then do the import, will be easy right?
Aman.... -
Import global temporary table using impdp
Hi all,
I have encountered the following situation:
I have exported (using expdp) a schema that had 2 GTT (one that was stored in a temp tablespace created by me and one that didn't display no tablespace when I was viewing it from PL/SQL Developer).When I imported the dump into another schema I used REMAP_TABLESPACE to remap my custom temp tablespace to the target custom temp tablespace. Now, the strange (for me) situation is that the GTT that was created in the temp tablespace created by me wasn't imported but the other one was.
Has someone else encountered this situation and knows why this is happening and what could be the solution?
Thank you!
MirceaThe versions of Oracle are 11g Release 11.2.0.2.0
The export and import commands are in scripts:
expdp ' || :schema_owner || '/' || :schema_owner_password || '@' || :db_name || ' DIRECTORY="'|| :dump_dir ||'" dumpfile="' ||:dump_file||'" LOGFILE=' ||:dump_log ;
impdp ''' || :schema_owner || '/' || :schema_owner_password || '@' || :db_name || ''' DIRECTORY='|| :dump_dir ||' DUMPFILE=' ||:dump_file ||' LOGFILE=' ||:dump_log||' REMAP_SCHEMA=' || :from_user || ':' || :schema_owner || ' REMAP_TABLESPACE=' || :from_user || '_MTD:' || :schema_owner || '_MTD,' || :from_user || '_DTS:' || :schema_owner || '_DTS,' || :from_user || '_IND:' || :schema_owner || '_IND,' || :from_user || '_TMP:' || :schema_owner || '_TMP TABLE_EXISTS_ACTION=APPEND TRANSFORM=oid:n %content%
where :from_user is the source schema and :schema_owner is the target schema.
No error was returned.
Thank you! -
Issue while importing the dump using impdp
**command:**
While iam using to import the dump with below command iam facing an issue as below,
impdp pdol2/pdol2@DOLPHDB directory=NEW_IMPORT_DIR table
existsaction=TRUNCATE EXCLUDE=statistics dumpfile=exportTDOLPHIN2_30062011_pre
batch_%u.dmp remap_schema=dolphin2:pdol2 logfile=exportTDOLPHIN2_30062011_pdol2.
log parallel=1 transform=oid:n
Import: Release 11.2.0.2.0 - Production on Wed Dec 14 18:29:37 2011
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit
Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
ORA-31626: job does not exist
ORA-31637: cannot create job SYS_IMPORT_FULL_01 for user PDOL2
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPV$FT_INT", line 798
ORA-39080: failed to create queues "KUPC$C_1_20111214182937" and "" for Data Pum
p job
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPC$QUE_INT", line 1530
ORA-04063: package body "SYS.DBMS_AQADM_SYS" has errors
ORA-06508: PL/SQL: could not find program unit being called: "SYS.DBMS_AQADM_SYS
Thanks in Advance
SivakarthickPl post details of OS version.
Pl see if these MOS Docs can help
563701.1 - Data Pump Job Fails With ORA-31626 ORA-31637 ORA-39080 ORA-04063 And ORA-06508
345198.1 - Impdp or Expdp Fails With ORA-31626 and ORA-31637
727804.1 - ORA-39080 Failed to Create Queues During Expdp or Impdp
372961.1 - Ora-00832 No Streams Pool Created while running datapump
HTH
Srini -
When Importing XML table using "merge content" I get content but none of the formatting
I'm just wondering, is this by design?
I generated this by exporting a table.
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<Root>
<Story>
<Table tblName="tsBasicTable" xmlns:aid="http://ns.adobe.com/AdobeInDesign/4.0/" xmlns:aid5="http://ns.adobe.com/AdobeInDesign/5.0/" aid5:tablestyle="tsBasicBody" aid:table="table" aid:trows="7" aid:tcols="9">
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="24.0"/>
<Cell aid:table="cell" aid:crows="1" aid:ccols="2">Alcohol</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="2">Tobacco</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="2">Marijuana</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="2">Prescription drugs</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="24.0">Grade</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">Town 2013</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">State 2013</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">Town 2013</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">State 2013</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">Town 2013</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">State 2013</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">Town 2013</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">State 2013</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="24.0">6</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">9.0</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">10.2</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">4.2</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">2.4</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">9.6</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">7.2</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">8.4</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">11.4</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="24.0">8</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">12.0</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">15.2</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">15.2</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">12.8</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">13.6</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">15.2</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">4.8</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">10.4</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="24.0">10</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">9.0</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">17.0</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">46.0</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">44.0</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">5.0</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">2.0</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">38.0</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">50.0</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="24.0">12</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">12.0</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">19.2</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">14.4</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">14.4</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">16.8</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">32.4</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">19.2</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">22.8</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="24.0">All</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">10.5</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">15.4</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">20.0</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">18.4</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">11.3</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">14.2</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">17.6</Cell>
<Cell aid:table="cell" aid:crows="1" aid:ccols="1" aid:ccolwidth="64.0">23.7</Cell>
</Table>
</Story>
</Root>
Say I take that XML, change all instances of aid:ccolwidth="24.0" to aid:ccolwidth="26.0" and replace
<Cell aid:table="cell" aid:crows="1" aid:ccols="2">Alcohol</Cell>
with
<Cell aid:table="cell" aid:crows="1" aid:ccols="2">Alcoholism</Cell>
'I save that XML.
In ID: Import XML... selecting "Merge Content", "only import elements that match existing structure" & "import text elements into tables if tags match"
The cell with "Alcohol" updates, but the column widths don't change.
Is this by design?
When I select "Append Content" I get a (new) table with the formatting (column widths) as specified in the updated aid:ccolwidth and of course, the updated content. When merging content are you giving up all rights to updating formatting at the same time?
Thanks in advance for any insights.Hello,
The existence check on the Salutation descriptor is likely check cache, as it is the default. Because of this, when you merge the Salutation with id=2, it will check the cache and not find it and so register it as a new object instead. You have 2 options, the first is to read it instead of creating it new. The second is to change your existence checking option so that it will either go tothe database or assume existence so that it is more appropriate for how you intend to use these types of objects in your mappings. For instance, if you never plan to create new ones, assume existence might be more appropriate.
I suspect though that for most applications, reading the object first is the best option performance wise.
Best Regards,
Chris -
Hello,
I wanna import a database using impdp. Can i import while i'm logged with system/<password> ? On which database should i be logged? I have a single database, 'orcl'
ThanksI wanna import a database using impdp.
Can i import while i'm logged with system/<password> ?
On which database should i be logged? I have a single database, 'orcl'You don't have to login to any database to perform the import using datapump import.
You just have to set the oracle database environment where you want to import into.
suppose, you have an export dump generated already,using datapump export(expdp) and if you want to import into orcl database,
set the oracle environment to orcl
You have to have the directory object created with read,write access to the user performing export/import(expdp or impdp)
use impdp help=y at the command line to see the list of available options.
http://download.oracle.com/docs/cd/B13789_01/server.101/b10825/dp_import.htm
-Anantha -
Memory fault while using impdp
I'm unable to import a database using impdp; as soon as I start the impdp cmd it errors out with a memory fault
I was able to previously import 1 TB out of a total of 3 TB, and then got an error related to the new MEMORY_TARGET parameter, complaining about insufficient tmpfs.
I resolved that error, and then after that I can't invoke impdp without the memory fault error as described below..
$ nohup impdp "'/ as sysdba'" JOB_NAME=LANCE_FULL_01 parfile=auexpdp.dat &
[1] 32118
$ nohup: appending output to `nohup.out'
[1] + Memory fault nohup impdp "'/ as sysdba'" JOB_NAME=LANCE_FULL_01 parfile=auexpdp.dat &
$
I also get the following error in the import log
ORA-39012: Client detached before the job started.
This is what my parfile looks like
directory=dmpdir
dumpfile=aexp1_%U.dmp,aexp2_%U.dmp,aexp3_%U.dmp,aexp4_%U.dmp
parallel=8
full=y
exclude=SCHEMA:"='MDDATA'"
exclude=SCHEMA:"='OLAPSYS'"
exclude=SCHEMA:"='ORDSYS'"
exclude=SCHEMA:"='DMSYS'"
exclude=SCHEMA:"='OUTLN'"
exclude=SCHEMA:"='ORDPLUGINS'"
include=tablespace
#transform=oid:n
logfile=expdpapps.log
trace=1FF0300
Has anybody seen this type of error before ?
ThanksPl post details of OS and database versions
>
$ nohup impdp "'/ as sysdba'" JOB_NAME=LANCE_FULL_01 parfile=auexpdp.dat &
>
You should not use '/ as sysdba' for expdp/impdp - use SYSTEM account instead - see the first NOTE sections in these links
http://docs.oracle.com/cd/E11882_01/server.112/e22490/dp_export.htm#i1012781
http://docs.oracle.com/cd/E11882_01/server.112/e22490/dp_import.htm#i1012504
HTH
Srini -
Datapump API: Import all tables in schema
Hi,
how can I import all tables using a wildcard in the datapump-api?
Thanks in advance,
tensai_tensai_ wrote:
Thanks for the links, but I already know them...
My problem is that I couldn't find an example which shows how to perform an import via the API which imports all tables, but nothing else.
Can someone please help me with a code-example?I'm not sure what you mean by "imports all tables, but nothing else". It could mean that you only want to import the tables, but not the data, and/or not the statistics etc.
Using the samples provided in the manuals:
DECLARE
ind NUMBER; -- Loop index
h1 NUMBER; -- Data Pump job handle
percent_done NUMBER; -- Percentage of job complete
job_state VARCHAR2(30); -- To keep track of job state
le ku$_LogEntry; -- For WIP and error messages
js ku$_JobStatus; -- The job status from get_status
jd ku$_JobDesc; -- The job description from get_status
sts ku$_Status; -- The status object returned by get_status
spos NUMBER; -- String starting position
slen NUMBER; -- String length for output
BEGIN
-- Create a (user-named) Data Pump job to do a "schema" import
h1 := DBMS_DATAPUMP.OPEN('IMPORT','SCHEMA',NULL,'EXAMPLE8');
-- Specify the single dump file for the job (using the handle just returned)
-- and directory object, which must already be defined and accessible
-- to the user running this procedure. This is the dump file created by
-- the export operation in the first example.
DBMS_DATAPUMP.ADD_FILE(h1,'example1.dmp','DATA_PUMP_DIR');
-- A metadata remap will map all schema objects from one schema to another.
DBMS_DATAPUMP.METADATA_REMAP(h1,'REMAP_SCHEMA','RANDOLF','RANDOLF2');
-- Include and exclude
dbms_datapump.metadata_filter(h1,'INCLUDE_PATH_LIST','''TABLE''');
dbms_datapump.metadata_filter(h1,'EXCLUDE_PATH_EXPR','LIKE ''TABLE/C%''');
dbms_datapump.metadata_filter(h1,'EXCLUDE_PATH_EXPR','LIKE ''TABLE/F%''');
dbms_datapump.metadata_filter(h1,'EXCLUDE_PATH_EXPR','LIKE ''TABLE/G%''');
dbms_datapump.metadata_filter(h1,'EXCLUDE_PATH_EXPR','LIKE ''TABLE/I%''');
dbms_datapump.metadata_filter(h1,'EXCLUDE_PATH_EXPR','LIKE ''TABLE/M%''');
dbms_datapump.metadata_filter(h1,'EXCLUDE_PATH_EXPR','LIKE ''TABLE/P%''');
dbms_datapump.metadata_filter(h1,'EXCLUDE_PATH_EXPR','LIKE ''TABLE/R%''');
dbms_datapump.metadata_filter(h1,'EXCLUDE_PATH_EXPR','LIKE ''TABLE/TR%''');
dbms_datapump.metadata_filter(h1,'EXCLUDE_PATH_EXPR','LIKE ''TABLE/STAT%''');
-- no data please
DBMS_DATAPUMP.DATA_FILTER(h1, 'INCLUDE_ROWS', 0);
-- If a table already exists in the destination schema, skip it (leave
-- the preexisting table alone). This is the default, but it does not hurt
-- to specify it explicitly.
DBMS_DATAPUMP.SET_PARAMETER(h1,'TABLE_EXISTS_ACTION','SKIP');
-- Start the job. An exception is returned if something is not set up properly.
DBMS_DATAPUMP.START_JOB(h1);
-- The import job should now be running. In the following loop, the job is
-- monitored until it completes. In the meantime, progress information is
-- displayed. Note: this is identical to the export example.
percent_done := 0;
job_state := 'UNDEFINED';
while (job_state != 'COMPLETED') and (job_state != 'STOPPED') loop
dbms_datapump.get_status(h1,
dbms_datapump.ku$_status_job_error +
dbms_datapump.ku$_status_job_status +
dbms_datapump.ku$_status_wip,-1,job_state,sts);
js := sts.job_status;
-- If the percentage done changed, display the new value.
if js.percent_done != percent_done
then
dbms_output.put_line('*** Job percent done = ' ||
to_char(js.percent_done));
percent_done := js.percent_done;
end if;
-- If any work-in-progress (WIP) or Error messages were received for the job,
-- display them.
if (bitand(sts.mask,dbms_datapump.ku$_status_wip) != 0)
then
le := sts.wip;
else
if (bitand(sts.mask,dbms_datapump.ku$_status_job_error) != 0)
then
le := sts.error;
else
le := null;
end if;
end if;
if le is not null
then
ind := le.FIRST;
while ind is not null loop
dbms_output.put_line(le(ind).LogText);
ind := le.NEXT(ind);
end loop;
end if;
end loop;
-- Indicate that the job finished and gracefully detach from it.
dbms_output.put_line('Job has completed');
dbms_output.put_line('Final job state = ' || job_state);
dbms_datapump.detach(h1);
exception
when others then
dbms_output.put_line('Exception in Data Pump job');
dbms_datapump.get_status(h1,dbms_datapump.ku$_status_job_error,0,
job_state,sts);
if (bitand(sts.mask,dbms_datapump.ku$_status_job_error) != 0)
then
le := sts.error;
if le is not null
then
ind := le.FIRST;
while ind is not null loop
spos := 1;
slen := length(le(ind).LogText);
if slen > 255
then
slen := 255;
end if;
while slen > 0 loop
dbms_output.put_line(substr(le(ind).LogText,spos,slen));
spos := spos + 255;
slen := length(le(ind).LogText) + 1 - spos;
end loop;
ind := le.NEXT(ind);
end loop;
end if;
end if;
-- dbms_datapump.stop_job(h1);
dbms_datapump.detach(h1);
END;
/This should import nothing but the tables (excluding the data and the table statistics) from an schema export (including a remapping shown here), you can play around with the EXCLUDE_PATH_EXPR expressions. Check the serveroutput generated for possible values used in EXCLUDE_PATH_EXPR.
Use the DBMS_DATAPUMP.DATA_FILTER procedure if you want to exclude the data.
For more samples, refer to the documentation:
http://download.oracle.com/docs/cd/B28359_01/server.111/b28319/dp_api.htm#i1006925
Regards,
Randolf
Oracle related stuff blog:
http://oracle-randolf.blogspot.com/
SQLTools++ for Oracle (Open source Oracle GUI for Windows):
http://www.sqltools-plusplus.org:7676/
http://sourceforge.net/projects/sqlt-pp/ -
ORA-02374: conversion error loading table during import using IMPDP
HI All,
We are trying to migrate the data from one database to an other database.
The source database is having character set
SQL> select value from nls_database_parameters where parameter='NLS_CHARACTERSET';
VALUE
US7ASCII
The destination database is having character set
SQL> select value from nls_database_parameters where parameter='NLS_CHARACTERSET';
VALUE
AL32UTF8
We took an export of the whole database using expdp and when we try to import to the destination database using impdp. We are getting the following error.
ORA-02374: conversion error loading table <TABLE_NAME>
ORA-12899: value too large for column <COLUMN NAME> (actual: 42, maximum: 40)
ORA-02372: data for row:<COLUMN NAME> : 0X'4944454E5449464943414349E44E204445204C4C414D414441'
Kindly let me know how to overcome this issue in destination.
Thanks & Regards,
Vikas KrishnaHi,
You can overcome this issue by increasing the column width in the target database for the max value required for all data to be imported successfully in the table.
Regards -
Errors importing TTS using impdp
Hello All,
I am seeing following errors when importing TTS using impdp. Please let me know if you know any work around.
Thanks,
Sunny boy
Corp_DEODWPRD$ impdp parfile=impdp_tts.par
Import: Release 11.2.0.3.0 - Production on Mon Jul 30 13:23:50 2012
Copyright (c) 1982, 2011, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
With the Partitioning, Automatic Storage Management, OLAP, Data Mining
and Real Application Testing options
Master table "SYS"."SYS_IMPORT_TRANSPORTABLE_02" successfully loaded/unloaded
Starting "SYS"."SYS_IMPORT_TRANSPORTABLE_02": /******** AS SYSDBA parfile=impdp_tts.par
Processing object type TRANSPORTABLE_EXPORT/PLUGTS_BLK
Processing object type TRANSPORTABLE_EXPORT/TABLE
Processing object type TRANSPORTABLE_EXPORT/GRANT/OWNER_GRANT/OBJECT_GRANT
ORA-39126: Worker unexpected fatal error in KUPW$WORKER.LOAD_METADATA [SELECT process_order, flags, xml_clob, NVL(dump_fileid, :1), NVL(dump_position, :2), dump_length, dump_allocation, NVL(value_n, 0), grantor, object_row, object_schema, object_long_name, partition_name, subpartition_name, processing_status, processing_state, base_object_type, base_object_schema, base_object_name, base_process_order, property, size_estimate, in_progress, original_object_schema, original_object_name, creation_level, object_int_oid FROM "SYS"."SYS_IMPORT_TRANSPORTABLE_02" WHERE process_order between :3 AND :4 AND duplicate = 0 AND processing_state NOT IN (:5, :6, :7) ORDER BY process_order]
ORA-39183: internal error -19 ocurred during decompression phase 2
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPW$WORKER", line 9001
----- PL/SQL Call Stack -----
object line object
handle number name
0x5d9562370 20462 package body SYS.KUPW$WORKER
0x5d9562370 9028 package body SYS.KUPW$WORKER
0x5d9562370 4185 package body SYS.KUPW$WORKER
0x5d9562370 9725 package body SYS.KUPW$WORKER
0x5d9562370 1775 package body SYS.KUPW$WORKER
0x59c4fbb90 2 anonymous block
Job "SYS"."SYS_IMPORT_TRANSPORTABLE_02" stopped due to fatal error at 13:52:57
PAR file:
USERID='/ as sysdba'
directory=ODW_MIG_DIR
dumpfile=exp_full_ODW_tts_20120719_01.dmp
logfile=imp_FULL_ODW_TTS_20120730.log
exclude=PROCACT_INSTANCE
TRANSPORT_DATAFILES=(
'+DEODW_DATA_GRP/deodwprd/data/ts_appsds_d_01.dbf',
'+DEODW_DATA_GRP/deodwprd/data/ts_appsds_i_01.dbf',
'+DEODW_DATA_GRP/deodwprd/data/deodw_aud_d_01.dbf',
'+DEODW_DATA_GRP/deodwprd/data/deodw_capone_d01_02.dbf',
'+DEODW_DATA_GRP/deodwprd/data/deodw_capone_d01_03.dbf',
'+DEODW_DATA_GRP/deodwprd/data/deodw_capone_d01_01.dbf',
'+DEODW_DATA_GRP/deodwprd/data/deodw_capone_d_p2012_01.dbf',
'+DEODW_DATA_GRP/deodwprd/data/deodw_capone_i01_01.dbf',
'+DEODW_DATA_GRP/deodwprd/data/deodw_ts_ccr_d01_3.dbf',
'+DEODW_DATA_GRP/deodwprd/data/deodw_ts_ccr_d01_2.dbf',
'+DEODW_DATA_GRP/deodwprd/data/deodw_ts_ccr_d01_1.dbf',
'+DEODW_DATA_GRP/deodwprd/data/deodw_cst_stage_d_01.dbf',
'+DEODW_DATA_GRP/deodwprd/data/deodw_cst_stage_i_01.dbf',
'+DEODW_DATA_GRP/deodwprd/data/deodw_dsdw_doc_stg_d01_01.dbf',
'+DEODW_DATA_GRP/deodwprd/data/deodw_x_etl_d01_02.dbf',
'+DEODW_DATA_GRP/deodwprd/data/deodw_x_report_work_d01_01.dbf',
'+DEODW_DATA_GRP/deodwprd/data/deodw_x_sas_d01_01.dbf')Still the same error with system account. Working with oracle about this iissue.
Corp_DEODWPRD$ impdp parfile=impdp_tts.par
Import: Release 11.2.0.3.0 - Production on Tue Jul 31 11:47:32 2012
Copyright (c) 1982, 2011, Oracle and/or its affiliates. All rights reserved.
UDI-28002: operation generated ORACLE error 28002
ORA-28002: the password will expire within 6 days
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
With the Partitioning, Automatic Storage Management, OLAP, Data Mining
and Real Application Testing options
Master table "SYSTEM"."SYS_IMPORT_TRANSPORTABLE_01" successfully loaded/unloaded
Starting "SYSTEM"."SYS_IMPORT_TRANSPORTABLE_01": system/******** parfile=impdp_tts.par
Processing object type TRANSPORTABLE_EXPORT/PLUGTS_BLK
Processing object type TRANSPORTABLE_EXPORT/TABLE
Processing object type TRANSPORTABLE_EXPORT/GRANT/OWNER_GRANT/OBJECT_GRANT
ORA-39126: Worker unexpected fatal error in KUPW$WORKER.LOAD_METADATA [SELECT process_order, flags, xml_clob, NVL(dump_fileid, :1), NVL(dump_position, :2), dump_length, dump_allocation, NVL(value_n, 0), grantor, object_row, object_schema, object_long_name, partition_name, subpartition_name, processing_status, processing_state, base_object_type, base_object_schema, base_object_name, base_process_order, property, size_estimate, in_progress, original_object_schema, original_object_name, creation_level, object_int_oid FROM "SYSTEM"."SYS_IMPORT_TRANSPORTABLE_01" WHERE process_order between :3 AND :4 AND duplicate = 0 AND processing_state NOT IN (:5, :6, :7) ORDER BY process_order]
ORA-39183: internal error -19 ocurred during decompression phase 2
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPW$WORKER", line 9001
----- PL/SQL Call Stack -----
object line object
handle number name
0x5dac10b98 20462 package body SYS.KUPW$WORKER
0x5dac10b98 9028 package body SYS.KUPW$WORKER
0x5dac10b98 4185 package body SYS.KUPW$WORKER
0x5dac10b98 9725 package body SYS.KUPW$WORKER
0x5dac10b98 1775 package body SYS.KUPW$WORKER
0x59c554178 2 anonymous block
Job "SYSTEM"."SYS_IMPORT_TRANSPORTABLE_01" stopped due to fatal error at 12:13:04
Edited by: Sunny boy on Jul 31, 2012 12:16 PM -
Hi,
Being an experienced Excel user before Power BI, I am just starting to explore the M and Power Query capabilities, and need help already (ain't easy to google this use case somehow):
I need to import the table which sits in the Excel file with header row in the row 17 of Excel sheet, with some metadata header in the preceding rows of the columns A and B.
01: Report name, Quick Report
02: Report Date, 1/1/2014
17: Employee Name, Manager, etc...
18: John Doe, Matt Beaver, etc.
Both (a) direct attempt to load as Excel file and (b) the indirect way through [From Folder] and formula in custom column -- both lead to the same error: "[DataFormat.Error] External table is not in the expected format."
Specifically, I tried to use the [Power Query -> From File -> From Folder] functionality, select an Excel file and add a custom column to access the binary content: [Add Custom Column] with formula "=Excel.Workbook([Content])".
It looks like Power Query expects a rectangular range with headers full-width followed by a contiguous table range to import anything, and refuses to load if that is not the case...
QUESTION: Is there any way to load whatever-formatted data from Excel first, and then manipulate the overall imported range (like referring to rows starting from 17th using "Table.SelectRows" etc.) to read the actual data? Reading and using
the metadata from header would be a bonus, but that comes second... The main issue is to get something from a non-regular Excel file to later work with using M formulae ...
Thanks!
SAMFinally found the answer to this one in ():
You Cannot Open a Password-Protected Workbook
If the Excel workbook is protected by a password, you
cannot open it for data access, even by supplying the correct password with
your connection settings, unless the workbook file is already open in the
Microsoft Excel application. If you try, you receive the following error
message:
Could not decrypt file.
ANSWER: So, will have either weave in the work with temporary unprotected files or requires opening them before updating the data source (although this almost defeats the purpose of automation...)
ANSWER to ORIGINAL QUESTION: password was preventing Power Query from reading the Excel file. For solution see above.
Thanks anyway for participation and inspiration, Imke! -
Exporting and importing table using R3trans program between 2 clients
Hi,
How to export and import a table between to clients in a same system using R3trans program?
I need to copy a table from Client 020 in a system to client 040 of the same system using R3 trans. I need to know the procedure.
Can any one advice
Regards,
SureshThis is how you do a export and import of table entries.
Export:
Open Notepad and type the following,
export
client = 020
file = 'clone.export.<sid>.<client no>.data'
select * from <client_dependent_tablename1>
select * from <client_dependent_tablename2>
select * from <client_dependent_tablenamen>
Save the file as export.ctl
Run R3trans export.ctl
and the data of these files will be stored in a file called clone.export.<sid>.data in the directory from which you have called R3trans
Import:
Open Notepad,
import
client = 040
file = clone.export.<sid>.<client no>.data
buffersync = yes
Save the file as import.ctl
Run R3trans import.ctl
Cheers!
Bidwan
Message was edited by:
Bidwan Baruah -
Exporting and Importing a table in Lookout 6.0.2 using Office 2007
I am exporting a table to Excel 2007 from Lookout 6.0.2 so I can make changes and then import the table with changes back into Lookout. I have done this in the past with Excel 2003 and have had no problems. When I export the table the file that is created is an Excel 97-2003 workbook file, so after I make changes to the file in Excel 2007, I save the file as an Excel 97-2003 workbook because that is what is was initially and that is the only type of file Lookout sees when trying to do an import. Problem is when go to import this file I get an error saying 'cannot read file format'.
Any help would be greatly appreciated.
Thanks!
Jason
Jason PhillipsLookout Datatable object can only read worksheet file, but unfortunately Excel 2007 can not save as worksheet any more.
I have not found a way to use datatable with excel 2007 yet.
One workaround is to use ODBC. Refer to this article.
http://digital.ni.com/public.nsf/allkb/C8137C6BDA276982862566BA005C5D1F?OpenDocument
Ryan Shi
National Instruments -
Can we use impdp to import the data from an normal exp dump?
Hi All,
I have a export dump taken from a 9i database. Can i use impdp to import the data into 10g database from that 9i exp dump?
Please suggest
thanks and Regards
ArunHi,
I have a export dump taken from a 9i database. Can i use impdp to import the data into 10g database from that 9i exp dump?Yes, it can be.
Refer:
http://wiki.oracle.com/thread/3734722/can+a+9i+dump+file+be+imported+into+10%3F
thanks,
X A H E E R
Maybe you are looking for
-
How can I get FlashPlayer10 on my smart phone Motorola CLIQ Xt?
How can I get FlashPlayer10 on my smart phone Motorola CLIQ Xt?
-
How can I get rid of Yahoo Search starting up everytime I click + to open a new tab
I recently downloaded a product (can't remember what it was) but it included the Yahoo Search Toolbar. Now, although I have disabled the ToolBar, everytime I click on + to open a new tab it ALWAYS brings up Yahoo Search Box before I can type in the U
-
How to select alt. recipe
Hi there are 2 recipe's for a material , whenever Process order created system automatically assign recipe group X, but i want to change another group (Y) during process order release. How to do? which setting drive's the automatic assig
-
Will Adobe CS6 work on a laptop with screen resolution of 1366 x 768 pixels?
I'm looking at this laptop - HP ProBook 15.6-Inch Laptop 2.1 GHz AMD A8-5550M Processor, 8GB DDR3L, 500GB HDD, Windows 7 Professional and a 7200 RPM drive. All the specs are great except for the screen resolution, which Adobe says should be 1280X900.
-
Transferring In-App Purchase to New Device with Another Account
I loaned my mother my old iPad for a while so she could play bridge. I bought her, in-app (i.e. via Mac App Store) a year's worth of "Fun Bridge" credits on my account. The iPad died, so she bought a new one herself, and set it up with her own Apple