Exclude tables list kink option in exp/imp ????
Hi,
Is there any option to exclude tables while exp/imp, if so from which version?
Thanks.
Actually, in 10.1 and later, the Data Pump versions of export and import allow you to specify an EXClUDE parameter to exclude one or more tables.
Justin
Distributed Database Consulting, Inc.
http://www.ddbcinc.com/askDDBC
Similar Messages
-
How to exp/imp both diff character set tables between in DB1 and DB2?
In the Solaris 2.7 ,the oracle 8i DB1 has NLS_CHARACTERSET
ZHS16CGB231280 and NLS_NCHAR_CHARACTERSET ZHS16CGB231280
character set.
In other linux7.2 system ,the oracle 8i DB2 is install into the
NLS_NCHAR_CHARACTERSET US7ASCII and NLS_CHARACTERSET US7ASCII
character set.
The tables contents of DB1 have some chinese. I want to exp/imp
tables of DB1 into DB2 . But the chinese can't correct display
in the SQLWheet tools. How do the Exp/Imp operation ? ples help
me . thanks .The supported way to store GB231280-encoded characters is using a ZHS16CGB231280 database or a database created using a superset of GB231280 ,such as UTF8 .Can you not upgrade your target database from US7ASCII to ZHS16CGB231280 ?
With US7ASCII and NLS_LANG set to US7ASCII , you are using the garbage in garbage out (GIGO) approach. This may seem to work but there are many hidden problems :-
1. Invalid SQL String Function behaviours - LENGTH ( ) , SUBSTR ( ) , INSTR ( )
2. Data can be corrupted when data is loaded into another database. e.g. EXP / IMP , Dblinks
3. Communication with other clients will generate incorrect results. e.g. other Oracle products - Oracle Text, Forms. , Java , HTML etc.
4. Linguistic sorts not available
5. Query using the standard WHERE clause may return incorrect results ..
6. Extra coding overhead in handling character conversions manually.
I recommend you to check out the FAQ and the DB Character set migration guide on the Globalization Support forum on OTN.
Nat. -
Exp/imp procedures, functions and packages question
Hi
I've a 9i R2 version Oracle database. I would like to export procedures, functions and packages from a schema. How do I do that?
Is there any script or command lines can provide?
ThanksHello user12259190.
You can do an export of the user itself, excluding table data as inH:\>exp
Export: Release 10.2.0.1.0 - Production on Tue Dec 22 11:22:52 2009
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Username: db_user@db_sid
Password:
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
With the Partitioning, Data Mining and Real Application Testing options
Enter array fetch buffer size: 4096 >
Export file: EXPDAT.DMP >
(2)U(sers), or (3)T(ables): (2)U > 2
Export grants (yes/no): yes > no
Export table data (yes/no): yes > no
Compress extents (yes/no): yes > no
Export done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
server uses UTF8 character set (possible charset conversion)
Note: table data (rows) will not be exported
Note: grants on tables/views/sequences/roles will not be exported
. exporting pre-schema procedural objects and actions
. exporting foreign function library names for user DB_USER
. exporting PUBLIC type synonyms
. exporting private type synonyms
. exporting object type definitions for user DB_USER
About to export DB_USER's objects ...
. exporting database links
. exporting sequence numbers
. exporting cluster definitions
. about to export DB_USER's tables via Conventional Path ...
. . exporting table TABLE_NAMEs
EXP-00091: Exporting questionable statistics.
. exporting synonyms
. exporting views
. exporting stored procedures
. exporting operators
. exporting referential integrity constraints
. exporting triggers
. exporting indextypes
. exporting bitmap, functional and extensible indexes
. exporting posttables actions
. exporting materialized views
. exporting snapshot logs
. exporting job queues
. exporting refresh groups and children
. exporting dimensions
. exporting post-schema procedural objects and actions
. exporting statistics
Export terminated successfully with warnings.Unfortunately, you can't export just the objects you want to unless they are tables.
Using import (imp) you can list the content of your packages, procedures, functions, views, etc. and perhaps that will give you what you need.
Another choice would be to useSELECT * FROM user_source ORDER BY 2, 1, 3;to list the code.
Hope this helps,
Luke -
Full database exp/imp between RAC and single database
Hi Experts,
we have a RAC database oracle 10GR2 with 4 node in linux. i try to duplicate rac database into single instance window database.
there are same version both database. during importing, I need to create 4 undo tablespace to keep imp processing.
How to keep one undo tablespace in single instance database?
any experience of exp/imp RAC database into single instance database to share with me?
Thanks
Jim
Edited by: user589812 on Nov 13, 2009 10:35 AMJIm,
I also want to know can we add the exclude=tablespace on the impdp command for full database exp/imp?You can't use exclude=tablespace on exp/imp. It is for datapump expdp/impdp only.
I am very insteresting in your recommadition.
But for a full database impdp, how to exclude a table during full database imp? May I have a example for this case?
I used a expdp for full database exp. but I got a exp error in expdp log as ORA-31679: Table data object "SALE"."TOAD_PLAN_TABLE" has long columns, and longs can not >be loaded/unloaded using a network linkHaving long columns in a table means that it can't be exported/imported over a network link. To exclude this, you can use the exclude expression:
expdp user/password exclude=TABLE:"= 'SALES'" ...
This will exclude all tables named sales. If you have that table in schema scott and then in schema blake, it will exclude both of them. The error that you are getting is not a fatal error, but that table will not be exported/imported.
the final message as
Master table "SYSTEM"."SYS_EXPORT_FULL_01" successfully loaded/unloaded
Dump file set for SYSTEM.SYS_EXPORT_FULL_01 is:
F:\ORACLEBACKUP\SALEFULL091113.DMP
Job "SYSTEM"."SYS_EXPORT_FULL_01" completed with 1 error(s) at 16:50:26Yes, the fact that it did not export one table does not make the job fail, it will continue on exporting all other objects.
. I drop database that gerenated a expdp dump file.
and recreate blank database and then impdp again.
But I got lots of error as
ORA-39151: Table "SYSMAN"."MGMT_ARU_OUI_COMPONENTS" exists. All dependent metadata and data will be skipped due to table_exists_action of skip
ORA-39151: Table "SYSMAN"."MGMT_BUG_ADVISORY" exists. All dependent metadata and data will be skipped due to table_exists_action of skip
......ORA-31684: Object type TYPE_BODY:"SYSMAN"."MGMT_THRESHOLD" already exists
ORA-39111: Dependent object type TRIGGER:"SYSMAN"."SEV_ANNOTATION_INSERT_TR" skipped, base object type VIEW:"SYSMAN"."MGMT_SEVERITY_ANNOTATION" >already exists
and last line as
Job "SYSTEM"."SYS_IMPORT_FULL_01" completed with 2581 error(s) at 11:54:57Yes, even though you think you have an empty database, if you have installed any apps or anything, it may create tables that could exist in your dumpfile. If you know that you want the tables from the dumpfile and not the existing ones in the database, then you can use this on the impdp command:
impdp user/password table_exists_action=replace ...
If a table that is being imported exists, DataPump will detect this, drop the table, then create the table. Then all of the dependent objects will be created. If you don't then the table and all of it's dependent objects will be skipped, (which is the default).
There are 4 options with table_exists_action
replace - I described above
skip - default, means skip the table and dependent objects like indexes, index statistics, table statistics, etc
append - keep the existing table and append the data to it, but skip dependent objects
truncate - truncate the existing table and add the data from the dumpfile, but skip dependent objects.
Hope this helps.
Dean -
Migrating from 9i to 10g through exp/imp
Hi,
We need to migrate a database on 9i in HP-UNIX to 10g in IBM AIX. Can you please tell me whether i can export data in 9i with 9i export utility and import it into 10g using 10g data pump utility for faster import ?
We don't have the 10g software installed in the HP-UNIX platform.
And what other alternatives we have to reduce the amount of time for the migration process as it's a critical Production database ?
Thanks,
JayantaI believe that Justin is correct that if you use exp to unload the data you will need to use imp and not impdp to reload the data.
To speed up the cross-platform exp/imp process I suggest you consider running multiple concurrent exp/imp jobs. You can generate a table=list into a spool file using sql. You can give each large table its own exp job and bunch the small tables.
Depending on how your database objects are organized you might also be albe to export by owner or a combination of owner, tables= exports.
You can use a full=y with the rows=n option to grab the public synonyms, non-owning users, packages, etc.... not brought in by the prior exp/imp.
HTH -- Mark D Powell -- -
I exported tables in 10g using exp(normal export)command
Can I user impdp(datapump) while doing importThere is no exclude or include option in the traditional export/import (exp/imp) utility. This is only given in datapump (expdp/impdp) to exclude or include the database objects while exporting or importing.
All you can do as work around is, create the structure of the table in the target database/schema, then do import, while import if the table is found in the target schema, then that table will not imported. So, don’t use ignore=y, if you use, even if the table is found during the import, the rows will be imported in fact. Hope you got what I mean.
Regards,
Sabdar Syed. -
How to exclude tables while doing import in Traditional import/export
Hi OTN,
Please suggest me, while using Oracle Datapump we can exclude tables as below i mentioned.... if i will use Traditional export/import how can i exclude tables.
impdp tech_test10/[email protected] directory=EAMS04_DP_DIR dumpfile=EXP_DEV6_05092012.dmp logfile=IMP_TECH_TEST10_29MAY.log REMAP_SCHEMA=DEV6:TECH_TEST10 remap_tablespace=DEV6_DATA ATA_TECH_TEST10 remap_tablespace=DEV6_INDX:INDEX_TECH_TEST10 EXCLUDE=TABLE:\"IN \(\'R5WSMESSAGES_ARCHIVE\',\'R5WSMESSAGES\',\'R5AUDVALUES\'\)\"
Your suggestions will be helpful to me find the way.you cannot exclude. But you can use tables parameter to list all the tables needs to be exported.
exp scott/tiger file=emp.dmp tables=(emp,dept)
similarly for import
Edited by: Kiran on Jul 18, 2012 2:20 AM -
Hi Friends,
I need to exclude two tables in impdp.. help me in this...
impdp system directory=exportdump tables=SYNC.cin_document query=SYNC.CIN_DOCUMENT:'"where rownum<10"' dumpfile=satexpdpdump.dmp logfile=1.log CONTENT=DATA_ONLY EXCLUDE=TABLE:"SYNC.CIN_DETAIL,SYNC.CIN_DOCUMENT_GTIN"
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
ORA-39001: invalid argument value
ORA-39071: Value for EXCLUDE is badly formed.
ORA-00920: invalid relational operatorThanks friends ,,, I got the solution.
Solution:
instead of excluding i have mentioned only one table name in "tables=SYNC.cin_document" field. I have just removed the EXCLUDE option ..It is working fine..
The following query i have used.
impdp system directory=exportdump tables=SYNC.cin_document query=SYNC.CIN_DOCUMENT:'"where rownum<2570000"' dumpfile=satexpdpdump.dmp logfile=2570000d000gih.log CONTENT=DATA_ONLY
For your reference.:
The following scenario explains: 3 tables exported under scott. but we need to import only two tables. output follows
step1:
=============
create directory data_pump_dir as '/u01/oracle/my_dump_dir';
grant read,write on directory data_pump_dir1 to system;
expdp system/sys DIRECTORY=data_pump_dir DUMPFILE=tables.dmp logfile=table.log TABLES=SCOTT.DEPT,SCOTT.EMP,SCOTT.BONUS
output:
[oracle@linux1 ~]$ expdp system/sys DIRECTORY=data_pump_dir DUMPFILE=tables.dmp logfile=table.log TABLES=SCOTT.DEPT,SCOTT.EMP,SCOTT.BONUS
Export: Release 10.2.0.1.0 - Production on Tuesday, 10 May, 2011 19:40:00
Copyright (c) 2003, 2005, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
Starting "SYSTEM"."SYS_EXPORT_TABLE_01": system/******** DIRECTORY=data_pump_dir DUMPFILE=tables.dmp logfile=table.log TABLES=SCOTT.DEPT,SCOTT.EMP,SCOTT.BONUS
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 128 KB
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
. . exported "SCOTT"."DEPT" 5.656 KB 4 rows
. . exported "SCOTT"."EMP" 7.820 KB 14 rows
. . exported "SCOTT"."BONUS" 0 KB 0 rows
Master table "SYSTEM"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
Dump file set for SYSTEM.SYS_EXPORT_TABLE_01 is:
/u01/app/oracle/product/10.2.0/db_1/rdbms/log/tables.dmp
Job "SYSTEM"."SYS_EXPORT_TABLE_01" successfully completed at 19:40:22
[oracle@linux1 ~]$
Step 2:
SQL> alter table DEPT rename to dept1;
Table altered.
SQL> alter table EMP rename to emp1;
Table altered.
SQL> select * from tab;
TNAME TABTYPE CLUSTERID
BONUS TABLE
SALGRADE TABLE
EMP1 TABLE
DEPT1 TABLE
==============================================================================
[oracle@linux1 ~]$ impdp system/sys DIRECTORY=data_pump_dir DUMPFILE=tables.dmp TABLES=SCOTT.DEPT,SCOTT.EMP
output:
Import: Release 10.2.0.1.0 - Production on Tuesday, 10 May, 2011 19:48:51
Copyright (c) 2003, 2005, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
Master table "SYSTEM"."SYS_IMPORT_TABLE_01" successfully loaded/unloaded
Starting "SYSTEM"."SYS_IMPORT_TABLE_01": system/******** DIRECTORY=data_pump_dir DUMPFILE=tables.dmp TABLES=SCOTT.DEPT,SCOTT.EMP
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
. . imported "SCOTT"."DEPT" 5.656 KB 4 rows
. . imported "SCOTT"."EMP" 7.820 KB 14 rows
Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
ORA-31684: Object type INDEX:"SCOTT"."PK_DEPT" already exists
ORA-31684: Object type INDEX:"SCOTT"."PK_EMP" already exists
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
ORA-31684: Object type CONSTRAINT:"SCOTT"."PK_DEPT" already exists
ORA-31684: Object type CONSTRAINT:"SCOTT"."PK_EMP" already exists
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
ORA-39083: Object type REF_CONSTRAINT failed to create with error:
ORA-02270: no matching unique or primary key for this column-list
Failing sql is:
ALTER TABLE "SCOTT"."EMP" ADD CONSTRAINT "FK_DEPTNO" FOREIGN KEY ("DEPTNO") REFERENCES "SCOTT"."DEPT" ("DEPTNO") ENABLE
Job "SYSTEM"."SYS_IMPORT_TABLE_01" completed with 5 error(s) at 19:48:54
[oracle@linux1 ~]$
SQL> select * from tab;
TNAME TABTYPE CLUSTERID
BONUS TABLE
SALGRADE TABLE
EMP1 TABLE
DEPT1 TABLE
DEPT TABLE
EMP TABLE
6 rows selected.
SQL> -
Migrate oracle 9207 DB 8 TB size frm Solaris to AIX?dont want Exp/Imp
Hi Guys,PLz help
I want migrate 8TB oracle database from Solaris 8 to AIX 5.
In my last post on the same topic I was told to refer Metalink notes
291024.1,Note:77523.1,Note:277650.1
acc to these notes 'EXPORT/IMPORT IS THE ONLY OPTION TO MIGRATE FROM SOLARIS TO AIX'.
I was not convinced as In http://dba.ipbhost.com/index.php?showtopic=9523
I read
"As both solaris and AIX are UNIX O/S, cloning the DB is also possible from source box to target box"
Also
whats the role of "If the endianness is the same"??????
Can you guys plz comment on this again.....as I'm really confused because exp/imp of 8TB
is going to take half of my life :)
PLz tell in any other option in 9.2.0.7?If we have
ThanxYou can do
SELECT PLATFORM_NAME, ENDIAN_FORMAT
FROM V$TRANSPORTABLE_PLATFORMto check endianness of different platform.
-- Note, the view is only on 10g and above.
datafiles from different endian format can't be directly copied over and use. On 10g you have option to use RMAN convert endianness.
with that said, since Solaris and AIX are all big engianness. That means you can directly copy datafiles over and clone database without using exp/imp.
this article has a list of how to clone
http://www.dba-oracle.com/oracle_tips_db_copy.htm -
Summary: Problem with export (maybe...terminated successfully with warnings) and import (IMP-00022: failed to process parameters)
I used PL/SQL developer to make a parameter file (initially I used it to export, then I just stole the file to try again). It contains 100 tables from a single schema.
Export from prod DB using exp parfile=test.par: Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
Export done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
server uses ZHS16GBK character set (possible charset conversion)
About to export specified tables via Conventional Path ...
. . exporting table CONTRACT_INFO 12 rows exported
EXP-00091: Exporting questionable statistics.
'Export terminated successfully with warnings.' !!!I want to import into the cs2_user schema (same privileges as on the production DB) trying:impdp 'sys as sysdba'@danieldb schemas=cs2_user dumpfile=test01tables3.dmpThat gets the error: UDI-00014: invalid value for parameter, 'attach'
I then thought maybe you have to use imp/exp or impdp/expdp instead of a combination (exp + impdp) so I tried: imp 'sys/admin as sysdba'@danieldb touser=cs2_party_owner file=test01tables4.dmpbut then I just get: IMP-00022: failed to process parameters
Anyone see why its all failing? :s
Mikeyou must input SID name into single-quotes like:
C:\Tools\Oracle\product\10.2.0\db\NETWORK\ADMIN>expdp 'sys/manager@stb as sysdba'
Export: Release 11.2.0.1.0 - Production on Thu Feb 9 13:46:16 2012
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
UDE-06550: operation generated ORACLE error 6550
ORA-06550: line 1, column 11:
PLS-00201: identifier 'SYS.DBMS_UTILITY' must be declared
ORA-06550: line 1, column 11:
PL/SQL: Statement ignored -
Any way to use filter to exclude tables in navagation pane?
Is there any way to use filter to exclude a set of tables from the table list in the navagation pane? I have a number of tables (15+) starting with the same prefix eg. AB123 that I would like eliminate from the list. They sort right to the top and I always have to scroll down, and go through the show more dialog to see the entire list.
I am sure I'm missing something, but not sure what. Help Center has nothing to offer.
Thanks
GlennThis has been mentioned on the forum before - basically the need for more elaborate ways to filter (multiple conditions as well as 'not like'). It is on our list for future consideration, meaning post-production.
-- Sharon
Message was edited by:
sbkenned -
Hi all,
I have one problem in Oracle 9.2.0.8 onRHEL4. Iwant to use exp/imp to update one table from one server to another.
Suppose I have one table abc on A server and it is having around 1 million records and another B server is also having the same no. of records and some of the record have been changed to A server and i also want to imp the same record to another server B.Is there any paramneter in IMP.
Please suggest me.Hi,
Kamran Agayev A. wrote:
user00726 wrote:
but how would i know which table has been updated or modifiedUsing MERGE funtion, you'll merge two tables. This command will updated changed rows and insert non-inserted rows to the second table and make it as the same as the first tableSQL> create table azar(pid number,sales number,status varchar2(20));
Table created.
SQL> create table azar01(pid number,sales number,status varchar2(20));
Table created.
SQL> insert into azar01 values(1,12,'CURR');
1 row created.
SQL> insert into azar01 values(2,13,'NEW');
1 row created.
SQL> insert into azar01 values(3,15,'CURR');
1 row created.
SQL> insert into azar values(2,24,'CURR');
1 row created.
SQL> insert into azar values(3,0,'OBS');
1 row created.
SQL> insert into azar values(4,42,'CURR');
1 row created.
SQL> commit;
Commit complete.
SQL> select * from azar01;
PID SALES STATUS
1 12 CURR
2 13 NEW
3 15 CURR
SQL> select * from azar;
PID SALES STATUS
2 24 CURR
3 0 OBS
4 42 CURR
SQL> merge into azar01 a using azar b on (a.pid=b.pid) when matched
2 then update set a.sales=a.sales + b.sales, a.status=b.status
3 delete where a.status='OBS'
4 when not matched
5 then insert values(b.pid,b.sales,'NEW');
3 rows merged.
SQL> select * from azar01;
PID SALES STATUS
1 12 CURR
2 37 CURR
4 42 NEW
Hello Sir, is this correct?
Regards
S.Azar
DBA -
Exp/Imp alternatives for large amounts of data (30GB)
Hi,
I've come into a new role where various test database are to be 'refreshed' each night with cleansed copies of production data. They have been using the Imp/Exp utilities with 10g R2. The export process is ok, but what's killing us is the time it takes to transfer..unzip...and import 32GB .dmp files. I'm looking for suggestions on what we can do to reduce these times. Currently the import takes 4 to 5 hours.
I haven't used datapump, but I've heard it doesn't offer much benefit when it comes to saving time over the old imp/exp utilities. Are 'Transportable Tablespaces' the next logical solution? I've been reading up on them and could start prototyping/testing the process next week. What else is in Oracle's toolbox I should be considering?
Thanks
brianHi,
I haven't used datapump, but I've heard it doesn't offer much benefit when it comes to saving time over the old imp/exp utilitiesDatapump will be faster for a couple of reasons. It uses direct path to unload the data. DataPump also supports parallel processes, so while one process is exporting metadata, the other processes can be exporting the data. In 11, you can also compress the dumpfiles as you are exporting. (Both data and metadata compression is available in 11, I think metadata compression is available in 10.2). This will remove your zip step.
As far as transportable tablespace, yes, this is an option. There are some requirements, but if it works for you, all you will be exporting will be the metadata and no data. The data is copied from the source to the target by way of datafiles. One of the biggest requirements is that the tablespaces need to be read only while the export job is running. This is true for both exp/imp and expdp/impdp. -
Exp/Imp In Oracle 10g client
Hi All,
I want to take one schema export from Oracle 10g Client. I am in new Oracle 10g.
In oracle 9i, we can using exp/imp command for Export and Import from Client itself.
I heard about in Oracle 10g, we can use Expdb/Impdb in Server only. Is there any possibility to take exp/imp in Client also.
Pls help me...
Cheers,
Moorthy.GSTo add up to expdb from oracle client
NETWORK_LINK
Default: none
Purpose
Enables an export from a (source) database identified by a valid database link. The data from the source database instance is written to a dump file set on the connected database instance.
Syntax and Description
NETWORK_LINK=source_database_link
The NETWORK_LINK parameter initiates an export using a database link. This means that the system to which the expdp client is connected contacts the source database referenced by the source_database_link, retrieves data from it, and writes the data to a dump file set back on the connected system.
The source_database_link provided must be the name of a database link to an available database. If the database on that instance does not already have a database link, you or your DBA must create one. For more information about the CREATE DATABASE LINK statement, see Oracle Database SQL Reference.
If the source database is read-only, then the user on the source database must have a locally managed tablespace assigned as the default temporary tablespace. Otherwise, the job will fail. For further details about this, see the information about creating locally managed temporary tablespaces in the Oracle Database Administrator's Guide.
Restrictions
When the NETWORK_LINK parameter is used in conjunction with the TABLES parameter, only whole tables can be exported (not partitions of tables).
The only types of database links supported by Data Pump Export are: public, fixed-user, and connected-user. Current-user database links are not supported.
Example
The following is an example of using the NETWORK_LINK parameter. The source_database_link would be replaced with the name of a valid database link that must already exist.
expdp hr/hr DIRECTORY=dpump_dir1 NETWORK_LINK=source_database_linkDUMPFILE=network_export.dmp LOGFILE=network_export.log -
Database upgrade from 8i to 10g using exp/imp
Dear Friends,
Please provide the steps to upgrade from 8i to 10g using exp/imp.
Thanks,
RathinavelHi;
Please also see cold backup option
How to migrate from 8i to 10g to new server using cold backup [ID 742108.1]
Also see:
Upgrading from 8i to 10g with import utility
http://searchoracle.techtarget.com/answer/Upgrading-from-8i-to-10g-with-import-utility
Regard
Helios
Maybe you are looking for
-
Hiding grand total value in pivot table view
Hi, I have a pivot table in which i have 5 measures, out of these 5 measures i want to show the grand total for only the first one. How can this be done? Thx
-
Broadband Dropping Out and Connection Speed Decrea...
Sorry to be a pain but I am back here again. Since the last issue with speed, connection drop out and noisey line which required an underground cable to be partly replaced, all has been fantastic and no problem to report. Many thanks to those who wer
-
Every time I click an app the app opens and then closes right away. any idea why?
Every time I try to use an app when I click on the app it opens and then closes right away. Any idea what I can do to fix that?
-
Pls anybody tel me what is self life / ageing report in mm
hi all, i have to develop self life ageing report. pls anybody tel me what is self life / ageing report in mm. pls send me template and code. regards vikas
-
Hi, Why we need CRM Middleware for Data Exchange with other servers, as we have SAP-XI Server which can do the same. Reg. Sunil.