Exp/Imp 10.2.0.1 Database

Can I export data from a database with 8k block size onto a target database with 16k block size?

it doesn't matter what blocksize your target database has configured.
imp generates simple insert statements for each row. you do not need an db_8k_cache_size

Similar Messages

  • Full database exp/imp  between RAC  and single database

    Hi Experts,
    we have a RAC database oracle 10GR2 with 4 node in linux. i try to duplicate rac database into single instance window database.
    there are same version both database. during importing, I need to create 4 undo tablespace to keep imp processing.
    How to keep one undo tablespace in single instance database?
    any experience of exp/imp RAC database into single instance database to share with me?
    Thanks
    Jim
    Edited by: user589812 on Nov 13, 2009 10:35 AM

    JIm,
    I also want to know can we add the exclude=tablespace on the impdp command for full database exp/imp?You can't use exclude=tablespace on exp/imp. It is for datapump expdp/impdp only.
    I am very insteresting in your recommadition.
    But for a full database impdp, how to exclude a table during full database imp? May I have a example for this case?
    I used a expdp for full database exp. but I got a exp error in expdp log as ORA-31679: Table data object "SALE"."TOAD_PLAN_TABLE" has long columns, and longs can not >be loaded/unloaded using a network linkHaving long columns in a table means that it can't be exported/imported over a network link. To exclude this, you can use the exclude expression:
    expdp user/password exclude=TABLE:"= 'SALES'" ...
    This will exclude all tables named sales. If you have that table in schema scott and then in schema blake, it will exclude both of them. The error that you are getting is not a fatal error, but that table will not be exported/imported.
    the final message as
    Master table "SYSTEM"."SYS_EXPORT_FULL_01" successfully loaded/unloaded
    Dump file set for SYSTEM.SYS_EXPORT_FULL_01 is:
    F:\ORACLEBACKUP\SALEFULL091113.DMP
    Job "SYSTEM"."SYS_EXPORT_FULL_01" completed with 1 error(s) at 16:50:26Yes, the fact that it did not export one table does not make the job fail, it will continue on exporting all other objects.
    . I drop database that gerenated a expdp dump file.
    and recreate blank database and then impdp again.
    But I got lots of error as
    ORA-39151: Table "SYSMAN"."MGMT_ARU_OUI_COMPONENTS" exists. All dependent metadata and data will be skipped due to table_exists_action of skip
    ORA-39151: Table "SYSMAN"."MGMT_BUG_ADVISORY" exists. All dependent metadata and data will be skipped due to table_exists_action of skip
    ......ORA-31684: Object type TYPE_BODY:"SYSMAN"."MGMT_THRESHOLD" already exists
    ORA-39111: Dependent object type TRIGGER:"SYSMAN"."SEV_ANNOTATION_INSERT_TR" skipped, base object type VIEW:"SYSMAN"."MGMT_SEVERITY_ANNOTATION" >already exists
    and last line as
    Job "SYSTEM"."SYS_IMPORT_FULL_01" completed with 2581 error(s) at 11:54:57Yes, even though you think you have an empty database, if you have installed any apps or anything, it may create tables that could exist in your dumpfile. If you know that you want the tables from the dumpfile and not the existing ones in the database, then you can use this on the impdp command:
    impdp user/password table_exists_action=replace ...
    If a table that is being imported exists, DataPump will detect this, drop the table, then create the table. Then all of the dependent objects will be created. If you don't then the table and all of it's dependent objects will be skipped, (which is the default).
    There are 4 options with table_exists_action
    replace - I described above
    skip - default, means skip the table and dependent objects like indexes, index statistics, table statistics, etc
    append - keep the existing table and append the data to it, but skip dependent objects
    truncate - truncate the existing table and add the data from the dumpfile, but skip dependent objects.
    Hope this helps.
    Dean

  • Exp/imp of tablespace

    hi all
    could we use exp/imp of tablespace on same database
    say we have list of objects in a tablespace and we want to shrink the tablespace size. ( as there are not many transactions ) then can we take export of this tablespace onyly , drop the tablespace , create a new tablesapce with same name but of less size and import objects back from the export dump file into this tablespace
    thanks
    kedar

    So basically you want to shrink the tablespace but you've got objects scattered around in there and it won't coalesce.
    There are several ways to do this, not least the one you've mentioned.
    An alternative would be to rebuild the objects into a new (smaller) tablespace. If you haven't got the disk space to accomodate a new tablespace and the existing one then, yes, export/import will do the job.

  • Full Database Exp & Imp

    Hi,
    I am trying to Exp & Imp a full database. I am working on Oracle 9i & 10g on Solaris 9. I am not using Data Pump. Can anyone please help me with the following:
    1. I am performing the full export using SYSTEM user.
    1.a Does a Full export include (or backups up) the DATA DICTIONARY of the database?
    1.b Does a Full export include the backup of SYS and SYSTEM objects?
    I am using the following command to export
    exp system/system@testdb file=$HOME/testdbfullexp.dmp full=y statistics=none
    I have tried importing the FULL export to another database and i did see that SYS and SYSTEM objects were also being imported ( got some errors regarding constraints and inconsistencies).
    I would like to ask like what are the ideal steps to follow to copy a database from DB1 to DB2 using EXP and IMP
    Any information will be of a great help
    Thanks,
    Harris.

    1 a) No, as the data dictionary will be automagicall recreated by implicit SQL.
    This means any non dictionary objects under SYS will be lost
    1 b) as above. SYSTEM however is a normal user.
    any %SYS user will NOT be exported (CTXSYS, MDSYS, etc)
    On import of SYSTEM there will be always errors, as SYSTEM is non-empty after initial database creation.
    Sybrand Bakker
    Senior Oracle DBA

  • Using exp imp on 11i apps database

    Hi,
    Does anyone have experience in using exp and imp utility on 11i apps database? Is there any special method to achieve this.
    I want to do a full export and do an import.
    Do we get any performance benefits if we do this?
    Regards,
    SA

    Hi Sa;
    Does anyone have experience in using exp and imp utility on 11i apps database? Is there any special method to achieve this. You want to take exp imp for which issue?
    I want to do a full export and do an import.Follow:
    10g Export/Import Process for Oracle Applications Release 11i [ID 331221.1]
    Export/import process for R12 using 11gR1 [ID 741818.1]
    Do we get any performance benefits if we do this?It depends many segment but finaly u will make one process on server and it has cost,answer you can have some performance issue while you are taking exp-imp
    Regard
    Helios

  • Exp/imp for upgrading database

    Hi,   Is it a good idea to use exp / imp to upgrade a database from 9i to 11g.To overcome parallel I plan to set ll tables degree to 8 manually and then start the export.Which is better conventional upgrade or exp/imp

    HI,
    Please check Oracle support id : How to Perform a Full Database Export Import during Upgrade, Migrate, Copy, or Move of a Database (Doc ID 286775.1)
    Thank you

  • Refresh database through exp/imp

    version 10203 on windows (old db on solaris 8170)
    i Have to refresh database data, No of users/schemas are 400+. fastest way to do that would be to do full exp/imp.
    but 1st drop current users cascade. (any command to drop all users in with one command?)
    then validate all all tables/schemas are same & up-to-date( any suggestion to validate this efficiently?) i am thinking to check full exp logs on old & new db.(what that can take forever manually going through thousands of tables etc)

    First, make sure that statistics are properly gathered in both the old and the new DB.
    Next, use num_rows column in dba_tables. Since statistics are typically gathered by sampling, they would not necessarily match. They need to be quite close though.
    After you are sure that all objects were transferred, you can issue a query to find all tables with “invalid” number of records.
    The query can look something like this:
    select s.owner, d.table_name , d.num_rows
    from dba_tables d
    where d.owner not in (‘SYS’,’SYSTEM’,…)
    and not exists
               (select * from dba_tables@old_db s
              Where d.owner = s.owner
              And d.table_name = s.table_name
              And s.num_rows between 0.9*d.num_rows and 1.1*d.num_rows
              )You need to take care of some special cases, such as num_rows being NULL, partitioned tables, etc.
    Iordan Iotzov
    http://iiotzov.wordpress.com/

  • Using exp/imp to clone a database from Solaris to NT

    Hi All,
    We have an 8.1.6 Oracle database in production under Sun Solaris.
    We actually use this database to run Siebel CRM system.
    We have a Windows NT server which we used as a QA system
    for this application.
    We have a daily full export of our production database on the
    Sun machine. Now my manager wants to import this production
    dmp file into this Win NT database on a weekly basis.
    The goal here is to maintain the state of this NT database
    as much functional and close to the production one as possible.
    I would appreciate your help on this.
    I have some other general question:
    in which case can the Oracle imp utility be run in the
    full=y mode ?
    does this make any sense at all ?
    since the Oracle import utility does not delete
    any existing objects before it imports, these objects
    should be deleted manually and that's what I do
    with all Siebel users schemas; of course I do not
    do this with SYS, SYSTEM and other special Oracle
    users; question: are there any objects in these
    special Oracle schemas that have to be carried over from
    the source database to the target database to
    make the rest of the schemas functional ?
    Somehow I failed to find Oracle documentation on how
    to use exp/imp to create a complete clone of an Oracle
    database.
    Can direct path be used with full=y export/import mode ?
    Thank you in advance,
    Yuriy.

    As an alternative to using daily/weekly exports and imports, what would probably be more effective here would be replication. The databases could synch on a nightly basis and updated tables and rows will be transferred from your OLTP system to your read-only system. More information can be found on replication from "Oracle8i DBA Handbook" by Kevin Loney and Marlene Therialt.
    LM
    PS. IGNORE=Y

  • Step by Step EXP / IMP database from Oracle 9i to Oracle 11g

    Please help on the captioned where I did the below
    1. Use EXP to export the database from a Oracle9i database (O9_DB) OLD_Windows2003_SERVER
    2. Install the Oracle 11g (default service name is ORCL) to a new Windows 2003 server and use the IMP to import the O9_DB, however, I need to create the same physical path as the OLD_Windows2003_SERVER (e.g E:\Oralce_9i\Ordata\) in order to complete the IMP
    3. I can launch the SQL Developer and obtain the records count of evey tables being imported
    However, is there any best / proper way to do the EXP/IMP, the ideal outcome would be
    1. Create a new namespace say (OLDDB) which same as the Oracle9i database name so maybe I don't need to modify much on the application side to point to the new database ?
    2. IMP the OLDDB to the new namespace OLDDB
    Thanks in advance

    Hello,
    If you create the new Database on a new Server, you could keep the same name for the Database.
    To switch from the old Database to the new one, you'll have to change the hostname on the "tnsnames.ora" or in any configuration file which define the database connexion.
    About the best way to do the Export/Import, you may have some details on the link below:
    http://download.oracle.com/docs/cd/E11882_01/server.112/e10819/expimp.htm#i262317
    Else, why do you intend to stay on a Windows 2003 Server, why not moving to Windows 2008 ?
    NB: Oracle *11.1* is certified on Windows 2008 and Oracle *11.2* is certified on Windows 2008 R2.
    Hope this help.
    Best regards,
    Jean-Valentin

  • Database upgrade from 8i to 10g using exp/imp

    Dear Friends,
    Please provide the steps to upgrade from 8i to 10g using exp/imp.
    Thanks,
    Rathinavel

    Hi;
    Please also see cold backup option
    How to migrate from 8i to 10g to new server using cold backup [ID 742108.1]
    Also see:
    Upgrading from 8i to 10g with import utility
    http://searchoracle.techtarget.com/answer/Upgrading-from-8i-to-10g-with-import-utility
    Regard
    Helios

  • Exp/Import database schema vs Ch 10 Exp/Imp Content

    Hi I'm using Portal 10.1.2.02. I'm a DBA tasked with migrating a Dev portal into Test, then Production.
    There seems to be a number of ways to achieve this and I was hoping to clarify which is best for me.
    I've run through note: 330391.1(copying the Portal schema) which details running a perl script to export and import onto a newly installed server. This process worked will.
    Chapter 10 of the Portal guide details a fairly complex process of creating transports sets and migrating these over and importing.
    My question is: if I want the entire Portal copied is there any difference in these processes? ie. Do you end up with the same result?
    thanks in advance of any advice :-)

    The cloning model is not a rerunnable model, its not granular and conditional migraton is not possible.
    You can do the cloning only if u want to take a copy of the entire setup and rewire to a new midtier.
    Portal exp/imp model helps u to achieve granular, conditional and rerunnable method of moving portal objects.
    Apart from that, it comes with the readily available prechecks to intimate what's going wrong during the process.

  • Exp/Imp alternatives for large amounts of data (30GB)

    Hi,
    I've come into a new role where various test database are to be 'refreshed' each night with cleansed copies of production data. They have been using the Imp/Exp utilities with 10g R2. The export process is ok, but what's killing us is the time it takes to transfer..unzip...and import 32GB .dmp files. I'm looking for suggestions on what we can do to reduce these times. Currently the import takes 4 to 5 hours.
    I haven't used datapump, but I've heard it doesn't offer much benefit when it comes to saving time over the old imp/exp utilities. Are 'Transportable Tablespaces' the next logical solution? I've been reading up on them and could start prototyping/testing the process next week. What else is in Oracle's toolbox I should be considering?
    Thanks
    brian

    Hi,
    I haven't used datapump, but I've heard it doesn't offer much benefit when it comes to saving time over the old imp/exp utilitiesDatapump will be faster for a couple of reasons. It uses direct path to unload the data. DataPump also supports parallel processes, so while one process is exporting metadata, the other processes can be exporting the data. In 11, you can also compress the dumpfiles as you are exporting. (Both data and metadata compression is available in 11, I think metadata compression is available in 10.2). This will remove your zip step.
    As far as transportable tablespace, yes, this is an option. There are some requirements, but if it works for you, all you will be exporting will be the metadata and no data. The data is copied from the source to the target by way of datafiles. One of the biggest requirements is that the tablespaces need to be read only while the export job is running. This is true for both exp/imp and expdp/impdp.

  • Exp/Imp In Oracle 10g client

    Hi All,
    I want to take one schema export from Oracle 10g Client. I am in new Oracle 10g.
    In oracle 9i, we can using exp/imp command for Export and Import from Client itself.
    I heard about in Oracle 10g, we can use Expdb/Impdb in Server only. Is there any possibility to take exp/imp in Client also.
    Pls help me...
    Cheers,
    Moorthy.GS

    To add up to expdb from oracle client
    NETWORK_LINK
    Default: none
    Purpose
    Enables an export from a (source) database identified by a valid database link. The data from the source database instance is written to a dump file set on the connected database instance.
    Syntax and Description
    NETWORK_LINK=source_database_link
    The NETWORK_LINK parameter initiates an export using a database link. This means that the system to which the expdp client is connected contacts the source database referenced by the source_database_link, retrieves data from it, and writes the data to a dump file set back on the connected system.
    The source_database_link provided must be the name of a database link to an available database. If the database on that instance does not already have a database link, you or your DBA must create one. For more information about the CREATE DATABASE LINK statement, see Oracle Database SQL Reference.
    If the source database is read-only, then the user on the source database must have a locally managed tablespace assigned as the default temporary tablespace. Otherwise, the job will fail. For further details about this, see the information about creating locally managed temporary tablespaces in the Oracle Database Administrator's Guide.
    Restrictions
    When the NETWORK_LINK parameter is used in conjunction with the TABLES parameter, only whole tables can be exported (not partitions of tables).
    The only types of database links supported by Data Pump Export are: public, fixed-user, and connected-user. Current-user database links are not supported.
    Example
    The following is an example of using the NETWORK_LINK parameter. The source_database_link would be replaced with the name of a valid database link that must already exist.
    expdp hr/hr DIRECTORY=dpump_dir1 NETWORK_LINK=source_database_linkDUMPFILE=network_export.dmp LOGFILE=network_export.log

  • Running OMBPlus and EXP/IMP in mixed version environment

    OWB Mixed Environment Guru's
    Current environment:
    OWB Client: 10.1.0.2.0 on Windows XP Professional
    OWB Server side: 10.1.0.2.0 on UNIX (AIX 5.2)
    Repository: Oracle 9.2.0.4 on UNIX (AIX 5.2)
    UNIX Listener: 9.2.0.4 on UNIX (AIX 5.2)
    Runtime Repository: Oracle 9.2.0.4 on UNIX (AIX 5.2)
    I call this a mixed environment since my OWB stuff is 10g and my database stuff is 9.2.
    Issues:
    1- I can't get the command line exp.sh script to connect to the repository and returns the famous 'ORA-12154, TNS:listener does not currently know of service requested in connect descriptor'. It looks like the 'owbsetenv.sh' script is changing the value of $ORACLE_HOME to point to the 10g areas. Could that be then causing the system to look for a 10g LISTENER which doesn't exist since all my databases are 9.2.0.4???
    2- I have the same issue trying to run OMBPlus.sh.
    I am ultimately trying to set up a promotion process using the UNIX command line programs (exp/imp and OMBPlus) to get objects from the TEST environment into the PRODUCTION environment which is a separate repository and target schema on a different machine.
    Any advice on how to successfully operate in this 'mixed' environment is most welcomed.
    Many thanks!
    Gary

    Well it looks like I did it again!
    Total brain fart.
    The problem turned out that I wasn't specifying the entire SERVICE_NAME for the repository database. I had been leaving off the domain information. Must be a habit from not having to use it in the TNSNAMES.ORA files.
    I was able to compelte my test export and connect to OMBPLUS and will now try my test import.
    Sorry to clutter the forum but if it helps anyone else with the same affliction I seem to have frequently, I guess that's a small reward.
    Until next time.
    Gary

  • Best way of using exp/imp

    Dear all,
    I wanted to migrate database from 8i to 11g(8.1.5 to 11.1.0). I am going for exp/imp method. Which is the best method of doing this task? I mean Full export and Import Or Schemawise export and import? Is there any chances of missing objects or rows while doing this task? If yes, How to avoid? Please help me to take a best decision. I dont want any problem after migration.
    Approach is (take exp of 8.1.5 and imp in 9.2.0) then (exp of 9.2.0 and imp it to 11.1.0.6)
    OS is HP Unix
    Nishant Santhan

    Have you not yet completed this task? As we have already answered to your question a couple of days back.
    Take a look at the similar duplicate thread created by you.
    Re: Migrating from 8i to 11g
    Regards,
    Sabdar Syed.

Maybe you are looking for

  • Questions re 2 ITunes accounts on the same computer

    Daughter #1 has and I Pod touch linked to a desktop PC with her I tunes library. Daughter #2 has an I Pod touch and would like to have her own I tunes library.  Can I have 2 different I tunes libraries on the same computer?   Each daughter has their

  • Adobe Flash Player has been crashing for months. Can you please correct associated problems?

    The following is the message after the successful download from within Microsoft Internet Explorer and Installation of Adobe Flash Player on 8/17/2011 (today): You are now running Adobe Flash Player 10.3.183.5 And the following version of Adobe Shock

  • HP C4680 all in one printer out of alignement when printing avery business cards.

    I recently purchased an HP C4680 all in one printer and have found that it will not print Avery clean edge C32028-25 business cards. It prints ok on plain paper and when you compare against the business card sheet everything is in the correct positio

  • Negative values in report for delta

    Hello, I am getting negative values for service level report metrics in delta loading, intresting thing is metrics is giving correct values for almost all the order but for around 40 order I am getting -ve values in metrics (i am getting -100 % and c

  • Event on customer Z** table

    Hello! I have got a customer Z** table. How can I do workflow that will start when this table rows add or change? Thanks for you help!