Sometimes is Recommended to Exp/ Imp an production Database ?

Hi there;
Related to this [my yesterday post|http://forums.oracle.com/forums/message.jspa?messageID=6498780#6498780] , AWR reports I'm looking in google for it;
Is it recommended to Export and Import an production schema ?
This operation in others database ( not comparing then to oracle ) , sometimes is recommended, to realign indexes and reconstruct tables;
So, may be for an not expert, solution "may be" it could be an not-expert solution !
Thanks;
Marcos Ortega

The similarity between Oracle and other rdbms products (actually between any two rdbms products) ends at "SELECT * FROM EMP;". The internals are vastly different. WHat is considered best management practice for one product could be worst management practice for another.
In the case of oracle, it generally does NOT benefit from constant re-organization. There have been many, many papers published on this. See especially Richard Foote's work on oracle indexing.
If you have an observed performance issue, you need to track down the specific cause and address that.

Similar Messages

  • Exp/imp for upgrading database

    Hi,   Is it a good idea to use exp / imp to upgrade a database from 9i to 11g.To overcome parallel I plan to set ll tables degree to 8 manually and then start the export.Which is better conventional upgrade or exp/imp

    HI,
    Please check Oracle support id : How to Perform a Full Database Export Import during Upgrade, Migrate, Copy, or Move of a Database (Doc ID 286775.1)
    Thank you

  • Full Database Exp & Imp

    Hi,
    I am trying to Exp & Imp a full database. I am working on Oracle 9i & 10g on Solaris 9. I am not using Data Pump. Can anyone please help me with the following:
    1. I am performing the full export using SYSTEM user.
    1.a Does a Full export include (or backups up) the DATA DICTIONARY of the database?
    1.b Does a Full export include the backup of SYS and SYSTEM objects?
    I am using the following command to export
    exp system/system@testdb file=$HOME/testdbfullexp.dmp full=y statistics=none
    I have tried importing the FULL export to another database and i did see that SYS and SYSTEM objects were also being imported ( got some errors regarding constraints and inconsistencies).
    I would like to ask like what are the ideal steps to follow to copy a database from DB1 to DB2 using EXP and IMP
    Any information will be of a great help
    Thanks,
    Harris.

    1 a) No, as the data dictionary will be automagicall recreated by implicit SQL.
    This means any non dictionary objects under SYS will be lost
    1 b) as above. SYSTEM however is a normal user.
    any %SYS user will NOT be exported (CTXSYS, MDSYS, etc)
    On import of SYSTEM there will be always errors, as SYSTEM is non-empty after initial database creation.
    Sybrand Bakker
    Senior Oracle DBA

  • How to exp/imp both diff character set tables between in DB1 and DB2?

    In the Solaris 2.7 ,the oracle 8i DB1 has NLS_CHARACTERSET
    ZHS16CGB231280 and NLS_NCHAR_CHARACTERSET ZHS16CGB231280
    character set.
    In other linux7.2 system ,the oracle 8i DB2 is install into the
    NLS_NCHAR_CHARACTERSET US7ASCII and NLS_CHARACTERSET US7ASCII
    character set.
    The tables contents of DB1 have some chinese. I want to exp/imp
    tables of DB1 into DB2 . But the chinese can't correct display
    in the SQLWheet tools. How do the Exp/Imp operation ? ples help
    me . thanks .

    The supported way to store GB231280-encoded characters is using a ZHS16CGB231280 database or a database created using a superset of GB231280 ,such as UTF8 .Can you not upgrade your target database from US7ASCII to ZHS16CGB231280 ?
    With US7ASCII and NLS_LANG set to US7ASCII , you are using the garbage in garbage out (GIGO) approach. This may seem to work but there are many hidden problems :-
    1. Invalid SQL String Function behaviours - LENGTH ( ) , SUBSTR ( ) , INSTR ( )
    2. Data can be corrupted when data is loaded into another database. e.g. EXP / IMP , Dblinks
    3. Communication with other clients will generate incorrect results. e.g. other Oracle products - Oracle Text, Forms. , Java , HTML etc.
    4. Linguistic sorts not available
    5. Query using the standard WHERE clause may return incorrect results ..
    6. Extra coding overhead in handling character conversions manually.
    I recommend you to check out the FAQ and the DB Character set migration guide on the Globalization Support forum on OTN.
    Nat.

  • Exp/Imp alternatives for large amounts of data (30GB)

    Hi,
    I've come into a new role where various test database are to be 'refreshed' each night with cleansed copies of production data. They have been using the Imp/Exp utilities with 10g R2. The export process is ok, but what's killing us is the time it takes to transfer..unzip...and import 32GB .dmp files. I'm looking for suggestions on what we can do to reduce these times. Currently the import takes 4 to 5 hours.
    I haven't used datapump, but I've heard it doesn't offer much benefit when it comes to saving time over the old imp/exp utilities. Are 'Transportable Tablespaces' the next logical solution? I've been reading up on them and could start prototyping/testing the process next week. What else is in Oracle's toolbox I should be considering?
    Thanks
    brian

    Hi,
    I haven't used datapump, but I've heard it doesn't offer much benefit when it comes to saving time over the old imp/exp utilitiesDatapump will be faster for a couple of reasons. It uses direct path to unload the data. DataPump also supports parallel processes, so while one process is exporting metadata, the other processes can be exporting the data. In 11, you can also compress the dumpfiles as you are exporting. (Both data and metadata compression is available in 11, I think metadata compression is available in 10.2). This will remove your zip step.
    As far as transportable tablespace, yes, this is an option. There are some requirements, but if it works for you, all you will be exporting will be the metadata and no data. The data is copied from the source to the target by way of datafiles. One of the biggest requirements is that the tablespaces need to be read only while the export job is running. This is true for both exp/imp and expdp/impdp.

  • Export / import  exp / imp commands Oracle 10gXE on Ubuntu

    I have Oracle 10gXE installed on Linux 2.6.32-28-generic #55-Ubuntu, and I need soe help on how to export / import the base with exp / imp commands. The commands seems to be installed on */usr/lib/oracle/xe/app/oracle/product/10.2.0/server/bin* directory but I cannot execute them.
    The error message I got =
    No command 'exp' found, did you mean:
    Command 'xep' from package 'pvm-examples' (universe)
    Command 'ex' from package 'vim' (main)
    Command 'ex' from package 'nvi' (universe)
    Command 'ex' from package 'vim-nox' (universe)
    Command 'ex' from package 'vim-gnome' (main)
    Command 'ex' from package 'vim-tiny' (main)
    Command 'ex' from package 'vim-gtk' (universe)
    Command 'axp' from package 'axp' (universe)
    Command 'expr' from package 'coreutils' (main)
    Command 'expn' from package 'sendmail-base' (universe)
    Command 'epp' from package 'e16' (universe)
    exp: command not found
    Is there something I have to do ?

    Hi,
    You have not set environment variables correctly.
    http://download.oracle.com/docs/cd/B25329_01/doc/install.102/b25144/toc.htm#BABDGCHH
    And of course that sciprt have small hickup, so see
    http://ubuntuforums.org/showpost.php?p=7838671&postcount=4
    Regards,
    Jari

  • Running OMBPlus and EXP/IMP in mixed version environment

    OWB Mixed Environment Guru's
    Current environment:
    OWB Client: 10.1.0.2.0 on Windows XP Professional
    OWB Server side: 10.1.0.2.0 on UNIX (AIX 5.2)
    Repository: Oracle 9.2.0.4 on UNIX (AIX 5.2)
    UNIX Listener: 9.2.0.4 on UNIX (AIX 5.2)
    Runtime Repository: Oracle 9.2.0.4 on UNIX (AIX 5.2)
    I call this a mixed environment since my OWB stuff is 10g and my database stuff is 9.2.
    Issues:
    1- I can't get the command line exp.sh script to connect to the repository and returns the famous 'ORA-12154, TNS:listener does not currently know of service requested in connect descriptor'. It looks like the 'owbsetenv.sh' script is changing the value of $ORACLE_HOME to point to the 10g areas. Could that be then causing the system to look for a 10g LISTENER which doesn't exist since all my databases are 9.2.0.4???
    2- I have the same issue trying to run OMBPlus.sh.
    I am ultimately trying to set up a promotion process using the UNIX command line programs (exp/imp and OMBPlus) to get objects from the TEST environment into the PRODUCTION environment which is a separate repository and target schema on a different machine.
    Any advice on how to successfully operate in this 'mixed' environment is most welcomed.
    Many thanks!
    Gary

    Well it looks like I did it again!
    Total brain fart.
    The problem turned out that I wasn't specifying the entire SERVICE_NAME for the repository database. I had been leaving off the domain information. Must be a habit from not having to use it in the TNSNAMES.ORA files.
    I was able to compelte my test export and connect to OMBPLUS and will now try my test import.
    Sorry to clutter the forum but if it helps anyone else with the same affliction I seem to have frequently, I guess that's a small reward.
    Until next time.
    Gary

  • Need help for full db exp-imp

    Hi,
    My database is having undo segment corruption.I have considered and tried a lot of things to come out of the situation,but didn't get any +ve result. So I have decided to take a full database export,rename the database,create a new database on the same system and import the full database in the new database.Will you plz tell me the steps to do this(full db import)? Plz include a wise syntax for full db exp-imp.
    Thanks a lot

    If your only identified problem was sudden undo corruption to an online instance and you followed the steps referenced on the Burleson site (switch to manual undo management, create a new undo tablespace, drop the old undo tablespace, and switch back to automatic undo management) and then the very next day encountered undo corruption again, I would recommend two simultaneous courses of action:
    1) Get the hardware analyzed, this could be a signal of hard drive or controller problems.
    2) Scour metalink for any bug references that may be related (or just go ahead and open an iTar).
    If you persist in importing to create the database back on this same system and if you must use the same storage, I would at least try to get the file system or volume that the corruption was on taken offline and recreated.
    Just an alternate view of things ... good luck!

  • Migrating from 9i to 10g through exp/imp

    Hi,
    We need to migrate a database on 9i in HP-UNIX to 10g in IBM AIX. Can you please tell me whether i can export data in 9i with 9i export utility and import it into 10g using 10g data pump utility for faster import ?
    We don't have the 10g software installed in the HP-UNIX platform.
    And what other alternatives we have to reduce the amount of time for the migration process as it's a critical Production database ?
    Thanks,
    Jayanta

    I believe that Justin is correct that if you use exp to unload the data you will need to use imp and not impdp to reload the data.
    To speed up the cross-platform exp/imp process I suggest you consider running multiple concurrent exp/imp jobs. You can generate a table=list into a spool file using sql. You can give each large table its own exp job and bunch the small tables.
    Depending on how your database objects are organized you might also be albe to export by owner or a combination of owner, tables= exports.
    You can use a full=y with the rows=n option to grab the public synonyms, non-owning users, packages, etc.... not brought in by the prior exp/imp.
    HTH -- Mark D Powell --

  • Exp/Import database schema vs Ch 10 Exp/Imp Content

    Hi I'm using Portal 10.1.2.02. I'm a DBA tasked with migrating a Dev portal into Test, then Production.
    There seems to be a number of ways to achieve this and I was hoping to clarify which is best for me.
    I've run through note: 330391.1(copying the Portal schema) which details running a perl script to export and import onto a newly installed server. This process worked will.
    Chapter 10 of the Portal guide details a fairly complex process of creating transports sets and migrating these over and importing.
    My question is: if I want the entire Portal copied is there any difference in these processes? ie. Do you end up with the same result?
    thanks in advance of any advice :-)

    The cloning model is not a rerunnable model, its not granular and conditional migraton is not possible.
    You can do the cloning only if u want to take a copy of the entire setup and rewire to a new midtier.
    Portal exp/imp model helps u to achieve granular, conditional and rerunnable method of moving portal objects.
    Apart from that, it comes with the readily available prechecks to intimate what's going wrong during the process.

  • Exp/imp full or transportable tablespaces?

    Hi Experts,
    DB version: 10.2.0.4 64 bit enterprise edition
    OS version: Windows 2003 R2 64 bit
    here database moving from 10.2.0.4 enterprise edition to standard edition.
    i have went through metalink, i found
    *10G : Step by Step Procedure to Migrate from Enterprise Edition to Standard Edition [ID 465189.1]*
    *Converting An Enterprise Edition Database To Standard Edition [ID 139642.1]*
    so i have taken export of full database before that i have create a DBLINK & TABLES ,
    i have impoted into standard edition but i have missed those DBLINKS and grants will imported?
    Thanks.

    hi guru,
    my expectation is the entire database should pilot to new server with standard edition. i have read articles from metalink, as per traditional export/import is a better.
    By that process i tested,
    1) production database:-
    a)created some tables, one database link
    b) exp system/*** full=y file=exp.dmp log=exp.log
    2)another side, i have created a dummy database using DBCA, and created some tablespaces as exist in export side.
    a) imp system/*** full=y file=exp.dmp log=imp.log
    so after successful import, i have checked objects but i missed the DBLINK from production.
    1)my question is any other parameters should be process within import? to perform full export/import
    2)all the schemas will be imported or manully need to create?
    3)grants will be imported?
    4) what about constraints.
    ignore=y rows=n.i want to export rows also(entire database)

  • Refresh using exp/imp

    Hi All,
    I need to refresh one of my development database using exp/imp utility,i did refreshing using the copy datafiles but new to this ,i have gone thru this forum and did some search on google too but the part that am not getting is suppose my development db already created with same tablespaces name and size and containes every schema as that of source and has data in it which is refreshed earlier, so do i need to wipe out everything and do it right from scratch(create db,tablespace,users) or just wipe out tables and import with (ignore=Y) .Any feedback would be apperciated.
    Thanks

    <p class="MsoNormal">Hi,</p>
    <p class="MsoNormal"> </p>
    <p class="MsoNormal">>> do i need to wipe out everything and do it right from
    scratch(create db,tablespace,users)</p>
    <p class="MsoNormal"> </p>
    <p class="MsoNormal">Well you have choices, and you can do whatever you want,
    but there are some “best practices”.  I’m assuming that you are copying your
    PROD database into DEV (a great practice to have a full-sized production system
    the TEST and DEV):</p>
    <p class="MsoNormal"> </p>
    <p class="MsoNormal">- You can restore PROD into TEST using RMAN.</p>
    <p class="MsoNormal"> </p>
    <p class="MsoNormal">- You can use the
    <a style="color: blue; text-decoration: underline; text-underline: single" href="http://download-east.oracle.com/docs/html/B16227_02/oui7_cloning.htm">
    Oracle Universal Installer</a> to clone a database.</p>
    <p class="MsoNormal"> </p>
    <p class="MsoNormal">- You can “<a style="color: blue; text-decoration: underline; text-underline: single" href="oracle_tips_db_copy.htm">clone
    the whole database</a>”, moving the current datafiles and recreating the
    instance</p>
    <p class="MsoNormal"> </p>
    <p class="MsoNormal">- You can export from PROD, truncate the tables in DEV
    (after a backup) and import IGNORE=Y.  This can be very slow, especially for a
    large database.  However, there are some
    <a style="color: blue; text-decoration: underline; text-underline: single" href="http://asktom.oracle.com/pls/ask/f?p=4950:8:::::F4950_P8_DISPLAYID:1240595435323">
    tips to speed-up imports</a>.</p>
    <p class="MsoNormal"> </p>
    <p class="MsoNormal">Check the
    <a style="color: blue; text-decoration: underline; text-underline: single" href="http://search.oracle.com/">
    Oracle docs</a>.  Also, Dave Moore has a book on
    <a style="color: blue; text-decoration: underline; text-underline: single" href="http://www.amazon.com/Oracle-Utilities-Programs-Oradebug-Dbverify/dp/0972751351">
    Oracle Utilities</a> that you might like.</p>
    Hope this helps . . . .
    Don Burleson
    <p>
    www.dba-oracle.com

  • Does data pump really replace exp/imp?

    hi guys,
    Ive read some people saying we should be using data pump instead of exp/imp. But as far as I can see, if I have a database behind a firewall at some other place, and cannot connect to that database directly and need to get some data acrposs, then data pump is useless for me and I can only exp and imp the data.

    OracleGuy777 wrote:
    ...and i guess this means that data pump does not replace exp and imp.Well, depending of your database version, it is.
    "+Original Export is desupported for general use as of Oracle Database 11g. The only supported use of original Export in 11g is backward migration of XMLType data to a database version 10g release 2 (10.2) or earlier. Therefore, Oracle recommends that you use the new Data Pump Export and Import utilities, except in the following situations which require original Export and Import:+
    +* You want to import files that were created using the original Export utility (exp).+
    +* You want to export files that will be imported using the original Import utility (imp). An example of this would be if you wanted to export data from Oracle Database 10g and then import it into an earlier database release.+"
    http://download.oracle.com/docs/cd/E11882_01/server.112/e10701/original_export.htm#SUTIL3634
    Coming back on your problem, as already suggested, you have the NETWORK_LINK parameter, you're able to export data from source and import directly into your target db without the need of any intermediate file.
    Nicolas.

  • Using exp/imp to clone a database from Solaris to NT

    Hi All,
    We have an 8.1.6 Oracle database in production under Sun Solaris.
    We actually use this database to run Siebel CRM system.
    We have a Windows NT server which we used as a QA system
    for this application.
    We have a daily full export of our production database on the
    Sun machine. Now my manager wants to import this production
    dmp file into this Win NT database on a weekly basis.
    The goal here is to maintain the state of this NT database
    as much functional and close to the production one as possible.
    I would appreciate your help on this.
    I have some other general question:
    in which case can the Oracle imp utility be run in the
    full=y mode ?
    does this make any sense at all ?
    since the Oracle import utility does not delete
    any existing objects before it imports, these objects
    should be deleted manually and that's what I do
    with all Siebel users schemas; of course I do not
    do this with SYS, SYSTEM and other special Oracle
    users; question: are there any objects in these
    special Oracle schemas that have to be carried over from
    the source database to the target database to
    make the rest of the schemas functional ?
    Somehow I failed to find Oracle documentation on how
    to use exp/imp to create a complete clone of an Oracle
    database.
    Can direct path be used with full=y export/import mode ?
    Thank you in advance,
    Yuriy.

    As an alternative to using daily/weekly exports and imports, what would probably be more effective here would be replication. The databases could synch on a nightly basis and updated tables and rows will be transferred from your OLTP system to your read-only system. More information can be found on replication from "Oracle8i DBA Handbook" by Kevin Loney and Marlene Therialt.
    LM
    PS. IGNORE=Y

  • Exp/imp related problem

    Hi all,
    I have one problem in Oracle 9.2.0.8 onRHEL4. Iwant to use exp/imp to update one table from one server to another.
    Suppose I have one table abc on A server and it is having around 1 million records and another B server is also having the same no. of records and some of the record have been changed to A server and i also want to imp the same record to another server B.Is there any paramneter in IMP.
    Please suggest me.

    Hi,
    Kamran Agayev A. wrote:
    user00726 wrote:
    but how would i know which table has been updated or modifiedUsing MERGE funtion, you'll merge two tables. This command will updated changed rows and insert non-inserted rows to the second table and make it as the same as the first tableSQL> create table azar(pid number,sales number,status varchar2(20));
    Table created.
    SQL> create table azar01(pid number,sales number,status varchar2(20));
    Table created.
    SQL> insert into azar01 values(1,12,'CURR');
    1 row created.
    SQL> insert into azar01 values(2,13,'NEW');
    1 row created.
    SQL> insert into azar01 values(3,15,'CURR');
    1 row created.
    SQL> insert into azar values(2,24,'CURR');
    1 row created.
    SQL> insert into azar values(3,0,'OBS');
    1 row created.
    SQL> insert into azar values(4,42,'CURR');
    1 row created.
    SQL> commit;
    Commit complete.
    SQL> select * from azar01;
    PID SALES STATUS
    1 12 CURR
    2 13 NEW
    3 15 CURR
    SQL> select * from azar;
    PID SALES STATUS
    2 24 CURR
    3 0 OBS
    4 42 CURR
    SQL> merge into azar01 a using azar b on (a.pid=b.pid) when matched
    2 then update set a.sales=a.sales + b.sales, a.status=b.status
    3 delete where a.status='OBS'
    4 when not matched
    5 then insert values(b.pid,b.sales,'NEW');
    3 rows merged.
    SQL> select * from azar01;
    PID SALES STATUS
    1 12 CURR
    2 37 CURR
    4 42 NEW
    Hello Sir, is this correct?
    Regards
    S.Azar
    DBA

Maybe you are looking for

  • I can't use a decimal number in a update routine

    Hi, I want to use a decimal number in a udate routine (in a updating rule). I've a code similar to this: IF COMM_STRUCTURE-/BIC/ZSATIAUX1 LT 1,5.   RESULT = 'GOOD'. And when I save the routine I get the following error: E:Comma without preceding colo

  • My Ipod touch cant be identified properly so i cant sync my music

    i've tried everything i've found on apple support like reinstalling itunes or the USB cord. nothing is working please i need help

  • Save "Actions"

    Is there a way in FCE to save "actions" similar to what can be done in Photoshop? Example- You come up with process you like on applying "Ken Burns effect" to pictures. Is there a way to save this and apply as you come to other pictures you would lik

  • How to implement logical equation as LabView?

    Hi, I have this bit test logic I need to implement as a Labview code. Should I use formula node or either something similar to this already in Labview? "((var & (1<<bit)) != 0)"  where 'var' may be 8, 16, or 32bit, 'bit' is a number 0 - 31 representi

  • AppleWorks 6 Draw doesn't open.

    Pages will open AppleWorks 6 documents beautifully. What opens programs in AppleWorks 6 created in "draw"?