Cloning Oracle 8 db using export

Hi,
I'm going to clone a database from export dump file. I haven't tried this yet. I found the instructions from link http://oradbapro.blogspot.com/2009/04/clone-using-export-dump.html. I 'm not sure how accurate it is.
So my question is: are the steps outlined in the link accurate?

Imp username/password file=’D:\fullbkup.dmp’ log=’D:\imp.log’ full=y show=yThis will just create the logfile with relevant SQL's and it doesn't import into any database.
type imp help=y for more options and their descriptions.
Keyword  Description (Default)       Keyword      Description (Default)
USERID   username/password           FULL         import entire file (N)
BUFFER   size of data buffer         FROMUSER     list of owner usernames
FILE     input files (EXPDAT.DMP)    TOUSER       list of usernames
SHOW     just list file contents (N) TABLES       list of table names
IGNORE   ignore create errors (N)    RECORDLENGTH length of IO record
GRANTS   import grants (Y)           INCTYPE      incremental import type
INDEXES  import indexes (Y)          COMMIT       commit array insert (N)
ROWS     import data rows (Y)        PARFILE      parameter filename
LOG      log file of screen output   CONSTRAINTS  import constraints (Y)
DESTROY                overwrite tablespace data file (N)
INDEXFILE              write table/index info to specified file
SKIP_UNUSABLE_INDEXES  skip maintenance of unusable indexes (N)
FEEDBACK               display progress every x rows(0)
TOID_NOVALIDATE        skip validation of specified type ids
FILESIZE               maximum size of each dump file
STATISTICS             import precomputed statistics (always)
RESUMABLE              suspend when a space related error is encountered(N)
RESUMABLE_NAME         text string used to identify resumable statement
RESUMABLE_TIMEOUT      wait time for RESUMABLE
COMPILE                compile procedures, packages, and functions (Y)
STREAMS_CONFIGURATION  import streams general metadata (Y)
STREAMS_INSTANTIATION  import streams instantiation metadata (N)
VOLSIZE                number of bytes in file on each volume of a file on tape
Second question is which account should I use to run the imp to create logfile? I am just in step 2.In order to execute the imp you should have IMP_FULL_DATABASE role assigned to it.
If you have access to Metalink, Check the Note:- 273140.1 - Clone A Db Independent Of Platform might help you. Detailed instructions are described.
Added Metalink Note information.

Similar Messages

  • Problem using Export in Oracle 8.1.6.

    Hi,
    I have a problem using export utility in oracle 8i.
    I got a message:
    EXP-00056: ORACLE error 942 encountered
    ORA-00942: table or view does not exist
    EXP-00000: Export terminated unsuccessfully
    I tried running CATPROC.sql and CATEXP.sql but it did not correct the problem.
    Please help me.
    Thanks

    You may want to try using the error lookup tool available here, http://otn.oracle.com/pls/tahiti/tahiti.homepage?remark=tahiti You may be able to find the answer even though the error lookup tools is for 8.1.7 version. Otherwise, you can search the db discussion forums to see if this issue has been discussed already.

  • Cloning Oracle database 8.1.7.0

    Hi,
    We have upgraded our database from Oracle 8.1.7.0 to Oracle 8.1.7.4 , applied terminal patch set(Oracle 8.1.7.4) , as we have to upgrade it further to 10g rel2.
    After upgrading to 8.1.7.4, we required to apply the database fixes against the mentioned bugs.
    One of the Serious Server Issues is mentioned below after upgradation,
    Changing the Database Character Set can corrupt LOBs ( Note 118242.1)
    we have gone through the note mentioned above,
    In that one of the point regarding cloning is as below :
    If the database version is 8.1.7.3 or 8.1.7.4:
    - install non-patched Oracle 8.1.7.0.0 from the original CD-Pack
    into a separate ORACLE_HOME
    - take a full backup of the database
    - create a clone database from this backup using the 8.1.7.0.0
    version of the RDBMS software -- contact Oracle Support
    if you need help; the clone should open with the older
    non-patched software without any downgrade steps
    - export all non-system tables with non-NULL CLOB and/or NCLOB columns
    from the clone; do not forget to set NLS_LANG environment
    variable to the database character set for the export session;
    you can use the 8.1.7.0.0 version of the Export utility
    - truncate the exported tables but in the original database
    - reimport the exported tables into the original database;
    do not forget to set NLS_LANG environment variable to
    the database character set for the import session;
    use the Import utility from the original ORACLE_HOME
    - drop the clone database - do not use the clone for other
    purposes as it is not supported to run a patched database
    with the non-patched software.
    How to Clone the Oracle database (8.1.7.0) on windows OS ?
    From where i would get the Cloning steps to accomplish the above steps on Windows OS ?
    With Regards

    Hi,
    Please, refer to Steve Rea's Web for DB Cloning examples:
    For Cloning using Datafile copy:
    http://uaex.edu/srea/Cloning_a_Database_using_Datafile_Copy.htm
    For Cloning using Export/Import:
    http://uaex.edu/srea/Cloning_a_Database_using_Export_Import.htm
    Cheers:
    Francisco Munoz Alvarez
    www.oraclenz.com

  • Using export file in EM

    I am using XP pro, Oracle 10G:
    In EM, export files
    I am trying to export,
    c:\oracle\product\10.1.0\oradata\ord\example01
    When it states specify the full path and name for the export file on the database server machine. I loaded
    c:\oracle\product\10.1.0\oradata\ord\example01
    I got the Error report directory does not exist.
    I created by SQL that path as a dir,
    and Grant it read write.
    How do I get export files to work ?????

    try it from a command prompt and see what happens.
    exp help=y for a listing of parameters.

  • Using export/import to migrate data from 8i to 9i

    We are trying to migrate all data from 8i database to 9i database. We plan to migrate the data using export/import utility so that we can have the current 8i database intact. And also the 8i and 9i database will reside on the same machine. Our 8i database size is around 300GB.
    We plan to follow below steps :
    Export data from 8i
    Install 9i
    Create tablespaces
    Create schema and tables
    create user (user used for exporting data)
    Import data in 9i
    Please let me know if below par file is correct for the export :
    BUFFER=560000
    COMPRESS=y
    CONSISTENT=y
    CONSTRAINTS=y
    DIRECT=y
    FEEDBACK=1000
    FILE=dat1.dmp, dat2.dmp, dat3.dmp (more filenames here)
    FILESIZE=2048GB
    FULL=y
    GRANTS=y
    INDEXES=y
    LOG=export.log
    OBJECT_CONSISTENT=y
    PARFILE=exp.par
    ROWS=y
    STATISTICS=ESTIMATE
    TRIGGERS=y
    TTS_FULL_CHECK=TRUE
    Thanks,
    Vinod Bhansali

    I recommend you to change some parameters and remove
    others:
    BUFFER=560000
    COMPRESS=y -- This will increase better storage
    structure ( It is good )
    CONSISTENT=y
    CONSTRAINTS=y
    DIRECT=n -- if you set that parameter in yes you
    can have problems with some objects
    FEEDBACK=1000
    FILE=dat1.dmp, dat2.dmp, dat3.dmp (more filenames here)
    FILESIZE=2048GB
    FULL=y
    GRANTS=y -- this value is the default ( It is
    not necesary )
    INDEXES=y
    LOG=export.log
    OBJECT_CONSISTENT=y -- ( start the database in restrict
    mode and do not set this param )
    PARFILE=exp.par
    ROWS=y
    STATISTICS=ESTIMATE -- this value is the default ( It is
    not necesary )
    TRIGGERS=y -- this value is the default ( It is
    not necesary )
    TTS_FULL_CHECK=TRUE
    you can see what parameters are not needed if you apply
    this command:
    [oracle@ozawa oracle]$ exp help=y
    Export: Release 9.2.0.1.0 - Production on Sun Dec 28 16:37:37 2003
    Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
    You can let Export prompt you for parameters by entering the EXP
    command followed by your username/password:
    Example: EXP SCOTT/TIGER
    Or, you can control how Export runs by entering the EXP command followed
    by various arguments. To specify parameters, you use keywords:
    Format: EXP KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
    Example: EXP SCOTT/TIGER GRANTS=Y TABLES=(EMP,DEPT,MGR)
    or TABLES=(T1:P1,T1:P2), if T1 is partitioned table
    USERID must be the first parameter on the command line.
    Keyword Description (Default) Keyword Description (Default)
    USERID username/password FULL export entire file (N)
    BUFFER size of data buffer OWNER list of owner usernames
    FILE output files (EXPDAT.DMP) TABLES list of table names
    COMPRESS import into one extent (Y) RECORDLENGTH length of IO record
    GRANTS export grants (Y) INCTYPE incremental export type
    INDEXES export indexes (Y) RECORD track incr. export (Y)
    DIRECT direct path (N) TRIGGERS export triggers (Y)
    LOG log file of screen output STATISTICS analyze objects (ESTIMATE)
    ROWS export data rows (Y) PARFILE parameter filename
    CONSISTENT cross-table consistency(N) CONSTRAINTS export constraints (Y)
    OBJECT_CONSISTENT transaction set to read only during object export (N)
    FEEDBACK display progress every x rows (0)
    FILESIZE maximum size of each dump file
    FLASHBACK_SCN SCN used to set session snapshot back to
    FLASHBACK_TIME time used to get the SCN closest to the specified time
    QUERY select clause used to export a subset of a table
    RESUMABLE suspend when a space related error is encountered(N)
    RESUMABLE_NAME text string used to identify resumable statement
    RESUMABLE_TIMEOUT wait time for RESUMABLE
    TTS_FULL_CHECK perform full or partial dependency check for TTS
    VOLSIZE number of bytes to write to each tape volume
    TABLESPACES list of tablespaces to export
    TRANSPORT_TABLESPACE export transportable tablespace metadata (N)
    TEMPLATE template name which invokes iAS mode export
    Export terminated successfully without warnings.
    [oracle@ozawa oracle]$
    Joel P�rez

  • BLOB insertion in Oracle 10g database using ojdbc14 (10g drivers)

    Hello!
    I have a situation where I am trying to insert a blob data into oracle 10g database using oracle thin
    10g drivers, <b>ojdbc14.jar</b> in <b>weblogic 8.1 sp2</b>. I have the following error happening very intermittently.
    <u><b>java.sql.SQLException: OALL8 is in an inconsistent state.</b></u>
    And this is leading to the <u><b>"No more data to read from socket"</b></u> error when I am trying to
    insert the BLOB into the database. I have gone through the bug list of SP2 and have realised there is the
    following issue fixed in SP3.
    <b>CR124933</b>
    <b>An Oracle BLOB sometimes used a pooled connection after the connection pool determined that
    the connection was available for reassignment.
    Code was added to ensure the BLOB is completely processed before closing the pool connection or
    ending the transaction.</b>
    I believe the problem arises when we try to insert BLOB into database using a refreshed connection
    from the pool.We have upgraded weblogic 8.1 from SP2 to SP4 service pack inorder to come over the above problem.
    But this still continues to behave intermittently.
    We put ojdbc14.jar in our classpath and Weblogic startup classpath looks like the following :-
    WLS_CLASSPATH=${WLS_DOMAIN_DIR}/appslib/server.jar:$PRE_CLASSPATH:${WLS_WEBLOGIC_HOME}/server/lib/weblogic.jar:
    ${WLS_WEBLOGIC_HOME}/server/lib/ojdbc14.jar:${WLS_WEBLOGIC_HOME}/server/lib:${WLS_JAVA_HOME}/lib/tools.jar:
    ${WLS_JAVA_HOME}/jre/lib/rt.jar:${WLS_WEBLOGIC_HOME}/server/lib/webservices.jar:${WLS_CONFIG_DIR}:
    ${WLS_CUSTLIB_DIR}:${WLS_BIN_DIR}:$POST_CLASSPATH
    export WLS_CLASSPATH
    CLASSPATH=${WLS_CLASSPATH}:${APP_CLASSPATH}
    export CLASSPATH
    After upgrade to SP4, there are new ojdbc14_g.jar(debug jar) and orai18n.jar jars in the ${WLS_WEBLOGIC_HOME}/server/ext/jdbc/oracle/10g directory added.
    Please let me know if I need to update classpath with the new 10g jars in the ext/lib directory or any suggestions
    to insert BLOB using the ojdbc14 10G drivers, Weblogic 8.1 environment would be appreciated.
    Following is the stack trace of the errors that I recieve:
    <Oct 6, 2005 1:29:36 PM EDT> <Error> <JDBC> <BEA-001112> <Test "select count(*) from DUAL" set up for pool
    "MHUBPoolStage" failed with exception: "java.sql.SQLException: OALL8 is in an inconsistent state".>
    <Oct 6, 2005 1:29:36 PM EDT> <Info> <JDBC> <BEA-001128> <Connection for pool "MHUBPoolStage" closed.>
    <Oct 6, 2005 1:29:36 PM EDT> <Info> <JDBC> <BEA-001067> <Connection for pool "MHUBPoolStage" refreshed.>
    <Oct 6, 2005 1:29:36 PM EDT> <Info> <EJB> <BEA-010051>
    java.rmi.RemoteException: TransactionRequestManager.requestTransaction():
    Caught PersistnceException com.mortgagehub.busobj.PersistenceException: -5258: No more data to read from socket
    Please let me know if there is anything that I am missing.
    Thanks
    Pradeep G

    pradeep g wrote:
    Hello!
    I have a situation where I am trying to insert a blob data into oracle 10g database using oracle thin
    10g drivers, <b>ojdbc14.jar</b> in <b>weblogic 8.1 sp2</b>. I have the following error happening very intermittently.
    > <u><b>java.sql.SQLException: OALL8 is in an inconsistent state.</b></u>
    And this is leading to the <u><b>"No more data to read from socket"</b></u> error when I am trying to
    insert the BLOB into the database. I have gone through the bug list of SP2 and have realised there is the
    following issue fixed in SP3.
    > <b>CR124933</b>
    <b>An Oracle BLOB sometimes used a pooled connection after the connection pool determined that
    the connection was available for reassignment.
    Code was added to ensure the BLOB is completely processed before closing the pool connection or
    ending the transaction.</b>
    > I believe the problem arises when we try to insert BLOB into database using a refreshed connection
    from the pool.We have upgraded weblogic 8.1 from SP2 to SP4 service pack inorder to come over the above problem.
    But this still continues to behave intermittently.
    We put ojdbc14.jar in our classpath and Weblogic startup classpath looks like the following :-
    > WLS_CLASSPATH=${WLS_DOMAIN_DIR}/appslib/server.jar:$PRE_CLASSPATH:${WLS_WEBLOGIC_HOME}/server/lib/weblogic.jar:
    ${WLS_WEBLOGIC_HOME}/server/lib/ojdbc14.jar:${WLS_WEBLOGIC_HOME}/server/lib:${WLS_JAVA_HOME}/lib/tools.jar:
    ${WLS_JAVA_HOME}/jre/lib/rt.jar:${WLS_WEBLOGIC_HOME}/server/lib/webservices.jar:${WLS_CONFIG_DIR}:
    ${WLS_CUSTLIB_DIR}:${WLS_BIN_DIR}:$POST_CLASSPATH
    export WLS_CLASSPATH
    CLASSPATH=${WLS_CLASSPATH}:${APP_CLASSPATH}
    export CLASSPATH
    > After upgrade to SP4, there are new ojdbc14_g.jar(debug jar) and orai18n.jar jars in the ${WLS_WEBLOGIC_HOME}/server/ext/jdbc/oracle/10g directory added.
    > Please let me know if I need to update classpath with the new 10g jars in the ext/lib directory or any suggestions
    to insert BLOB using the ojdbc14 10G drivers, Weblogic 8.1 environment would be appreciated.
    > Following is the stack trace of the errors that I recieve:
    > <Oct 6, 2005 1:29:36 PM EDT> <Error> <JDBC> <BEA-001112> <Test "select count(*) from DUAL" set up for pool
    "MHUBPoolStage" failed with exception: "java.sql.SQLException: OALL8 is in an inconsistent state".>
    <Oct 6, 2005 1:29:36 PM EDT> <Info> <JDBC> <BEA-001128> <Connection for pool "MHUBPoolStage" closed.>
    <Oct 6, 2005 1:29:36 PM EDT> <Info> <JDBC> <BEA-001067> <Connection for pool "MHUBPoolStage" refreshed.>
    <Oct 6, 2005 1:29:36 PM EDT> <Info> <EJB> <BEA-010051>
    java.rmi.RemoteException: TransactionRequestManager.requestTransaction():
    Caught PersistnceException com.mortgagehub.busobj.PersistenceException: -5258: No more data to read from socket
    Please let me know if there is anything that I am missing.
    > Thanks
    > Pradeep GHi. This is something we'd like to diagnose. How is your application
    getting using and closing pool connections? The initial symptom
    seems to be an internal oracle problem... Are you using standard
    JDBC or oracle-specific calls?
    Joe

  • Homogeneous system copy using export/import on WIn/Ora

    Dear experts,
    i need to know the homogeneous system copy procedure using export/import method for following scenario as we are in the process of Data center migration of SAP systemsfrom location to another location.
    Source system:
    ECC6/BI7/APO(ABAP stack)
    OS:WIndows 2003
    D:B:Oracle 10g
    Target System: New hardware installed on Virtual environment(VMware)
    No change of SID
    ECC6/BI//APO
    OS:WINDOWS2008
    D:B: Oracle 11g
    It would be great if anyone can share the documentu2019s/links (Not SAP homogeneous system copy guide) which you have prepared for the homog. system copy using export/import on Windows/oracle  as i have done the system copy using backup/restore for ABAP stack and export/import for Java stack so far.
    Therefore i am not sure about this procedure/prerequisites/checklist needs to prepare as i have gone through the SAP guide but it is not step by step document.
    Cheers

    There are two approacahes for your  project
    1) First wether to use Oracle11g on target database
    2) Continue with Database Oracle 10.2 on traget database
    If you want to use Oracle11.2 on your target database you have to search the installers which are supported for 11.2 database. If the installers are available then only you can install your target database using Oracle11.2.Other wise t use Oracle10.2.0.4 is supported on Win2008r2. You can use this version of Oracle database to build ur target database.It works fine . Absolutely no problem.
    Later on you can upgrade the Oracle10.2 to 11.2 Onwards.
    If installers are avialble you can use Oracle11.2 also. There is absolutely no problem.
    As per my knowledge installers are available and supported with Oracle11.2.Also check the PAM before arriving on any deciscion.

  • RAC to NON-RAC Cloning in R 12 using Rapid Clone

    Hi,
    We are using EBS 12.0.4 with DB version Oracle Database 10g Enterprise Edition Release 10.2.0.4.0.
    Initilally the RAC to non RAC cloning were not supported using rapid clone for this.But from Oct 19 Oracle has certified the same.
    Can anybody specify me the document which contain the step by step process for RAC to NON-RAC Cloning using Rapid Clone.I cant find this.
    Regards,
    Susmit

    Hi;
    Please check below previos thread:
    Cloning RAC to non RAC
    RAC -> NON RAC ebiz clone
    Re: Clone Oracle Apps 11.5.10.2 RacDB to Non-RAC DB
    Regard
    Helios

  • Cloning Oracle Applications Release 11i

    Hi,
    for Cloning Oracle Applications Release 11i , is it ncessary to copy jre, oui and oraInventory directories from source database tier file system to trget ?
    Many thanks before.

    Applications / EBS is discussed in this category http://forums.oracle.com/forums/category.jspa?categoryID=11 (and there are more, if you look at the forums home page)
    But, in general the oraInventory must exist to be able to patch or otherwise maintain the installation. Wouldn't think copying directories is a supported cloning method... Again, check with the proper forum. And use the Search function!

  • 9i to 11g migration using export and import

    Hi,
    I got a requirement to migrate 9i database to 11g using export and import.
    Please let me know what are all the parameters required in 11g, so that database will run with better performance.
    thanks,

    You can start with the current parameters, except for COMPATIBLE (which must be 10.0.0 or higher in the 11g environment). However, you will have to remove parameters deprecated in 11g.
    See http://download.oracle.com/docs/cd/E11882_01/server.112/e17222/changes.htm#BABHACIE
    Then you can test performance and identify what you need to change.
    Hemant K Chitale

  • Oracle Outside In Image Export SDK API - DAInit() crashes

    Hi,
    We use DelayLoad concept to load Oracle Outside In Image Export SDK DLLs when it is required. If the necessary DLLs are not present, the DAInit() function call crashes. Is there any other way that it can throw exception and we can catch it and handle it? Or Is there any other method to detect that the necessary DLLs are not present? Kindly help us in this regard.
    Thanks
    Revathi

    Thank for the update.
    But I know this link and I searched there for Outside In Technology. No match.
    I did try the obvious things, else I wouldn't ask here.
    And what about the broken download ? Does nobody from Oracle read this ? Or does nobody care ? Does anyone know where I can mail the broken link ?
    Regards
    Kai

  • Are there performancebenefits to reorganizing database-using export/import?

    I have a production database using Oracle 9.2.0.5, which has been running for last 3 years since it was upgraded from 8.1.7. At that time we had done full export of 8.1.7 database and then created 9.2. instance and then imported all the application schemas.
    Load on our database has been increasing and there are constant pressures from management to improve performance. We have looked at indexes many times, have lots of memory for SGA and have tuned various init.ora parameters. Being a third party packages, we cannot rewrite queries.
    Application is a mix of OLTP and reporting, it is definitely more read than write.
    Are there any benefits to reorganize database using export/import, i.e., we will do a full export of existing database and then delete all objects from application schemas and do schema imports. We will run the dbms_Stats again to recomputed statistics. Of course, we will test all of that in a test environment before making change sin production.
    I have heard different views on reorganization. Some people say it is useless, some people say it can improve performance since data will be placed homely in fewer blocks.
    Appreciate your feedback.

    Hi,
    Oracle gave us reorg utilities (dbms_redefinition) because Oracle does not do real-time reorganization for performamnce reasons.
    In some applications, reorgs are critical to high-performance, while in others, it may make no difference.
    Remember, a reorg simply puts the indexes and tables into their "optimal" pristine state.
    The most striking benefit of table reorgs is when a "sparse" table experiences lots of full scans. After the reorg, response time can be cut in half.
    Also, in cases where related rows are queried together, a reorg with row-resequencing (like 10g sorted hash clusters) make a bif difference:
    http://www.dba-oracle.com/t_table_row_resequencing.htm
    But like I said, it depends on many factors . . .
    Hope this helps . . .
    Donald K. Burleson
    Oracle Press author
    Author of "Oracle Tuning: The Definitive Reference"
    http://www.rampant-books.com/book_2005_1_awr_proactive_tuning.htm

  • Question about using export as a "hot" backup

    Hello.
    I have been told to do the following task (release 9.2.0.7). A kind of backup needs to be done every 12 hours of a database (24*7) to a remote machine. I have not experience with this kind of tasks and I have been talking to several workmates who have made some suggestions.
    The initial idea would be making a consistent export (of the only user involved) because of some reasons:
    1. It is the simplest solution. (I have no experienced with backup and recovery).
    2. There is only one file involved that be compressed and sent to the remote machine.
    3. When an export is currently executed of the full user, early in the morning (let's say 4 a.m. ) when no transaction is being run, it takes about only 20 minutes.
    4. It is not important how long the import can take.
    But I have some doubts:
    1. An export at 4 p.m. is going to be executed with transaction being run. I do not know if the export can cause locking problems to these transactions.
    2. I have told that there is an option "incremental export" that I have never used that can be useful in this case.
    I would thank some advice any experienced dba could tell me.

    Thanks for the answer.
    I do not understand what "export as of 9.x version is obsolete" actually means, the database is 9.2.0.7 and I cannot change that.
    There is one thing that is worrying me: I have just read at http://download.oracle.com/docs/cd/B10501_01/server.920/a96572/osbackups.htm#13346
    Caution:
    If you use Export to perform a logical backup, then you must export all data in a logically consistent way so that the backup reflects a single point in time. No one should make changes to the database while the Export takes place. Ideally, you should run the database in ALTER SYSTEM QUIESCE RESTRICTED mode while you export the data, so no regular users can access the data. Alternatively, you can quiesce the database before you export the data, and unquiesce the database afterward.So, maybe using an export can be a bad idea.
    What do you dbas think?

  • How to reduce the size of cloned Oracle Applications Instance

    How to reduce the size of cloned Oracle applications instance, so as to save the storage space for the cloned systems.
    How can we remove unimplemented modules in the instance so that we can reclaim the space occupied by them.
    can any please suggest on this..?

    mumbai,
    I would recommend to leave it as it is.
    If you can add some inexpensive HDD to your test system then do so.
    The Apps DB storage decreasing process is quite painful and isn't straightforward.
    You can’t just delete unused schemas & tablesapces (if they are not used than they are not consuming reasonable space anyway). The modules in Apps are using each other procedures and objects. The effect of deleting one module (schema) is highly unpredictable.
    If you will do something like that than you will not be able to patch your cloned system after that.
    Some things can be done however in order to reduce size of a testing Apps system:
    - Decries UNDO & TEMP tablespaces. Normally you do not need that big tbs in test system as you have got in the production.
    - If your REDO LOG files are quite big and you have got a lot of redo groups you can recreated smaller REDO log files.
    - On Apps tier you can delete OUTput and LOG files
    In case you still would like to decrease data volume in your testing system you need to take a look on some tools which provide data subtracting capabilities from Apps DB. This process has to be quite intelligent. The tool have to know the data structures in APPS DB and subtract data in the way to not harm logical relationships between records. After a subtracting process you will need to make multiple FULL exp/imp cycles in order to reduce physical space consumed by the database. Beside the fact that those tools are quite expensive you will need to spend a lot of your time to implement those.
    BTW: That is the size of you production DB? Have you analyzed which module takes most of the space? May be you can identify top 10 objects and try to archive a data from those object preventing the future grow of the production database.
    Just my 0.02£
    Yury
    Check this out:
    A.
    http://www.freelists.org/archives/ora-apps-dba/05-2006/msg00000.html
    B.
    - Users can subscribe to your list by sending email to
    ora-apps-dba-request_at_freelists.org with 'subscribe' in the Subject field
    C.
    http://www.freelists.org/archives/ora-apps-dba/05-2006/threads.html

  • Clone Oracle Applications 11i using Oracle Application Manager

    Hi All,
    I am working on the following Metalink documents:
    *[Note: 398619.1 - Clone Oracle Applications 11i using Oracle Application Manager (OAM Clone)|https://metalink2.oracle.com/metalink/plsql/ml2_documents.showDocument?p_database_id=NOT&p_id=398619.1]*
    I am bit confused, if I am doing Cloning with OAM and providing all the information about the target system how does it work?
    Is it that, cloning with OAM will make the required templates and I will have to just take the backup of my database and application and have to restore it on target machine and cloning is done!
    Please suggest,
    Anchorage

    Is it that, cloning with OAM will make the required templates and I will have to just take the backup of my database and application and have to restore it on target machine and cloning is done!This is not enough. You still need to complete the following steps on the target node:
    Task 5: Configure the Target System
    Task 6: Initiate Data Purge
    Task 7: Review Checklist

Maybe you are looking for

  • I am trying to upload my headshot from a cd to a website that uses Java, but i can't find the disk drive, can anyone help? I also saved image to HD but can't locate it

    i have been a pc user for many years, and recently upgraded to a macbook pro, when i go to the site and click browse my only options are macintosh HD and users, i cant find disk drive and when i search for the copy i put on HD i cant find it....very

  • Mpd and JACK - problem with ncmpcpp

    I have archlinux with kde installed on my desktop computer, however I was curious and wanted to record sound from movies and games with recordmydesktop. I went with the jack solution, but halfway through the process I ended up with this: [wyrm88@nbab

  • Account assignment determination

    Hi Gurus, Can any one let me know: 1. If the account assignment is backend system specifi (for multiple backend).? If yes where is the mapping with the system.(I know the mapping of AAC with backend account assignment). 2. Is the local AAC and backen

  • Property in RequestBean1

    Hi, I'm still making my way thru the FieldGuide with all the examples. In Chapter 9 (Databeses) there is the example MusicRead2 wich has 2 Properties in RequestBean1 and fills it in the action-Handler of Page1. The Navigation calls than another page

  • Direct Connect OSPF and BGP AWS failover setup

    Hi, We recently installed AWS Direct Connect which was successful but now we are looking at the best way to  automatically fail over if our Direct Connect fails to route via our back VPN. The setup Cisco 6500 distributes routes via OSPF internally to