Performance for full export database.

Hi,
Is it possibe to estimate or measure how much time needs to export the whole database?
Should I use the parameter "direct=y" to speed up the export process?
Thanks very much.
Frank

Hi,
I tried the export in the database server, but the error is encountered.
Is it because the user(system) is different from owner(ds_ap_dwh)?
Anyone can advise me? thanks alot.
Frank.
C:\>exp system/manager owner=ds_ap_dwh direct=y file=g:\oracle\export\ds_ap_dwh.d
mp log=g:\oracle\export\ds_ap_dwh.log
Export: Release 9.2.0.1.0 - Production on Tue May 29 23:48:36 2007
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Connected to: Oracle9i Release 9.2.0.1.0 - Production
JServer Release 9.2.0.1.0 - Production
Export done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
About to export specified users ...
. exporting pre-schema procedural objects and actions
. exporting foreign function library names for user DS_AP_DWH
. exporting PUBLIC type synonyms
. exporting private type synonyms
. exporting object type definitions for user DS_AP_DWH
About to export DS_AP_DWH's objects ...
. exporting database links
. exporting sequence numbers
. exporting cluster definitions
EXP-00056: ORACLE error 24324 encountered
ORA-24324: service handle not initialized
EXP-00056: ORACLE error 24324 encountered
ORA-24324: service handle not initialized
EXP-00000: Export terminated unsuccessfully

Similar Messages

  • In general how is T series performance for data warehousing databases ~1 TB

    Hi,
    I am planning to use a Sun T3-2 for a data warehouse app. Oracle 11g around a 1 TB worth of data and batch jobs and ~ 200 concurrent users.
    We are using HP UX Superdome server now ( node partition of 8 physical CPU's 750 Mhz ).
    I know that the T series servers are good for highly multi threaded application. Can I consider the database 11g as a highly multithreaded app ?
    How about the batch jobs which run long SQL jobs, I don't think those can be considered multithreaded.
    Anyone is using these T series srevers for databases around a TB size ?
    Please suggest.
    Thanks ..

    The T3-2 can do much more work then the old Superdome, but the T3-2 won't run single threaded queires significantly faster then the old Superdome, it will just be able to run about 100x more of them.
    If you care most about price/performance look at the X4800 or X4450 servers.
    If you care most about availablity and performance look at the M4000/M5000 servers
    My experience with coolthreads servers for Oracle DB is be 100% sure what you're doing isn't single thread dependent before deploying.

  • Poor performance for full screen desktop

    Hi,
    Full screen desktop ( displayed as Kiosk ) of Linux with gnome ( I believe it's the same for all window managers ) is poor ( even with command as ls you see the delays ).
    It happens on the local network. Connection to the application server is SSH.
    SGD server - Solaris 10 , Sun Fire 280. Application server is regular modern PC.
    Regular windows performance is very good.
    Any suggestions ?
    Thanks

    I think you will find the poor performance is only with GTK applications.
    For example, if you go into a large directory of files, and do an ls -aRl, you will notice it is a lot slower with a gnome-terminal than it is with a plain xterm.
    I think 4.3 will resolve this performance issue.

  • Missing grant with full export/full import.

    We have the following problem which is unexplained:
    Using Oracle 9.0.1.4 we full export a database and we import it in full mode in another database which is also running Oracle 9.0.1.4. The database has 1 application schema which owns a lot of PL/SQL object packages and triggers.
    In the source database, all objects are valid and schema owner has SELECT privileges on V$SESSION and execute permission on DBMS_ALERT and DBMS_PIPE.
    I cannot tell you if these privileges have been granted directly or with a role.
    In the target database, package bodys and triggers using V$SESSION and DBMS_ALERT/DBMS_PIPE are invalid because schema owner has not the corresponding privileges.
    Schema owner already existed in the target database. Is it normal than the corresponding privileges have not been imported ?
    I cannot give you the exp/imp command lines but I'm sure full export/full import have been run (I've checked the import log).
    Thanks.
    Message was edited by:
    Pierre Forstmann

    See MetaLink Note:97902.1
    When performing a full export of a database, both objects belonging to SYS and object privileges on SYS's objects (e.g. admin views such as DBA_SEGMENTS, etc.) are not exported.
    To generate the missing privileges you can use the follwing script
    Connect to the original database (the one full export was done from) as SYSTEM using sqlplus and run this script:
    set hea off
    set pagesize 0
    set feedback off
    spool objs_upg.sql
    select 'grant '||privilege||' on '||owner||'.'||
    table_name||' to '||grantee||' '||
    decode(grantable,'YES','WITH Grant option')||';'
    from dba_tab_privs
    where owner = 'SYS'
    select 'grant '||privilege||' ('||column_name||') '||
    ' on '||owner||'.'||table_name||' to '||grantee||' '||
    decode(grantable,'YES','WITH Grant option')||';'
    from dba_col_privs
    where owner = 'SYS'
    spool off

  • 9i full export for a large database

    Hi Dear All,
    My organization is going to make one database read only. so requested to take a full export backup of the same which grew upto 600GB. I want to take it into splits of 2gb. what parameters shall I use.
    Prepared the parfile like:
    FILE=/exp/prod/exp01.dmp,/exp/prod/exp02.dmp,/exp/prod/exp03.dmp
    FILESIZE=2G
    Pls suggest me can I proceed with above parameters and what should be the parameter file for importing+* the same.
    Thanks in Advance
    Phani Kumar

    Hello,
    can I proceed with above parameters and what should be the parameter file for importing the same.With the Original Export/Import you may use the FILE and FILESIZE parameters so as to split the Dump into several dump files, and also to Import from several dump files:
    http://download.oracle.com/docs/cd/B10501_01/server.920/a96652/ch02.htm#1005411
    For instance, you may use something like this:
    FILE=<dumpfile1>.dmp,<dumpfile2>.dmp,<dumpfile3>.dmp
    FILESIZE=2GIn this example the Size of each Dump Files is limited to 2 GB.
    NB: Use the same setting for FILE and FILESIZE for the Export and the Import.
    how can I user mknod option in aix box with oracle9i for taking 600gb of databaseAbout mknod you may read the following Note of My Oracle Support:
    Large File Issues (2Gb+) when Using Export (EXP-2 EXP-15), Import (IMP-2 IMP-21), or SQL*Loader [ID 30528.1]See the chapter *3. Export and Import with a compressed dump file* and chapter *4. Export and Import to multiple files*.
    Hope this help.
    Best regards,
    Jean-Valentin
    Edited by: Lubiez Jean-Valentin on Jun 2, 2011 10:06 AM

  • Data pump export full RAC database  in window single DB by network_link

    Hi Experts,
    I have a window 32 bit 10.2 database.
    I try to export a full rac database (350G some version with window DB) in window single database by dblink.
    exp syntax as
    exdpd salemanager/********@sale FULL=y DIRECTORY=dataload NETWORK_LINK=sale.net DUMPFILE=sale20100203.dmp LOGFILE=salelog20100203.log
    I created a dblink with fixed instance3. It was working for two day and display message as
    ORA-31693: Table data object "SALE_AUDIT"."AU_ITEM_IN" failed to load/unload and is being skipped due to error:
    ORA-29913: error in executing ODCIEXTTABLEPOPULATE callout
    ORA-01555: snapshot too old: rollback segment number with name "" too small
    ORA-02063: preceding line from sale.netL
    I stoped export and checked window target alert log.
    I saw some message as
    kupprdp: master process DM00 started with pid=16, OS id=4444
    to execute - SYS.KUPM$MCP.MAIN('SYS_EXPORT_FULL_02', 'SYSTEM', 'KUPC$C_1_20100202235235', 'KUPC$S_1_20100202235235', 0);
    Tue Feb 02 23:56:12 2010
    The value (30) of MAXTRANS parameter ignored.
    kupprdp: master process DM00 started with pid=17, OS id=4024
    to execute - SYS.KUPM$MCP.MAIN('SYS_EXPORT_FULL_01', 'SALE', 'KUPC$C_1_20100202235612', 'KUPC$S_1_20100202235612', 0);
    kupprdp: worker process DW01 started with worker id=1, pid=18, OS id=2188
    to execute - SYS.KUPW$WORKER.MAIN('SYS_EXPORT_FULL_01', 'SALE');
    In RAC instance alert.log. I saw message as
    SELECT /*+ NO_PARALLEL ("KU$") */ "ID","RAW_DATA","TRANSM_ID","RECEIVED_UTC_DATE ","RECEIVED_FROM","ACTION","ORAUSER",
    "ORADATE" FROM RELATIONAL("SALE_AUDIT"."A U_ITEM_IN") "KU$"
    How to fixed this error?
    add more undotbs space in RAC instance 3 or window database?
    Thanbks
    Jim
    Edited by: user589812 on Feb 4, 2010 10:15 AM

    I usually increate undo space. Is your undo retention set smaller than the time it takes to run the job? If it is, I would think you would need to do that. If not, then I would think it would be the space. You were in the process of exporting data when the job failed which is what I would have expected. Basically, DataPump want to export each table consistent to itself. Let's say that one of your tables is partitioned and it has a large partition and a smaller partition. DataPump attempts to export the larger partiitons first and it remembers the scn for that partition. When the smaller partitions are exported, it will use the scn to get the data from that partition as it would have looked like if it exported the data when the first partiiton was used. If you don't have partitioned tables, then do you know if some of the tables in the export job (I know it's full so that includes just about all of them) are having data added to them or removed from them? I can't think of anything else that would need undo while exporting data.
    Dean

  • How can recover database using only Full Export ?

    Hi,
    I have one query , i have only Full export is given database , after full export sucessfully
    database Crash by mistacally, how can recover it, i have only export is given no backup is taken (RMAN,COLD or HOT) statergy is use , How can recover it.

    You need to recreate the database from scratch as imp requires a working database.
    You could already do this by transforming the results from
    'alter database backup controlfile to trace', a trace file containing a 'create controlfile' statement, into a 'create database' script.
    After that step you apply your import. Everything after that import is lost forever.
    One would ask for a sanity check of management and personell of a company, which has no backup strategy in place.
    Sybrand Bakker
    Senior Oracle DBA

  • Database Full Export

    Hi,
    I want to get full export of oracle9i db with no ARCHIVELOG mode.
    What is the best approach to get full export?
    Shall I get hot backup or cold backup? If I want to get cold backup should I shut the db down or only mount it? Do I need the take tablespaces to backup mode?
    Thanx for replying

    Hi,
    I would strongly recommend not implement such stupid stretegy.you were asking that if you were able to fully recover your table db.i would say no.the reason is if you export your full database in the morning 10 then you would only have a copy your database till 10 what if your database receive severl dml and ddl.what would you do if your db crash evening 5?dear you loose all the data between 10 to till 5.for exapmle your employee table has 1000 records in it and between 10 to 5 youe db recieves 500 records and commit all those records what would you do if your senior's want to generate report based on those 500 records which were added to your db.so although i would say that you can exp your db that is good way to keep up your static version of database so if the user accidently delete one table then you can get your table back without hurting your database backups.
    anyways i would say always
    1.use rman because it doed all the book-keeping,take less space,smart enough to know which your backups location and disk.
    2.always run your db in archive log mode (other wise you going to loose your db)
    thanks
    Alok Kumar.

  • Metadata for full database

    Hi,
    We have full export (datapump) for oracle 10g and it is 2 month older and i need to drop the existing database and need to import the 2 month older dump.
    so before i drop the databsae i need to take full metadata of the exisitng database. is there any command to take full metadata so that after droping the database i can create the database and do import.
    appreciated the help
    Thanks

    user12266475 wrote:
    Hi,
    We have full export (datapump) for oracle 10g and it is 2 month older and i need to drop the existing database and need to import the 2 month older dump.
    so before i drop the databsae i need to take full metadata of the exisitng database. is there any command to take full metadata so that after droping the database i can create the database and do import.
    appreciated the help
    Thanks1- I understand you have taken a full export (data + metadata) 2 months ago right?
    2- Why you need to drop your database? what is your obligation? Dropping the actual database means you have to create a new one and database import will NOT create the new database for you.

  • How do you perform a full system restore for the hp officejet 6500

    Basically, after having this printer for 1 year, the wireless printing started to not work correctly. After buying a laptop and trying to configure the printer with it, the printer started to have problems with wireless printing. So how do you perform a FULL factory reset? I already tried hitting #9 while the printer was turning on but it only seemed to partially reset the system. It did not turn it back to the day i bought the printer.

    OK, so it turns out it was the 'only show synced content' tick box.
    Wow, how embarrassed do I feel now? I never even paid any attention to that tick box all these years.
    Interestingly, over the last couple of days I had my ATV at v1.1 again, and happily deleted all my content (except for 1 album which I deliberately left on). A few days went past, and I checked daily - and there was only ever the 1 album.
    I then upgraded to ATV v3.0.1 again today and that's when my problems came back - all my content magically reappeared on the Apple TV (or so I thought).
    I'm thinking maybe that the 'only show synced content' is a recent 3.0 feature? And if so, maybe that's why I never noticed it before and that's why it was confusing me? Or maybe it's been there all these years (in which case I'll carry on being embarrassed).
    Anyway, a huge thanks to everyone who responded.
    Brad

  • For Full performance You need a Higher voltage AC Adapter Message

    Hello,
    I have a HP G70-460US Laptop since October 10,2009. About three days ago, the laptop started showing me a system message about the AC Adapter. The message says: For Full performance you need a higher voltage AC adapter. I verified the AC Adapter and the temperature was normal. The battery was charging OK. I had the laptop connected to a power strip, so I connect the laptop directly to the wall outlet. The next day, my wife was trying to use the laptop, and the battery had no charge. I unplugged and plugged again the AC Adpter and started charging again, but, the system keeps posting the same error message as before. My laptop is usually connected to a electrical outlet.
    What can be the problem?

    We are also getting this same message with a new G60-506US laptop just placed in service with the factory HP A/C adapter. 
    Looks like someone from HP needs to get into this!

  • Regarding Database performance for report creation

    Hi,
    Currently i have one database in that two schema one for table data and second for reports creation.But it is taking more time to display data because there is more load on first schema.
    I want to create second database for report schema .But i have to access table data from first database .
    1) There is one option to fetch data from first database is through DB Link . But i think it also takes more time to fetch data.
    Is there any way i can access data from first database and i get more performance with creating new database.
    Kindly give me suggestion . What should i do to improve reports performance?

    user647572 wrote:
    Hi,
    Currently i have one database in that two schema one for table data and second for reports creation.But it is taking more time to display data because there is more load on first schema.
    I want to create second database for report schema .But i have to access table data from first database .
    1) There is one option to fetch data from first database is through DB Link . But i think it also takes more time to fetch data.
    Is there any way i can access data from first database and i get more performance with creating new database.
    Kindly give me suggestion . What should i do to improve reports performance?You have more two options:
    1. Use Oracle Streams and replicate tables between databases. WHile using reporting, you'll refer to the second database
    2. Create Standby database, it's the clone of your database where you can update it by adding archived redo log files from primary database

  • SQL Developer 2.1.0.63 - Export Data menu missing for SQL Server databases

    Hi
    The Export Data menu only appears for Oracle databases in this version, it does not appear for SQL server databases.
    To demonstrate:
    Press F9 to run query
    Right click on Query Result data
    For Oracle an Export Data sub menu appears
    For SQL server no Export Data sub menu appears
    This worked in previous versions.
    Is this functionality going to reappear?
    Thanks
    Dave

    Hi,
    Looks like this is exactly the same problem for third party connections as I have the exact same issue for mySQL.
    I had to apply 1.5.5 version to fix it.
    Hope we're going to have this solved !
    Thanks,
    JP

  • Issue Migrating Character Data Using a Full Export and Import

    Hi There;
    I have a database in my local machine that doesn't support Turkish characters. My NLS_CHARACTERSET is WE8ISO8859P1, It must be changed to WE8ISO8859P9 , since it supports full Turkish characters. I would like to migrate character data using a full export and import and my strategy is as follows:
    1- create a full export to a location in network,
    2- create a new database in local machine that it's NLS_CHARACTERSET is WE8ISO8859P9 (I would like to change NLS_LANGUAGE and NLS_TERRITORY by the way)
    3- and implement full import to newly created database.
    I 've implemented first step, but I couldn't implement the second step. I 've created the second step by using toad editor by clicking Create -> New Database but I can not connect the new database. I must connect new database in order to perform full import. How can I do this?
    Thanks in advance.
    Technical Details
    NLS_LANGUAGE.....................AMERICAN
    NLS_TERRITORY.....................AMERICA
    NLS_CURRENCY.....................$
    NLS_ISO_CURRENCY..............AMERICA
    NLS_NUMERIC_CHARACTERS    .,
    NLS_CHARACTERSET.............WE8ISO8859P1
    NLS_CALENDAR.....................GREGORIAN
    NLS_DATE_FORMAT................DD-MON-RR
    NLS_DATE_LANGUAGE...........AMERICAN
    NLS_SORT...............................BINARY
    NLS_TIME_FORMAT.................HH.MI.SSXFF AM
    NLS_TIMESTAMP_FORMAT......DD-MON-RR HH.MI.SSXFF AM
    NLS_TIME_TZ_FORMAT............HH.MI.SSXFF AM TZR
    NLS_TIMESTAMP_TZ_FORMAT..DD-MON-RR HH.MI.SSXFF AM TZR
    NLS_DUAL_CURRENCY............ $
    NLS_COMP...............................BINARY
    NLS_LENGTH_SEMANTICS.......BYTE
    NLS_NCHAR_CONV_EXCP........FALSE
    NLS_NCHAR_CHARACTERSET...AL16UTF16
    NLS_RDBMS_VERSION............10.2.0.1.0

    First, if your applications run on Windows, do not use WE8ISO8859P9. Use TR8MSWIN1254.
    Second, if you create a new database, the database is not necessarily immediately accessible to outer world. I do not know Toad and I have no idea if it performs all necessary steps required for the new database to be visible.  For example, in the Toad itself, I assume you should create a new connection that references the new SID of the newly created database and use this new connection to connect. However, connections without a connection string use the ORACLE_SID setting in Registry to tell connecting applications which instance (database) to use.  To change the database accessed with an empty connection string you need to modify Registry (unless Toad has an option to do this for you). If you want to connect without changing Registry, you need a connect string. This requires setting up Oracle Listener to serve the new database (unless default configuration is used and the database registers itself with the default listener). It also requires changing tnsnames.ora file to create an alias for the new database. Net Manager and/or Net Configuration Assistant can help you with this.
    I wonder if Database Configuration Assistant would not be a better tool to create new Oracle databases.
    Thanks,
    Sergiusz

  • Does EXPDP has any performance effect on the database ?

    We are planning to run export dump expdp (11g r2) on standard edition database every hour. Full =y . It might run for about 15 minutes.
    Does it has any performance effect ?
    Will it slow down the database operations ?
    Will users who are using the database will see any performance issues?

    HI!
    1.For decrease influence on DB ->
    Try excluding index statistics exclude=index_statistics.
    2.Starting with release 10g (10.1.0), Oracle introduced the new Oracle Data Pump technology, which enables very high-speed movement of data and metadata from one database to another. This technology is the basis for Oracle's new data movement utilities, Data Pump Export and Data Pump Import.
    Under certain circumstances, a performance problem may be seen when unloading or loading data with the Data Pump clients. This document will provide details about setup and configuration settings that may have an impact on the performance of the Data Pump clients; will provide details how to check what Data Pump is doing at a specific moment; and will discuss some known defects that have an impact on the performance.
    Checklist for Slow Performance of Export Data Pump (expdp) and Import DataPump (impdp) [ID 453895.1]
    3.Data Pump Performance Tuning
    http://ksadba.wordpress.com/2008/09/16/data-pump-performance-tuning-of-export-import/
    4.speeding up expdp
    http://www.mentby.com/Group/oracle-list/speeding-up-expdp.html

Maybe you are looking for

  • Creating BP in MSA

    Dear Colleagues, When I try to craete the new BP in MSA, R/3 account group combo shows no value and because of that Partner ID is not set by the system after saving. Combo should display the vaule which I have created in CRM system and for each, numb

  • My recommendations for edits to the Thunderbird "Profiles" section: "Restoring".

    Dear Thunderbird "Help" team: My recommendations for edits to the Thunderbird "Profiles" section: Things that are not clear, on the page of Thunderbird "...Restoring to a different location..." https://support.mozilla.org/en-US/kb/profiles-tb#w_resto

  • Internet based management of SCCM clients and management points ?

    We have an SCCM backend infrastructure which is used to manage what is in affect multiple external organizations over the internet.  Currently each SCCM client is managed by the primary server directly over the internet.  Am I right that we could pla

  • Latentcy Issues - Illustrator CS4 only

    Just installed Creative Suite 4 Design Premium. Photoshop and InDesign are working fine. Illustrator is operating correctly. The program freezes for 20-40 seconds with my every move. Change the font, wait 20-40 seconds for the change to process. Move

  • Is there a way to create drop down menus in IWeb

    I want to have drop down menus from the main ones on my website. How do I do that?