Compress=Y in Exp and Expdp

I have been using oracle export/import as well as datapump export/import on a regular basis in building our database environments. I noticed the difference between the normal export and the datapump export.
Export(exp) has the compress=Y/N option whereas i didnt see the option in Datapump.
I was looking at the Documentation and found that
http://download.oracle.com/docs/cd/B14117_01/server.101/b10825/dp_export.htm#i1005864
"A parameter comparable to COMPRESS is not needed. " --This is what described in the documentation.
Does Datapump export does automatic compresssion?
I have tried exporting a table(size-14G) from our Database with "norows" option using normal export and tried Datapump export using Metadata_only option and generated the Sql for the metadata definitions.
The script from the normal export shows the initial extent size of 2g whereas SQLfile from the datapump import shows 260M.
Now,which is the best approach for creating objects while Data movement across Databases.
Please clarify.

using compress=y in export means:
exp determines the total size of the segment, uses that as the size of the new initial extent, and sets the next extent to 10 percent of the initial extent.
It is a remnant from the V6 days when there were only dictionary managed tablespaces, and many DBAs (including yours truly) where micromanaging table size and free space size.
That said, even Oracle admits it was not their brightest moment, and you shouldn't use it, ever.
Obviously introducing expdp, in a LMT world, was the moment to get rid of it.
Sybrand Bakker
Senior Oracle DBA

Similar Messages

  • Conversions between character sets when using 'exp' and 'expdp' utilities

    When I make export with exp utility with NLS_LANG environment
    variable not set, export is done with US7ASCII character set,
    but when server uses some other character set e.g. EE8ISO8859P2,
    there can be some national specific characters lost with such a conversion.
    But when I use expdp utility and have not NLS_LANG set,
    in log file there is not mentioned that export is done in other character set
    than that on the server, does it mean that it uses character set on server?

    When I make export with exp utility with NLS_LANG environment
    variable not set, export is done with US7ASCII character set,
    but when server uses some other character set e.g. EE8ISO8859P2,
    there can be some national specific characters lost with such a conversion.
    But when I use expdp utility and have not NLS_LANG set,
    in log file there is not mentioned that export is done in other character set
    than that on the server, does it mean that it uses character set on server?

  • Using expdp and impdp instead of exp and imp

    Hi All,
    I am trying to use expdp and impdp instead of exp and imp.
    I am facing few issues while using expdb. I have a Job which exports data from one DB server and then its imported so another DB server. Both DB servers are run on separate machines. Job runs on various clients machine and not on any of DB server.
    For using expdp we have to create DIRECTORY and as I understand it has to be created on DB server. Problem here is Job can not access DB Server or files on DB server. Also dump file created is moved by Job to other machines based on requirement( Usually it goes to multiple DB server).
    I need way to create dump files on server where job runs.
    If I am not using expdp correctly please guide. I am new to expdp/impdp and imp/exp.
    Regards,

    Thanks for quick reply ..
    Job executing expdb/impdp runs on Red Hat Enterprise Linux Server release 5.6 (Tikanga)
    ORACLE server Release 11.2.0.2.0
    Job can not access the ORACLE server as it does not have privileges (In fact there is no user / password to access ORACLE server machines). Creating dump on oracle server and moving is not an option for this JOB. It has to keep dump with itself.
    Regards,

  • Hoping for a quick response : EXP and Archived REDO log files

    I apologize in advance if this question has been asked and answered 100 times. I admit I didn't search, I don't have time. I'm leaving on vacation tomorrow, and I need to know if I'm correct about something to do with backup / restore.
    we have 10g R2 running a single instance on a single server. The application vendor has "embedded" oracle with their application. The vendor's backup is a batch file using EXP - thus:
    exp system/xpwdxx@db full=y file=D:\Orant\admin\db\EXP\db_full.dmp log=D:\Orant\admin\db\EXP\db_full.txt direct=y compress=y
    This command is executed nightly at midnight. The files are then backed up by our nightly backup to offsite storage media.
    Te database is running in autoarchive mode. The problem is, the archived redo files filled the drive they were being stored on, and it is the drive the database is on. I used OS commands to move 136G of archived redo logs onto other storage media to free the drive.
    My question: Since the EXP runs at midnight, when there is likely NO activity, do I need to run in AutoArchive Mode? From what I have read, you cannot even apply archived redo log files to this type of backup strategy (IMP) Is that true? We are ok losing changes since our last EXP. I have read a lot of stuff about restoring consistent vs. inconsistent, and just need to know: If my disk fails, and I have to start with a clean install of Oracle and nothing else, can I IMP this EXP and get back up and running as of the last EXP? Or do I need the autoarchived redo log files back to July 2009 (136G of them).
    Hoping for a quick response
    Best Regards, and thanks in advance
    Bruce Davis

    Bruce Davis wrote:
    Amardeep Sidhu
    Thank you for your quick reply. I am reading in the other responses that since I am using EXP without consistent=y, I might not even have a backup. The application vendor said that with this dmp file they can restore us to the most recent backup. I don't really care for this strategy as it is untested. I asked them to verify that they could restore us and they said they tested the dmp file and it was OK.
    Thank you for taking the time to reply.
    Best Regards
    BruceThe dump file is probably ok in the sense it is not corrupted and can be used in an imp operation. That doesn't mean the data in it is transactionally consistent. And to use it at all, you have to have a database up and running. If the database is physically corrupted, you'll have to rebuild a new database from scratch before you can even think about using your dmp file.
    Vendors never understand databases. I once had a vendor tell me that Oracle's performance would be intolerable if there were more than 5 concurrent connections. Well, maybe in HIS product ..... Discussions terminated quickly after he made that statement.

  • Scripts for exp and imp in linux

    please give me some links for exp and imp tables peridical bactch script in oracle 10 R2 over RHEL 4.5.

    user13653962 wrote:
    please give me some links for exp and imp tables peridical bactch script in oracle 10 R2 over RHEL 4.5.try this.. change the same script for your environment
    $ crontab -l
    00 22 * * * sh /u01/db/scripts/testing_user1.sh
    $ cat /u01/db/scripts/testing_user1.sh
    . /home/oracle/TEST1.env
    export ORACLE_SID=TEST1
    export ORA_USER=dbdump
    export ORA_PASSWORD=dbdump
    export TNS_ALIAS=TEST1
    expdp $ORA_USER/$ORA_PASSWORD@$TNS_ALIAS tables=YOUR_tables_NAMEs  dumpfile=TEST_`date +'%Y-%m-%d`.dmp directory=YOUR_DUMPDIR logfile=TEST_`date +'%Y-%m-%d`.logif you need for original export change the exp command,
    exp username/password tables=table_name1,table_name2,etc... file=your_file_name log=your_log_file_name

  • Error while converting class file to exp and jca file

    error while converting *.class file to *.exp and *.jca file
    =====================================================================================================================
    linux-y60u:/home/admin/java_card_kit-2_2_1/samples/src # converter -exportpath "/home/admin/java_card_kit-2_2_1/lib/" com/sun/javacard/samples/HelloWorld 0x00:0x01:0x02:0x03:0x04:0x05:0x06:0x07:0x0b 1.0 -v -applet 0x00:0x01:0x02:0x03:0x04:0x05:0x06:0x07:0x0b:0x01 Identity
    Java Card 2.2.1 Class File Converter, Version 1.3
    Copyright 2003 Sun Microsystems, Inc. All rights reserved. Use is subject to license terms.
    parsing /home/admin/java_card_kit-2_2_1/samples/src/com/sun/javacard/samples/HelloWorld/HelloWorld.class
    parsing /home/admin/java_card_kit-2_2_1/samples/src/com/sun/javacard/samples/HelloWorld/Identity.class
    error: com.sun.javacard.samples.HelloWorld.HelloWorld: unsupported class file format of version 50.0.
    error: com.sun.javacard.samples.HelloWorld.Identity: unsupported class file format of version 50.0.
    conversion completed with 2 errors and 0 warnings.
    =====================================================================================================================

    i compile a file javacard use this command:
    ===
    javac -source 1.3 -target 1.1 -g -classpath ./classes:../lib/api.jar:../lib/installer.jar src/com/sun/javacard/samples/Identity/Identity.java
    ===
    and try to convert this class use this command
    ===
    /home/xnuxerx/admin/java_card_kit-2_2_1/bin/converter -exportpath "/home/xnuxerx/admin/java_card_kit-2_2_1/lib/" com/sun/javacard/samples/Identity 0x00:0x01:0x02:0x03:0x04:0x05:0x06:0x07:0x0b 1.0 -v -applet 0x00:0x01:0x02:0x03:0x04:0x05:0x06:0x07:0x0b:0x01 Identity
    ===
    result convert:
    ===
    Java Card 2.2.1 Class File Converter, Version 1.3
    Copyright 2003 Sun Microsystems, Inc. All rights reserved. Use is subject to license terms.
    parsing /home/xnuxerx/admin/java_card_kit-2_2_1/samples/classes/com/sun/javacard/samples/Identity/Identity.class
    converting com.sun.javacard.samples.Identity.Identity
    error: export file framework.exp of package javacard.framework not found.
    conversion completed with 1 errors and 0 warnings.
    ===
    why ??
    please your comment for this problem.
    thank 4 all.

  • How do I stop beats audio compressing the sound up and down

    How do I stop beats audio compressing the sound up and down.

    Hi Strung,
    I see that you are having some issues with compression and Beats Audio. I have done some research into your issue and will be happy to help. Here are a couple of links to HP Support Forum threads where they talk about the settings in the Beats Audio.
    http://h30434.www3.hp.com/t5/Notebook-PC-Sound-and-Audio/DV7-w-Beats-sound-compression/td-p/396705/p...
    http://h30434.www3.hp.com/t5/Notebook-PC-Sound-and-Audio/Beats-Audio-EQ-Setting-General-Sound-Advice...
    Take a look and fiddle with your settings and let me know how it goes.
    Thank you,
    Please click “Accept as Solution ” if you feel my post solved your issue.
    Click the “Kudos Thumbs Up" on the right to say “Thanks” for helping!
    Thank you,
    BHK6
    I work on behalf of HP

  • Difference between " COMPRESS FOR ALL OPERATIONS" and "COMPRESS FOR OLTP"?

    I was looking through Oracle's OLTP Table Compression (11g onwards) documentation as well as online resources to find the syntax and came across two different versions:
    COMPRESS FOR ALL OPERATIONS
    and
    COMPRESS FOR OLTP
    The documentation I looked through didn't mention any alternative syntax, so i was wondering if anyone here might know the difference.
    Thank you!

    Table Compression Enhancements in Oracle Database 11G Rel1 as as follows:
    The compression clause can be specified at the tablespace, table or partition level with the following options:
    •NOCOMPRESS - The table or partition is not compressed. This is the default action when no compression clause is specified.
    •COMPRESS - This option is considered suitable for data warehouse systems. Compression is enabled on the table or partition during direct-path inserts only.
    •COMPRESS FOR DIRECT_LOAD OPERATIONS - This option has the same affect as the simple COMPRESS keyword.
    •COMPRESS FOR ALL OPERATIONS - This option is considered suitable for OLTP systems. As the name implies, this option enables compression for all operations, including regular DML statements. This option requires the COMPATIBLE initialization parameter to be set to 11.1.0 or higher.

  • How to find a dump file is taken using norma exp or expdp

    Hi All,
    How to find out whether a dump file is taken using the conventional exp or using the expdp utility?
    OS: HPUX
    DB: 10.2.0.4

    Hi ,
    I go with Helios's reply. We cannot just predict if it was taken my expdp or exp.
    Since your DB version is 10 , both could be possible.
    The simplest way would be : just run the imp , if it throws error , then its created through expdp.
    because dump from expdp cannot be used with imp and vice versa. So that could help you find..
    else , try to get the syntax by which the dump was created.
    If you have any doubts , wait for experts to reply.
    HTH
    Kk

  • Confusing impdp and expdp stuff

    Hello, I am kinda new to Oracle in general.
    Building a test environment.
    Want to move a production schema to a test RAC db i have built.
    I read up and decided that expdp would be the best method ( correct me if i am wrong please )
    I am on oracle linux.
    Now I have expdp'ed a schme succesfully but when impdping on the other system i am getting a total of 735 errors.
    I have created the same exact user and am using the following command
    Please let me know whether I am missing a step here.
    $>impdp system/<password> directory=impdp_dir dumpfile=UNIXDATA.DMP logfile=impdpUNIXdata.log fromuser='unixdata' touser='unixdata' commit=Y ignore=Y
    would the log file help?
    one of them
    Processing object type SCHEMA_EXPORT/JOB
    ORA-39083: Object type JOB failed to create with error:
    ORA-00001: unique constraint (SYS.I_JOB_JOB) violated
    also this
    ORA-39083: Object type MATERIALIZED_VIEW failed to create with error:
    ORA-31625: Schema UNIXDATA is needed to import this object, but is unaccessible
    ORA-01435: user does not exist
    THANKS ALOT FOR YOUR HELP!
    Ali
    Edited by: user10270464 on Apr 16, 2010 5:40 AM

    Hi,
    You don't say what version of Oracle you are using so trying to help without that info is more difficult. It also looks like you are using some old exp parameters and some new expdp parameters. Your mixed command is:
    impdp system/<password> directory=impdp_dir dumpfile=UNIXDATA.DMP logfile=impdpUNIXdata.log fromuser='unixdata' touser='unixdata' commit=Y ignore=Y
    You said that you did a schema mode export, so I would change the impdp command to be:
    impdp system/<password> directory=impdp_dir dumpfile=UNIXDATA.DMP logfile=impdpUNIXdata.log table_exists_action=append
    The parameters I changed were:
    fromuser='unixdata' -- if the only thing in the dumpfile is what you want, then you don't need fromuser
    touser='unixdata' -- since you are importing into the same schema, you don't need touser
    commit=y -- no such equivalent in datapump
    ignore=y -- the same as table_exists_action=append
    Now for your errors.
    The first one is not familiar with me, but I have seen the second error, but can't remember the details. Having the version that you are running may help job my memory.
    Dean

  • Database merge with the exp and imp...

    Hi All,
    I am very new to these dba activities. I have two databases with the same schema and has some data in both of it. Now I would like to merge both the databases and come up with one database. In this process I don't wanted to loose any data.
    Does the oracle exp/imp helps in this scenario ? If not any other tools helps us in doing this?
    What are the best practices to follow when we are doing this kind of work?
    What kind of verifications we need to do pre/post merge ?
    Any help would really be appreciated... Thank you in adavance...
    K.

    NewBPELUser wrote:
    Hi All,
    I am very new to these dba activities. I have two databases with the same schema and has some data in both of it. Now I would like to merge both the databases and come up with one database. In this process I don't wanted to loose any data.
    Does the oracle exp/imp helps in this scenario ? If not any other tools helps us in doing this?
    What are the best practices to follow when we are doing this kind of work?
    What kind of verifications we need to do pre/post merge ?
    Any help would really be appreciated... Thank you in adavance...
    K.What do you mean with "merging" data?
    How many objects are in both schemas?
    Are both schemas need to be merged in mutual way or in one way?
    Do you have option to use MERGE command for each object in the schema?
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14200/statements_9016.htm#SQLRF01606
    P.S. If the version of the database is greater than 9i, then use Data Pump (expdp/impdp) instead of deprecated exp/imp utitlies
    Kamran Agayev A.
    Oracle ACE
    My Oracle Video Tutorials - http://kamranagayev.wordpress.com/oracle-video-tutorials/

  • DUMP by EXP  or EXPDP

    1 - how can I find out if the DMP file which I got it from somebody else, has been created using exp or EXPDB ? ( I just have a <....>.DMP )
    2- what will be the syntax for IMP or IMPDB if there are more than 1 dmp ? ( like <....1>.dmp and <....2>.dmp ) ( i mean the person who has given the backups to me, has created 2 dmp and also there is no log file )

    Hi,
    how can I find out if the DMP file which I got it from somebody else, has been created using exp or EXPDB ? ( I just have a <....>.DMP )
    It can't be simply judged by looking at the .dmp file. If you have the log file, check in the logfile where the expdp command has been spdficied if it the backup was taken using expdp else that's exp..
    2. If you have more than one dmp file, then it was taken using parallel option. (you man consider your backup was taken using expdp)
    ... to import use the command
    impdp user/password directory=directoryname dumpfile=dumpfile%U.dmp schemas=schemaname logfile=logfilename
    HTH
    KSG

  • I am trying to use my IPAD to video students in my conducting class, then email them the video for self evaluation.  However, many of the video clips are too long to email.  Is there anyway to compress the video clips and still email them so they can view

    I am trying to use my IPAD to video students in my conducting class, then email them the video for self evaluation.  However, many of the clips are too long to send.  Is there a way I can compress the clips and still send them via email so they can open and them using Quicken?
    Muzakmn

    It depends on the clips' content, their current format, and how much you would need to compress them, but in most cases and with most email systems, it's difficult to impossible to compress a clip enough to be able to get it through the attachment size limits of most email providers and still have the video be comprehensible. You'll probably need to find a web site or other method where you could post the videos for download by the students.
    You can try compression and trimming, though, and see if you can get the video small enough to email. An attachment often has to be 3MB or less to go through, though it depends entirely on the email systems on both ends. If you look to the right under "more like this" you'll find similar threads on the subject.
    Regards.

  • Conversions between character sets when using exp and imp utilities

    I use EE8ISO8859P2 character set on my server,
    when exporting database with NLS_LANG not set
    then conversion should be done between
    EE8ISO8859P2 and US7ASCII charsets, so some
    characters not present in US7ASCII should not be
    successfully converted.
    But when I import such a dump, all characters not
    present in US7ASCII charset are imported to the database.
    I thought that some characters should be lost when
    doing such a conversions, can someone tell me why is it not so?

    Not exactly. If the import is done with the same DB character set, then no matter how it has been exported. Conversion (corruption) may happen if the destination DB has a different character set. See this example :
    [ora102 work db102]$ echo $NLS_LANG
    AMERICAN_AMERICA.WE8ISO8859P15
    [ora102 work db102]$ sqlplus test/test
    SQL*Plus: Release 10.2.0.1.0 - Production on Tue Jul 25 14:47:01 2006
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Connected to:
    Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    TEST@db102 SQL> create table test(col1 varchar2(1));
    Table created.
    TEST@db102 SQL> insert into test values(chr(166));
    1 row created.
    TEST@db102 SQL> select * from test;
    C
    ¦
    TEST@db102 SQL> exit
    Disconnected from Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    [ora102 work db102]$ export NLS_LANG=AMERICAN_AMERICA.EE8ISO8859P2
    [ora102 work db102]$ sqlplus test/test
    SQL*Plus: Release 10.2.0.1.0 - Production on Tue Jul 25 14:47:55 2006
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Connected to:
    Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    TEST@db102 SQL> select col1, dump(col1) from test;
    C
    DUMP(COL1)
    ©
    Typ=1 Len=1: 166
    TEST@db102 SQL> exit
    Disconnected from Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    [ora102 work db102]$ echo $NLS_LANG
    AMERICAN_AMERICA.EE8ISO8859P2
    [ora102 work db102]$ exp test/test file=test.dmp tables=test
    Export: Release 10.2.0.1.0 - Production on Tue Jul 25 14:48:47 2006
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    Export done in EE8ISO8859P2 character set and AL16UTF16 NCHAR character set
    server uses WE8ISO8859P15 character set (possible charset conversion)
    About to export specified tables via Conventional Path ...
    . . exporting table                           TEST          1 rows exported
    Export terminated successfully without warnings.
    [ora102 work db102]$ sqlplus test/test
    SQL*Plus: Release 10.2.0.1.0 - Production on Tue Jul 25 14:48:56 2006
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Connected to:
    Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    TEST@db102 SQL> drop table test purge;
    Table dropped.
    TEST@db102 SQL> exit
    Disconnected from Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    [ora102 work db102]$ imp test/test file=test.dmp
    Import: Release 10.2.0.1.0 - Production on Tue Jul 25 14:49:15 2006
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    Export file created by EXPORT:V10.02.01 via conventional path
    import done in EE8ISO8859P2 character set and AL16UTF16 NCHAR character set
    import server uses WE8ISO8859P15 character set (possible charset conversion)
    . importing TEST's objects into TEST
    . importing TEST's objects into TEST
    . . importing table                         "TEST"          1 rows imported
    Import terminated successfully without warnings.
    [ora102 work db102]$ export NLS_LANG=AMERICAN_AMERICA.WE8ISO8859P15
    [ora102 work db102]$ sqlplus test/test
    SQL*Plus: Release 10.2.0.1.0 - Production on Tue Jul 25 14:49:34 2006
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Connected to:
    Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    TEST@db102 SQL> select col1, dump(col1) from test;
    C
    DUMP(COL1)
    ¦
    Typ=1 Len=1: 166
    TEST@db102 SQL>

  • Exporting using exp and TRIGGERS=N doesn't  work

    Exporting using exp on version Export: Release 8.1.6.0.0 - with the option TRIGGERS=N stills exports with triggers. is it a bug ?
    Any sugesttions ?

    This is actually a known behaviour.
    As of table level export, all the dependent objects are exported and is expected namely, indexes, constraints, triggers... and even restricting with other parameters set to n will not disable their export.
    As of schema level export, the whole schema is exported and with this we can pose our restrictions like no constraints, no rows, no indexes, no triggers etc.. which will work because we have control.
    For table level export we dont have control on individual objects. 10g solves your problem.

Maybe you are looking for

  • Issue in connecting sub VI programs through the main program

    Sir/Madam,                    I have made a few sub VI  programs of 'Keithley 2400', but I am having problem in connecting them together through the main program. Actually the created sub VI programs are not showing  any activation button when I conn

  • How can I login to the Mac App Store?

    My wife cannot login to the Mac App Store on her MacBook Pro.  This first happened under Snow Leopard and is now happening under Lion.  I have googled extensively for a solution, but not yet found one that lasts long term.  We have binned a Networkin

  • PDF colour shows in Acrobat X1 in wrong way

    When comparing the Printed Magazine with the dokument in Indesign (CC 2014), Photoshop (CC 2014) and Acrobat (X1) the colours is not the same. Its all been setup from Bridge. Everything is matched except Acrobat that shows the PDF i wrong way. Why?

  • Substitution rule has to triggered after save document in FB70 and FB75

    HI,   I have a substitution rule for REFNR field  and that will trigger when you make FB70 and FB75 for 2 company codes. In the user exit for subustitution we have a code to get the next object number for some other purpose. because third party will

  • [SOLVED] wine arial font only bold and italic

    I have installed office 2003 in wine. Some one send me a document in arial regular font. It looks fine in winxp, but when I open with the word by wine the all the arial font looks bold and italic. I have also tried with winetricks corefonts but nothi