Estimate size of export with datapump

Hi,
When i run an expdp with estimate_only=TRUE, the estimate give me 383M.
When i run expdp the sum of the files generated are not equal to 383M but 143.64M.
Why.
Regards

Fahd Mirza wrote:
Hi,
Estimate size of expdp is calculated using statistics for each table. That's wrong, well, at least by default estimate is working based on BLOCKS, not on STATISTICS.
Find out more Understanding the ESTIMATE and ESTIMATE_ONLY Parameters in Export DataPump ID 786165.1+
And using blocks method may give less accurate on estimation size when tables have been created with large extents, or mass delete...
Nicolas.

Similar Messages

  • Long estimate times when exporting with EXPDP

    Hello Everyone,
    I have a few questions about using EXPDP. I'm running Oracle Database 10g Enterprise Edition Release 10.2.0.1.0.
    I'm trying to understand whether or not the export times I'm seeing are reasonable. First, I am performing a SCHEMA export, excluding users, grants, roles, statistics, and several tables. When using block estimation, the following operations take 10 minutes to complete for a single schema:
    6/7/2010 12:39:39 PM: Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    6/7/2010 12:49:12 PM: Total estimation using BLOCKS method: 512 KB
    6/7/2010 12:49:13 PM: Processing object type SCHEMA_EXPORT/TABLESPACE_QUOTA
    6/7/2010 12:49:14 PM: Processing object type SCHEMA_EXPORT/TABLE/TABLE
    10 minutes seems completely unreasonable for 6 tables composing so little data. The 10 minute times appears to be constant regardless of the data size. For example, another schema (with about 30 tables) produces the following output:
    6/7/2010 12:10:52 PM: Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    6/7/2010 12:20:33 PM: Total estimation using BLOCKS method: 16.08 GB
    6/7/2010 12:20:35 PM: Processing object type SCHEMA_EXPORT/TABLE/TABLE
    6/7/2010 12:20:36 PM: Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    Also in about 10 minutes! I wouldn't expect these times to be so close for about 100000x more data! Why does the estimate take so much time? I would expect the block estimate to be a simple calculation over data the DB server is already keeping around.
    When using estimate=statistics, the behavior is even stranger. In this case, the "Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA" operations all take ~17 minutes (regardless of the volume of data in the schema), and the actual export lines in the output (". . exported "test_schema"."table_a" 5.056 MB 5782 rows") complete very quickly (< 2 secs).
    Is this kind of behavior expected?
    To collect these times, I wrote a small application that kicks off expdp and reads the output along with a timestamp one line at a time. If the tool is buffering it's output, that might explain some of this, though for the block estimate, I see the following output during the table exports, which seems reasonable:
    6/7/2010 12:20:47 PM: . . exported "test_schema"."table_a" 2.332 MB 915 rows
    6/7/2010 12:22:58 PM: . . exported "test_schema"."table_b" 954.8 MB 32348 rows
    6/7/2010 12:23:59 PM: . . exported "test_schema"."table_c" 975.6 MB 60573 rows
    6/7/2010 12:24:38 PM: . . exported "test_schema"."table_d" 553.8 MB 973 rows
    6/7/2010 12:24:56 PM: . . exported "test_schema"."table_e" 159.6 MB 2562 rows

    The command:
    expdp 'schema/********@server' estimate=blocks directory=DATA_PUMP_DIR dumpfile=8be7d007-e6c1-4e10-8164-db27d9fce103.dmp logfile=8be7d007-e6c1-4e10-8164-db27d9fce103.log SCHEMAS='schema' EXCLUDE=SCHEMA_EXPORT/USER,SCHEMA_EXPORT/SYSTEM_GRANT,SCHEMA_EXPORT/ROLE_GRANT,SCHEMA_EXPORT/DEFAULT_ROLE,SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA,SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS,SCHEMA_EXPORT/TABLE/COMMENT,SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    The new output:
    6/7/2010 4:22:19 PM: Estimate in progress using BLOCKS method...
    6/7/2010 4:22:19 PM: Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    6/7/2010 4:29:48 PM: . estimated "schema"."table1" 11.89 GB
    6/7/2010 4:29:48 PM: . estimated "schema"."table2" 1.298 GB
    6/7/2010 4:29:48 PM: . estimated "schema"."table3" 1.041 GB
    Judging by the output, I'm guessing that expdp is buffering the output and my tool is reading it all at once, such that each "estimated" line has nearly the same time. Is there a better way to to timing information from the estimation?
    Thanks for the help!

  • How do I perform an export with DataPump on commandline ?

    According to page
    http://www.oracle.com/technology/obe/10gr2_db_vmware/manage/datapump/datapump.htm#t1tt
    I tried in SQLplus to export all the stuff from table CTEST to a local file
    D:\mydata\tabledata.txt by entering the following command:
    expdp "D:\mydata" TABLES=CTEST DUMPFILE=tabledata.txt
    but I got as answer:
    SP2-0734: unknown command beginning "expdp D:\m..."
    Why ?
    How can dump a table into a textfile otherwise ?

    Perhaps your (10g) oracle binaries location is not in your PATH?
    Perhaps you are actually executing against a 9i or earlier database release?
    perhaps you have quotes where they do not belong -- why "D:\mydata" It appears as if your login credentials are not correct?

  • Question about reducing PDF file size on export

    I have a large file that is generates a PDF about 75MB when exported with the "[High Quality Print]" preset. However, I can reduce the size down to to under 5MG in Acrobat if I use the Tools>Flattener Preview>[Medium Resolution] setting (image attached). Is there a way to generate this smaller file directly from InDesign? I couldn't figure out how to do this with InDesign's Flattener Preview or Export options, but it is very likely that I missed something.

    Eugene Tyson wrote:
    When you choose Smallest File size, it is still using the PDF 1.6 setting.
    Change this to Acrobat 4 (PDF 1.3) this will automagically flatten the PDF - which means you shouldn't have to do it in Acrobat.
    That should get you a nice small file size, I believe.
    Flattening transparency in InDesign will not necessarily yield a smaller PDF file size. Transparency flattening will not automatically convert placed vector artwork to raster images unless that vector artwork is actually involved with transparency. And even then, whether you end up with a smaller file depends on a number of factors.
    If the reason for the very large file size is indeed very complex vector artwork and you are willing to sacrifice quality, conversion of such vector artwork to raster might yield a significantly smaller exported PDF file. To accomplish that conversion, I would personally suggest converting the most offensive (in terms of file size and complexity) of such files in Illustrator (assuming that they were .AI files) and exporting them as .TIF files (to avoid the potential imaging artifacts of JPEG compression).
              - Dov

  • Maximum file size for export into MP4?

    Hello,
    I am not able to export 2 hour HD video into standard MP4 file. It seems that reaching 100% export algorithm gets into loop. I was waiting for hours and still had seen progress at exactly 100% with final file size on hard disk to be 0 bytes. I am using CS5 on MAC OS X. I had to split my timeline to 2 parts and to export them separately (which is embarrasing). Is there something like maximum file size for export? I guess that 2h video would have about 25-35GB.
    Thank you
    jiri

    You are right.
    So I am running AP Pro 5.0.4, Adobe Media Encoder 5.0.1.0 (64bit). Operating system Mac OS X ver 10.7.3. All applications are up to date.
    MacBook Pro Intel i5 2.53GB, 8GB RAM, nVidia GT 330M 256 MB, 500GB HDD
    Video is 1920x1080 (AVCHD) 25fps in a .MTS container  (major part of timeline), 1280x720 30fps in .MOV container (2mins), Still images 4000x3000 in .JPG
    No error message is generated during export - everything finishes without any problem...just file created has 0 byte size (as described above).
    This is my largest video project (1h 54min) I dont have any other problem with other projects.
    I dont run any other special software, at the moment of export all usual applications are closed so that MacBook "power" can go to Media Encoder. No codecs installed, using VLC Player or Quick Time.
    Attached please find printscreen from Export settings (AP Pro). Writing this ppost I tried to export only the first 4mins from timeline where all kind of media is used...and it was OK.
    As a next step I will try to export (same settings) 1h 30mins as I still believe problem comes with length of video exported.
    Let me know your opinion

  • Clips exported with Media Encoder from AE are choppy (CS6)

    Hi!
    I'm using Media Encoder (6.0.3.1) to export clips from AE (11.0.4.2) so that I can work with them in PPro (6.0.5).
    In Media Encoder I go to add AE composition and I've tried the YouTube HD 1080p 25 and the Broadcast HD 1080p 25 presets (the latter creates a much bigger file but I don't think the file size is the issue - for example, the biggest clip of approx. 1 mintute is 276MB).
    The curious thing is that most of the clips that are exported are fine, i.e. they play back fine - the problem is that some of the clips go slow or choppy towards the end of the clip. So the last couple of seconds will be choppy. It basically looks like the frame rate isn't quite right.
    Once exported from Media Encoder, I import them into PPro to do some more work - the problem with the clips is apparent when I play the original file in VLC (for example) or in PPro. So it seems like the problem occurs during the export from AE. When I export the video containing the AE clips from PPro the problem is obviously still there (wishful thinking on my part). I use AE just for keying and making some basic titles, nothing massively complex.
    I had some problems with RAM preview in AE before, in other words, it wasn't playing the clips back in realtime and the exported clips with the problem look exactly like that (i.e. looks like the frame rate isn't right). I deactivated Enable Disk Cache in AE (Multiprocessing was already deactivated) and now I can play back clips ok in AE using RAM preview - even the ones that go choppy towards the end when they are exported with Media Encoder play back fine in AE.
    Could Media Encoder be going back to some previous render files saved somewhere on my system and could that cause some of the exported clips to go choppy?
    I'm new to Adobe so I'm just guessing here and I don't want to go about and delete stuff that might not have anything to do with this problem. As I've adopted the computer I'm working on from my predecessor and it's had several users since, the system is a bit of a mess and there are files all over the place. Empting the Disk Cache doesn't actually delete anything for example.
    I'm really stuck here because I can't finish the projects I started working on with AE.... I would appreciate someone's input!
    Many thanks in advance!
    My system specs:
    Intel Core i7-3930K, 3.20GHz, 16GB RAM, 64-bit operating system, GeForce GTX 680, Windows 7, Creative Suite CS6
    Before using Media Encoder to export clips from AE I was just relying on Dynamic Link between AE and PPro by the way and that made all of the clips choppy for the whole duration (this was before I solved the RAM preview problem and I haven't checked if a working RAM preview would make a difference to dynamic link). So Media Encoder nearly solved the problem. I posted this original problem on creativecow (http://forums.creativecow.net/thread/2/1045810) but no one has gotten back to me about the new issue unfortunately.

    Thanks for your reply!
    I tried using Quicktime with PNG and it created a big file (1.18GB for a 1 minute long clip) that doesn't play back nicely in PPro unfortunately. But I then tried rendering the clip using a lossy format, H.264, in AE and it's playing back ok in PPro! I will just do that for the rest of the clips and hope that this is finally resolved. Many thanks for the tip!
    Out of curisoty, do you know what might be causing the problem when I use Media Encoder?

  • Why is the version file smaller than the original file, although I didn't make changes in the file? And why is the keywords don't exporting with original files?

    Hi! Why is the version file smaller than the original file, although I didn't make changes in the file? And why is the keywords don't exporting with original files?

    Wild guess: you're using the the wrong export settings. You'll need to tell us more before we can help you - like the export settings you're usng, the size and format of the originals etc.

  • I want to export with alpha channel using a codec that carries less data than gopro cineform

    Hi--I want to export some intermediate files with an alpha channel. i'm using gopro cineform, and these files end up being so large, meaning that i have to sit through hour long renders.  is there a a codec i can use that has an alpha chanel but creates a smaller file size?
    thanks

    Thanks for your help Patrick. I tell ya, I can't believe what's going on. As I said, this project until today has not caused me any issues what-so-ever. The other six sessions are totally fine. I also created a quick project with the same settings added text - then exported using the same settings I've used before, and it exports with the proper alpha channel.
    Yet, I rebooted the Mac, made sure to delete old versions in the trash can and everything. Still, no alpha.
    To keep moving forward, I literally had to use a color key in FCP to cut out the background color.
    Crazy!

  • Export with consistent=y raise snapshot too old error.

    Hi,
    Oracle version:9204
    It raises
    EXP-00056: ORACLE error 1555 encountered
    ORA-01555: snapshot too old: rollback segment number 2 with name
    "_SYSSMU2$" too small
    when I do an export with consistent=y option.
    And I find below information in alert_orcl.log
    Wed Apr 20 07:50:01 2005
    SELECT /*NESTED_TABLE_GET_REFS*/ "XXX"."TABLENAME".* FROM
    "XXX"."TABLENAME"
    ORA-01555 caused by SQL statement below (Query Duration=1140060307
    sec, SCN: 0x0000.00442609):
    The undo parameters:
    undo_retention=10800(default value)
    undo_retention is larger than the seconds run export(only 1800
    seconds),so I think the default value is enough.
    undo_management=auto(default value)
    Maybe the rollback tablespace is too small(about 300M)? But I think oracle should increase the size of datafile in this mode.Is that right?
    undo_tablespace=undotbs1
    undo_suppress_errors=false
    I think I must miss something.
    Any suggestions will be very appreciated.
    Thanks.
    wy

    UNDO_RETENTION is a request, not a mandate. If your UNDO tablespace is too small, Oracle may have to discard UNDO segments before UNDO_RETENTION is reached.
    How much UNDO is your database generating every second?
    SELECT stat.undoblks * param.value / 1024 / 1024 / 10 / 60 undo_mb_per_sec
      FROM v$undostat  stat,
           v$parameter param
    WHERE param.name = 'db_block_size'Justin
    Distributed Database Consulting, Inc.
    http://www.ddbcinc.com/askDDBC

  • Huge size scheme export in standard edition

    Hi,
    Source:-
    Oracle 10.2.0.3 standard edition on Linux.
    Destination:-
    Oracle 11gR2 - enterprise edition.
    I have to export a scheme of size 250+ gb, it taking long time to export as we do not have parallelism too in STD edition.
    Is there any way? Where I can perform export & import faster?
    Constraint is expdp of schema taking 30+ hours, so If transportable tablespaces, then is there any compatibility problem with source and destination versions and editions?
    An what is the procedure?
    Thanks.

    Hemant K Chitale wrote:
    Can i use 11gR2 binaries to perform TTS of 10g Std edition database?
    You could concurrently run multiple export sessions with table lists --- but you wouldn't get data consistency if the tables are being updated.Thanks for your information.
    This question is now out of TTS, talking in general expdp/impdp.
    I have posted this question in Export/import division of Database, But no quick responses so i moved my question to Database-General.
    Solomon Yakobson mentioned we can use 11gR2 binaries to perform export schema of 10g database, from below link.
    Huge size scheme export in standard edition
    Hope this will work. any more suggestions on this?
    Any suggestions
    Edited by: Autoconfig on Oct 17, 2011 6:32 AM

  • Any way to export with multiple presets at once?

    Hi! Is there any way to export with multiple presets at once? I do a lot of product photography, usually clients send me around 5 to 10 pieces to photograph, and from each photo requires to be exported to about 7 different formats and sizes for different uses. I have a preset for each export, but this requires multiple clicks per export, select the preset, the destination folder, etc.....   and at the end I end taking about 1 hour to just export files when if I could just select 1 time a folder of presets and the destination folder just once, then it will take no time to get all oft those exports.
    Thanks!

    Some of those other photographers have assistants they can task with doing the drudgery.
    I would describe what you're wanting scripted multiple exporting, not multiple export presets, because multiple-exports don't necessarily rely on presets, just initiating an Export and clicking on various things for each one.
    I assume you have a preset set for each of the 7 format-size-uses variations, and then most of the time consuming part is setting the Choosing the destination folder that gets mirrored to DropBox?  You can copy/paste most of the path into the Folder address area after clicking Choose.
    Without knowing what your master-photo folders and dropbox-mirror folder names and organization are it's hard to know if you've thought of all the shortcuts you might use or if things are organized in the most efficient manner.
    If you're on Windows, maybe something like AutoHotKey macros would help with what you're doing.

  • Ld: cycle in dylib re-exports with /usr/X11/lib/libGL.dylib liker error

    Hello,
    I found out that some open source applications when tied to be installed from source or trough fink or mac ports simply fail at linking with the error:
    ld: cycle in dylib re-exports with /usr/X11/lib/libGL.dylib
    However, when I force the linker to use /System/Library/Frameworks/OpenGL.framework/Versions/A/Libraries/libGL.dylib instead it works.
    I am wondering what is the problem since /usr/X11/lib/libGL.dylib has to be exported from /System/Library/Frameworks/OpenGL.framework/Versions/A/Libraries/libGL.dylib.
    It is very painful to force configure to use right version of libGL.dylib. Since the two libraries are of
    different size the should be different. Simple replacement of /usr/X11/lib/libGL.dylib by /System/Library/Frameworks/OpenGL.framework/Versions/A/Libraries/libGL.dylib would work, but what are the consequences for the other applications?
    I would be very happy is someone from OS X developement would reply to this and address this problem in the future.
    Regards,
    Slobodan

    Post to the Unix forum under OS X Technologies and you should get a knowledgeable answer.

  • Size & Quality - Exporting Pix from iPhoto for Display and Printing?

    I would like to export pictures from iPhoto for use in PowerPoint.
    If the pictures are in 4x3 format, exporting them using the "Custom" size option and specifying 1024 as the maximum pixels results in nice 1024 x 768 photos that fill the screen on most projectors. This is good.
    The exported photos are also at 72 pixels/inch. This is OK for onscreen display, but not so good for printing.
    Oddly, the results are the SAME (1024 x 768, 72 pixels/inch) whether one specifies Low, Med, High or Max quality in the iPhoto export panel -- though the file sizes ARE different (higher quality = larger sizes). (I used Photoshop Elements to identify the sizes of the exported photos).
    What gives? I would have thought higher quality files would have been associated with higher resolution (more pixels/inch) but this doesn't seem to be the case, and I don't understand why the file sizes are larger.
    Can someone recommend a way to export a batch of photos from iPhoto at a specific size (1024 x 768) and perhaps 300 pixels/inch) that I might use in PowerPoint for both display and printing? And perhaps explain the bloated file sizes with no apparent quality increase?
    Many thanks!

    Luxesto
    Welcome to the Apple user to user assistance forums
    If the pictures are in 4x3 format, exporting them using the "Custom" size option and specifying 1024 as the maximum pixels results in nice 1024 x 768 photos that fill the screen on most projectors. This is good.
    The exported photos are also at 72 pixels/inch. This is OK for onscreen display, but not so good for printing.
    DPI is meaningless for a digital photo until you display it - Dots Per Inch are just a meaningless number - the displayed DPI is the size of the photo in dots (pixels) divided by the inches of the display measures in inches - the DPI setting in the photo means nothing - see The Myth of DPI for a longer discussion on this.
    Oddly, the results are the SAME (1024 x 768, 72 pixels/inch) whether one specifies Low, Med, High or Max quality in the iPhoto export panel -- though the file sizes ARE different (higher quality = larger sizes). (I used Photoshop Elements to identify the sizes of the exported photos).
    What's odd? you said export with the largest dimension 1024 - that occurred just as you specified - what would be odd would be that the export were different form your specification
    What gives? I would have thought higher quality files would have been associated with higher resolution (more pixels/inch) but this doesn't seem to be the case, and I don't understand why the file sizes are larger.
    Higher quality means less compression (and therefore larger file size) - it does not mean a difference in compression - JPEGs have a compression from 1 to 12 available - iPhoto does 4 steps - small, medium, large and highest. I've not see the compression documented but I think of it as like 3, 6, 9 and 12
    Can someone recommend a way to export a batch of photos from iPhoto at a specific size (1024 x 768) and perhaps 300 pixels/inch) that I might use in PowerPoint for both display and printing? And perhaps explain the bloated file sizes with no apparent quality increase?
    Forget the DPI - just be sure the pixel sizes are correct - for printing you should have at least 150 dpi and 300 dpi is better - so at 150 DPI printing your 1024x768 photo can be printed at 6.8" x 5.12 " -- at 300 dpi it would only 3.4" x 2.55"
    LN

  • Transaction Status during Database Export through DataPump

    Hi,
    I've q question and need clarification. We have 120GB Database ( Version 11GR1 ) which has a single application schema other than standard schemas (Like ststem,sysaux, example etc). The object containg in the application schema will be round about 110~115 GB and the rest size of other standard schemas object. The application users that made transaction through front end J2EE application that hit to application's schema objects.
    We usually expdp that schema. Now my question is when we export the schema when the users are performing transactions what will be the status of that transactions that were made during export process ? what that be reflected in the dump or not? What is that Oracle Actual mechanism for that type of transactions in case of Datapup export.
    Regards,
    Kamran

    Let's say you have 3 tables
    table a
    table b
    table c with partition 1
    table c with partition 2
    Now let's say all of these tables are being exported and while the export is happening, users are adding/dropping rows to these tables/partitions. This is what Data Pump will do if you don't specify flashback_time or flashback_scn;
    Since table a and table b are not partitioned, they will be exported with the scn that is current when the unload of that table is scheduled. So,
    table a gets scheduled to be unloaded when the scn is 12345. The data in the dumpfile for this table will be consistent with 12345. If rows were modified before 12345, the data in the dumpfile will have those rows. If rows were added after 12345, they would not be in the dumpfiles.
    Table b gets scheduled to be unloaded when the scn is 23456. The data in the dumpfile for this table will be consistent with 23456. If rows were modified before 23456, the data in the dumpfile will have those rows. If rows were added after 23456, they would not be in the dumpfiles. If you had ref constraints from table a to table b, you could hit a problem.
    Table c partition 1 gets scheduled at 34567. All of the data in the dumpfile will be consistent with 34567. Just like above
    Here is the exception to the rule!
    Table c partition 2 gets scheduled at 45678. The difference is that partition 1 of this table got scheduled at 34567, so the data in this partition will also be
    consistent with 34567. All partitions of a table will use the same SCN as the first partition that is scheduled.
    If you want a consistent dumpfile, use either
    flashback_time='some date' you can use flashback_time=sysdate if you want to put it in a script and not have to update the date in the script.
    or
    flashback_scn='some_scn'
    Hope this explains what you wanted to know.
    Dean

  • File size on export

    I need to know the file size that will be generated when exporting jpegs for web use, as certain sites have restrictions. This is a fairly basic function and is available in Elements.
    I currently have to guess a quality setting, export the jpeg, check the file size in Explorer, re-export from LR (reducing quality/pixel limits if the file size is too big) and continue the iterations until I achieve an acceptable size.
    This facility should be available for all export file-type options.

    I agree, but I think this has to go beyond the file size.
    The problem is that if you are exporting more than one jpeg, even with the same number of pixels and quality, the file sizes will be different due to compression.
    To make this work, it'd be nice to be able to set a Maximum File Size to Export. Then perhaps a Radio Button to choose between adjusting the size in pixels or adjusting the quality to fit as needed.
    Paul Wasserman

Maybe you are looking for