Dump size decreasing aftter each export

Hi All
I am working with oracle 10g with UTF8 format Whenever I exports the data its size is being reduced as compared to earlier export
What will be the problem ??
thanking you
Regards
Gaurav Sontakke

What kind of size decrease are you seeing? If it is minimal it could just be the order in which information is stored in the dumpfile.
Dean

Similar Messages

  • Today EXP dump size decreased

    Hi,
    DB: 10.2.0.4
    OS: AIX 5.3 L
    It is 2 node RAC on ASM with single standby DB.
    I have been taking EXP backup of my primary DB daily.Unfortunately , today my backup size has decreased by 1 GB .My total exp dump file size will come to 105 GB.It has happened today only.
    Here my question is,
    Are there any chance where today EXP dump file size will be less as compared to previous dump size ?.If we have , please let me know the things.
    Thanks,
    Sunand

    Hi G777 & Sybrand,
    Thanks for your reply.
    We(users) did not delete any data and at DBA side , no index rebuild and table analysis .We will have the updates and inserts (obviously) on tables and these DML will not affect the dump size to get decrease .
    Sybrand , can you explain me more on your reply ?.
    Thanks,
    Sunand

  • My Databse size is 10 gb what will be my export dump size?

    My Databse size is 10 gb what will be my export dump size?
    Why is there is difference ?

    Hi,
    an Oracle Block constists not only of the row-data itself,
    but also has a block header.
    This is made up ofdifferent structures which will use about
    - 60 bytes of fixed administrative information for the block
    - 24 bytes per transaction slot (depending of the value inittrans there will be a number of preallocated slots for transaction pointers, default 1 for tables 2 for indexes, in every block header) this socalled ITL consists of three parts:
    - Undo segment number the transaction is using for the before images
    - Undo slot number in the undo segment
    - sequence number (this will be incremented at commit time)
    2 bytes for every row which is inserted into the block.
    Also the rows have a header each (congtaining the number of columns belonging to the row for example) and every column in a row has header also(containing the informatin of the datatype and the length of the value for the column.
    You see that there is a lot more info in a block than just the rowdata.

  • Export minimize dump size

    hi all ;
    i have 10.2.0.2 run on Redhat. i have to take backup a schema and import it to a test db.
    this schema contains 20 tables and some of them have miilions records. i try to export of this schema and use ;
    exp system/password fille=exp.dmp log=exp.lod owner=schemaowner
    but dump size is nearly 1GB and this is too big for me. as i mentioned that this dump will use for test and i dont have to take all the records. for example last 7 days of records is enough for me.
    i cant take this export without row at least some of records is needed.
    how can i do this export and import ???
    thanks...

    Hello,
    I prefer to use Parameter Files like the following ones:
    For the Original Export:
    TABLES=(
    <schema>.<table_1>,
    <schema>.<table_2>,
    <schema>.<table_n>
    FILE=<path>\<dump_file_name>.dmp
    LOG=<path>\<log_file_name>.dmp
    COMPRESS=N
    DIRECT=Y
    For the Original Import:
    FROMUSER=<schema>
    TOUSER=<schema>
    TABLES=(<table_1>,<table_2>,...<table_n>)
    FILE=<path>\<dump_file_name>.dmp
    LOG=<path>\<log_file_name>.dmp
    IGNORE=Y
    FEEDBACK=10000Then you have just to execute the statements below:
    exp <user>/<password> PARFILE=<export_parameter_file>
    imp <user>/<password> PARFILE=<import_parameter_file>Hope this help.
    Best regards,
    Jean-Valentin

  • Best settings for quality/size trade-off when exporting to QT?

    Hi All,
    I've been using Keynote '09 (5.3) to record narrated slide shows (slides with me narrating by voice) for my online class "lecture" podcast series for a while - now that the beloved, but quirky and unsupported, ProfCast has fallen by the wayside. This works well - as we all know - producing clear audio and visuals that upload well to YouTube, where I am now hosting the files.
    The problem I'm running into now is that my lectures often last around 50 minutes, and this produces enormous file sizes (like the original Keynote file comes in around 800MB with the slideshow saved). I've been exporting to Quicktime with full size video and mixed audio set to AAC, and this produces a QT mov of around 130MB or so. Still quite large, but at least manageable for uploading, even with DSL at home.
    Ideally, though, I'd have even smaller mov files, but without much loss in quality. In my earlier experimentation, I realized that it was the audio that was eating up loads of space. However, switching to half size video cuts the same file to around 80MB as well. Once I've opened the can of worms of custom audio and video settings, I feel like there are many, many possible combinations of sizes and qualities...and it's difficult to steadily compare each and every permuationg (it takes a while for each export to run!).
    So, the question I have is: are any of the other AAC settings in the menu likely to make a significant dent in mov file size? Or, are these really just small differences on the margin? Even if I stay with AAC overall, would tweaking the kHz Rate make much of a difference?
    Looking forward to any wisdom from experience you all might have to pass along!

    I was looking for command-line PDF conversion tools and eventually found out this:
    https://www.pdf-tools.com/pdf/pdf-optimization-reduce-size-1.aspx
    To me, it is way too pricey (US$ 570), but maybe it's a viable solution to you. It has  watched folder workflows, therefore it can be handy for your purposes. You can download a demo and even test it online (the resulting PDF is watermarked).
    I submitted a file to their online testing page and were impressed with the results: a 1.5 Mb original PDF resulted in a 925 Kb file (while an Acrobat "optimized" file ot the same sample PDF turned out to be _bigger_ then the original). What makes it even more impressive: in the online test I deselected ALL the image downsampling settings, to force the optimized PDF to keep the original image quality and resolution. Then, in Acrobat, I exported the images from both the original PDF and the optimized one, and compare them in Photoshop. They were not identical, but the difference is pratically imperceptible. I guess that applying downsampling to the test will result in even smaller PDFs, but I didn't test this.
    Anyway... I don't have any affiliation with those guys, and I don't have either the need nor the money to buy this product. I'm only sharing these thoughts because they may point you to other ways to find a solution. The above mentioned application seems to be very efficient. If you can afford it, then I think it deserves a try. If not, you can search the Web for plugins or other tools that can fit you needs and budget. Acrobat/InDesign approaches are not the only way to go.

  • Emailed photos are small no mattt size I choose in export dialogue box

    E

    LR is working on the uncompressed data inside of itself so the relative size between originals and exported size is determined entirely by what you do to the JPGs as you process and the export settings.
    If you’re reexporting your JPGs without resizing them, then the Quality setting has the most effect on increasing or decreasing the size relative to what was imported. 
    Also, making them smoother (using noise-reduction) will also help them compress more, and making them less smooth (adding sharpening) will prevent them from being compressed as much for a given quality setting.
    JPGs lose quality when you compress them so the trick is to compress them enough to make them a manageable size without degrading them too much.   Export your images as uncompressed 8-bit TIFs to see what the JPG compression is accomplishing.
    Increasing the quality setting higher than 70 will make them larger so you can experiment with multiple quality settings to gauge what quality the originals were created using. 
    For your experiments to be most accurate you’d want to export the JPGs without any other processing having been applied: no toning or noise-reduction or sharpening or resizing.  Once you determine what quality setting gives a similar size to what was imported you can then decide if you want to make them smaller to keep your e-mail size manageable, comparing the quality of the files, visually, instead of just relying on size to know if they have similar quality.
    LR may use slightly different JPG compression methods than what produced the JPGs in the first place, so don’t be too obsessed with them being exactly the same size, just roughly the same magnitude.

  • Question about  import oracle dump size 20GB

    I got the oracle dmp file size 20GB from client
    export from LInux AS4 ( oracle database version 9R2 )
    is it posible to import in to windows server 2003 running oracle 10gR2
    please post your views on this

    Why notll, exp and imp are widely used for cross platforms enviournments. you go through with your imp and if you come up with something which needs to be sort out , then update the thread with your problems .
    hare krishna
    Alok

  • Exporting Quark to Acrobat. Acrobat enlarges page size, how do I export to Adobe Acrobat so that it keeps the page size the same?

    Exporting Quark to Acrobat. Acrobat enlarges page size, how do I export to Adobe Acrobat so that it keeps the page size the same?

    Thank you. I used to be able to Export from Quark to Acrobat without any problems when I was using Acrobat 7. Just changed to Acrobat 8 and have this problem. My 8.5x11 pages become 33x44 inches when exported to Acrobat 8. 400% increase in size. I don't know why. I tried Printing my Quark file to Adobe PDF and that worked. Thank you for your help.
    Margo

  • Original size Raw to JPG export not exporting correct size

    I'm having a problem with the export of RAW to JPG at original size. I have raw images of 4941 × 3295 (125mb), when I export selecting "JPEG-origional" size I get resulting jpg's of 800 x 533 (95kb) 
    If I select JPEG-50% of original size I get correctly exported images of 2471 × 1648 (589kb)
    Why would the original size export not work?

    Have you checked the "JPEG original size" preset? Select the preset on export, then switch to "Edit" and see, what is selected. Sometimes this preset is redefined to a smaller size.
    When you edit the preset, the "Size To" pop-up should be set to "Original size". If it is not, set it back.
    Regards
    Léonie

  • Error 0164 memory size decreased

    Hello, I recently was completing RAM testing with some of our ThinkCentre M Series computers.  Once I completed the testing, I put the same amount of RAM back into the machines.  It functions fine and shows up in BIOS and Windows, but the message "error 0164 memory size decreased" simply will not go away.  I've tried removing the CMOS battery and various other things (there are previous threads with similar issues, none of the resolutions worked for me though).  Any ideas?
    Thanks

    Try adding more memory than it was original amount and see what happens? Just for troubleshooting puropse ...
    Disclaimer: While I do work for Lenovo Partner, all my contributions are my personal, non-official and not that of Lenovo or my employer.

  • Is there a file size limit of each pic /or total of all pictures for iBook

    I am putting together an iPhoto book using pages of several pictures each I have composed in Photoshop. I then import these pages as a jpeg or psd to iPhoto. I am finding the jpegs often do not import (with message explaining, etc) but that the psd documents do import. These are rather large documents because of the several pictures used on each page. I am planning to do 30 double sided full bleed pages and I am concerned the finished book will be refused because the completed book file is to large. Should I be? Another way to put this could be: is there a size limit of each page or the total size of the book that iPhoto will accept for an individual book?

    Your best bet would be to make duplicate of your iPhoto Library folder and then open the duplicate and delete all the photos that do not belong to the book. Keep the book in the Source pane however. If you think some photos might be added later then keep them also.
    Now the iPhoto Library folder will be small enough to burn to disk (just drag the folder onto a mounted, blank disk and burn via the Finder). That folder can then be copied from the CD to the desktop of the other computer and opened with iPhoto to continue with the fine tuning. I'd rename the library folder to something like "Book Library" before burning to disk. This method is also a great way of preserving the book after you've ordered it in case you want to order a second copy at a later date. I create a special library for all my books for that exact reason. Once ordered and burned, I can reorder the same book later on even if I have a catastrophic drive crash in the meantime.
    Do you Twango?

  • Export dump size precheck

    Hi
    Before do export of oracle users or database is there any option to check what will be the export dumpfile size.

    check by using commands what options do you have..
    $exp help=y
    $expdp help=yexp help=y
    Export: Release 11.1.0.7.0 - Production on Wed May 4 02:57:46 2011
    Copyright (c) 1982, 2007, Oracle. All rights reserved.
    You can let Export prompt you for parameters by entering the EXP
    command followed by your username/password:
    Example: EXP SCOTT/TIGER
    Or, you can control how Export runs by entering the EXP command followed
    by various arguments. To specify parameters, you use keywords:
    Format: EXP KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
    Example: EXP SCOTT/TIGER GRANTS=Y TABLES=(EMP,DEPT,MGR)
    or TABLES=(T1:P1,T1:P2), if T1 is partitioned table
    USERID must be the first parameter on the command line.
    Keyword Description (Default) Keyword Description (Default)
    USERID username/password FULL export entire file (N)
    BUFFER size of data buffer OWNER list of owner usernames
    FILE output files (EXPDAT.DMP) TABLES list of table names
    COMPRESS import into one extent (Y) RECORDLENGTH length of IO record
    GRANTS export grants (Y) INCTYPE incremental export type
    INDEXES export indexes (Y) RECORD track incr. export (Y)
    DIRECT direct path (N) TRIGGERS export triggers (Y)
    LOG log file of screen output STATISTICS analyze objects (ESTIMATE)
    ROWS export data rows (Y) PARFILE parameter filename
    CONSISTENT cross-table consistency(N) CONSTRAINTS export constraints (Y)
    OBJECT_CONSISTENT transaction set to read only during object export (N)
    FEEDBACK display progress every x rows (0)
    FILESIZE maximum size of each dump file
    FLASHBACK_SCN SCN used to set session snapshot back to
    FLASHBACK_TIME time used to get the SCN closest to the specified time
    QUERY select clause used to export a subset of a table
    RESUMABLE suspend when a space related error is encountered(N)
    RESUMABLE_NAME text string used to identify resumable statement
    RESUMABLE_TIMEOUT wait time for RESUMABLE
    TTS_FULL_CHECK perform full or partial dependency check for TTS
    VOLSIZE number of bytes to write to each tape volume
    TABLESPACES list of tablespaces to export
    TRANSPORT_TABLESPACE export transportable tablespace metadata (N)
    TEMPLATE template name which invokes iAS mode export
    Export terminated successfully without warnings.

  • Export Dump Size

    Oracle Version 10.2.0.4
    My database total size(allocated) is 120 GB
    Used is 79 GB
    When I take export, can i assume database export dump file size would be around 70 GB ?

    Dean Gagne wrote:
    If you use data pump to export data, you can estimate the size of the dump file using following parameter:
    ESTIMATE_ONLY=YThis is not really true. All that is being estimated is the size of data, it does not include the size of the metadata.
    I'm not sure if there is a good way to estimate how large the dumpfile from Data Pump will be.
    DeanAgreed!
    But the estimated size may at least gives some idea.

  • Dramatic decrease in AB Export Archive size?

    I do a backup (manually) of several key folders. Between 10.6 and 10.6.1 update, doing an Export>address book archive I am seeing a dramatic decrease in file size between the two backups.
    Under 10.6.1 the file size is 9.4 mb, whereas 10.6 is 37.8 mb — Nothing else changes with the AB.
    I have deleted today's 9.4. Several times and re-saved. It is consistent.
    Ideas as to the huge difference???
    Cletus/
    MBP 2.16 GHz C2D, 2GB RAM, 160HD

    Well, a couple of things:
    1) The default compilation is to include debug code. So be
    sure you are selecting Export Release Build when you build in Flex
    Builder, or disabling debug on the command line. That should reduce
    it a little.
    2) Use RSLs (runtime shared libraries). Specifically, RSLs
    are designed to reduce the app size, especially if the user has
    already downloaded the signed framework RSL. More info:
    http://livedocs.adobe.com/flex/3/html/help.html?content=rsl_09.html
    3) Every app has a lot of built in framework code -- even the
    simplest app has alot of framework classes that it must include
    (such as managers and layouts). If you add components, though, the
    increase in file size will be minimal. So an app by itself might be
    150K, but if you add 10 buttons and 10 labels, the app might only
    increase a few K, if that much.
    There's some doc that mentions additional techniques on
    reducing SWF file size here:
    http://livedocs.adobe.com/flex/3/html/performance_06.html#208825
    hth,
    matt horn
    flex docs

  • Export dump size,How to predict

    We had a client database of 45GB but the dump was only 1.5 gb.THere were no views,SP,synonyms,primary key etc except table and indexes but that was performing wonderfully well .
    Now we got db of 12 gb with more 6gb of dump.But here we got 20,000 views,100's of synonyms,
    How does oracle generates ?

    Maran,<br>
    <br>
    I have created pipes and compressed in Unix.How to do it in Windows<br>A solution could be to place the dump file under a compressed directory.<br>
    <br>
    Or, make some scripts, examples here<br>
    <br>
    Nicolas.

Maybe you are looking for

  • I can't access my photos in camera roll.

    I can't access my photos in camera roll. It just shows blank boxes instead of the images. When I click 'edit', I can see the image but it won't save, email, sms or anything. If I click on a video, it kicks me out back to home screen. Any ideas what t

  • How to use create_sort_rule? Please help!

    Hi Experts,           I want to craete a sort rule for a field. There is a method called create_sort_rule in IF_SALV_WD_SORT interface. It has following parameters: inputs: SORT_ORDER : LIKE IF_SALV_WD_C_SORT=>SORT_ORDER  DEFAULT  IF_SALV_WD_C_SORT=>

  • Audio not transferring to Encore from Premiere Pro CS6

    I'm trying to export a sequence through the Adobe Dynamic Link from Premiere Pro CS6 to Encore and the audio is not transferring with video. The video is 30 minutes long with multiple video and audio tracks. The video came through fine, but no audio

  • Illustrator CS3 problem

    Hi, I've been having a bit of what I believe is a problem with Illustrator CS3. I created a graphic layout using Photoshop CS3 and saved the finished product as a jpg to open up and "slice" in illustrator to code it, but every time I open up the imag

  • Choice list does not appear

    Hi everyone! I have added a choice list to my Applet and it compiled without any kind of erros, but once I view the Applet on appletviewer or on my browser I can't see it. I have created a Panel to display the choice list so that it didn't interfere