Exp/imp no data just structure uses a ton of disk space

I am attempting to migrate a schema from one db instance to another and have run into a problem with disk space. All I want to migrate is the structure. So I am exporting with out the data. Yet the tablespace datafiles grow to a size as if they actually held data. I don't have the disk space for this and would like to know if there is a way around this problem. Also, can anyone provide some insight into why the space would be taken up even though no data is being imported.
I am not a DBA but our DBA suggested I include a compress=n parameter. However, this doesn't appear to solve the issue.
Thanks,
Jason

Go Google for "Databee" and then search their home page for the DDL Wizard
OK, forget that: just go visit http://www.ddlwizard.com/
(It is tricky to see on their home page).
Feed it your row-less export dump files and turn them into a bunch of 'create xxxx' statements. Edit those statements so they aren't asking for stupid INITIAL extent sizes. Run those statements. Then do a standard import using ignore=y. That will get your tables created empty and small and the subsequent import will get all your other schema objects back.
The DBA that suggested compress=n is on the right track: compress=y will mean import will seek to create empty tables as big as the fully-populated table currently is, pre-allocating all the space it thinks the table will eventually need given the amount of data that might, one day, be inserted into it.
Compress=n will seek to create the table with the smallest requested extent size and no more than that: row inserts will make it grow big later on, but that growth is left to happen when it needs to happen.
The only other potential fly in the ointment is, as someone else said, the fact that if you're importing into a tablespace that has been created 'extent management local uniform size 100M', then the mere fact of creating a completely empty table will cause 100M of space to be consumed. You would be much better off making sure your tablespaces are created 'extent management local autoallocate': then you start of with small space allocations and only get bigger when you really need it.

Similar Messages

  • When i import movies from my iPhoto app to my iMovie app does it make a copy (thereby using double the hard disk space) or does it link to my iPhoto app? The reason i ask is i want to archive all my videos from iPhoto/iMovie. HELP!

    When i import movies from my iPhoto app to my iMovie app does it make a copy (thereby using double the hard disk space) or does it link to my iPhoto app? The reason i ask is i want to archive all my videos from iPhoto/iMovie. Do i need to archive the movies in iPhoto AND iMovie? HELP!

    Hi - I'm having the same problem with freeing up space on the HD. If I can't free space if I'm using any part of the clip, how can I split up the clips so that I can discard the part I won't use? This is crazy! I have only 4 GB of free space left!
    Also, I've tried to transfer the project file to my external HD but I keep getting an error saying I can't transfer it. I've been told this is because I don't have enough space on the HD for the transfer. I tried a similar transfer with another computer with plenty of HD space but the same error occurred. Is there a way to break up the project file into small pieces for the transfer (I'm thinking the whole movie file is just too big)?
    Thanks for any info!

  • HT3275 2 problems: All files are locked - how to globally unlock on the backup drive? I am backing up large video files 500GB on a 2 TB machine. Time Machine repeatedly copies everything using up all of disk space. I only need 1 copy not, twenty. What to

    2 problems: All files are locked - how to globally unlock on the backup drive? I am backing up large video files 500GB on a 2 TB machine. Time Machine repeatedly copies everything using up all of disk space. I only need 1 copy not, twenty. What to do?

    2 problems: All files are locked - how to globally unlock on the backup drive? I am backing up large video files 500GB on a 2 TB machine. Time Machine repeatedly copies everything using up all of disk space. I only need 1 copy not, twenty. What to do?

  • Migrate exp/imp into data pump

    Hi Experts,
    we use exp/imp to exp data 150G and works to support stream in past time.
    As I know, that data pump will speed up exp.
    How to migrate ex/imp syntax into data pump?
    my imp/exp as
    exp USERID=SYSTEM/tiger@test OWNER=tiger FILE=D:\Oraclebackup\CLS\exports\test.dmp LOG=D:\Oraclebackup\test\exports\logs\exportTables.log OBJECT_CONSISTENT=Y STATISTICS=NONE
    imp USERID=SYSTEM/tiger FROMUSER=tiger TOUSER=tiger CONSTRAINTS=Y FILE=test.dmp IGNORE=Y COMMIT=Y LOG=importTables.log STREAMS_INSTANTIATION=Y
    Thanks
    Jim

    You are right - expdp more faster and useful than classic exp utility
    There are several features in EXPDP
    - may do only local on current server
    - you must create directory object in database &
    grant read,write priveleges to user
    For Example:
    create directory dump as 'd:\export\hr';
    grant read,write on directory dump to hr;
    Then we may do export:
    expdp hr/hr DIRECTORY=dump DUMPFILE=test.dmp LOGFILE=exportTables.log
    After export we will see two files in directory 'd:\export\hr'
    Other features see from expdp help=y & Oracle Documentation
    Edited by: Vladandr on 15.02.2009 22:07

  • Does Spotlight use a lot of disk space?

    I'm new to Macs — in fact, my first one hasn't arrived yet — and I'm wondering if Spotlight will tie up a lot of disk space? I'm getting a MacBook Pro with a 100GB drive (modest by desktop standards), and I've noticed that Windows search products (e.g., Google Desktop, MSN Desktop) can create 3-5 GB of data after indexing even a modest 30GB hard drive. If Spotlight is going to "waste" gigs of space on my limited laptop drive, I might look into disabling it. Thanks.

    Do not worry about it. On my book disk (100Gb with 60Gb occupied) spotlight uses the following space:
    big:~ mtsouk$ ls -l /.Spotlight-V100/
    total 278972
    -rw------- 1 root admin 219443200 Apr 15 07:23 ContentIndex.db
    -rw------- 1 root admin 238 Feb 28 18:56 _IndexPolicy.plist
    -rw------- 1 root admin 304 Apr 3 23:42 _exclusions.plist
    -rw------- 1 root admin 378 May 28 2005 _rules.plist
    -rw------- 1 root admin 66211840 Apr 15 07:23 store.db
    big:~ mtsouk$
    Mihalis.
    Dual G5 @ 2GHz   Mac OS X (10.4.6)  

  • Does a gmail account use a lot of disk space?

    I have recently moved my domain from a pop account to a gmail hosted domain, and my employer has us using gmail. I have made IMAP accounts to get my gmail content in Apple Mail. After a couple of months, my available disk space has gone from 30G to 5, without noteworthy additions of applications or files. It's all mail. Is a gmail account a disk hog? If so, are there ways of containing it?

    I think I've got my arms around this problem. I've actually recovered about 65G of disk space in the process.
    Google's version of IMAP combined with Apple Mail is a chatty, noisy, wasteful system. Google creates many folders that replicate a message (or are pointers to the same message). When Apple Mail syncs with Gmail, those folders are created on your hard drive and have actual copies (not pointers) of the messages. Caches are also spawned and the whole bloated mess is constantly syncing and spawning more temp, cache, and envelope files.
    A solution is to go into gmail (on the web) look at settings, and then under the Label tab, turn off IMAP syncing with all those extra directories (like All Mail) Just turn them off.
    There are two benefits:
    It won't expand out of control on your hard drive (1 have .5G on gmail which bloated to 65G on my hard drive)
    The constant synching and passing back and forth of files for these various folders will be choked off and you won't risk getting cut off for excessive bandwidth usage.

  • How to use VM server's disk space to store VM?

    Hi!
    I've installed Oracle VM Server on my Sun Server. The server has 4 hard drives, which are combined in RAID 10.
    The final disk space is approximately 600 gigabytes. This is df command's output:
    Filesystem
    Size  Used Avail Use% Mounted on
    /dev/sda2 
    50G  1.2G   46G   3% /
    tmpfs    
    666M     0  666M  0% /dev/shm
    /dev/sda1
    477M   47M  401M  11% /boot
    none     
    666M   40K  666M   1% /var/lib/xenstored
    For the needs of the server uses about 50 gigabytes of disk space.
    Oracle VM server Installation and Upgrade Guide says:
    Partitioning is performed automatically by the installer, and you do not have
    the option to define how partitions should be created on the drive during the
    installation process. Partitions are created in such a way that only the maximum
    amount of disk space required to run Oracle VM Server is attributed to the
    partition where it runs. A separate partition is created for any remaining disk
    space to be used as a discoverable local disk that can be used to host a
    repository or assigned to any virtual machine that runs on the Oracle VM Server.
    So where is this separate partition, that I can use to server pool and storage repository in OVMM?
    I want to use free disk space to share it using NFS. Later I use these NFS shares to create server pool and storage repository.
    Thanks!
    Konstantin

    Hi Konstantin,
    you should not use a OVS to export NFS shares to create a server pool or storage repo on it. Such shares should come from a dedicated NFS server, especially, if you plan on ever configuring a sever pool with multiple OVS. The point it, if you're using an OVS to act also as the NFS server for your server pool and storage repo, you will make it impossible to restart that one OVS, due to tha fact that you'd blow up your OVM cluster.
    If you only want to run a single server pool, then nothing prevents you from usind fdisk on the OVS itself and claim the remaining drive space, by making it available as a /dev/sda3
    Cheers,
    budy

  • Compressor 4 is using all my hard disk space?!

    I have a Mac Pro, 3.7 Ghz Quad-Core Intel, 12 GB Memory.
    Im trying to compress a video from FCP X (10.1.3), but Compressor (4.1.3) is using all of my hard disk space. The video I am trying to compress has 4 one minute HD interview segments and about 100 different photos. It's a tribute video that combines both forms of media into a presentation. The total time for the video is 11:23.
    The project library itself works off an external hard drive that has 1.45 TB available. The only thing that goes locally to my hard disk are the FCP X backup folders.
    I received a message saying that I needed more Virtual Memory, but am having a hard time understanding virtual memory and how to free up more of it.
    Any advice?

    This is the message from my last attempt.

  • Get Meta Data for Structures used in RFCs

    On Monday I asked in another thread, how to retrieve table meta data using RFCs (Get Table Meta Data with RFC)
    Having a closer look on what information I got I realized, that this wasn't the exact information I was looking for, although the answers from Ferry have been very good.
    What I am actually looking for isn't meta information of tables, but information about the "Associated Type" that is associated with every import, export, or table parameter of a function module.
    If I double-click on one of those types in transaction se37 after choosing a function module, I get a wonderful overview about the structure including 'component', 'component type', 'data type', 'length', etc.
    Is there an RFC I can call that gives me exactly this meta information by giving an associated type?
    Best,
    Stefan

    Thanks Glen
    I had a look on the proposed function module.
    The good thing, it gives me information about the parameter naming for example table name and field name.
    But it won't give me more detailed information about the table and the field.
    Nevertheless
    RFC1::RFC_GET_NAMETAB seems to give me what I need. So your call for RFC1 and RFC2 was a good one.
    Thanks again.
    Stefan

  • Date and Name Edits Consume Lots of Disk Space

    I am adjusting clip dates on my videos. Also changing the name of the clip in the Events Library. This seems to consume a huge amount of additional storage. I assume iMovie is duplicating the clip? Is there a way to either stop this from occurring or to delete duplicated files? Thanks.

    As you can see, I didn't get any results from the "show parameter archive_log_dest" command.
    I'm also showing the output of "show parameter log_archive" and "archive log list" since these are other commands I've stumbled upon to try to find a solution, in hopes that it helps you.
    Just let me know if there's a typo in that command, and I'll re-run it.
    Thanks
    $ sqlplus user/pass@oracle as sysdba
    SQL*Plus: Release 11.1.0.6.0 - Production on Mon Nov 2 21:37:48 2009
    Copyright (c) 1982, 2007, Oracle. All rights reserved.
    Connected to:
    Oracle Database 10g Release 10.2.0.1.0 - 64bit Production
    SQL> SHOW PARAMETER ARCHIVE_LOG_DEST
    SQL>
    SQL> show parameter log_archive
    NAME TYPE VALUE
    log_archive_config string
    log_archive_dest string
    log_archive_dest_1 string
    log_archive_dest_10 string
    log_archive_dest_2 string
    log_archive_dest_3 string
    log_archive_dest_4 string
    log_archive_dest_5 string
    log_archive_dest_6 string
    log_archive_dest_7 string
    log_archive_dest_8 string
    NAME TYPE VALUE
    log_archive_dest_9 string
    log_archive_dest_state_1 string enable
    log_archive_dest_state_10 string enable
    log_archive_dest_state_2 string enable
    log_archive_dest_state_3 string enable
    log_archive_dest_state_4 string enable
    log_archive_dest_state_5 string enable
    log_archive_dest_state_6 string enable
    log_archive_dest_state_7 string enable
    log_archive_dest_state_8 string enable
    log_archive_dest_state_9 string enable
    NAME TYPE VALUE
    log_archive_duplex_dest string
    log_archive_format string ARC%S_%R.%T
    log_archive_local_first boolean TRUE
    log_archive_max_processes integer 2
    log_archive_min_succeed_dest integer 1
    log_archive_start boolean FALSE
    log_archive_trace integer 0
    SQL> archive log list
    Database log mode Archive Mode
    Automatic archival Enabled
    Archive destination USE_DB_RECOVERY_FILE_DEST
    Oldest online log sequence 6730
    Next log sequence to archive 6730
    Current log sequence 6732
    SQL>

  • Why does iPhoto use so much freaking disk space?!

    When you import photos it makes TWO copies of them into the iPhoto library package, one in "originals" and one in "modified" regardless of whether you edit them or not. Is there any way to get iphoto to leave your files alone and just index them?

    1 - no - iPhoto does not ever make a modified if the photo had no modifications - you probably have auto rotate set on your camera and when the photo is rotated according to your instructions then a modified version is created - turn off auto rotate and no modified will be made until you edit the photo
    2 - you can not change this - iPhoto always maintains your original so you always can revert to it and it uses the unmodified original as the starting point for all iPhoto edits so you are never more than one version away from the original - non destructive editing
    3 - if you do not want a relational database Digital Asset manager with non destructive editing there are many alternatives - version tracker or Mac update will show you many
    LN
    LN

  • How can I see what's using all my hard disk space

    The other day I noticed my HD has 3GB left of memory, but my Home folder is only using 67GB. How can I figure out what is hogging all my space and get rid of it? I do use Bittorrent but I highly doubt that would account for the loss of free space.
    Message was edited by: Jeremy Sample

    Start with the freeware Grandperspective. The interface is a bit unusual, but it should help you figure out what's taking up all your space.
    And of course make sure you're emptying the Trash.
    Regards.
    Message was edited by: Dave Sawyer

  • Thunderbird is using 12.5GB of disk space. How can this be reduced?

    I took a look at this thread: https://support.mozilla.org/en-US/questions/999525?esab=a&as=aaq
    However, I don't see any nstmp files, so the solution doesn't transfer.
    I use Thunderbird to manage six email addresses. At present, Thunderbird is using 12.5GB in my AppData folder, which is roughly an eighth of my hard drive (SSD). I either need to reduce that or force Thunderbird to store these on a different hard drive. How can I do either?

    first and foremost... Compact your folders. That is the only time space is freed. Deleted mail is marked as such but not purged until the compact occurs. So, File menu (Alt+F) > compact folders
    Second move your profile,
    See for locating your profile https://support.mozilla.org/en-US/kb/profiles-tb
    See for moving your profile using the profile manager. http://kb.mozillazine.org/Moving_your_profile_folder#Firefox_Thunderbird_and_SeaMonkey_2

  • System Update Repository - Using a lot of disk space

    Hi
    Looking at the Repository - it seems to be using a lot of space with multiple iterations of similar files
    It also appears to be keeping large Setup files as well as the Applications.  
    Are all these necessary?
    Is there anyway to remove them?  CAN I delete old files/setup files?  (do you need the old version if a new application is present and functioning)
    Many thanks

    What version of System Update are you running? The bloating of the repositiory was corrected with SU v5, which also redesigned the internals of SU. A side affect is that when V5 is installed, all installation history is lost (from old SU v4). Uninstall your current SU, Reply when YES you see the screen below during uninstall .  Install latest SU 5.006.16.
    =
    This is a boring stale message.  Reply yes, it deletes:
    c\programdata/lenovo\Systemupdate

  • Refresh using exp/imp

    Hi All,
    I need to refresh one of my development database using exp/imp utility,i did refreshing using the copy datafiles but new to this ,i have gone thru this forum and did some search on google too but the part that am not getting is suppose my development db already created with same tablespaces name and size and containes every schema as that of source and has data in it which is refreshed earlier, so do i need to wipe out everything and do it right from scratch(create db,tablespace,users) or just wipe out tables and import with (ignore=Y) .Any feedback would be apperciated.
    Thanks

    <p class="MsoNormal">Hi,</p>
    <p class="MsoNormal"> </p>
    <p class="MsoNormal">>> do i need to wipe out everything and do it right from
    scratch(create db,tablespace,users)</p>
    <p class="MsoNormal"> </p>
    <p class="MsoNormal">Well you have choices, and you can do whatever you want,
    but there are some “best practices”.  I’m assuming that you are copying your
    PROD database into DEV (a great practice to have a full-sized production system
    the TEST and DEV):</p>
    <p class="MsoNormal"> </p>
    <p class="MsoNormal">- You can restore PROD into TEST using RMAN.</p>
    <p class="MsoNormal"> </p>
    <p class="MsoNormal">- You can use the
    <a style="color: blue; text-decoration: underline; text-underline: single" href="http://download-east.oracle.com/docs/html/B16227_02/oui7_cloning.htm">
    Oracle Universal Installer</a> to clone a database.</p>
    <p class="MsoNormal"> </p>
    <p class="MsoNormal">- You can “<a style="color: blue; text-decoration: underline; text-underline: single" href="oracle_tips_db_copy.htm">clone
    the whole database</a>”, moving the current datafiles and recreating the
    instance</p>
    <p class="MsoNormal"> </p>
    <p class="MsoNormal">- You can export from PROD, truncate the tables in DEV
    (after a backup) and import IGNORE=Y.  This can be very slow, especially for a
    large database.  However, there are some
    <a style="color: blue; text-decoration: underline; text-underline: single" href="http://asktom.oracle.com/pls/ask/f?p=4950:8:::::F4950_P8_DISPLAYID:1240595435323">
    tips to speed-up imports</a>.</p>
    <p class="MsoNormal"> </p>
    <p class="MsoNormal">Check the
    <a style="color: blue; text-decoration: underline; text-underline: single" href="http://search.oracle.com/">
    Oracle docs</a>.  Also, Dave Moore has a book on
    <a style="color: blue; text-decoration: underline; text-underline: single" href="http://www.amazon.com/Oracle-Utilities-Programs-Oradebug-Dbverify/dp/0972751351">
    Oracle Utilities</a> that you might like.</p>
    Hope this helps . . . .
    Don Burleson
    <p>
    www.dba-oracle.com

Maybe you are looking for