Export larger dump file  in small drive

Hello Everybody,
I want to take the export of 100GB database (Dump file will come approx 50 to 60 GB) to the drive having 40GB space .Is it possible.
Thanks in Advance
Regards
Hamid
Message was edited by:
Hamid

No version, no platform... Why? Too difficult? Strain on your fingers?
The answer is platform and version dependent!
Anyway: on 9i and before : Winblows: Set the compression attribute of the directory or drive you plan to compress to, make sure this attribute is inherited.
Unix: export to a pipe, compress the input of this pipe.
Scripts are floating on this forum and on the Internet everywhere (also on Metalink) everyone should be able to find them with little effort.
On 10g: expdp can generate compressed exports.
Sybrand Bakker
Senior Oracle DBA

Similar Messages

  • Best way to copy large VM files from one drive to another?

    Hi all, I am wondering what the best and fastest way there is to copy a large VM file from one drive to another (external drive to internal drive). Should the .pvm folder (in my case with Parallels) be zipped first? The files I need to copy range from 50 to 70gb. thx for any insight! ps. Even Bombich admits it's difficult for CCC to do...not clear why, but it does slow a full clone down to a crawl. I also have Chronosync at my disposal, but suspect the same is true there as well. Cheers!
    coocoo

    I don't think it matters much which way you do the job. I use Parallels to compress by drives on a regular basis. If you want to copy a dynamic VM, which can grow to just about any size as required, to another drive you will use about the same amount of time zipping the VM - transfering the VM - unzipping the VM, as you would just moving the VM.
    I've moved my VM files to a second partition that does not get cloned to my bootable backup that I create with SuperDuper. I've excluded my entire VM folder from Time Machine and just make zipped backups for each system after any large Service Pack Updates from Microsoft, or any intricate program installations. These get copied to a second partition on my Time Machine drive manually.
    My VM's sizes are kept under control by using common Documents-Movies-Music folders for the Mac and the VM's. My VM's do not have the data files the programs use, they are on my Mac and get backed up every hour with Time Machine.
    Reinstalling a VM just means deleting the broken copy, unzipping my latest backup, and putting into the same folder that Parallels knows to use. The data files are always accessible, even if they were created after my latest VM backup.

  • Export large xml file in PL/SQL

    How to export large or big xml (XMLTYPE) into file system.
    I have tried with...
    1. UTL_FILE.PUT -- it has a limitation of 32767 chars
    2. DBMS_XSLPROCESSOR.CLOB2FILE supports little more than UTL_FILE but not in MB's
    i'm looking for options to export a xml file which is more than 5mb
    any suggestions?

    user12020576 wrote:
    I have tried with...
    1. UTL_FILE.PUT -- it has a limitation of 32767 chars
    It might have but this is not a problem, if you programmatically break it up in pieces.
    Have a look at https://forums.oracle.com/message/9748097#9748097
    The method demonstrates how to dump/write a CSV in binary format to disk.
    I used it to dump/write files way bigger than 10++ MB in size...

  • Error message exporting large picture file from Aperture 3.5

    Recently I took a large panaramic comprised of 165 pictures using Gigapan Epic Pro and stitched together using software provided with Gigapan. I exported the stitched file into Aperture to perform final edits.  When I go to export the final version I keep getting an error message. The file size will be approximately 2.0 GB. I've been able to export a 0.7 GB file without problems (it did take several minutes). I have a late 2009 27" iMac with 3.3 GHZ Intel Core 2 Duo processor and 16 GB memory. Any suggestions?  Not sure if this is an Aperture problem or if my system is not up to the task. Below is the error mesage I receive.  Thanks for your assistance.

    Testing: Exporting a large TIFF - the original is 24512 × 23016 (564,2 MP), 1.05 GB
    The exported file that I am seeing on my Desktop after 10 minutes, is showing this in the Preview:
    But after exporting there was no message at - neither an error message nor the usual alert when finished "Export 1 version completed - Reveal ...".
    So, I am not yet conviced that the export went well, but Aperture can export gigantic TIFFs.
    What is the file format you have been using? My test was done with TIFF, 16 bit.
    Aperture 3.5.1

  • Cannot export a dump file

    Dear all experts,
    I need your helps please. We have oracle enterprise 9i2 installed and oracle fail safe installed on the server. We run cluster.
    All of a sudden, i ran into a problem to create a dump file for backup.
    at the command prompt i type in:
    c:\exp
    c:\username
    c:\password
    After entered all the correct usename and password, it quickly responded with error messages as below
    EXP-00056: ORACLE error 12560 encountered
    ORA-12560: TNS:protocol adapter error
    EXP-00000: Export terminated unsuccessfully.
    Please note that i use the oracle client 9i2 on client machine.
    even i logon to the server where Oracle database server installed, but i got the same error message.
    Now we are so worry since we cannot backup database using exportint dump file.
    We are now at critical moment. Any helps are greatly appreciated.
    Thanks a million.

    even i logon to the server where Oracle database server installed, but i got the same error message.
    For server side:
    Did you set ORACLE_SID environment variable ?
    Are you sure Windows service OracleService<instance> is running ?
    Are you sure database instance is up and running ?
    Can you connect with SQL*Plus to database instance from server ?
    For client side:
    Did you set ORACLE_SID environment variable ?
    Are you sure listener is up and running on server ?
    What is the output of tnsping on the client ?
    Please note also:
    While useful, database exports are not a substitute for whole database backups. They cannot provide the same complete recovery advantages of physical-level backups. For example, you cannot apply archived logs to logical backups in order to update lost changes.

  • Problems exporting large PDF files

    I am trying to export a 52 page InDesign file to PDF. One or two random pages will not export properly, elements from other pages appear and other graphics which were there in InDesign have dissappeared in the exported PDF file. The most annoying thing is that when I export the offending pages separately, one at a time the export without any problems. Is it possible for Indesign to experience an "information overload" when exporting larger documents? how can I fix this?
    thanks,
    Geordie

    Well, I didn't want to go into the weeds and be too much of a distraction, but since you asked, I guess it's puzzling because:
    I work with long complex documents, and it doesn't happen to me.
    We don't see many reports of this problem on the forum [I can't remember any, honestly, but I have not dug hard].
    There are some similar problems, mostly relating to master page items, that have been fixed in some recent updates.
    I don't see how touching many or all page items or pages has much to do with "address space"; sure, it will make InDesign use more virtual memory, but that's a core function of the operating system, and if it's flaking out, there should be many more serious problems. Therefore I am probably misunderstanding your use of "working address space."
    If in fact there is some allocation limit that is being hit or leak or whatnot, it ought to be caught be refcounting of the resource, and the various checks in the InDesign debug build, the InDesign server 30-day uptime tests, etc., etc.
    The implication is that you have reproducible testcases for this problem and have been unable to get InDesign Engineering to resolve the problem, which is kind of an unfortunate situation to be in, so I slap the label of "puzzling" on it.
    An alternative interpretation is that you think the problem isn't easily solvable and haven't really tried to do so, which I also find puzzling.
    If the 64 bit InDesign fixes this, it's just papering over the problem. But since we have a 64-bit InDesign Server for Windows (right? I don't use ID Server for much), it should be easily testable today.
    So, that's basically some rough insight into the things floating around in my head that lead me to be "puzzled."
    I don't mean to discount your problems, or your workaround; I thought it useful to offer the original poster the standard workarounds and suggestions we tend to offer for the broader category of problem, in case yours did not work (hence "Beyond that").

  • How do you export large audio files?

    I made a mix which is about 20 minutes long and seems to be around 600MB. There are 13 songs and I put each song on a separate track. Ive tried bouncing, exporting and sharing and anything I try results in Logic Pro X (version 10.0.1) starting the process then quitting. Does anyone have any suggestions on how to export this file. Ill be honest I don't really have a clue what I am doing. I just recently started using Logic and I am still learning! I wanted to put this mix up on soundcloud so if anyone could give some suggestions that would be very helpful! Thanks!

    if the outfile is much larger than 600MB then...
    Bounce out as a PCM .caf file.... which allows for much larger maximum file sizes....up to 13.5 hours at 16bit/44.1k for example..
    ...Also.. if it isn't the size of the output file that is causing the issue.. bounce out in realtime mode rather than offline mode.. to a Wav or MP3 file... in case it is a plugin that is causing the bounce out to quit... and then upload the resulting MP3 or wave file to Soundcloud manually...
    Finally, make sure you have set the start and end times in the bounce out dialog box.. to reflect the correct start and end times of your project....

  • Splitting large video files into small chunks and combining them

    Hi All,
    Thank you for viewing my thread. I have searched for hours but cannot find any information on this topic.
    I need to upload very large video files. The files can be something like 10GB in size, currently .mpeg format but they may be .mp4 format later on too.
    Since the files will be uploaded remotely using using internet from the clients mobile device (3G or 4G in UK) I need to split the files into smaller chunks. This way if there is a connection issue (very likely as the files are large) I can resume the upload without doing the whole 10gb again.
    My question is, what can I use to split a large file like 10Gb into smaller chunks like 200mb, upload the files and then combine them again? Is there a jar file that I can use. i cant install stuff like ffmpeg or JMF on the clients laptops.
    Thank you for your time.
    Kind regards,
    Imran

    There is a Unix command called split that does just that splits a file into chunks of a size you specify. When you want to put the bits bact together you use the command cat.
    If you are comfortable in the terminal you can look at the man page for split for more information.
    Both commands are data blind and work on binary as well as text files so I would think this should work for video but of course check it out to see if the restored file still works as video.
    regards

  • How to cut up a large video file into smaller files?

    I have a 30 minute video file I recorded on my camera. I need to make it into 6 smaller video so I can work on them separately instead of importing such a large file in final cut. Is there a way I can do this with Final Cut Pro?
    Thanks in advanced!

    No. FCP is a non-destructive editor. Once you ingest a clip it stays there on your drive unchanged until you delete it. FCP is really a dataaaaabase with instructions for playing back clips in order, applying filters, audio adjustments etc. If you want separate clips you have two options: either log and capture in chunks, or set markers on the clip and export smaller pieces, then re-import them.
    But I guess the question in my mind is, if it's already captured, why bother? you'll do your editing of the segments and export the finished pieces anyway...

  • Splitting a large video file into smaller segments/clips using FCE

    Is there a way to split a large FCE / iMovie Captured event into smaller segments/clips.
    Am attempting to convert my home videos ( Analog: 8mm) into iMovie clips.
    Used a ADS Pyro that converts my composite signal ( RCA: Yellow+Red/White) to a digital signal via Firewire using the iMovie capture. However, this creates a single 2 hour file that takes approx 26Gb because there is now DV information.
    I would like to (a) Break this into smaller segments/clips then (b) Change the date on these.

    afernandes wrote:
    Thanks Michel,
    I will try this out.
    Do you know if this will create a new file(s) ?
    http://discussions.apple.com/thread.jspa?threadID=2460925&tstart=0
    What I want to do is to break up my 2 hours video into smaller chunks then burn the good chunks as raw footage ( AVI/MOV) onto backup data DVDs. Then export all the chunks into compresssed files (MPEG-4?) and save these on another data DVD.
    Avoid to compress. Save as quicktime movie.
    Michel Boissonneault

  • Problem exporting large QT files in Premiere

    I have Adobe Premiere (a recent edition - not sure if it's
    Pro but close) and I am trying to export a movie from my timeline
    into a Quicktime file. On smaller files (say, under 3 minutes) it
    works fine, no errors, and the QT files play fine.
    But anything bigger than that and I get the "Error 2048"
    message. This error pops up right as Premiere finishes exporting
    the file.
    I have seen the "Error 2048" issue addressed here and on a
    few other forums, but since the smaller QT files export and play
    fine, I do not believe this is a problem with my Quicktime program.
    I normally export to AVI and have never had problems with
    that, but for this project I'm dealing with Mac people who
    apparently have issues with AVI's coming from a PC.

    could be worth posting in premiere... but I'd say its to do
    with the final file size. some codecs only allow a maximum file
    size of, say 1gb... you could either... try compressing your video.
    or cut it into small chunks and export each of these and then put
    it back together.
    What media are you putting the final video on?
    For CD and DVD playback MPEG2 is a good quality video format
    with a reasonable file size.

  • Break large TDMS file into small files

    hello all
    My TDMS file is around 3g, and needs changed into around 10M size files. 
    I ran McRorie's splitFiles.vi ‏15 KB in this page.  and set the number of samples per file as 5000000, however, I cannot get the results I need. every small file is only 1KB, and no data inside. What is the possible problem in this?
    Also, I tried to write a vi based on sample vi(Read TDMS File) by adding one "write to measurement file.vi". However, when I set the small file size as 10M Byte inside the"write to measurement file.vi", the first file could be around 20M, and the next few files may be correct as 10M, and then it just stop splitting, edding with a file even much larger than original file. I uploaded my vi here, maybe someone  can help to find some mistake for me. 
    Thanks very much!
    Wuwei Mao
    Solved!
    Go to Solution.
    Attachments:
    Read TDMS File.vi ‏54 KB

    Hi Wuwei,
    After giving the correct data type to TDMS Read node in splitFiles.vi, it works as expected. ( See the attached two VIs: createFile.vi and the modified splitFiles.vi)
    Because I don't know how you created your TDMS file, I write a new 3G bytes TDMS file, which has one group and one channel. The data type of samples is unsigned 16-bit Integer. The total number of samples is 1610612736. Then I set the number of samples per file to 5000000 as you did. So after splitting, each file size is 5000000*(16/8) bytes (around 10M bytes).
    Please make sure the followed steps have been done before you run the splitFiles.vi:
    1. The TDMS file will be split has been put on the proper path;
    2. The correct group and channel names have been given;
    3. The correct data type to TDMS Read node has been given.
    Because your second option using "write to measurement file.vi" to split TDMS file will lose some information, such as group and channel names. So I suggest using the method used by splitFiles.vi to accomplish your goal.
    Jie Zheng
    NI R&D

  • Exporting larger .mov files

    I've read that using standard (4:3) or widescreen (16:9) projects, iDVD can only handle 15 mins or 7.5 mins of video respectively in drop zones. I know that this is probably a dumb question but how else can I get a larger video onto a disk?
    Thanks

    Thank you. That's very helpful. I have a couple of follow-up questions if I may. A 2 hour video would obviously be a very large file. Would this have to be exported from FCE in a compressed form to burn a DVD? If so, which format? Also, when I export from FCE using "Quicktime movie" fully contained, I end up with two .mov files: one wiht the Quicktime logo and another with a clip from the film on the thumbnail. The latter is what I have been using. Thanks again!

  • Export large XML-file

    Hi!
    I want to export XML-files, but when i select 50 rows of my object view i only get 43 in my utl_file.
    Here is the code:
    CREATE OR REPLACE PROCEDURE Export_XML_to_file_with_Schema IS
    qryCtx DBMS_XMLGEN.ctxHandle;
    result XMLTYPE;
    result_1 Clob;
    Abfrage VARCHAR2(4000);
    lob_length integer;
    read_amount integer;
    read_offset integer;
    buffer varchar2(100);
    loc varchar2(100) := 'usr_dir';
    f_hand utl_file.file_type;
    BEGIN
    -- Setting up offset and no. of chars to be read in
    -- in one go from clob datatype.
    read_offset := 1;
    read_amount := 75;
    --Opening file
    f_hand := Utl_File.Fopen(location =>'d:\oracle\utl_file\',
    filename =>'test.xml',
    open_mode =>'w',
    max_linesize => 32767);
    Abfrage :='Select * from I_GESAMT_V where rownum < 50';
    qryctx := dbms_xmlgen.newContext(Abfrage);
    Returns a new context
    PARAMETERS: queryString (IN)- the query string, the result of which needs to be converted to XML
    RETURNS: Context handle.
    Call this function first to obtain a handle that you can use in the getXML()
    and other functions to get the XML back from the result.
    dbms_xmlgen.setRowSetTag(qryCtx,NULL);
    rowsetTag (IN) -
    the name of the document element.
    NULL indicates that you do not want
    the ROW element to be present.
    Call this to set the name of the document root element,
    if you do not want the default "ROWSET" name in the output.
    You can also set this to NULL to suppress the printing of this element.
    However, this is an error if both the row and the rowset are null and there
    is more than one column or row in the output.
    dbms_xmlgen.setRowTag(qryCtx,'INSP_PDA');
    rowTag (IN) -
    the name of the ROW element.
    NULL indicates that you do not want the ROW element to be present.
    Call this function to set the name of the ROW element,
    if you do not want the default "ROW" name to show up.
    You can also set this to NULL to suppress the ROW element itself.
    Its an error if both the row and the rowset are null and there
    is more than one column or row in the output.
    -- now get the result
    result := DBMS_XMLGEN.getXMLType(qryCtx);
    select xmlelement("INSOBJ",
    xmlattributes('http://www.w3.org/2001/XMLSchema-instance' AS "xmlns:xsi",'http://sv6:8080/sys/schemas/SCOTT/sv6:8080/public/mydocs/inspection_pda_schema.xsd' AS "xsi:noNamespaceSchemaLocation"), result) into result from dual;
    result_1 :=result.getClobval();
    -- Getting the length of the data stored in Clob
    lob_length := dbms_lob.getlength(result_1);
    utl_file.put(f_hand,'<?xml version="1.0" encoding="windows-1252"?>');
    Utl_File.New_Line (f_hand,1);
    -- Reading data from clob variable and writng into file.
    while (lob_length > 0) loop
    dbms_lob.read(result_1,read_amount,read_offset,buffer);
    utl_file.put(f_hand,buffer);
    read_offset := read_offset+read_amount;
    lob_length := lob_length-read_amount;
    if lob_length < read_amount then
    read_amount := lob_length;
    end if;
    end loop;
    utl_file.fclose(f_hand);
    END Export_XML_to_file_with_Schema ;
    Is there something wrong with my code?
    Thanks for help.
    With best regards
    Nicole

    HI!
    I've seen that the utl_file max_linesize is the problem, is there an alternative except 'Spool'!
    Thanks for help.
    With best regards
    Nicole

  • Exporting large excel file exception

    Hello-
    I am using Jasper reports within a j2ee application to generate an excel file, and this file is written on the reponse outputstream. It is deployed as a web application on Oracle application server 10g
    When the generated xls file is large like 2 Mb or more, Oracle application server throws ServletException and sometimes NullpointerException.
    The same application runs smoothly on the IBM Webshpere test server embedded on websphere application developer.
    I tried to increase the reponse buffer size for AS10g but the problem is still there.
    Could anyone help in this?
    Thanks

    Hi Jason,
    Unfortunately, Xcelsius is not designed to handle the large volume of embedded data that you have described.  There are basically 2 approaches to resolve your issue:
    1) Try to aggregate the data and simplify the lookups, to reduce the total number of cells until Xcelsius is able to process it;
    2) Store the large volume of data in an external data source and retrieve only the necessary subset of data via one of Xcelsius's external connectivity options including web services or XML etc.
    Also, with 15 tabs, depending on the number of components you have within each tab panel, you may also be pushing the limit of how many total components Xcelsius can handle.  However, based on your comments, I would say the real issue is the 8MB Excel file, a size which Xcelsius simply cannot handle.
    Regards,
    Mustafa.

Maybe you are looking for