ZipOutputStream triggers OutOfMemoryError on large files

I am trying to zip a directory structure that is large, e.g., /usr. I traverse the directory structure and zip as I go. As long as the directory structure totals about 1 Gig or less there is no problem, otherwise ZipOutputStream craps out. There doesn't seem to be a way to alter existing zip files and insert new entries once you have closed the stream. Is there some way to create a zip file that can approach sizes in the 1000's GIG range? Is ZipOutputStream limited to certain sizes? Are Zip files limited? How can I create a single file that contains a giant directory structure that can then be used to reconstruct the tree later on?

A brilliant programmer named Phil Katz came up with ZIP compression when the popular compression program was hijacked by an algorithm's corporate "owner" that had the brilliant idea that if it made a penny every time its program was run it would make millions. What happened was PKZip (yes, those are the authors initials) which immediately replaced the proprietary solution. The "owner" never recouped its legal costs, if you enjoy irony.
Java's zip compression is licensed from Phil Katz's PKWare company. Most of the shareware and freeware zip programs are also built on software licensed from PKWare. So you should probably hike over to www.PKWare.com and see what they have for you. If you need to do this from java, fake it by simply running an existing compression program out of java.
If that won't work for you, here's another place to start. The Zip format is basically a serial format and is not suitable for random-access use. Zip a directory? OK. Insert and delete individual files from a large Zip? Not OK. So the solution is to redesign for random-access use and then use the compression as part of your solution. Any contemporary OS design solves this problem in its disk I/O mechanism.
Question about your design: do you really want a single file that size?
By the way, congratulations on figuring out the basics of java.util.zip. Several people, myself included, have reported the entire API doc as a bug. It's close to incomprehensible.

Similar Messages

  • OOM happens inside Weblogic 10.3.6 when application uploads large files

    Oracle Fusion BI Apps application is uploading large files (100+ MB) onto Oracle Cloud Storage. This application works properly when ran outside weblogic server. When deployed on Fusion Middleware Weblogic 10.3.6, during upload of large files we get this OOM error
    java.lang.OutOfMemoryError: allocLargeObjectOrArray: [B, size 268435472
        at jrockit/vm/Allocator.allocLargeObjectOrArray(JIZ)Ljava/lang/Object;(Native Method)
        at jrockit/vm/Allocator.allocObjectOrArray(Allocator.java:349)[optimized]
        at weblogic/utils/io/UnsyncByteArrayOutputStream.resizeBuffer(UnsyncByteArrayOutputStream.java:59)[inlined]
        at weblogic/utils/io/UnsyncByteArrayOutputStream.write(UnsyncByteArrayOutputStream.java:89)[optimized]
        at com/sun/jersey/api/client/CommittingOutputStream.write(CommittingOutputStream.java:90)
        at com/sun/jersey/core/util/ReaderWriter.writeTo(ReaderWriter.java:115)
        at com/sun/jersey/core/provider/AbstractMessageReaderWriterProvider.writeTo(AbstractMessageReaderWriterProvider.java:76)
        at com/sun/jersey/core/impl/provider/entity/InputStreamProvider.writeTo(InputStreamProvider.java:98)
        at com/sun/jersey/core/impl/provider/entity/InputStreamProvider.writeTo(InputStreamProvider.java:59)
        at com/sun/jersey/api/client/RequestWriter.writeRequestEntity(RequestWriter.java:300)
        at com/sun/jersey/client/urlconnection/URLConnectionClientHandler._invoke(URLConnectionClientHandler.java:213)
        at com/sun/jersey/client/urlconnection/URLConnectionClientHandler.handle(URLConnectionClientHandler.java:149)
    Looks like weblogic is using its default Weblogic HTTP handler, switching to Sun HTTP handler via start up JVM/Java Option "-DUseSunHttpHandler=true" solves the OOM issue.
    Seems instead of streaming the file content with some fixed size byte array its being on the whole file into memory during upload.
    Is it possible to solve this OOM by changing any setting of Weblogic HTPP handler without switching to Sun HTTP handler as there are many other application deployed on this weblogic instance?
    We are concerned whether there will be any impact on performance or any other issue.
    Please advice, highly appreciate your response.
    Thanks!

    Hi,
    If you have a back then restore the below file back and then try to start weblogic:
    \Oracle\Middleware\user_projects\domains\<domain_name>\config\config.lok
    Thanks,
    Sharmela

  • Exception while Read very large file 300 MB

    Hi
    I want to read a file which has size of more than 300 MB.
    While i m executing my code, i m getting
    Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
    Below is my code
    1. FileChannel fc = new FileInputStream(fFile).getChannel();
    2.CharBuffer chrBuff = Charset.forName("8859_1").newDecoder().decode(fc.map(FileChannel.MapMode.READ_ONLY, 0, fc.size()));
    3.fc.close();Im getting the exception in the line 2.
    Even though i increased the Heap space up to 900 Mb, i m getting this error.
    (FYI)
    I executed in command prompt like below,
    java -Xms128m -Xmx1024m readLargeFile
    Kindly give a solution for this. is there any other best way to read large text file ??
    I m waiting for your reply..
    Thanks in advance

    Thanks for your reply.
    My task is to open a large file in READ&WRITE mode and i need to search some portion of text in that file by searching starting and end point.
    Then i need to write that searched area of text to a new file and delete that portion from the original file.
    The above process i will do more times.
    So I thought that for these process, it will be easy by loading the file into memory by CharBuffer and can search easily by MATCHER class.
    Above is my task, then now suggest me some efficient ways.
    Note that my file size will be more and using JAVA only i've to do these things.
    Thanks in Advance...

  • How to display a large file in JTextArea

    i am displaying a large file(a multi MB) in a JTextArea, it is showing java.lang.outofmemoryerror
    are there any solutions for this problem or is there any better way other than JTextArea ???

    thanks for replying but its all abt Tables
    i am new to java
    i asked how to display a large Text file in a JTextAreathanx
    --santosh                                                                                                                                                                                                                                                                                                   

  • Handling large files in scope of WSRP portlets

    Hi there,
    just wanted to ask if there are any best practices in respect to handling large file upload/download when using WSRP portlets (apart from by-passing WebCenter all-together for these use-cases, that is). We continue to get OutOfMemoryErrors and TimeoutExceptions as soon as the file being transfered becomes larger than a few hundred megabytes. The portlet is happily streaming the file as part of its javax.portlet.ResourceServingPortlet.serveResource(ResourceRequest, ResourceResponse) implementation, so the problem must somehow lie within WebCenter itself.
    Thanks in advance,
    Chris

    Hi Yash,
    Check this blogs for the strcuture you are mentioning:
    /people/shabarish.vijayakumar/blog/2006/02/27/content-conversion-the-key-field-problem
    /people/shabarish.vijayakumar/blog/2005/08/17/nab-the-tab-file-adapter
    Regards,
    ---Satish

  • Java.lang.IndexOutOfBoundsException when zipping larger files

    i get the error:
    java.lang.IndexOutOfBoundsException
    at java.util.zip.ZipOutputStream.write(ZipOutputStream.java:261)
    at compress.zipEntry(compress.java:126)
    at compress.Create_Zip_File(compress.java:84)
    at Test.backup(Test.java:101)
    at Test.<init>(Test.java:37)
    at Test.main(Test.java:149)
    when zipping a file which is 349MB in size I think this might be because it trying to store the file in the memory and the file is too large.What can I do to not store it in memory or if im wrong can someone point me in the right direction.
    Thanks.
    FileOutputStream zipFile = null;
        ZipOutputStream zipOut = null;
       private void listContents( File Zip_File, File dir, ZipOutputStream out )
           throws Exception
           // Assume that dir is a directory.  List
           // its contents, including the contents of
           // subdirectories at all levels.
           System.out.println("Directory \"" + dir.getName() + "\""); 
           String[] files ;  // The names of the files in the directory.
           files = dir.list();
           for (int i = 0; i < files.length; i++) {
           File f;  // One of the files in the directory.
           f = new File(dir, files);
    if ( f.isDirectory() )
    // Call listContents() recursively to
    // list the contents of the directory, f.
    listContents(Zip_File, f, out);
    else
    // For a regular file, just print the name, files[i].
    zipEntry(Zip_File, f, out);
    } // end listContents()
    public void Create_Zip_File(File Zip_File,File[] To_Be_Zipped_Files,
    boolean Skip_Dirs)
    try
    // Open archive file
    FileOutputStream stream=new FileOutputStream(Zip_File);
    ZipOutputStream out=new ZipOutputStream(stream);
    for (int i=0;i<To_Be_Zipped_Files.length;i++)
    //if (To_Be_Zipped_Files[i]==null
    // || !To_Be_Zipped_Files[i].exists()
    // || (Skip_Dirs ))
    // continue;
    System.out.println("Adding "+To_Be_Zipped_Files[i].getName());
    zipEntry(Zip_File, To_Be_Zipped_Files[i], out);
    out.close();
    stream.close();
    System.out.println("Finished zipping : "
    + Zip_File.getAbsolutePath());
    catch (Exception e)
    e.printStackTrace();
    System.out.println("Error: " + e.getMessage());
    return;
    private void zipEntry(File Zip_File, File file, ZipOutputStream out)
    throws Exception
    if (file.isDirectory())
    listContents(Zip_File, file, out);
    return;
    int BUFFER_SIZE=10240000; // 10 M
    byte buffer[]=new byte[BUFFER_SIZE];
    // Add archive entry
    ZipEntry zipAdd=new ZipEntry(file.getAbsolutePath());
    //zipAdd.setTime(file.lastModified());
    out.putNextEntry(zipAdd);
    // Read input & write to output
    FileInputStream in=
    new FileInputStream(file.getAbsolutePath());
    int len;
    int nRead=in.available();
    while((count = in.read(buffer, 0, BUFFER_SIZE)) != -1)
    out.write(buffer, 0, nRead);
    // System.out.println("loop ");
    in.close();
    out.closeEntry();

    of course!!Thanks problem solved:DI don't see how - it is very very wrong!
        private void zipEntry(File Zip_File, File file, ZipOutputStream out)
        throws Exception
            if (file.isDirectory())
                listContents(Zip_File, file, out);
                return;
            int BUFFER_SIZE=4094;
            byte buffer[]=new byte[BUFFER_SIZE];
            // Add archive entry
            ZipEntry zipAdd=new ZipEntry(file.getAbsolutePath());
            //zipAdd.setTime(file.lastModified());
            out.putNextEntry(zipAdd);
            // Read input & write to output
            InputStream in= new BufferedInputStream( new FileInputStream(file.getAbsolutePath()), BUFFER_SIZE);            
            for(int count = 0; (count = in.read(buffer)) != -1;)
                out.write(buffer, 0, count);
            in.close();
            out.closeEntry();     
        }Message was edited by:
    sabre150

  • Uploading Very Large Files via HTTP

    I am developing some classes that must upload files to a web server via HTTP and multipart/form-data. I am using Apache's Tomcat FileUpload library contained within the commons-fileupload-1.0.jar file on the server side. My code fails on large files or large quantities of small files because of the memory restriction of the VM. For example when uploading a 429 MB file I get this exception:
    java.lang.OutOfMemoryError
    Exception in thread "main"I have never been successful in uploading, regardless of the server-side component, more than ~30 MB.
    In a production environment I cannot alter the clients VM memory setting, so I must code my client classes to handle such cases.
    How can this be done in Java? This is the method that reads in a selected file and immediately writes it upon the output stream to the web resource as referenced by bufferedOutputStream:
    private void write(File file) throws IOException {
      byte[] buffer = new byte[bufferSize];
      BufferedInputStream fileInputStream = new BufferedInputStream(new FileInputStream(file));
      // read in the file
      if (file.isFile()) {
        System.out.print("----- " + file.getName() + " -----");
        while (fileInputStream.available() > 0) {
          if (fileInputStream.available() >= 0 &&
              fileInputStream.available() < bufferSize) {
            buffer = new byte[fileInputStream.available()];
          fileInputStream.read(buffer, 0, buffer.length);
          bufferedOutputStream.write(buffer);
          bufferedOutputStream.flush();
        // close the files input stream
        try {
          fileInputStream.close();
        } catch (IOException ignored) {
          fileInputStream = null;
      else {
        // do nothing for now
    }The problem is, the entire file, and any subsequent files being read in, are all being packed onto the output stream and don't begin actually moving until close() is called. Eventually the VM gives way.
    I require my client code to behave no different than the typcial web browser when uploading or downloading a file via HTTP. I know of several commercial applets that can do this, why can't I? Can someone please educate me or at least point me to a useful resource?
    Thank you,
    Henryiv

    Are you guys suggesting that the failures I'm
    experiencing in my client code is a direct result of
    the web resource's (servlet) caching of my request
    (files)? Because the exception that I am catching is
    on the client machine and is not generated by the web
    server.
    trumpetinc, your last statement intrigues me. It
    sounds as if you are suggesting having the client code
    and the servlet code open sockets and talk directly
    with one another. I don't think out customers would
    like that too much.Answering your first question:
    Your original post made it sound like the server is running out of memory. Is the out of memory error happening in your client code???
    If so, then the code you provided is a bit confusing - you don't tell us where you are getting the bufferedOutputStream - I guess I'll just assume that it is a properly configured member variable.
    OK - so now, on to what is actually causing your problem:
    You are sending the stream in a very odd way. I highly suspect that your call to
    buffer = new byte[fileInputStream.available()];is resulting in a massive buffer (fileInputStream.available() probably just returns the size of the file).
    This is what is causing your out of memory problem.
    The proper way to send a stream is as follows:
         static public void sendStream(InputStream is, OutputStream os, int bufsize)
                     throws IOException {
              byte[] buf = new byte[bufsize];
              int n;
              while ((n = is.read(buf)) > 0) {
                   os.write(buf, 0, n);
         static public void sendStream(InputStream is, OutputStream os)
                     throws IOException {
              sendStream(is, os, 2048);
         }the simple implementation with the hard coded 2048 buffer size is fine for almost any situation.
    Note that in your code, you are allocating a new buffer every time through your loop. The purpose of a buffer is to have a block of memory allocated that you then move data into and out of.
    Answering your second question:
    No - actually, I'm suggesting that you use an HTTPUrlConnection to connect to your servlet directly - no need for special sockets or ports, or even custom protocols.
    Just emulate what your browser does, but do it in the applet instead.
    There's nothing that says that you can't send a large payload to an http servlet without multi-part mime encoding it. It's just that is what browsers do when uploading a file using a standard HTML form tag.
    I can't see that a customer would have anything to say on the matter at all - you are using standard ports and standard communication protocols... Unless you are not in control of the server side implementation, and they've already dictated that you will mime-encode the upload. If that is the case, and they are really supporting uploads of huge files like this, then their architect should be encouraged to think of a more efficient upload mechanism (like the one I describe) that does NOT mime encode the file contents.
    - K

  • OutOfMemory while uploading/downloading a large file

    Hello,
    We have recently been experiencing OutOfMemory errors in production. We find that whenver there is reasonable load and some user tries to download a 20MB zip file served from weblogic, the server throws OutOfMemoryError.
    How does Weblogic handle large file upload / download request? Is it possible to crash the weblogic server by initiating a huge file upload (in the order of 200MB)?
    Will this be kept in memory?
    like wise for downloads, what is the maximum size of the file that can be served by weblogic? Is there any memory implication? (other than the response buffer size) Will the entire file be cached in memory by weblogic when a download request gets served?
    Please help!
    Thanks!
    Dheepak.

    Hi Dheepak, Are you using a special servlet to download the zip? The default servlet always buffers only buffer_size no of bytes and then flushes it as soon as it exceeds. The default buffer size is around 8k, so you should see multiple flushes of 8k each.
    The file is not kept in memory or cached in anyway by the server itself buf I can imagine a custom servlet/filter doing that.
    For uploads, WebLogic just creates an inputStream over the underlying socket inputstream and returns it to the user. So again there is no caching of data other than the inter buffers which again arnt more than 8k.
    Hope that helps
    Nagesh

  • BT Cloud - large file ( ~95MB) uploads failing

    I am consistently getting upload failures for any files over approximately 95MB in size.  This happens with both the Web interface, and the PC client.  
    With the Web interface the file upload gets to a percentage that would be around the 95MB amount, then fails showing a red icon with a exclamation mark.  
    With the PC client the file gets to the same percentage equating to approximately 95MB, then resets to 0%, and repeats this continuously.  I left my PC running 24/7 for 5 days, and this resulted in around 60GB of upload bandwidth being used just trying to upload a single 100MB file.
    I've verified this on two PCs (Win XP, SP3), one laptop (Win 7, 64 bit), and also my work PC (Win 7, 64 bit).  I've also verified it with multiple different types and sizes of files.  Everything from 1KB to ~95MB upload perfectly, but anything above this size ( I've tried 100MB, 120MB, 180MB, 250MB, 400MB) fails every time.
    I've completely uninstalled the PC Client, done a Windows "roll-back", reinstalled, but this has had no effect.  I also tried completely wiping the cloud account (deleting all files and disconnecting all devices), and starting from scratch a couple of times, but no improvement.
    I phoned technical support yesterday and had a BT support rep remote control my PC, but he was completely unfamiliar with the application and after fumbling around for over two hours, he had no suggestion other than trying to wait for longer to see if the failure would clear itself !!!!!
    Basically I suspect my Cloud account is just corrupted in some way and needs to be deleted and recreated from scratch by BT.  However I'm not sure how to get them to do this as calling technical support was futile.
    Any suggestions?
    Thanks,
    Elinor.
    Solved!
    Go to Solution.

    Hi,
    I too have been having problems uploading a large file (362Mb) for many weeks now and as this topic is marked as SOLVED I wanted to let BT know that it isn't solved for me.
    All I want to do is share a video with a friend and thought that BT cloud would be perfect!  Oh, if only that were the case :-(
    I first tried web upload (as I didn't want to use the PC client's Backup facility) - it failed.
    I then tried the PC client Backup.... after about 4 hrs of "progress" it reached 100% and an icon appeared.  I selected it and tried to Share it by email, only to have the share fail and no link.   Cloud backup thinks it's there but there are no files in my Cloud storage!
    I too spent a long time on the phone to Cloud support during which the tech took over my PC.  When he began trying to do completely inappropriate and irrelevant  things such as cleaning up my temporary internet files and cookies I stopped him.
    We did together successfully upload a small file and sharing that was successful - trouble is, it's not that file I want to share!
    Finally he said he would escalate the problem to next level of support.
    After a couple of weeks of hearing nothing, I called again and went through the same farce again with a different tech.  After which he assured me it was already escalated.  I demanded that someone give me some kind of update on the problem and he assured me I would hear from BT within a week.  I did - they rang to ask if the problem was fixed!  Needless to say it isn't.
    A couple of weeks later now and I've still heard nothing and it still doesn't work.
    Why can't Cloud support at least send me an email to let me know they exist and are working on this problem.
    I despair of ever being able to share this file with BT Cloud.
    C'mon BT Cloud surely you can do it - many other organisations can!

  • TC and WD500 External drive - Locks up when copying large files

    I have a TC with a Western Digital 500GB my Book connected to the USB port on the TC that I use for network Storage. Everything works fine until I try to copy large files or directories from the Finder mounted share. After the copy freezes the external drive can no longer be seen by airport utility. If I look at disks it sees the disk but no partition. I have to power cycle the external drive to have TC see the drive again. Then I disconnect the drive and plug it in directly to the MAC and run disk utility and his repairs the journal. I have to do the repair every time it locks up. Should the external drive be formatted different or does it have to be connected to a USB hub. Or is this just a bug with the TC and afp ?

    I have this very same problem.
    I tried to copy a 4GB+ folder from the Macbook to the 500GB MyBook attached to my TC and it froze.
    Now the the files on the MyBook are invisible when attached to the TC ("0 Items") but are available when the drive is connected directly to the Macbook. I have run disk utility but it hasn't helped. The disk is FAT 32 formatted and the files on the MyBook are still invisible over the network.
    When the drive attached to the Macbook directly I can see a 'phantom' file with the same title as the 4GB+ folder I'd tried to copy. This file is "ZERO KB" and it cannot be deleted (error code -43).
    Anyone have a clue what is happening here?

  • Unable to transfer large files from MB to External HDs

    Hi,
    This may be an unusual one. My 1st edition Macbook won't let me transfer large files to my external hard-drives, either by USB, Ethernet or wirelessly through my home wi-fi network.
    I've started video editing so I'm importing DVcam tapes through firewire from the camera. A full DV tape translates to about 8-12gb I think.
    Using iMovie and saving the import to my Freecom 3.5inch network harddrive vis USB or ethernet, it fails saying 'Unexpected error, error code 1309'.
    When I try quicktime pro instead to import the tape to the external HD, I get 'Operation could not be completed, An attempt to add a resource to the file failed'. This happens too with my WD 2.5inch external drive.
    However, there is no such problem when I import to the MB's internal harddrive. When I then try and shift the large file off the laptop to an external drive, via USB, ethernet of wirelessly, for editing and save keeping, it fails half way through.
    I have the same problem when I try back-up my 17gb virtual windows machine I use with VMware fusion off the laptop to an external drive.
    I am currently burning an 18gb iMovie project from the MB's internal HD, over 5 DVDs using Toast's disc-spanning and will reimport them to the external, which is really time consuming.
    Also to note, Final Cut Express doesn't have these problems as it seems to break up what would be an 18gb movie file into several smaller files, which my MB is quiet happy to let go onto an external drive. However, the problem re-emerges on Final Cut Ex when I try and import a HDV tape. Possibly FCE isn't breaking them up into small enough pieces for my MB to handle?
    Any ideas why this is happening and what I can do to fix it? Or does this mean a new laptop?

    Irish Apple Fan wrote:
    Hi,
    This may be an unusual one. My 1st edition Macbook won't let me transfer large files to my external hard-drives, either by USB, Ethernet or wirelessly through my home wi-fi network.
    I've started video editing so I'm importing DVcam tapes through firewire from the camera. A full DV tape translates to about 8-12gb I think.
    Using iMovie and saving the import to my Freecom 3.5inch network harddrive vis USB or ethernet, it fails saying 'Unexpected error, error code 1309'.
    When I try quicktime pro instead to import the tape to the external HD, I get 'Operation could not be completed, An attempt to add a resource to the file failed'. This happens too with my WD 2.5inch external drive.
    I'm a bit late to the party, but specifically there's a 4GB file size limit in the FAT32 format that is standard on many external hard drives. I've run across this problem before, and usually it copies over most of the file and then quits when it reaches the limit.

  • SharePoint Foundation 2013 Optimization For Large File Transfer?

    We are considering upgrading from  WSS 3.0 to SharePoint Foundation 2013.
    One of the improvements we want to see after the upgrade is a better user experience when downloading large files.  It can be done now, but it is not reliable.
    Our document library consists of mostly average sized Office documents, but it also includes some audio and video files and software installer package zip files ranging from 100MB to 2GB in size.
    I know we can change the settings to "allow" larger than default file downloads but how do we optimize the server setup to make these large file transfers work as seamlessly as possible? More RAM on the SharePoint Foundation server? Other Windows,
    SharePoint or IIS optimizations?  The files will often be downloaded from the Internet, so we will not have control over the download speed.

    SharePoint is capable of sending large files, it is an HTTP stateless system like any other website in that regard. Given your server is sized appropriately for the amount of concurrent traffic you expect, I don't see any special optimizations required.
    Trevor Seward
    Follow or contact me at...
    This post is my own opinion and does not necessarily reflect the opinion or view of Microsoft, its employees, or other MVPs.
    I see information like this posted warning against doing it as if large files are going to cause your SharePoint server and SQL to crash. 
    http://blogs.technet.com/b/praveenh/archive/2012/11/16/issues-with-uploading-large-documents-on-document-library-wss-3-0-amp-moss-2007.aspx
    "Though SharePoint is meant to handle files that are up to 2 gigs in size, it is not practically feasible and not recommended as well."
    "Not practically feasible" sounds like a pretty dire warning to stay away from large files.
    I had seen some other links warning that large file in the SharePoint database causes problems with fragmentation and large amounts of wasted space that doesn't go away when files are removed or that the server may run out of memory because downloaded
    files are held in RAM.

  • In making a "highlights" movie, using clips from different imported iMovie events, can I delete the larger iMovie event file from the Events browser and still work w the smaller clips in the Projects browser w/o having the larger files still loaded?

    I have sucessfully imported 150 Sony digital 8mm movies (each one hour in length) into iMovie as 150 iMovie events. I have since successfully converted them from their original 13 Gb (.dv) file to an exported smaller 1.3 Gb (.4mv "large file") movie that I am happy with, using the iMovie projects browser. So now I have 150 " .4mv" movies on my internal HD as well as about half of my original raw data " .dv " movies on my internal hard drive.
    Due to their large size (over 2 Tb), I do not have all the larger raw data (.dv) files on my 2Tb internal drive, just about half of them. What I want to do now is to create a new project in the Projects browser for each of my kids, and reload, starting with Tape #1, each of these larger files and do a highlights movie for each of my kids, wherein I pick out smaller clips from each 1 hour .dv iMovie event and paste them into the appropriate kid's Highlights project in the Projects browser.
    Here's my question: If I load the first 5 large files back onto my internal HD, and paste in various shorter clips into each of my kids' Highlights project, and then if I delete those first 5 large files (they are backed up on 2 other 3Tb external HDs), can I keep doing this (reloading the next 5 large .dv files to work with), and ultimately take each of my kid's Highlights project and export as a .4mv movie EVEN THOUGH the earlier large .dv files are no longer on my internal HD, OR does my iMac need to have all these larger files loaded on my internal HD for me to eventually export each of my kid's Highlights project to a .4mv movie?
    I have a 2011 era 27" iMac desktop w 2Tb HD internal and 250 Gb flash drive, and Lion OSX and iMovie 11.

    Thanks. I tested it out and you were correct. I loaded 2 .dv movies from my external HD back onto my internal HD, and got them re-imported into iMovie, took a few short clips from each of the 2 iMovie events and pasted them into a new project in the Project Browser. Then I deleted these 2  "source clips" from my internal HD, closed iMovie and then re-opened it and found that iMovie would NOT export the smaller clips for a "highlights" .4mv movie without the "source clips" being available.
    I read your link on Quicktime. It talks about mostly trimming, which is what I did with each iMovie event before I took each one as a project to export as a smaller .4mv file. But if I use Quicktime (do I need QT Pro or basic), what advantage is it to me to use QT over iMovie (I must admit I am a novice at iMovie and have never used QT or QT Pro as a tool)? Will it then convert any edits I make to a .m4v movie or do I need iMovie to do that?
    Does QT allow trimming multiple segments out of a movie during one edit session, or can you only do 1 at a time? By that I mean that, for example, when I use Sonys Picture Motion Browser for my .mts movies, you can only set one Start and one End point for each edit/trim you do: it does not allow you to set multiple Start Stop points like iMovie allows in its Event or Project browser. You can only do one "trim" at a time, save it and then reopen to do another trim. not very useful.

  • If you delete a large file from the mac hard drive does it also remove it from time machine on external hard drive

    if you delete a large file from the mac hard drive does it also delete it from time machine on the external hard drive ?

    As the others say, no, the backup copy won't be deleted immediately, but it will eventually.
    If you want to delete all backups of a selected item, such as for space or security reasons, see #12 in Time Machine - Frequently Asked Questions.

  • Q: HD players and carving up large files

    Greetings,
    First of all, this is what I'm using:
    Intel(R) Core(TM)2 Duo CPU
    8 gigs of RAM
    External 2TB drive, internal 1TB drive (for media only)
    Windows Vista, 64-bit
    Adobe Premiere Pro CS4
    First question, what is a good media player to play back MTS and M2TS files? I know there are several of them that come recommended on various sites, but I'm interested in knowing which ones to trust, which ones have the best options, and which ones some of you have had good luck with.
    But my biggest problem is that a recent client gave me some very large files with which to work, and it's been like trying to shove an elephant through a drinking straw. They gave me an external hard drive with a ton of AVCHD footage on it (1440 x 1080, 29.97). No problem...I thought. They wanted to use the footage for a time lapse sequence of a big construction job of a unique dome they're building. The way they had described it to me, they were going to record the site over a long period of time, programming the camera to capture a few seconds every few hours.
    Well, someone screwed the pooch. The camera was turned on almost every day, but they let the thing run all day! So, some of these files are over 30 gigs, and it's been murder trying to import them and get them to play. As I expected, it was pointless to try to use the external hard drive as a source since it's not even firewire but USB, and getting the files to play back in PP was not a good experience; I could practically hear the drive choking to death. But even when I transferred a couple of large files to my internal terabyte, they still played back terribly in PP, in both the viewer and on the timeline. When I moved the sequence marker, it would hang up the system for long periods of time, and then playback was choppy at best.
    I imagine it could be processor speed - or lack thereof. I know a quad would be better, but I've played back smaller MTS files with no problem, and right now getting more processor speed isn't an option - if that's even the main problem. The thing that adds insult to injury is that this project is going to mix SD (DVCAM, I think) with this AVCHD, so I don't even think shooting the timelapse stuff in HD was even necessary (they had a small Sony camera they wanted to use). I know I could convert the m2ts files to AVIs or something like that, but that would take forever since I'm going to need to use a little chunk of each file (to edit the time lapse).
    So, my second question is, is it possible to carve up these large files somehow, so that I can import smaller files into PP and use the (source) viewer in PP to scan the footage more easily? Like I said, it's been difficult to scan a 7-hour, 35 gig clip without missing something and/or hanging up the system. I don't have that problem with smaller source files. Unfortunately, I have a feeling carving up these m2ts isn't possible without converting the files to another format....?
    I don't know, maybe it isn't the files and it has something else to do with my computer. But I've been editing SD without much trouble at all, and also some HD (with smaller files), so something tells me these files are just too big. My drive is 7200 rpm, and I don't have anything else on this computer except some other Adobe stuff (PowerPoint, Reader; the "garbage" is minimal. Anybody have any ideas? Thanks in advance.
    Paul

    I would be declining to take on this project on the basis of what you have told us about what you
    have been supplied with...unless they are prepared to acknowledge the scale of the task (problem) and pay on an hourly basis.
    Its going to take a lot of time!

Maybe you are looking for