Saving Big Files

Without using my computer's hard drive or external hard drive, what is the best method for storing/saving files bigger than 5G without a dual layer DVD burner?

Depending on size, you can use utilities like Stuffit (http://www.stuffit.com/mac/index.html) or check versiontracker (http://www.versiontracker.com/macosx/) for other data compression utilities to reduce files to a size you can backup on dvd.
Also depending on what you're trying to compress, Toast (http://www.roxio.com/en/products/toast/index.jhtml) will compress movies that are greater than 4.7gb to fit on a single DVD. In addition, Toast will do data spanning across multiple dvds.

Similar Messages

  • I've doubled my RAM but still can't save big files in photoshop...

    I have just upgraded my Ram from the shipped 4GB to 8GB but I still can't save big images in photoshop. It seems to have made no difference to the speed of general tasks and saving and I can't save big files in photoshop at all. I already moved massive amounts of files off my computer and onto my external hard drive, so there is much less in my computer now, and twice the RAM but it is making no noticeable difference. When I click memory, under 'about my mac' it shows that the RAM is installed and now has twice the memory. Could this be something to do with photoshop? I'm running CS6.

    Also, I just calculated 220 cm into inches and this is roughly 86.5 inches. Over 7 feet in length.
    WIth an image that large, you may want to consider down sampling to a lower DPI to help with being able to work on and process images of this size much easier and it will be easier for your Print house to process, as well.
    You might want to consider working with these rather large images at 225 DPI instead of 300 if resolution at close distances is still a concern.
    Or what you good try is working with the images at 300 DPI, but then save them/export them as a Jpeg image at the highest image quality setting.
    I do a lot of project where I use a high resolution jpeg image to save on image proceesing overhead.
    The final printed images still come out pretty clear, clean and crisp without having to process those large files at a print house or having to deal with using that full resolution image in a page layout or illustration program.

  • Saving a file - Save as type

    I would like to have a file-save dialog with the standard drop-down list with pre-defined file types as seen in most programs today. However, in LabVIEW I cannot find a simple way of doing this. Any suggestions?

    LabVIEW-guru,
    Thanks for your suggestions.
    Before I posted this question I was actually contemplating on writing my own VI for doing this. I started the effort but soon I decided to not continue. Why? Well, why did I want this functionality in the first place?
    My goal is to have a LabVIEW application that does not shine “I am written in LabVIEW”. With LabVIEW 6i it is almost possible to do this. Almost. I am a Die Hard supporter of LabVIEW and I am more than happy with the power LabVIEW 6i gives me in creating programs for controlling my measurements.
    Ok, so what options did I have for this particular application. The obvious solution is to first have a pop-up dialog that asks for the desired format. The program
    then uses this information when calling the standard dialog for saving a file. However, doing this leaves behind a sense that this is a home-brew solution. The suggestion you had of writing my own VI seems the best way to do it. But! The listbox, that one needs to list the directory, with its symbols does not look exactly as the windows equivalent. I have not figured out how to get the symbols in color. Also, I faced problem with the dialogring used at the top for selecting higher level directories. It seems that I cannot use symbols in it. Those two problems defeat the purpose by writing my own VI. If I can’t get it to look like a standard save-dialog, I might as well go for the fist solution, which is by far easiest, using an extra dialog asking for the file format.
    But who knows... in LabVIEW 7i (?) I might be able to just wire an array of strings to the file dialog to be used for the Save-as-type selection.... If NI is listening...this would be a neat feature!!
    Thanks a
    ll for being part of DE! It is a big help for us all! /Mikael Garcia

  • How to read big files

    Hi all,
    I have a big text file (about 140MB) containing data I need to read and save (after analysis) to a new file.
    The text file contains 4 columns of data (so each row has 4 values to read).
    When I try to read all the file at once I get a "Memory full" error messags.
    I tried to read only a certain number of lines each time and then write it to the new file. This is done using the loop in the attached picture (this is just a portion of the code). The loop is repeated as many times as needed.
    The problem is that for such big files this method is very slow and If I try to increase the number of lines to read each time, I still see the PC free memory decending slowly at the performance window....
    Does anybody have a better idea how to implement this kind of task?
    Thanks,
    Mentos.
    Attachments:
    Read a file portion.png ‏13 KB

    Hi Mark & Yamaeda,
    I made some tests and came up with 2 diffrenet aproaches - see vis & example data file attached.
    The Read lines aproach.vi reads a chunk with a specified number of lines, parses it and then saves the chunk to a new file.
    This worked more or less OK, depending on the dely. However in reality I'll need to write the 2 first columns to the file and only after that the 3rd and 4th columns. So I think I'll need to read the file 2 times - 1st time take first 2 columns and save to file, and then repeat the loop and take the 2 other columns and save them...
    Regarding the free memory: I see it drops a bit during the run and it goes up again once I run the vi another time.
    The Read bytes approach reads a specified number of bytes in each chunk until it finishes reading all the file. Only then it saves the chunks to the new file. No parsing is done here (just for the example), just reading & writing to see if the free memory stays the same.
    I used 2 methods for saving - With string subset function and replace substring function.
    When using replace substring (disabled part) the free memory was 100% stable, but it worked very slow.
    When using the string subset function the data was saved VERY fast but some free memory was consumed.
    The reading part also consumed some free memory. The rate of which depended on the dely I put.
    Which method looks better?
    What do you recommand changing?
    Attachments:
    Read lines approach.vi ‏17 KB
    Read bytes aproach.vi ‏17 KB
    Test file.txt ‏1 KB

  • Illustrator hangs when saving .ai files with linked or embedded images

    Hi,
    Currently I have a huge problem with Adobe Illustrator. When I place an image file inside an Illustrator document, the program freezes for about 45 seconds when saving. In the beginning I though it was just because my image file was too big, but that wasn't the case.
    To test what's going on I created a new document, drew a few lines and saved. No problem, Illustrator saved immediately. Then I placed a bigger image and suddenly Illustrator hang when saving. I repeated this with a very small GIF of just a few kB and again, Illustrator hang for nearly a minute when saving the image. I tried embedding the image instead of linking it, but I still got the same results.
    Then I turned of the visibility of the image object by clicking on the little eye symbol in front of the image object and suddenly Illustrator was able to save quickly again. The next thing I wanted to try was turning off compression and PDF compatibility, but then I noticed that Illustrator is actually hanging BEFORE writing the file to disk. When I hit "Save as" Illustrator asked me where to save the new file and then after clicking "Save" it hang (showing "(Not Responding)" in the title bar). About 45 seconds later the save options dialog came up and when I clicked "OK" it saved the file immediately.
    Does anybody know what the problem might be? I'm using Illustrator CS5 on Windows 7 x64. The computer is a notebook with 8GB RAM and a switchable graphics card. I tried this on both the integrated Intel graphics unit and on the much more powerful AMD chip. The hard driver I'm saving to is the local hard drive (no network drive etc.).
    This is a more or less fresh installation (I haven't used Illustrator before on this notebook) and I can't remember that I had the same problem on my workstation where I used it inside a virtual machine (which doesn't exist anymore).

    Yeah, I know that changing the mode results in different colors. I just switched back and forth to check out what's going on.
    For CMYK I'm using ECI profiles which are common in Europe and for RGB the normal Adobe RGB 1998 profile. The profiles are synced between all Creative Suite products. I tried sRGB and some of the default profiles which shipped with CS5 but it produces the same result. I also deleted my whole Illustrator profile but without success.
    My graphics card doesn't have any updates. But it would be very unlikely anyway that both drivers for both cards had the same issue. If it has something to do with my graphics cards it's more likely the switchable graphics in general.

  • How do i open a VERY big file?

    I hope someone can help.
    I did some testing using a LeCroy LT342 in segment mode. Using the
    Labview driver i downloaded the data over GPIB and saved it to a
    spreadsheet file. Unfortunately it created very big files (ranging from
    200MB to 600MB). I now need to process them but Labview doesn't like
    them. I would be very happy to split the files into an individual file
    for each row (i can do this quite easily) but labview just sits there
    when i try to open the file.
    I don't know enough about computers and memory (my spec is 1.8GHz
    Pentium 4, 384MB RAM) to figure out whether if i just leave it for long
    enough it will do the job or not.
    Has anyone any experience or help they could offer?
    Thanks,
    Phil

    When you open (and read) a file you usually move it from your hard disk (permanent storage) to ram.  This allows you to manipulate it in high speeds using fast RAM memory, if you don't have enough memory (RAM) to read the whole file,  you will be forced to use virtual memory (uses swap space on the HD as "virtual" RAM) which is very slow.  Since you only have 384 MB of RAM and want to process Huge files (200MB-600MB) you could easily and inexpensively upgrade to 1GB of RAM and see large speed increases.  A better option is to lode the file in chunks looking at some number of lines at a time and processing this amount of data and repeat until the file is complete, this will be more programming but will allow you to use much lass RAM at any instance.
    Paul
    Paul Falkenstein
    Coleman Technologies Inc.
    CLA, CPI, AIA-Vision
    Labview 4.0- 2013, RT, Vision, FPGA

  • VERY big files (!!) created by QuarkXPress 7

    Hi there!
    I have a "problem" with QuarkXPress 7.3 and I don't know if this is the right forum to ask...
    Anyway, I have createed a document, about 750 pages, with 1000 pictures placed in it. I have divided it in 3 layouts.
    I'm saving the file and the file created is 1,20GB !!!
    Isn't that a very big file for QuarkXPress??
    In that project there are 3 layouts. I tried to make a copy of that file and delete 2 of 3 layouts and the project's file size is still the same!!
    (Last year, I had created (almost) the same document and as I checked that document now, its size is about 280 MB!!)
    The problem is that I have "autosave" on (every 5 or 10 minutes) and it takes some time to save it !
    Can anyone help me with that??
    Why does Quark has made SO big file???
    Thank you all for your time!

    This is really a Quark issue and better asked in their forum areas. However, have you tried to do a Save As and see how big the resultant document is?

  • Saving large files quickly (less slowly) in PS CS3

    I have some very large files (Document Size box in PS says 7GB, file size as a .psb about 3GB). I guess I should be happy that I can open and save these documents at all. But, on a fast Macintosh, save times are about 20 minutes, and open times about 10 minutes. I know that if I could save these as uncompressed TIFFs, the save and open times should be much faster, but I can't because the files are over the 4GB limit for a tiff. Is there anything I can do to speed up open and save times?
    A

    Peter and Tom
    Thanks for your replies. Have you done any timing on this? I have done some, but didn't keep the results. But my impression is that RAM size and disk speed are relatively less important for save times, and that multiple processors (cores) are not helpful at all.
    What I find is that my save times for a 3GB file are about 20 minutes, which is a write speed of about 2.5 MB/sec, which is well within the capacity of 100Base-T ethernet, and certainly nowhere near the limit of even a slow hard drive. So I don't think write time should have anything to do with it.
    Similarly, while I have 14 GB of RAM, I don't think that makes much of a difference. The time it would take to read data out of a disk cache, put it into memory, and write it back out should be a short fraction of the save time. But, I must say, I have not bothered to pull RAM chips, and see if my save times vary.
    I do find that maximize compatibility does affect things. However, I always leave this on for big files, as I like to be able to use the Finder to remind me which file is which, without firing up Photoshop (and waiting 10 minutes for the file to open). I find it always saves me time in the long run to "Maximize compatibility."
    Now, not being a Photoshop Engineer, what I think is going on is that Photoshop is trying to compress the file when it saves it, and the compression is very slow. It also has no parallel processing (as noted from processor utilization rates in Activity Monitor). I suspect that the save code is very old, and that Adobe is loathe to change it because introducing bugs into saving makes for very unhappy customers. (I know, I was bitten by a save bug once in ?CS2?, can't remember).
    Honestly, for my money, the best thing Adobe could do for me would be to put a switch or a slider for time/space tradeoffs in saving. Or, maybe somebody could come up with a TIFF replacement for files bigger than 4GB, and I could then saved uncompressed in that new "large-tiff" format.
    Anyway, it looks like my original question is answered: there isn't anything I can do to significantly speed up save times, unless someone has figures on what a 10,000 RPM disk does to change open/save times, which I would love to see.
    Cheers
    A

  • Saving wave files in STP editor

    I have this problem which is very annoying. I have sampled some sounds from my hardware synth, logic playing the synth from every key and recording it. Then I edit the big soundfile in Soundtrack Pro's audio editor and save every single sample as separate .wav-file. I select every sample in the big file with mouse and open as a new audio file, then cut it, fade in and out and save.
    The problem is that when I save this new audio file from the STP editor, no matter what save function I use, I always have to choose the quality when saving. By default, STP wants to save the file as a some kind of audio project file (.stap) and I have to change that to "wave" and then select the bitrate from 32 to 24. And I have to do this _every time_ I save a new sample which is recorded from a different key!
    Am I doing something wrong or why doesn't STP remember my settings when I save? This behaviour makes my working very slow. When saving, I would like to only type the name of the file and press save because I use the same quality settings every time.
    In fact STP remembers the quality settings if you open a wave file, do some editing and then save it but if you copy a section of the file to a new file and try to save it, STP defaults to a stap-file.
    And another thing... Every time I save something from the editor, STP asks whether I want to flatten the project (all the edits) or not and I always press "yes". I hope there was a check box that I could choose that STP remembers my decision because of course I want to apply all the edits I made.
    I have not noticed this kind of behaviour with any other sample editor, like soundforge, wavelab etc.
    I use STP version 1.0.3. Does the UB 1.1 version fix these or is there any solution?
    G5 Dual 1.8GHz   Mac OS X (10.4.7)   2GB, 2x250GB, 2x Powercore PCI mk2, UAD1, MOTU 828 mk2

    Hi Aarav,
    Check the internal table used to store the text is having line length 100. Change that to 200 or more . You can see the difference . If still the problem persists, enter each line in the text editor and conclude each line with a carriage return( ENTER ) 
    key press.
    Hope this will help you.
    Regards,
    Smart Varghese

  • Working with big files

    I would appreciate any suggestion as to how best one can work with big files.  I have eight one hour tapes containing information about various places I visited.  Some contain only one place while others can contain two or more places. As a preference I like to create up to one hour DVDs that contain an identifiable segment; eg either one place only or more but I do try not to break up a place and record it on two DVDs.  However, the problem I faced in the past, is that by the time I get to the end of an hour, the program Premiere Elements 7 (PE7) is getting slow and keeps telling me that it is running out of memory. That is despite the fact that I have 3 GM of RAM and over 200 GB of dedicated separate hard drive.  I tried saving segments under ‘Share’, ‘DV AVI’ and selecting ‘DV PAL Standard’ (the system I use), but the result, as far as quality was concerned, was very disappointing. There is no option to change or set quality as was the case in previous versions of PE.  Nor can I combine two .prel files into one while I am editing. 
    Is there any other way I can work in segments, say 15 minutes, and combine them at the end without losing on the quality? 
    Any suggestion would be most appreciated.
    Tom 

    Hi Steve
    It is a Sony DCR-HC40E PAL, and they were captured as AVI clips.
    The project settings are:
    General – not highlighted but it shows DV Pal and 25.00 frame/second
    Video – 720 horizontal 576vertical (4/3)
    Display format - 25 fps Timecode
    Capture – DV Capture
    Tom

  • Problem reading big file. No, bigger than that. Bigger.

    I am trying to read a file roughly 340 GB in size. Yes, that's "Three hundred forty". Yes, gigabytes. (I've been doing searches on "big file java reading" and I keep finding things like "I have this huge file, it's 600 megabytes!". )
    "Why don't you split it, you moron?" you ask. Well, I'm trying to.
    Specifically, I need a slice "x" rows in. It's nicely delimited, so, in theory:
    (pseudocode)
    BufferedFileReader fr=new BufferedFileReader(new FileReader(new File(myhugefile)));
    int startLine=70000000;
    String line;
    linesRead=0;
    while ((line=fr.ReadLine()!=null)&&(linesRead<startLine))
    linesRead++; //we don't care about this
    //ok, we're where we want to be, start caring
    int linesWeWant=100;
    linesRead=0;
    while ((line=fr.ReadLine()!=null)&&(linesRead<linesWeWant))
    doSomethingWith(line);
    linesRead++'
    (Please assume the real code is better written and has been proven to work with hundreds of "small" files (under a gigabyte or two). I'm happy with my file read/file slice logic, overall.)
    Here's the problem. No matter how I try reading the file, whether I start with a specific line or not, whether I am saving out a line to a string or not, it always dies with an OEM at around row 793,000,000. the OEM is thrown from BufferedReader->ReadLine. Please note I'm not trying to read the whole file into a buffer, just one line at a time. Further, the file dies at the same point no matter how high or low (with reason) I set my heap size, and watching the memory allocation shows it's not coming close to filling memory. I suspect the problem is occurring when I've read more than int bytes into a file.
    Now -- the problem is that it's not just this one file -- the program needs to handle a general class of comma- or tab- delimited files which may have any number of characters per row and any number of rows, and it needs to do so in a moderately sane timeframe. So this isn't a one-off where we can hand-tweak an algorithm because we know the file structure. I am trying it now using RandomAccessFile.readLine(), since that's not buffered (I think...), but, my god, is it slow... my old code read 79 million lines and crashed in under about three minutes, the RandomAccessFile() code has taken about 45 minutes and has only read 2 million lines.
    Likewise, we might start at line 1 and want a million lines, or start at line 50 million and want 2 lines. Nothing can be assumed about where we start caring about data or how much we care about, the only assumption is that it's a delimited (tab or comma, might be any other delimiter, actually) file with one record per line.
    And if I'm missing something brain-dead obvious...well, fine, I'm a moron. I'm a moron who needs to get files of this size read and sliced on a regular basis, so I'm happy to be told I'm a moron if I'm also told the answer. Thank you.

    LizardSF wrote:
    FWIW, here's the exact error message. I tried this one with RandomAccessFile instead of BufferedReader because, hey, maybe the problem was the buffering. So it took about 14 hours and crashed at the same point anyway.
    Exception in thread "AWT-EventQueue-0" java.lang.OutOfMemoryError: Java heap space
         at java.util.Arrays.copyOf(Unknown Source)
         at java.lang.AbstractStringBuilder.expandCapacity(Unknown Source)
         at java.lang.AbstractStringBuilder.append(Unknown Source)
         at java.lang.StringBuffer.append(Unknown Source)
         at java.io.RandomAccessFile.readLine(Unknown Source)
         at utility.FileSlicer.slice(FileSlicer.java:65)
    Still haven't tried the other suggestions, wanted to let this run.Rule 1: When you're testing, especially when you don't know what the problem is, change ONE thing at a time.
    Now you've introduced RandomAccessFile into the equation you still have no idea what's causing the problem, and neither do we (unless there's someone here who's been through this before).
    Unless you can see any better posts (and there may well be; some of these guys are Gods to me too), try what I suggested with your original class (or at least a modified copy). If it fails, chances are that there IS some absolute limit that you can't cross; in which case, try Kayaman's suggestion of a FileChannel.
    But at least give yourself the chance of KNOWING what or where the problem is happening.
    Winston

  • Loosing RAM by saving to file

    Hallo,
    I wrote an application running under Win98 which saves data every 100ms
    in a file for every day (40MB per day). While the application is runnigs
    the free RAM is reduced about every 100 minute for about 2,5 MB. The
    system doesn't refree the RAM, if the application start a new file the
    next day, or if the data a write in a new file every hour, so that the
    application has not to work with so big files.
    The problem doesn't occure, if I run the same application without the
    write to file VI.
    Anyone knows why the RAM getting less and how to solve this problem.
    Thanks, Niko

    You should not be using the 'Write to file VI' as this opens and closes the file each time the data is saved. I have included a picture of a VI showing how to stream data to disk, so the file is opened at the start of acquisition and closed at the end. This will be a much more efficient method.
    If you try this example but you are still having problems, then add a 'Flush file vi' after 'Write' into the loop (found in Functions>File I/O>Advanced File Functions>Flush file). This will write all the buffers to disk and update the directory entry. This may help you.
    Kim
    Attachments:
    Disk_streaming.jpg ‏56 KB

  • While saving a file, I have converted all my files to Pdf.  The only way to restore previous is to delete Adobe.  Does anyone have a soution?

    While saving a file, I have converted all my files to Pdf.  This made everything inaccessible, even Explorer.  I had to delete Adobe to get back to some of the previous files.  I tried to restore system to previous, but I kept getting a pop up message that said I did not have enough shadow space.  Does anyone have a solution?  Greatly appreciate your help.

    Application, file icons change to Acrobat/Reader icon

  • When saving photoshop file in PDF, my document changes, how do I solve this?

    Hello everybody,
    Im trying to save my CS6 files as PDF documents. When I save my files as PDF some texts or images disappear or deform.
    Is there anything i'm doing wrong?
    Is there a setting I need to change for saving my files so they don't change?
    Thank you for your help!

    Hi,
    Which operating system and version of photoshop cs6 are you using?
    You can find out the exact photoshop cs6 version by going to Help>System Info from within photoshop and looking at the very first line that says Photoshop Version.

  • The niFPui.mxx plug-in caused an exception in the CmxAggregateItemUI::InvokeCommand function in the NIMax process. When saving *.iak file in MAX4.6

    The niFPui.mxx plug-in caused an exception in the CmxAggregateItemUI::InvokeCommand function in the NIMax process. When saving *.iak file in MAX4.6
    Hi There,
    The subject header just about says it all. This is the first action I took with MAX - it is a fresh install. The file I wanted to save was still written and the FP seems to be working ok. However, I need to know what happened.
    I can't post the whole log file due to the amount of characters allowed on this post. I can cut and paste sections if there is a specific part of the file you need. Below is the first section and last section.
     Context where exception was caught:
    Func:
    CmxAggregateItemUI::InvokeCommand Args: plugin=niFPui.mxx Item=0107EAB1
    cmdID.cmdId={4A36174B-EC0C-4D73-A23D-F15D164542DE} cmdID.index=0
    Application   : C:\Program Files\National Instruments\MAX\NIMax.exe
    User Name     : slaney
    OS Version    : 5.1.2600 (Service Pack 3)
    Exception Code: C000001E
    Exception Addr: 457BC448
    Return Address: 457BC448
    Function Name : nNIFPServer::tFpLinearScaleRange::`vftable'
    Module Name   : FieldPoint71
    Parameters    : F001008E 7800FDDD C5100DFC EC0107EA
    Source File   : (not available)
    Return Address: 481000C3
    Function Name : (not available)
    Module Name   : (not available)
    Parameters    : 00000000 00000000 00000000 00000000
    Source File   : (not available) 

    Hi,
    I did a research on your error message and it seems this problem was introduced with MAX 4.6. This version switched to a new error reporting mechanism and reports even errors that are which are not critical to your task.
    These errors typically show up as "unexpected" and if your error falls into this category have a look to this KB for further assistance.
    If it doesn't fall into this category, your could try to go back to the MAX 4.5 or 4.4.. Of course you would need to reinstall some components and might not be able to use newer drivers at all.
    Let me know.
    DirkW

Maybe you are looking for

  • IMovie Full Quality Export Video Stops As Audio Continues. PLEASE HELP

    Im making a video of my camping trip, and I tried to export it using the Expert Settings. When I watched it, it stopped about 1.5 minutes in. The video freezes but the audio continues. I really need this video up soon! If you've experienced this and

  • IPad Content Viewer 1.9 don't donwload any folio ?

    I installed the new Content Viewer 1.9, but now with this version I can't download any of my folios. When I'm waiting long time a message appear say me: "can't connect with the server in this moment try more later". With 1.8 I don't had this problem.

  • PSE9 Organizer with Win7 - Really That Bad???

    I've just ordered PSE9 hoping to upgrade from 7. However, after reading the nightmare threads on this forum about the organizer (my primary purpose) I want to write "return to sender" on the package and not even open it! Is anyone having a GOOD exper

  • JDev 10.1.3. EA1 - new version = new bugs

    I was very surprised to read about the new EA1 version. Download - unzip - run - convert from 10.1.3 preview - disapointment... *.hta are now accepted as html files - fine! <tr> after <table> is not expected = error - hmmm ??? attribute styleClass no

  • No add contact button

    Got my iphone 5 today. When trying to add contacts there is no plus button to add them. I have turned the phone off and back on again and it pops up when i open contacts and then disappears. Please help