FCP - Out of Memory Error in 10.6.6

Running a MacBook Pro 17" (A1261) and while I edit, I got a "Out of Memory" Error and an "Invalid Operation". Can't reopen the sequence on my MacBook but it opens with no problem on a Mac Pro.
Gonna re-install the OS and see if that helps.

I just had something very similar happen. A sequence that I've been editing for over a month now no longer opens (since installing the 10.6.6 update) I'm getting the same "out of memory" and I/O error codes.
I'm at a loss what to do. Other editing projects open fine. But so far I haven't been able to open
any of the project files I've vaulted from the last six weeks.
Does anyone have any idea what changes 10.6.6 might have made to cause this and how
it could be rectified. I really need to get back into my project file.

Similar Messages

  • Final Cut Pro 6 - 'Out of Memory/Error Code'

    Hey All,
    I'm in the final stages of editing a documentary and I keep running into an 'Out of Memory' error code in Final Cut Pro 6.0 when trying to export a reference file. The finished film is 81 minutes long and contains 19 sequences. I have all 19 sequences nested in a main sequence of a seperate FCP project file, so I'm not wasting any memory on clips or additional backups of sequences. This Nested sequence is the only sequence that exists in this 'other' project file. All of the photos used are under 4k resolution and are RGB. Even the lower thirds are animation QT's with alphas. Any thoughs would be greatly appreciated.
    Nick
    System Specs:
    Quad-Core 2GHZ MacBook Pro
    4GB DDR3 Memory
    MAC 10.7.2 (Lion)
    FCP v6.0.6

    First troubleshooting step
    https://discussions.apple.com/docs/DOC-2491
    If that doesn't solve the problem, you may have a corrupt sequence
    Try creating a new sequence with the same settings and copy the contents from the problematic sequence
    If that doesn't work.  Mark an in and an out for the first half of the sequence and do an export.  If that works, mark an in and an out for the second half and export.  Keep narrowing down the sections til you find the problematic clip. 

  • Constant out of memory errors

    I am consantly receiving out of memory errors in fcp 5.1. Now when rendering I have to split it up into sections to render so I dont get the error (eg - render 5 seconds of footage at a time)
    My rendered files folder contains 203 files and I was wondering whether I could just delete all the files and then just re-render the project in one go - or will this just screw up the project?

    If I delete all the rendered files, when I re-render will it make just the one render file? If so does this mean everytime I make a change in the project it will take longer to render as Final Cut will have to re-render the whole file?

  • Another version of 'out of memory' error

    Hi all
    I have a colleague who is getting the following error message.
    As discussed...when I attempt to render a clip on the timeline, once it
    gets to the end of the rendering process and attempts to play the clip, an
    'out of memory' error message box appears on the screen.
    Then when I click on 'OK' to close this box, the canvas window turns to red
    and the following message displays in the canvas window...'Out of memory.
    Free up memory and window will redraw.'
    He is using HDV footage captured with the "HDV" codec [not the intermediate one], and editing it into HDV1080i50 sequences.
    He has a G5 DP 2 GHZ machine running Tiger 10.4.2 and 2 GB of ram.
    He has approx 80 GB free space on Mac HD and approx 400 GB on external Lacie HD. He is running only FCP HD 5 at any one time.
    I have sourced some previous posts which speak of corrupt graphics, clips, sequences and trashing fcp prefs.
    Does anyone have any other suggestions for him?
    [He is quite new to macs and FCP especially].
    I am going to send him an email to subscribe and create an account for this forum, his name is AGebhardt.

    Hello,
    I had the same issue last night, when a render (only 15 min., so not THAT big) completed and would not play and the canvas turned red and said I was out of memory.
    This is different than a general error! out of memory warning, which I have seen happen. Some of the answers in linked threads seem to be pointing to this other situation.
    I have plenty of space and plenty of RAM and was only running iTunes. I quit iTunes and it worked, almost to my disappointment because in the past I have had many apps working at a time with no problems,
    I would be pretty bummed if I could only have FCP open all of a sudden.
    I will try going through my clips to check for corruptions as suggested just to be on the safe side, but have a question;
    What good is it to throw out your render files if you have already checked to see if you have enough storage space? I can see the good if a file is corrupt, but with every project a new render folder is created and unless there is a limit on these folders that I'm unaware of I can't see the sense in trashing the folder.
    Am I missing something?
    Thanks,
    Jesse
    733 G4    

  • Out of memory error - large project

    I'm consulting on a feature doc edit, and the primary editor (Avid guy) is having serious problems accessing anything from the original project.
    It's an hour and 15 minute show, with probably close to 100 hours of footage.
    The box is a D2.3 G5 with 1.5 g of RAM, and the media is on two G-Tech drives: a G-RAID and a G-Drive. Plenty of headroom on both (now) and the system drive is brand new, having been replaced after the original died, and there's nothing loaded on it but FC Studio. The FCP version is 5.1.4. The project file is well over 100 MB.
    We started getting Out of Memory errors with this large project, and I checked all of the usual suspects: CMYK graphics, hard drive space, sufficient RAM... all checked out okay, except possibly the less-than-ideal amount of RAM.
    I copied the important sequences and a couple of select bins to a new project, and everything seems workable for now. The project is still 90 MB, and I've suggested breaking it up into separate projects and work on it as reels, but we can edit and trims work efficiently at the moment. However, the other editor has gotten to a point now where he can't even open bins in the old, big project. He keeps getting the OOM error whenever he tries to do anything.
    I have no similar problems opening the same project on my G5, which is essentially identical except I have 2.5 G RAM (1 G extra). Can this difference in RAM really make this big a deal? Is there something else I'm missing? Why can't this editor access even bins from the other project?
    G4   Mac OS X (10.2.x)  

    Shane's spot on.
    What I often do with large projects is pare down, just what you have done. But 90 out of 100 is not a big paredown by any stretch. In the new copy throw away EVERYTHING that's outdated: old sequences are the big culprit. Also toss any render files and re-render.
    Remember that, to be effective fcp keeps EVERYTHING in ram, so that it can instantly access anything in your project. The more there is to keep track of the slower you get.

  • HELP out of memory error message

    I'm using fcp 7 with 10.6 on a macbook pro.
    I just finished editing a 16 minute photo montage with about 100 stills. The pictues all of movement.
    I used some of the pictures in photoshop and AE.
    When I try and render I'm getting "out of memory" error. All of the footage is on my 1 tb external hard drive with
    450 gig not used. I've done more layered ones before with no issues.
    I tried to render indivdual clips and several clips. It worked fine for a while. The render files are about 7 gig
    so it's not much. I also tried to output the whole sequence instead of rendering and I'm still getting the
    "out of memory"  I need to output by tomorrow.
    Thanks,
    Michael

    I have done some Google searching, and FCP 7 is a 32-bit application.  So it is my guess that whatever you are currently doing has FCP 7 wanting to use more than 4 gigibytes of Virtual Address space, and it cannot grow expand any further because it is a 32-bit applications.
    This is my best guess as to why you are getting "out of memory" errors.
    I am not an FCP user so I cannot suggest any tricks to get this to work.  The only other suggestion is to get newer version of FCP which is 64-bit capable, but that might also require upgrading to a newer version of Mac OS X, so going down the upgrade road can get involved.

  • "out of memory error" will more memory help?

    I seem to be reading mixed reviews. I have an employee doing a really simple editing task with FCP 4.5 on an iMac we have that is 1.8 GHZ with 512 MB RAM. she keeps getting "Out of Memory" errors in FCP 4.5. So we're buying 2 GB RAM. Now I'm reading in other forums that the error may still continue. Any thoughts?

    Aside from the lack of ram - 512 MB is really the minimum for the OS - out of memory messages can also come about when using CMYK color space graphics in FCP. All images should be in the RGB color space.
    This usually doesn't appear as an issue unless you are using images that were intended for offset printing or someone inadvertently changed the color space from RGB.
    Good luck.
    x

  • Rendering Out of Memory Error

    Why when I render large parts of sequences do I get "Out Of Memory Errors". This bombs out my render.

    & Uh, I know there's a "chalkboard" font. My question was more to the point of how to "animate" it properly.< </div>
    Not really, that's just the error message that FCP pukes up for you. FCP has no intelligent diagnostics and the errors are totally misleading and inaccurate.
    the other boys are trying to help you using verbal shorthand that not all newbies understand. That's one of the may ways in which we figure out they're new.
    bogiesan

  • Out of Memory error, though nothing has changed?

    I have two large projects, 65mb each, which I have had open in FCP 7 with no problems for months.
    Then I left them closed for a month and worked on other projects.
    Now I can't open either of those two earlier projects without receiving the Out of Memory error (I'm trying to open each one individually). There are many things to look for when this error appears, as found all over this forum and others. So here is my checklist of facts:
    Neither project contains any PSD files or any sequences with text files in them.
    One of those projects contains no sequences whatsoever, only clips.
    All media for these projects is Apple AVC-Intra 100M 1280x720.
    The RAID drive containing the media for these projects has not been used since the last time the projects were open and working fine.
    The edit system has not changed in any way:
    - FCP 7.0.3
    - Mac Pro
    - OS 10.6.6
    - 2x2.66 Quad Core
    - 6 GB RAM
    - LaCie 8TB RAID, with 1.7TB free
    All RAM is functioning fine.
    I've trashed FCP preferences and received the same results anyway.
    Again, both projects were working fine on this same computer, same version of FCP, even with BOTH projects open at once.
    Also, I've been working with other projects on this same system with no problems, some nearly as large as these two.
    So... none of the obvious culprits are at work here to cause this error, unless I'm missing something.
    Anyone have any ideas? This is a bit scary, as these are documentary projects which are only going to get bigger, and I've done projects bigger than this several times without problems. Hopefully I'm missing something simple...
    Thanks.

    I think I found the culprit, and it's an obvious one:
    My original big project, 65mb, had a bin that was created by an assistant before I began work on the project and I never saw it. That bin has sequence strings for every tape/card loaded into the project, so it had about 65 rather long sequences. Once I found it, I moved that bin to its own project and deleted it from the original project.
    Now the original project opens, and I can open multiple projects at once, no Out of Memory error.
    So it's the bin full of sequences that did it. If I had seen that bin before I wouldn't have had to run this up the flagpole here. But at least it reinforces that word of advice that has appeared in many previous posts about this error: Don't overload a project with long sequences. Break up the sequences into smaller projects and archive them.
    And my own addition to that advice: If you're an assistant setting up a project for an editor, make sure your editor knows what sequences you've tucked away in bins hidden inside bins inside bins, etc.
    All good to go! (Until the next error... stand by...)

  • Out of Memory Error on simple operations

    Hellooo Apple forums! I have a slightly large project referrencing lots of media. It's a 25+ MB project file (and growing) containing a folder of clips from 100's of master tapes, another folder containing 100's of seperately recorded audio files, and another folder containing sequences syncing the two together (also have 100's of these).
    So nothing too complicated here, with my sync sequences being the most complicated thing (simply audio laid under video).
    I while opening new sequences, placing footage into my timeline, or opening clips in the viewer, i occasionally get an 'Out Of Memory' error. I think that i've only encountered the error during these 3 operations. There doesn't seem to be any consistency to it's appearance. Sometimes I can have 15 sequences open before it pops up when I try to open a 16th. Other times I'll only have one sequence open and it'll appear when I try to put a clip into my timeline.
    My computer is a newly purchased dual 2.0 GHZ PPC G5, with 512 MB of RAM (<-- the problem?), new install of FCP 5.1.2.
    Incase this tidbit helps, one time when I encountered the error, I increased my memory still cache from 10% to 15% (changed to 28 MB), and i was able to resume syncing before it came up again.
    Sometimes I close my other sequences so that only one sequence is open, other times I just wait a bit and I can resume syncing. Any advice on how to make this error go away?

    another question just popped into my mind. Is there a technical explanation of 'why' the project becomes slow to handle after a certain point? E.g. a ~100 MB threshold.
    I super vaguely recall reading something a long time ago that it was due to the way FCP saves the project files, relating to the XML language it uses. Any technical info on the slowness would be awesome to know.
    Also, Shane can you make a random post here so I can mark you with a solved star?

  • "out of memory error" message when not out of memory

    I've gotten this message before (as have others) when I'm clearly not out of memory. It's been about a year since this happened last, and I can't remember how I solved it. It seems many others have had this problem and that there has been no simple solution. It occured when I was trying to revise some titles over an imported .mov image which was downloaded from the internet. I revised the titles, but they wouldn't render. Then I got the "out of memory" error message. So far I have:
    1) Logged out and logged back in
    2) Rebooted
    3) Pasted my sequence into a new sequence
    4) Saved the project as a new project
    None of these things have worked. Of course this all happens when I'm at the very end of the project with a deadline of tomorrow.

    I'm having the same problem. I searched "out of memory" and found this solution on another thread..
    http://discussions.apple.com/thread.jspa?messageID=1162802&#1162802
    However, it didn't help me and I am still looking for an answer. I am running FCP HD 4.5 and Avid Express Pro. I edidted together an 18 min video w music, no effects and exported it as a reference movie. Imported to FCP to add chapter markers and that's when I get the "General Error Out of Memory" Tried the make clip offline thing, but that didn't work. Tried exporting a different video from Avid, 3 min and that worked (with chapter marks).
    Anyway, I feel your pain. If you find a solution, please mark this thread "question answered"
    Thanks
    Quicksilver 867   Mac OS X (10.3.9)   1.12 GB SDRAM, 80 gig + 160 gig

  • Acrobat XI Pro "Out of Memory" Error.

    We just received a new Dell T7600 workstation (Win7, 64-bit, 64GB RAM, 8TB disk storage, and more processors than you can shake a stick at). We installed the Adobe CS6 Master Collection which had Acrobat Pro X. Each time we open a PDF of size greater than roughly 4MB, the program returns an "out of memory" error. After running updates, uninstalling and reinstalling (several times), I bought a copy of Acrobat XI Pro hoping this would solve the problem. Same problem still exists upon opening the larger PDFs. Our business depends on opening very large PDF files and we've paid for the Master Collection, so I'd rather not use an freeware PDF reader. Any help, thoughts, and/or suggestions are greatly appreciated.
    Regards,
    Chris

    As mentioned, the TEMP folder is typically the problem. MS limits the size of this folder and you have 2 choices: 1. empty it or 2. increase the size limit. I am not positive this is the issue, but it does crop up at times. It does not matter how big your harddrive is, it is a matter of the amount of space that MS has allocated for virtual memory. I am surprised that there is an issue with 64GB of RAM, but MS is real good at letting you know you can't have it all for use because you might want to open up something else. That is why a lot of big packages turn off some of the limits of Windows or use Linux.

  • Acrobat XI Pro "Out of Memory" error after Office 2010 install

    Good Afternoon,
    We recently pushed Office 2010 to our users and are now getting reports of previous installs of Adobe Acrobat XI Pro no longer working but throwing "Out of Memory" errors.
    We are in a Windows XP environment. All machines are HP 8440p/6930p/6910 with the same Service pack level (3) and all up to date on security patches.
    All machines are running Office 2010 SP1.
    All machines have 2GB or 4GB of RAM (Only 3.25GB recognized as we are a 32bit OS environment).
    All machines have adequate free space (ranging from 50gb to 200gb of free space).
    All machines are set to 4096mb initial page file size with 8192mb maximum page file size.
    All machines with Acrobat XI Pro *DO NOT* have Reader XI installed alongside. If Reader is installed, it is Reader 10.1 or higher.
    The following troubleshooting steps have been taken:
    Verify page file size (4096mb - 8192mb).
    Deleted local user and Windows temp files (%temp% and c:\WINDOWS\Temp both emptied).
    Repair on Adobe Acrobat XI Pro install. No change.
    Uninstall Acrobat Pro XI, reboot, re-install. No change.
    Uninstall Acrobat Pro XI Pro along with *ALL* other Adobe applications presently installed (Flash Player, Air), delete all Adobe folders and files found in a full search of the C drive, delete all orphaned Registry entries for all Adobe products, re-empty all temp folders, reboot.
    Re-install Adobe Acrobat XI Pro. No change.
    Disable enhanced security in Acrobat XI Pro. No change.
    Renamed Acrobat XI's plug_ins folder to plug_ins.old.
    You *can* get Acrobat to open once this is done but when you attempt to edit a file or enter data into a form, you get the message, "The "Updater" plug-in has been removed. Please re-install Acrobat to continue viewing the current file."
    A repair on the Office 2010 install and re-installing Office 2010 also had no effect.
    At this point, short of re-imaging the machines (which is *not* an option), we are stumped.
    We have not yet tried rolling back a user to Office 2007 as the upgrade initiative is enterprise-wide and rolling back would not be considered a solution.
    Anyone have any ideas beyond what has been tried so far?

    As mentioned, the TEMP folder is typically the problem. MS limits the size of this folder and you have 2 choices: 1. empty it or 2. increase the size limit. I am not positive this is the issue, but it does crop up at times. It does not matter how big your harddrive is, it is a matter of the amount of space that MS has allocated for virtual memory. I am surprised that there is an issue with 64GB of RAM, but MS is real good at letting you know you can't have it all for use because you might want to open up something else. That is why a lot of big packages turn off some of the limits of Windows or use Linux.

  • RoboHelp 9 gives an out of memory error and crashes when I try to import or link a Frame 10 file or

    I have Tech Suite 3. If I start a new RoboHelp project and try to import or link Frame files, RoboHelp tries for a while, then I get an Out of Memory error and the program crashes.
    I opened one of the sample projects and was able to link to one of my frame files without any problem, so it seems to be an issue with creating something new.
    Any suggestions?

    It happens when I create a new project and then try to import or link frame docs to make up the content. It starts scanning, then crashes. I did get it to the conversion setting page once, but no further.
    It does not happen if I open one of the supplied example projects and link a file. But then it doesn't let me choose, during import, any style mapping. And I can't delete the sample project fold
    Twice now it has told me when I tried to import (not link, but import) that my .fm file could not be opened, and told me to verify that Frame is installed (it is) and that the file is a valid frame file (it is).
    The docs and project are in separate folders on my C: drive.

  • Out of Memory Error While deploying as EAR file

    Hai,
    I was trying to deploy an EAR file of size 63 MB which inturn containing about 60 EJB.jars. No WARs. application.xml has all the entries for the JARs. While I am deploying it is giving Out of Memory Error. Is there any way to tweak this problem. I am using my own hand written java application which uses the SunONE deployment APIs for deployment. Can u please tell how to tackle this problem. I am running my application through a batch file which uses jdk1.4.
    Please help me regarding this issue.

    You can set the initial heap size and maximum heap size for the JVM, either in the app-server admin console, or maybe in one of your scripts. You look-up the syntax!...
    I had this error yesterday. I too had run out of memory (150Mb). You simply need to allocate more to the app-server.

Maybe you are looking for