Final_cut_pro-"Error: out of memory" when rendering with hidef clips

I keep getting this notice when rendering "error: out of memory." Both my external drive and internal drive have plenty of memory; 1.8TB and 101GB respectively. It makes no difference if I am rendering at full resolution or draft. I solved it for a while by clearing out some memory on my internal harddrive,
but now it is back. What is going on. This didn't happen when I was working with regular definition clips but now I am working with high def. What to do?

Drives don't have 'memory' in this context, they are storage containers.
Check my answer to your other thread and give us more info about your rig...Hardware/Software (models/versions), how everything's connected, exact specifications of the media you're working with and sequence settings.
K

Similar Messages

  • "Error: Out of Memory" when trying to Render

    Hello all,
    I'm editing a project in FCP 5 that is using, in part, .mpeg-4 videos downloaded from the internet. After several weeks on the project with things going fine, the downloaded videos can no longer render. I can see them in the source window, but when laid into the sequence these downloaded videos produce an "Error: Out of Memory" message. Other videos of this type that are already rendered and playing in the existing sequence cannot be re-rendered or adjusted without producing the same Error message.
    A few things to remember...
    1. Other types of rendering, such as filters and fit to fill effects, render just fine.
    2. There is over 30 GBs on the external drive this project is running on.
    3. I've tried resetting scratch disks, transferring the project to another drive, and opening it on a new, super-fast computer, and nothing is different.
    Please help, I'm really stuck in the mud here and am dying to get past this issue,
    Peace,
    Sam

    Did either of you find an answer? I'm running into the S A M E issues. I feel strongly that it has to do with the quicktime update, as editing mp4's worked FINE in January.
    There is a lengthy protocol for trying to circumvent this issue:
    http://discussions.apple.com/thread.jspa?threadID=1413028&tstart=0
    Didn't work for me, but I think it worked for the query. And VLC (http://www.videolan.org/) ended up being the only program I can even VIEW mp4 in now properly.
    So if you can use a viewer, that works.

  • General Error - Out of memory when viewing clips

    Hi All. I have a problem.
    Right, I have 720p clips, about 5-10 minutes long each, 100-150mb in size. I roughly have about 180 of these clips. Now some of the clips can be viewed, others (most) cannot. Everytime I click to view them, it brings up the error out of memory message. All clips and the project are being done on an external HD (1TB)
    I've attempted to start a new project on the Mac HD, same issue (with the clips still on the external HD) and I've also tried only putting one clip in with the same error message.
    How can I resolve this issue?
    Cheers
    Tony

    Don't convert to DivX, that's not an editing codec either.
    Read my MPEG Streamclip tutorial: http://www.secondchairvideo.com/?p=694

  • Error out of memory when opening an edited sequence

    Don't know if this is any use I was attempting to open a hour long sequence in fcp
    (I'm only on 6.05 but anyway....) I kept on getting an error out of memory. I did the obvious things opened a new project and put the sequence in it didn't work. Eventually I found that if you open a new sequence and drop the one that won't open into it (this will nest it) and then double click on the nested sequence it will open the original sequence without a memory issue....
    Sloppy bit of code there somewhere!!!
    Justin

    After updating to FCP 703 and osx 10.6.4 a lot of Memory issue has been solved.

  • "File Error" and "Out of Memory" when rendering FCE

    Dear all,
    I imported a 2 hour holiday video into FCE.
    When rendering the video, a message appears that the rendering process will last about 15 minutes to complete.
    However, frequently I am warned by the following message:
    "File Error: The specified file is open and in use by this or another application"
    When activating Rendering again, the render time is now increased to about 45 minutes.
    I now either receive a couple of times the message: "File Error: The specified file is open and in use by this or another application" >>or even worse: "Out of memory"<<.
    Today I purchased an addition 2GB of memory to increase my memory form 750 MB to 2.5GB !!!
    Can anyone please tell me what could be the cause of these messages and how to solve them?
    BTW, no other programs are running while I use FCE.
    Thanks in advance,
    Hans E.<br>
    PowerMac G5-Dual 1.8GHz, Memory 2.5GB, iMac G3-600MHz, Airport, Airport Express Mac OS X (10.3.9)
    PowerMac G5-Dual 1.8GHz, Memory 2.5GB, iMac G3-600MHz, Airport, Airport Express Mac OS X (10.3.9)
    PowerMac G5-Dual 1.8GHz, Memory 2.5GB, iMac G3-600MHz, Airport, Airport Express   Mac OS X (10.3.9)  

    Is it happening when you're rendering, or exporting?
    The error message means FCE is trying to rewrite a file that it is currently using. It could be mistakenly trying to save a render file, if you're only rendering, or if you're trying to re-export a file, you'll get that message.
    Try dumping all your render files, restarting FCE and trying again.
    The Out of Memory error almost always points toward a corrupt file. Could be a source file, could be a render file. Again, dump your render files and try again.
    If these don't work, you need to close FCE, take all your source media offline, then open FCE. Now, reconnect each clip one by one until you find the corrupt clip. Keep going, there may be more than one. Re-capture the corrupt clips and you should be good to go.

  • Out of memory when rendering...mystery! urgent help!

    ok, this is mysterious. and stressful, because of course it happens when i have to output this and get on a plane.
    I created a stereo sequence which is high def, 3840x1080. photo jpeg. it has stills, and embedded hdv sequence content.
    out of memory error when exporting quicktime movie, either through fcp or compressor.
    so i have gone to the individual sequences (left and right) which are normal hdv sequences with video and stills, and out put a quicktime of each. i'll import the quicktimes and build the stereo sequence with these two quicktime files.
    the right sequence rendered, and output as a quicktime movie just fine.
    the left sequence failed, out of memory error.
    so i started rendering the sequence a bit at a time to find where the problem is. the problem is on two pieces of hdv footage, and a few frames of a photo.
    HERE'S THE STRANGE THING:
    except for the few frames of still photo, which are in a transition of text overlay, everything is identical to the right sequence which rendered and output JUST FINE! the only difference with the text is that its position is slightly different...but 97% of it rendered just fine. I found one still which was grayscale which i've changed to RGB, but still no difference in being able to render. and it rendered in grayscale on the right sequence.
    I've changed my scratch disk to a drive with over a tb of space, I have 5gb of RAM with only fcp running, on a dual core 2.67 mac pro.
    help!

    Deleting FCP Preferences:
    Open a Finder window. Set the view mode to columns:
    Click the house icon that has your name next to it.
    Click the folder "Library" in the next column.
    Find and click the folder "Preferences" in the next column.
    Find and click the folder "Final Cut Pro User Data" in the next column.
    Move the files "Final Cut Pro 6.0 Prefs", "Final Cut Pro Obj Cache" and "Final Cut Pro Prof Cache" to the Trash and empty it.
    When you have a stable set of preference files, download [Preference Manager|http://www.digitalrebellion.com/pref_man.htm] to back them up. If you have further problems, the backup can be recalled and you can continue working.

  • Multiclip Nightmare: General Error/Out of Memory when trying to match frame

    I'm an assistant editor on a reality show and am fairly experienced with Final Cut Pro. But I've never seen something like this.
    At our post house, we have over 30 Macs running FCP7, streaming media off a Xserve/Promise RAID SAN via Fiber channel. We create multiclips for everything (it's a reality show) and have run into some weird bugs.
    What's happening is that mutliclips will all-of-a-sudden be unable to match back to the original multiclip. It gives a General Error then an Out of Memory error. Sometimes, after those errors, FCP immediately quits. When I double click the clip in the timeline, I get the same general error. I can, however, match back to the original media clip. But matching back to the multiclip causes the error.
    I can't pinpoint a cause or common element. It's happening to different episodes with different editors and different project files.
    Also, randomly, things will go offline for no reason. And not everything in a timeline, just some things (which makes no since). We can reconnect them quickly since the media drive never went offline in the first place. I mention this because but I'm wondering if both of these problems are common symptoms of corrupt multiclips and if there's a way to fix them. The only element that I can find linking media randomly going offline is that these multiclips had subclips embedded in them. But even that doesn't seem to be a common thread.
    These timelines have been passed around from story editors to editors to AEs then back to editors, so we're thinking these multiclips might be getting corrupted through that process. I don't understand why because everything's looking at the server and all the Macs are pretty much identical.
    What I began to do was recut in each clip on the editor's timeline with the original multiclip. It fixes the problem but it's beyond time consuming to say the least. Now that these errors have come up three times in different places, I'm worried we have a widespread problem and if there's a better way to fix it.
    Details:
    We transcoded all the DVCPRO tapes with AJA Kona Cards to NTSC ProRes (LT) with plans to online the show once we lock
    Everyone's running the latest FCP7 and Snow Leopard release

    Nate Orloff wrote:
    It seems that FCP is making HUGE multiclips. I'm guessing that's the source of the memory error. Some multiclips (the ones it'll let me open) are 9 hours long.
    Interesting... Could be the source of your problem indeed
    And for no reason.
    Are you sure there's no reason? When making multilips there's not one clip's timecode passing 00:00:00:00 for exemple? Or a small clip is also selected with a complete different Timecode? Is it black all 9 hours long? etc.
    There must be something that is causing these long multiclips.
    I agree with Tom (down in this thread) that sending apple feedback with as much detail as possible might be helpfull.
    [FCP feedback|http://www.apple.com/feedback/finalcutpro.html]
    I would also be interested in an XML of one of the culprit clips. Not that I know answers, but looking at it might give some clue.
    Please select one of the wrong multiclips (in the browser). Select File > Export > XML. (version 5, include metadata; Please also make a version 4 XML).
    Send both XML to my email address (in my profile; click on my name in the left column of this post). No promises But you never know...
    Rienk

  • Out of memory when i render HD with many layers

    I got a message error ( out of memory ) when i render HD full 1920 X 1080 in final cut pro 5 with many layers ( 6 to 9 layers ). I have 5.5 gig of memory and when is crash....1.5gig of memory is free in my activity monitor. If a buy more ram...my problem will be resolve? Please help me if anybody know about this problem.
    thank
    Yan

    I am having the same problem...(out of memory on renders)
    I've been watching my activity monitor and cancelling the render before I run out of memory... but I'm not getting much done and I can't do a big render at night.
    I've done all the usuall things... no CMYC files. Trash Prefrences... etc..
    Two questions:
    I am using Magic Bullet filters and they seem to stack up the inactive RAM. (When the incactive ram fills up that's when I get the "out of memeory") You said you saw a post about that a few days ago...Was there an answer besides changing out the filters? or can you point me towards the posting?
    I'm also trying to figure out how to dump or clear the inactive RAM once I'm done with a render. I found that if I do a full restart I can clear my RAM and get a bunch of renders done before I fill up the inactive ram again. This is going to be a long project if I have to do that.
    thanks
    Mark

  • Rendering Error: Out of memory

    Hi
    Im having problems wit this error message : "Out of memory" when Im trying to render a sequence in fcp 7.0.2.
    Ive got a new Imac 27 i7, 8GB ram, system setting are 100% on application memory and still cache.
    50min sequence in proress1080 format, with material in proress, xdcam ex, and some jpg pictures. Lots of space on disk.

    Hi Guys,
    Thanks for your info for the RGBs & 4K jpegs but I followed,the problem still annoyed me last whole week. Then it made me check my 9 GB RAM & erased most ProRes footage in my HDD. D
    Finally, I had 2 solutions. First try, export the graphics into a ProRex clip, then key (superimpose) to my timeline.
    Second try, export to MOTION & make the animation. (This is the best result for MOTION has the Alpha Channel for the keying)
    What I guess the problem is the calculating of rendering a complex graphic would take all the memories from our system...so...OUT OF MEMORY. As in my graphics, I put a short & speedy movement to move arround in my timeline.
    Hope this could help all of us not spending our time in checking the Hardwares.
    I suggest Apple FCP supprot team should change the warning message, something like "check your media" or more explanations.

  • I'm getting an error Out of memory message when I try to render - There's lots of memory

    I'm getting an error Out of memory message when I try to render - there's lots of memory and I've never gotten this before - any ideas?

    Thanks - I did go through and change all the profiles to Apple RGB, there are several to choose - but I'm still having problems - I think it has to do with corrupt  images - Im backing up and starting over - I've worked with images large and small for years and never had this problem - thanks

  • "AE Effects Error: out of memory rendering '55mmv7_MonoTint.plugin"

    I am trying to export (using Quicktime Movie) a nested sequence when about 35% though the export it stops and gives me this error message:
    "AE Effects Error: out of memory rendering '55mmv7_MonoTint.plugin"
    The editor (i am only assistant) is using a filter package from Digital Film Tools, and they aren't in business any more. could it be that FCP 6 doesn't support these plugins?
    Some of the clips in the sequence aren't rendered. Do they need to be rendered before exporting to Quicktime?
    Some of the clips have a "Faux Film" filter on them but that's not the error that pops up.
    Not sure what else to say.
    Thanks for your help.

    Turn off the effects that you suspect and try the render again. That would be a good place to start.

  • My final cut studio keeps giving me this error out of memory message when ever i try to render a piece of footage and its closing unexpected. Does anybody know the reason?

    My final cut studio keeps giving me this error out of memory message when ever i try to render a piece of footage and its closing unexpected. Does anybody know the reason?

    I had this problem about 2 months ago and, as others have suggested in this thread, eventually traed it backed to some stills I had scanned and brought into the project.  Accidentally I had changed the scanner settings, with the result that a couple of skills were well beyond 1980 x 1020 pixels.  I resampled the stills and everything was fine again.  Hope this might help.

  • Server goes out of memory when annotating TIFF File. Help with Tiled Images

    I am new to JAI and have a problem with the system going out of memory
    Objective:
    1)Load up a TIFF file (each approx 5- 8 MB when compressed with CCITT.6 compression)
    2)Annotate image (consider it as a simple drawString with the Graphics2D object of the RenderedImage)
    3)Send it to the servlet outputStream
    Problem:
    Server goes out of memory when 5 threads try to access it concurrently
    Runtime conditions:
    VM param set to -Xmx1024m
    Observation
    Writing the files takes a lot of time when compared to reading the files
    Some more information
    1)I need to do the annotating at a pre-defined specific positions on the images(ex: in the first quadrant, or may be in the second quadrant).
    2)I know that using the TiledImage class its possible to load up a portion of the image and process it.
    Things I need help with:
    I do not know how to send the whole file back to servlet output stream after annotating a tile of the image.
    If write the tiled image back to a file, or to the outputstream, it gives me only the portion of the tile I read in and watermarked, not the whole image file
    I have attached the code I use when I load up the whole image
    Could somebody please help with the TiledImage solution?
    Thx
    public void annotateFile(File file, String wText, OutputStream out, AnnotationParameter param) throws Throwable {
    ImageReader imgReader = null;
    ImageWriter imgWriter = null;
    TiledImage in_image = null, out_image = null;
    IIOMetadata metadata = null;
    ImageOutputStream ios = null;
    try {
    Iterator readIter = ImageIO.getImageReadersBySuffix("tif");
    imgReader = (ImageReader) readIter.next();
    imgReader.setInput(ImageIO.createImageInputStream(file));
    metadata = imgReader.getImageMetadata(0);
    in_image = new TiledImage(JAI.create("fileload", file.getPath()), true);
    System.out.println("Image Read!");
    Annotater annotater = new Annotater(in_image);
    out_image = annotater.annotate(wText, param);
    Iterator writeIter = ImageIO.getImageWritersBySuffix("tif");
    if (writeIter.hasNext()) {
    imgWriter = (ImageWriter) writeIter.next();
    ios = ImageIO.createImageOutputStream(out);
    imgWriter.setOutput(ios);
    ImageWriteParam iwparam = imgWriter.getDefaultWriteParam();
    if (iwparam instanceof TIFFImageWriteParam) {
    iwparam.setCompressionMode(ImageWriteParam.MODE_EXPLICIT);
    TIFFDirectory dir = (TIFFDirectory) out_image.getProperty("tiff_directory");
    double compressionParam = dir.getFieldAsDouble(BaselineTIFFTagSet.TAG_COMPRESSION);
    setTIFFCompression(iwparam, (int) compressionParam);
    else {
    iwparam.setCompressionMode(ImageWriteParam.MODE_COPY_FROM_METADATA);
    System.out.println("Trying to write Image ....");
    imgWriter.write(null, new IIOImage(out_image, null, metadata), iwparam);
    System.out.println("Image written....");
    finally {
    if (imgWriter != null)
    imgWriter.dispose();
    if (imgReader != null)
    imgReader.dispose();
    if (ios != null) {
    ios.flush();
    ios.close();
    }

    user8684061 wrote:
    U are right, SGA is too large for my server.
    I guess oracle set SGA automaticlly while i choose default installion , but ,why SGA would be so big? Is oracle not smart enough ?Default database configuration is going to reserve 40% of physical memory for SGA for an instance, which you as a user can always change. I don't see anything wrong with that to say Oracle is not smart.
    If i don't disincrease SGA, but increase max-shm-memory, would it work?This needs support from the CPU architecture (32 bit or 64 bit) and the kernel as well. Read more about the huge pages.

  • When I import the ppt, why do you always import error out of memory? Thank you!

    When I import the ppt, why do you always import error out of memory? Thank you!

    > We are getting an error (about the error - system say out of memory) and when we referred to the messages in the SAP BPC forum, few have provided solution asking to shrink the log file size.
    > They have also provided the SQL synatx to reduce the log file size.
    > " dbcc shrinkfile (Appset_Log, 1) " -
    This is for Microsoft SQL Server, not for Oracle.
    If you get "out of memory" it's not a database logfile problem but a memory problem.
    > Exact Error message - out of memory (Pop-up screen - no other information available)
    When is this happening?
    And do not post the same question in different forums:
    SAP BPC Netweaver and error message (SQL)
    Markus

  • NIGHTMARE with ERROR OUT OF MEMORY

    I was going to finish a 80 hour editing proyect and suddenly this morning my fcp says ERROR OUT OF MEMORY! so NOW, I can´t open anymore the editing sequence.
    HELP! I am desesperate!
    any idea??

    Thanks a lot! I read another post more or less like this, and somebody has some kind of programa thats named FCP RESCUE and it works really good!
    I erased all render files, then with this program (I still dont understand well how this works) deleted preferences of fcp (with fcp closed -- no instructions available of the use of this little program) and then make off line the sequence.
    YES! I Think the problem was a cmyk graphic I haved. so then I opened the sequence -everything was off line (as suppossed to be) and then opened a new proyect --diffferent name (by caution) changed name, and copy paste all the edition of the original editing sequence to the new project, but I erased the CMYK GRAPHIC -- now working fine in the new project.
    Have to say that in the older one --even that I erased the cmyk graphic --still having troubles so...
    its better to work in a new project.
    Thanks a lot for your responses VERY VERY USEFUL!!
    Ah, one more thing, fcp (apple) should warn users that fcp is not compatible with cmyk graphics and that will give conflicts.

Maybe you are looking for

  • Performance tuning in t

    hi, I have to do perofrmance for one program, it is taking 67000 secs in back ground for execution and 1000 secs for some varints .It is an  ALV report. please suggest me how to proced to change the code.

  • Query Executing in WAD and not in BEx Analyzer

    Hi Experts, I am creating a query to find the "Execution Time Analysis" in PP the query has been successfully running in WAD but it is not opening in BEx Analyzer and throwing a message called RSBOLAP 000 and RS_EXEPTION 107. What will be the reason.

  • URGENT: Autonumber to use next AVAILABLE value

    Hi All I urgently need your help: I am writing a Java assignment using a MS Access database. Some of the columns are AutoNumber used as primary keys in a relationship. The program sometimes needs to delete records. The problem I have is that when I a

  • Error While Importing the oracle 815 database

    The error message i get is "segmentation fault(core dump)". This error Occurs when i import any table from this export file(.dmp). I using the command "imp parfile=filename". When i run the file, the error occure when starting to import the data from

  • Macbook- Endless reboot cycle- Failed 10.5.2 attempt!!!!

    Hi, I've got a Macbook. I tried doing the 10.5.2 update and then it froze along the process, so I rebooted. The computer wouldn't get past the white screen spinning wheel Apple screen on startup, so I figured I really screwed things up. So I put in t