"File Error" and "Out of Memory" when rendering FCE

Dear all,
I imported a 2 hour holiday video into FCE.
When rendering the video, a message appears that the rendering process will last about 15 minutes to complete.
However, frequently I am warned by the following message:
"File Error: The specified file is open and in use by this or another application"
When activating Rendering again, the render time is now increased to about 45 minutes.
I now either receive a couple of times the message: "File Error: The specified file is open and in use by this or another application" >>or even worse: "Out of memory"<<.
Today I purchased an addition 2GB of memory to increase my memory form 750 MB to 2.5GB !!!
Can anyone please tell me what could be the cause of these messages and how to solve them?
BTW, no other programs are running while I use FCE.
Thanks in advance,
Hans E.<br>
PowerMac G5-Dual 1.8GHz, Memory 2.5GB, iMac G3-600MHz, Airport, Airport Express Mac OS X (10.3.9)
PowerMac G5-Dual 1.8GHz, Memory 2.5GB, iMac G3-600MHz, Airport, Airport Express Mac OS X (10.3.9)
PowerMac G5-Dual 1.8GHz, Memory 2.5GB, iMac G3-600MHz, Airport, Airport Express   Mac OS X (10.3.9)  

Is it happening when you're rendering, or exporting?
The error message means FCE is trying to rewrite a file that it is currently using. It could be mistakenly trying to save a render file, if you're only rendering, or if you're trying to re-export a file, you'll get that message.
Try dumping all your render files, restarting FCE and trying again.
The Out of Memory error almost always points toward a corrupt file. Could be a source file, could be a render file. Again, dump your render files and try again.
If these don't work, you need to close FCE, take all your source media offline, then open FCE. Now, reconnect each clip one by one until you find the corrupt clip. Keep going, there may be more than one. Re-capture the corrupt clips and you should be good to go.

Similar Messages

  • "General Error" and "Out of Memory" for only certain files?

    I have Final Cut Express 4 on Mac OS X Leopard and it has been working fine, up until now.
    For some reason, when I try to view two clips in the tab to the left of the screen to find sections of them to put into my project, I get a message that says, "General Error" and when I click okay it says, "Error: Out of Memory". They are both mp4 files, and are 242.9 MB and 294.2 MB, and I have viewed them both on Quicktime. I don't understand why it is saying that it is out of memory when I still have 7 GB left on my computer and it still lets me view and add other files into my sequence.
    Can someone help me out and tell me how to fix this? I'd really appreciate advise!

    MPEG-4 is not a format that works in FCE. You'll have to convert it to one of FCE's format. Without knowing details about what the original format is it's impossible to say what you should convert it to.

  • Final_cut_pro-"Error: out of memory" when rendering with hidef clips

    I keep getting this notice when rendering "error: out of memory." Both my external drive and internal drive have plenty of memory; 1.8TB and 101GB respectively. It makes no difference if I am rendering at full resolution or draft. I solved it for a while by clearing out some memory on my internal harddrive,
    but now it is back. What is going on. This didn't happen when I was working with regular definition clips but now I am working with high def. What to do?

    Drives don't have 'memory' in this context, they are storage containers.
    Check my answer to your other thread and give us more info about your rig...Hardware/Software (models/versions), how everything's connected, exact specifications of the media you're working with and sequence settings.
    K

  • Out of memory when rendering...mystery! urgent help!

    ok, this is mysterious. and stressful, because of course it happens when i have to output this and get on a plane.
    I created a stereo sequence which is high def, 3840x1080. photo jpeg. it has stills, and embedded hdv sequence content.
    out of memory error when exporting quicktime movie, either through fcp or compressor.
    so i have gone to the individual sequences (left and right) which are normal hdv sequences with video and stills, and out put a quicktime of each. i'll import the quicktimes and build the stereo sequence with these two quicktime files.
    the right sequence rendered, and output as a quicktime movie just fine.
    the left sequence failed, out of memory error.
    so i started rendering the sequence a bit at a time to find where the problem is. the problem is on two pieces of hdv footage, and a few frames of a photo.
    HERE'S THE STRANGE THING:
    except for the few frames of still photo, which are in a transition of text overlay, everything is identical to the right sequence which rendered and output JUST FINE! the only difference with the text is that its position is slightly different...but 97% of it rendered just fine. I found one still which was grayscale which i've changed to RGB, but still no difference in being able to render. and it rendered in grayscale on the right sequence.
    I've changed my scratch disk to a drive with over a tb of space, I have 5gb of RAM with only fcp running, on a dual core 2.67 mac pro.
    help!

    Deleting FCP Preferences:
    Open a Finder window. Set the view mode to columns:
    Click the house icon that has your name next to it.
    Click the folder "Library" in the next column.
    Find and click the folder "Preferences" in the next column.
    Find and click the folder "Final Cut Pro User Data" in the next column.
    Move the files "Final Cut Pro 6.0 Prefs", "Final Cut Pro Obj Cache" and "Final Cut Pro Prof Cache" to the Trash and empty it.
    When you have a stable set of preference files, download [Preference Manager|http://www.digitalrebellion.com/pref_man.htm] to back them up. If you have further problems, the backup can be recalled and you can continue working.

  • General Error and Out of Memory Error - affecting multiple projects

    I am running Final Cut Pro 6 on my 1.25 GHz PowerPC G4 iMac, with 1.25 GB RAM. OS X 10.4.11.
    I have had this setup for more than a year and used it without any problems.
    I have two projects stored on an external LaCie firewire drive with more than 140GB disk space remaining. As of this morning, I cannot work on either of them: both are throwing the "General Error" message, and one of them is also telling me it's "Out of Memory". On project #1, the "Out of Memory" occurs whenever I try to open a clip in the viewer, following a "General Error" message. On project #2, the "General Error" message occurs when I try to open the project; it gets halfway through the process and then throws the error, leaving me unable to open the timeline.
    Both projects are short, less than 3 minutes long, and neither project contains any CMYK or grayscale graphics. Project #2 does have a short Motion sequence.
    Things I have tried:
    ~ restarting and hard-rebooting
    ~ trashing preferences
    ~ rebuilding permissions
    ~ trashing render files
    ~ creating a new project
    ~ searching this forum and Google for other answers
    Help?

    Thanks for the support, Jim. I've had terrible experiences with Firewire drives in the past, too; regrettably, there doesn't seem to be an affordable alternative at this point.
    I just looked up resetting the PMU, as that's not something I've done before. I really hope it's that simple, as the thought of recreating all these clips makes my head hurt. But I'll definitely try your suggestion of reconnecting individual media files first. I've been through that before.

  • FCE 3.5 "general error" and "out of memory" issues

    Hey folks -
    I'm sure you've covered this before, but couldn't find a good thread....
    I've imported some flip footage for my kid's project (due in two days, of course!) and now we are encountering a "General Error" message followed directly by an "out of memory" message when we try to play a clip. I think there is plenty of memory to get started, not sure how to fix this. Tried trashing and repairing preferences.... Hmmm...
    So very grateful for any assistance....!
    Running a 2.4 GHz Intel Core Duo iMac, 84G remaining on the internal Hard Drive, and using a G-drive 1TB external drive with 340GB available as the scratch disc.

    All set. Was going crazy, because I thought I had it, them lost it....! Your settings were perfect. Ended up downloading MPEG Streamclip program. Used codec "Apple Intermediate Codec", 29.97 frame rate, uncompressed 48k audio, and the setup that seemed happy was the last easy setup setting in FCE 3.5: HDV-Apple Intermediate Codec 720p30. I just needed to make sure I used a new sequence to import the newly transferred clips into.
    Whew.
    Really, many thanks for your assistance!
    All the best,
    DJ

  • General error and out of memory with Final Cut Pro 6

    I just upgraded to a Mac mini i7. What setting should I use to avoid these errors?

    So you were successful in getting FCS 2 apps installed in Mountain Lion?
    Are these messages you're getting now or ones that you have gotten in the past using FCP?
    Out of memory warnings often caused by still images that are not RGB or trying to use unsupported media.
    Russ

  • Server goes out of memory when annotating TIFF File. Help with Tiled Images

    I am new to JAI and have a problem with the system going out of memory
    Objective:
    1)Load up a TIFF file (each approx 5- 8 MB when compressed with CCITT.6 compression)
    2)Annotate image (consider it as a simple drawString with the Graphics2D object of the RenderedImage)
    3)Send it to the servlet outputStream
    Problem:
    Server goes out of memory when 5 threads try to access it concurrently
    Runtime conditions:
    VM param set to -Xmx1024m
    Observation
    Writing the files takes a lot of time when compared to reading the files
    Some more information
    1)I need to do the annotating at a pre-defined specific positions on the images(ex: in the first quadrant, or may be in the second quadrant).
    2)I know that using the TiledImage class its possible to load up a portion of the image and process it.
    Things I need help with:
    I do not know how to send the whole file back to servlet output stream after annotating a tile of the image.
    If write the tiled image back to a file, or to the outputstream, it gives me only the portion of the tile I read in and watermarked, not the whole image file
    I have attached the code I use when I load up the whole image
    Could somebody please help with the TiledImage solution?
    Thx
    public void annotateFile(File file, String wText, OutputStream out, AnnotationParameter param) throws Throwable {
    ImageReader imgReader = null;
    ImageWriter imgWriter = null;
    TiledImage in_image = null, out_image = null;
    IIOMetadata metadata = null;
    ImageOutputStream ios = null;
    try {
    Iterator readIter = ImageIO.getImageReadersBySuffix("tif");
    imgReader = (ImageReader) readIter.next();
    imgReader.setInput(ImageIO.createImageInputStream(file));
    metadata = imgReader.getImageMetadata(0);
    in_image = new TiledImage(JAI.create("fileload", file.getPath()), true);
    System.out.println("Image Read!");
    Annotater annotater = new Annotater(in_image);
    out_image = annotater.annotate(wText, param);
    Iterator writeIter = ImageIO.getImageWritersBySuffix("tif");
    if (writeIter.hasNext()) {
    imgWriter = (ImageWriter) writeIter.next();
    ios = ImageIO.createImageOutputStream(out);
    imgWriter.setOutput(ios);
    ImageWriteParam iwparam = imgWriter.getDefaultWriteParam();
    if (iwparam instanceof TIFFImageWriteParam) {
    iwparam.setCompressionMode(ImageWriteParam.MODE_EXPLICIT);
    TIFFDirectory dir = (TIFFDirectory) out_image.getProperty("tiff_directory");
    double compressionParam = dir.getFieldAsDouble(BaselineTIFFTagSet.TAG_COMPRESSION);
    setTIFFCompression(iwparam, (int) compressionParam);
    else {
    iwparam.setCompressionMode(ImageWriteParam.MODE_COPY_FROM_METADATA);
    System.out.println("Trying to write Image ....");
    imgWriter.write(null, new IIOImage(out_image, null, metadata), iwparam);
    System.out.println("Image written....");
    finally {
    if (imgWriter != null)
    imgWriter.dispose();
    if (imgReader != null)
    imgReader.dispose();
    if (ios != null) {
    ios.flush();
    ios.close();
    }

    user8684061 wrote:
    U are right, SGA is too large for my server.
    I guess oracle set SGA automaticlly while i choose default installion , but ,why SGA would be so big? Is oracle not smart enough ?Default database configuration is going to reserve 40% of physical memory for SGA for an instance, which you as a user can always change. I don't see anything wrong with that to say Oracle is not smart.
    If i don't disincrease SGA, but increase max-shm-memory, would it work?This needs support from the CPU architecture (32 bit or 64 bit) and the kernel as well. Read more about the huge pages.

  • Out of memory when i render HD with many layers

    I got a message error ( out of memory ) when i render HD full 1920 X 1080 in final cut pro 5 with many layers ( 6 to 9 layers ). I have 5.5 gig of memory and when is crash....1.5gig of memory is free in my activity monitor. If a buy more ram...my problem will be resolve? Please help me if anybody know about this problem.
    thank
    Yan

    I am having the same problem...(out of memory on renders)
    I've been watching my activity monitor and cancelling the render before I run out of memory... but I'm not getting much done and I can't do a big render at night.
    I've done all the usuall things... no CMYC files. Trash Prefrences... etc..
    Two questions:
    I am using Magic Bullet filters and they seem to stack up the inactive RAM. (When the incactive ram fills up that's when I get the "out of memeory") You said you saw a post about that a few days ago...Was there an answer besides changing out the filters? or can you point me towards the posting?
    I'm also trying to figure out how to dump or clear the inactive RAM once I'm done with a render. I found that if I do a full restart I can clear my RAM and get a bunch of renders done before I fill up the inactive ram again. This is going to be a long project if I have to do that.
    thanks
    Mark

  • C-runtime error occured "Out of Memory"

    Cannot display the selected records when an end user clicks the (>*) button next to the record count. Instead of displaying all the records based on the prompted selection criteria, the record count (28,338) is shown by itself without the data after a few minutes of waiting. Above the record count and selected prompts, "!error: A C-runtime error occurred (Out of Memory)" is shown twice. No data.
    So long as the end user does not click the (>*) button next to the record count, everything works okay. End user can download the entire set of selected records to a tab-delimited .csv file without any problem. End user can also scroll thru the 28,338 records, one page at a time. Using more restrictive selection criteria, I can view several thousand records at a time. No problem until the (>*) button is clicked when 28,338 records are requested.
    We recently doubled the memory on our desktops from 1GB to 2GB. No help attempting to display all records.
    Problem reproduced at will in both OBIEE 10.1.3.3.2 and 10.1.3.4.1 environments using either IE 8.0 or Firefox 3.6 on Windows XP Pro.
    How can we reconfigure our environment to display the 28,338 records on a desktop without running Windows XP out of memory?

    The C:\OracleBIData\tmp directory on the desktop appears empty before, during and after the query.
    In addition in Windows Task Manager, the Networking tab for the Local Area Connection shows a steady, but less than 0.5%, activity stream for the duration. CPU use and paging is also minimal.
    Edited by: bobatx on Dec 23, 2010 10:04 AM

  • How do I fix Error Message = Out Of Memory Could Not Install or Delete Data"

    I have looked all over the internet and there is a lot of discussion about WHY this happens and a lot of discusion about what to do in the future to avoid this from happening.
    I need help with specific steps on how to delete the static image that is stuck on my desktop. The image is large and covers most of my screen. I have shut down, unplugged, let the machine rest....started up several times. Not working.
    The error reads "OUT OF MEMORY - COULD NOT INSERT OR DELETE DATA"
    Thanks!

    Probably you are running out of disk space. Can you interact with your Mac at all, or is the error  message preventing you from using Finder windows?
    I'd try to boot into Save Mode: This will only load a rudimentary version of Max OS X, needing less space, and will clean some caches. The Start Up will tale longer, however.
    When in Save Mode try to empty the Trash and to free as much disk space as you can, by deleting files.
    How much free disk space do you have right now?
    See:
    http://www.wikihow.com/Start-Your-Mac-in-Safe-Mode
    If you cannot access the FInder to free disk space, even in Save Boot Mode, post back.
    Regards
    Léonie

  • After Effects Error: Keylight out of memory (4)

    After Effects CS4 keeps crapping out on me, and won't allow me to render my projects. I keep getting the message, "After Effects Error: Keylight out of memory (4)"
    I'm running a brand new IMac 3.4 GHz Intel Core I7 (which I understand gives me virtually 8 cores), and I have 8 GB of Ram.
    I've tried with multi-frame processing enabled, and dissabled and the same thing. When I tried with the mulit-frame enabled I selected allow 4 CPU's Free and allow 6CPU's free, and still the same results. Any help would be greatly apprecicated.
    Thanks!

    Since you run AE 9, it's probably your footage.  Okay, it's the most likely cause, at least.  Here's a copy 'n paste response I used to use a LOT  on the Creative COW:
    Dave's Stock Answer #1:
    If the footage you imported into AE is  any  kind of the following -- footage in an HDV acquisition codec, MPEG1, MPEG2, AVCHD, mp4, mts, m2t, H.261 or H.264 -- you need to convert it to a different codec.
    These kinds of footage use temporal, or interframe compression. They have keyframes at regular intervals, containing complete frame information.  However, the frames in between do NOT have complete information.  Interframe codecs toss out duplicated information.
    In order to maintain peak rendering efficiency, AE needs complete information for each and every frame. But because these kinds of footage contain only partial information, AE freaks out, resulting in a wide variety of problems.
    I'm a Mac guy, so I like to convert to Quicktime movies in the Animation or PNG codecs; both are lossless.  I'll

  • "Assembly Error. Out Of Memory"

    I am burning a 5 Disc set for someone, and when I burn disc 3, I get this message during the burning process: "Assembly Error. Out of Memory". Then a little while later the program turns off on its own.
    This is only happening when I burn the 3rd disc. When I burn discs 4 and 5 it works just fine, and they are all approximately the same size...
    Any idea what can be causing this? Thanks!

    OK, let's look at both the Audio and the still images.
    For the JPEG's, what are the dimensions (pixel x pixel) of these? Large stills can cause a myriad problems and if much larger than the Frame Size of the Project, will suffer a quality loss. Bigger is NOT better here.
    This ARTICLE will give you some background and an automated way to prepare them for use in PrPro.
    For the WMA, are these perhaps multi-channel files, like 5.1 SS Audio?
    First, I would look at doing a Render/Replace with these, and test. This will convert them to PCM/WAV 48KHz 16-bit Audio. If that does not allow you to build your DVD, then I would use Audition, or Soundbooth (or even the free audio-editing program, Audacity) to convert to PCM/WAV 48KHz 16-bit. Depending on which version of PrPro you have, you should have Edit in Soundbooth, or Edit in Audition, when you Rt-click on one of the WMA files. That would be my first choice. Test.
    Good luck,
    Hunt

  • RE: Big Log Files resulting in Out Of Memory of serverpartition

    To clean a log on nt, you can open it with notepad, select all and delte, add a space ans save as... with the same file name
    on unix, you can just redirect the standard input to the file name (e.g.:
    # > forte_ex_2390.log
    (Should work with nt but i never tried)
    Hope that will help
    De : Vincent R Figari
    Date : lundi 30 mars 1998 21:42
    A : [email protected]
    Objet : Big Log Files resulting in Out Of Memory of server partition
    Hi Forte users,
    Using the Component/View Log from EConsole on a server partition triggers
    an
    Out-Of Memory of the server partition when the log file is too big (a few
    Mb).
    Does anyone know how to change the log file name or clean the log file of
    a server partition running interpreted with Forte 2.0H16 ???
    Any help welcome,
    Thanks,
    Vincent Figari
    You don't need to buy Internet access to use free Internet e-mail.
    Get completely free e-mail from Juno at http://www.juno.com
    Or call Juno at (800) 654-JUNO [654-5866]

    So try treating your development box like a production box for a day and see if the problem manifests itself.
    Do a load test and simulate massive numbers of changes on your development box.
    Are there any OS differences between production and development?
    How long does it take to exhaust the memory?
    Does it just add new jsp files, or can it also replace old ones?

  • Encountering with the error Client out of memory in Bex. . .

    Hello All,
    In the Bex report when I run for the 2000 sales organization selection the initial display is coming fine. When I try to drill down to the next levels in the result, I am encountering with the error 'CLENT OUT OF MEMORY'.
    The report is displaying sales document level data. I need to drill down on almost 6 fields like cost element, Profit center. Auth group, Cost center, Customer number... Up to 7 attributes.
    I am able to drill down up to 5 attributes when i try to drill down on the 6th one, the report ended up with this error.
    Note:  I am able to run the report for remaining sales orgs and can drill down on all the fields. Might be because of huge data not able drill down on 2000, not sure
    Any one could help me on this.
    Thanks in Advance,
    Lakshmi.

    I checked in the infoprovider level.
    For the same selection, we have around 450000 of cells.
    Thank you,
    Lakshmi.

Maybe you are looking for

  • CMS System to create a page on the server

    Hi all, I am looking for a way for ColdFusion to create and save an html (or cfml) page generated by a user entering text into a form field on an html template page and submitting the text. I am looking to build an email system to allow the client to

  • CS5 to CS4 converter

    I am a user of Illustrator CS4 and I have been sent a CS5 file. Is there any online converter tool that I can use to downsave the file in CS4 in order to work with it?

  • CODING  EXAMPLE - LAYOUT ALV - ( very urgent )

    Could any body please give me the  coding example where slis_layout_alv is used ? Please treat this as very urgent.                           Yours sincerely,                            SAURAV  LAHIRY

  • Contacts in Mail with Lion

    Just downloaded Lion and now I can't see my contacts when in Mail. There used to be a link to them on the toolbar ?

  • Looking for a Zen Touch Remote...with the following

    So...first off...where in the world are all the accessories for the Zen Touches? Man..what a dissapointment...spent alot of money on this thing mainly becuase I use alot of creative products and did not want to give in to the IPod craze...not to ment