General Error (34) and Out of Memory Msgs

I've been editing footage for a video for quite some time (several weeks). I have several folders in my media browser depicting different scenes. However, there is only one folder with media clips that yield the message (General Error {34} followed by the Out of Memory message)..
However, when I go to other clips and click on them, I do not get those messages. Everything's fine...I can see them in the Viewer, no problem and I can edit as appropriate. It only seems to be with that one set of media clips...So (and I'm afraid to ask) am I staring at a corrupt set of media clips in this folder? What's a way to remedy this?
Thanks,
Henri
Powerbook G4, 1.67Ghz Superdrive 2GB RAM   Mac OS X (10.4.7)   2 LaCie Firewire Drives
Powerbook G4, 1.67Ghz Superdrive 2GB RAM   Mac OS X (10.4.7)   LaCie Firewire Ext HD's - 200 and 250 GB (1st FW HD is the scratch disk)

Yes... Yet there is so much information around this and so many different iterations to the solutions that the right solution for my particular case is not that easily discernible. Some suggest removing old preferences, others suggest memory and/or corrupt file issues. I've already run Diskwarrior to see if partition/drive problems exist, which was another discussion thread...
I'm not sure if my situation, being that only one browser media file that has this problem could be directly addressed.
I simply want some 2nd opinions regarding it.

Similar Messages

  • I'm getting a general error message then out of memory trying to add a new sequence

    Anyone know how to fix??  i am getting a general error message and then out of memory when adding a new squence??  Help!

    The General Error warning is usually associated with incompatible files.
    Out of Memory can be caused by corrupted clips, incompatible files and still images that use a color mode other than RGB.

  • General Error Final cut__ Out of memory

    I'm editing a short sequence were the footage is all h.264. When I import the footage to the bins and I try to open the clip in the canvas I get the message General Error and Out of Memory follows.
    Why is this?
    Below you can find the Clips info. The bottom four are the ones that give the error when opened.
    R

    Your question belongs in the FCP forum:
    https://discussions.apple.com/community/professional_applications/final_cut_pro_ x

  • General Error and Out of Memory Error - affecting multiple projects

    I am running Final Cut Pro 6 on my 1.25 GHz PowerPC G4 iMac, with 1.25 GB RAM. OS X 10.4.11.
    I have had this setup for more than a year and used it without any problems.
    I have two projects stored on an external LaCie firewire drive with more than 140GB disk space remaining. As of this morning, I cannot work on either of them: both are throwing the "General Error" message, and one of them is also telling me it's "Out of Memory". On project #1, the "Out of Memory" occurs whenever I try to open a clip in the viewer, following a "General Error" message. On project #2, the "General Error" message occurs when I try to open the project; it gets halfway through the process and then throws the error, leaving me unable to open the timeline.
    Both projects are short, less than 3 minutes long, and neither project contains any CMYK or grayscale graphics. Project #2 does have a short Motion sequence.
    Things I have tried:
    ~ restarting and hard-rebooting
    ~ trashing preferences
    ~ rebuilding permissions
    ~ trashing render files
    ~ creating a new project
    ~ searching this forum and Google for other answers
    Help?

    Thanks for the support, Jim. I've had terrible experiences with Firewire drives in the past, too; regrettably, there doesn't seem to be an affordable alternative at this point.
    I just looked up resetting the PMU, as that's not something I've done before. I really hope it's that simple, as the thought of recreating all these clips makes my head hurt. But I'll definitely try your suggestion of reconnecting individual media files first. I've been through that before.

  • Printing error due to out of memory condition...

    hi i got this problem of "printing error due to out of memory condition" and i cannot save it as a pdf file too. How to solve this prob? i using cs3 and acro 9.. pls help me... got a proj due and cannot afford to waste too much time...

    I've also seen this error message for the first time today "printing error due to out of memory condition" - I have 2GB of RAM, 300Gb+ free on my hard drive and have closed all other programs that were using some memory, however have had no problems printing to pdf ever before in Indesign... is there a solution to this problem?

  • Error due to out of memory condition

    Hi,
    system: Windows XP, 2 GM RAM, Indesign CS3
    I placed two files in Indesign: the first is .ai (1.1 MB, probably exported form a CAD program), the second is .psd (53 MB). When I try to print to .ps file, Indesign displays the message: "Export error: error due to out of memory condition". If I: 1. export to PDF, everything is OK, 2. rasterize .ai in Photoshop, then import the resulting .psd to Indesign, print to .ps file is OK.
    Does somebody has an answer to my problem?

    My gues is "mistakes" was referred to simply not outputting as intended by the designer.  I had output issues when going directly to PDF and had to also continue the old way of doing thinkgs by outputting to .ps then distill to PDF.  The "mistakes" that occured for me were nearly unnoticed.  The publication was a 128+4 product catalog that had a header at the top of each page.  I never had any problems outputting in the past.  At the time, I just had updated to CS3 (XP Pro SP3 Dual Core 4 gigs ram).  I was advised that outputting directly to PDF had been durastically improved.  I output my files without "mistakes" or errors, or so I thought.  After proofing, I didn't realize that all of the drop shadows for the header headline were not applied.  The problem though, wasn't that they were applied to all headers, just beginning at page 60 or so, halfway through.  After getting the finished publication back, I didn't notice it for a few months.  After noticing it, I investigated.  Several troubleshooting hours later (after forum posts and other expert help), I/we concluded that it was simply a program deficiency.  During my trouble shooting, I looked at several areas mainly focusing on the transparency settings.  I output the document several times.  I finally exported each page separately and found that that was the only way that I could output every page to include the dropshadow in each header page (past page 60 or so).
    I have been past that issue for a while now.  Now I am on Win 7, Quad core, 6 gigs of ram, and don't have the outputting problem that I used to.  Now I occasionally get this "Error due to out of memory condition" which is BS because my hardware specs are beyond reasonable for what I typically create in my workflow.  Most of the time, my memory is only at about half of its capacity when I get this error.  Not when I have a PS open with a large multipage file that I am working on.  My out of memory error happens during file output or just when performing ordinary layout functions in InDesign.
    When on my personal computer (Aluminum iMac 2.4 Core2 duo, 4 gigs of ram, 512 video, CS4 & CS5) I have neither of the above issues.

  • ERROR [B3108]: Unrecoverable out of memory error during a cluster operation

    We are using Sun Java(tm) System Message Queue Version: 3.5 SP1 (Build 48-G). We are using two JMS servers as a cluster.
    But we frequently getting the out of memory issue during the cluster operation.
    Messages also got queued up in the Topics. Eventhough listeners have the capability to reconnect with the Server after the broker restarting, usually we are restarting consumer instances to get work this.
    Here is detailed log :
    Jan 5 13:45:40 polar1-18.eastern.com imqbrokerd_cns-jms-18[8980]: [ID 478930 daemon.error] ERROR [B3108]: Unrecoverable out of memory error during a cluster operation. Shutting down the broker.
    Jan 5 13:45:57 polar1-18.eastern18.chntva1-dc1.cscehub.com imqbrokerd: [ID 702911 daemon.notice] Message Queue broker terminated abnormally -- restarting.
    Expecting your attention on this.
    Thanks

    Hi,
    If you do not use any special cmdline options, how do you configure your servers/
    brokers to 1 Gb or 2 Gb JVM heap ?
    Regarding your question on why the consumers appear to be connecting to just
    one of the brokers -
    How are the connection factories that the consumers use configured ?
    Is the connection factory configured using the imqAddressList and
    imqAddressListBehavior attributes ? Documentation for this is at:
    http://docs.sun.com/source/819-2571/ref_adminobj_props.html#wp62463
    imqAddressList should contain a list of brokers (i.e. 2 for you) in the cluster
    e.g.
    mq://server1:7676/jms,mq://server2:7676/jms
    imqAddressListBehavior defines how the 2 brokers in the above list are picked.
    The default is in the order of the list - so mq://server1:7676/jms will always be
    picked by default. If you want random behavior (which will hopefully even out the
    load), set imqAddressListBehavior to RANDOM.
    regards,
    -i
    http://www.sun.com/software/products/message_queue/index.xml

  • Transitioned from an older G5 Pro to a new iMac. My Final Cut files are on a 2T WD external HD. When I try to boot I get a "General Error 41" and files do not load. Took the WD to the Apple Store and tried it on one of their iMacs...it works OK. Ideas?

    Transitioning from an older G5 Pro to a new iMac. My Final Cut Express files are on a 2T WD external HD. When I try to load a project to the iMac I get a "General Error 41" and the files do not load. Took the WD the Apple Store and tried it on one of their iMacs...it works fine. The problem seems to be in my new iMac, other WD's with different files load OK. Any ideas?

    Hi
    Can it be so that You try to boot a G5 (Power) Mac from one external hard disk - then try to boot an INTEL Mac from same external Mac.
    WILL NOT WORK - It's not just the Mac OS Extended formation that plays a part here - they have to be formatted differently to be bootable on each system.
    I don't think there is any solution to this AT ALL.
    When You format a hard disk.
    • Select Partitions
    • Under the graphic representation to the left there is an Alternative button
    Here You select if it's supposed to be Bootable on a Power CPU or INTEL-CPU
    No other possibilities.
    Yours Bengt W

  • FCE 3.5 "general error" and "out of memory" issues

    Hey folks -
    I'm sure you've covered this before, but couldn't find a good thread....
    I've imported some flip footage for my kid's project (due in two days, of course!) and now we are encountering a "General Error" message followed directly by an "out of memory" message when we try to play a clip. I think there is plenty of memory to get started, not sure how to fix this. Tried trashing and repairing preferences.... Hmmm...
    So very grateful for any assistance....!
    Running a 2.4 GHz Intel Core Duo iMac, 84G remaining on the internal Hard Drive, and using a G-drive 1TB external drive with 340GB available as the scratch disc.

    All set. Was going crazy, because I thought I had it, them lost it....! Your settings were perfect. Ended up downloading MPEG Streamclip program. Used codec "Apple Intermediate Codec", 29.97 frame rate, uncompressed 48k audio, and the setup that seemed happy was the last easy setup setting in FCE 3.5: HDV-Apple Intermediate Codec 720p30. I just needed to make sure I used a new sequence to import the newly transferred clips into.
    Whew.
    Really, many thanks for your assistance!
    All the best,
    DJ

  • "General Error" and "Out of Memory" for only certain files?

    I have Final Cut Express 4 on Mac OS X Leopard and it has been working fine, up until now.
    For some reason, when I try to view two clips in the tab to the left of the screen to find sections of them to put into my project, I get a message that says, "General Error" and when I click okay it says, "Error: Out of Memory". They are both mp4 files, and are 242.9 MB and 294.2 MB, and I have viewed them both on Quicktime. I don't understand why it is saying that it is out of memory when I still have 7 GB left on my computer and it still lets me view and add other files into my sequence.
    Can someone help me out and tell me how to fix this? I'd really appreciate advise!

    MPEG-4 is not a format that works in FCE. You'll have to convert it to one of FCE's format. Without knowing details about what the original format is it's impossible to say what you should convert it to.

  • General error and out of memory with Final Cut Pro 6

    I just upgraded to a Mac mini i7. What setting should I use to avoid these errors?

    So you were successful in getting FCS 2 apps installed in Mountain Lion?
    Are these messages you're getting now or ones that you have gotten in the past using FCP?
    Out of memory warnings often caused by still images that are not RGB or trying to use unsupported media.
    Russ

  • "File Error" and "Out of Memory" when rendering FCE

    Dear all,
    I imported a 2 hour holiday video into FCE.
    When rendering the video, a message appears that the rendering process will last about 15 minutes to complete.
    However, frequently I am warned by the following message:
    "File Error: The specified file is open and in use by this or another application"
    When activating Rendering again, the render time is now increased to about 45 minutes.
    I now either receive a couple of times the message: "File Error: The specified file is open and in use by this or another application" >>or even worse: "Out of memory"<<.
    Today I purchased an addition 2GB of memory to increase my memory form 750 MB to 2.5GB !!!
    Can anyone please tell me what could be the cause of these messages and how to solve them?
    BTW, no other programs are running while I use FCE.
    Thanks in advance,
    Hans E.<br>
    PowerMac G5-Dual 1.8GHz, Memory 2.5GB, iMac G3-600MHz, Airport, Airport Express Mac OS X (10.3.9)
    PowerMac G5-Dual 1.8GHz, Memory 2.5GB, iMac G3-600MHz, Airport, Airport Express Mac OS X (10.3.9)
    PowerMac G5-Dual 1.8GHz, Memory 2.5GB, iMac G3-600MHz, Airport, Airport Express   Mac OS X (10.3.9)  

    Is it happening when you're rendering, or exporting?
    The error message means FCE is trying to rewrite a file that it is currently using. It could be mistakenly trying to save a render file, if you're only rendering, or if you're trying to re-export a file, you'll get that message.
    Try dumping all your render files, restarting FCE and trying again.
    The Out of Memory error almost always points toward a corrupt file. Could be a source file, could be a render file. Again, dump your render files and try again.
    If these don't work, you need to close FCE, take all your source media offline, then open FCE. Now, reconnect each clip one by one until you find the corrupt clip. Keep going, there may be more than one. Re-capture the corrupt clips and you should be good to go.

  • Error creating window handle and out of memory

    Hello everyone !
    I am having a .Net Windows application (VB.Net) that creates a lot of reports (the same report document for different clients and dates) for exporting them to PDF.
    While the first 75 reports are performed well the 76th report causes an error while creating the window handle and I get an out of memory error as well.
    Searching this forum and the internet I tried a lot of things to get this problem solved:
    - closing, disposing any report as well as setting it to nothing after exporting it to disk (also doing this for used dataset and datatable)
    - doing a single report operation in an own class that is set to nothing immediately after use
    - using garbage collector (methods: "Collect" and "WaitForPendingFinalizers")
    - defining the report, used dataset and datable within "using .. end using" blocks
    - take a single report object for all exports (implemented as singleton)
    - using a background thread to handle report and operation on it
    - changing registry entry "PrintJobLimit" to an higher value as well as to -1
    But none of my tries helped me so far, it's always the 76th report that collapses.
    Interesting thing: Exporting as PDF is not needed for causing the error (switched this off for all reports to test any difference).
    My system:
    - Visual Studio 2008 and .Net 3.5 Service Pack 1
    - Crystal Reports Basic for Visual Studio 2008 Service Pack 1
    Any help is appreciated.
    Thank You very much,
    M.Deister
    Edited by: M.Deister on Sep 20, 2011 2:17 PM

    Hello,
    Saving the report as RPT doesn't save the data which is why it's failing to log on, you have to export it to RPT format, then use that in your test.
    Also, I assume you are using the Report Engine. Try using InProc RAS to load and run the report.
    Just a few lines of code change is all it needs.
    using CrystalDecisions.CrystalReports.Engine;
    using CrystalDecisions.Shared;
    using CrystalDecisions.ReportAppServer.ClientDoc;
    using CrystalDecisions.ReportAppServer.Controllers;
    using CrystalDecisions.ReportAppServer.ReportDefModel;
    using CrystalDecisions.ReportAppServer.CommonControls;
    using CrystalDecisions.ReportAppServer.CommLayer;
    using CrystalDecisions.ReportAppServer.CommonObjectModel;
    using CrystalDecisions.ReportAppServer.ObjectFactory;
    using CrystalDecisions.ReportAppServer.DataSetConversion;
    using CrystalDecisions.ReportAppServer.DataDefModel;
    using CrystalDecisions.ReportSource;
    using CrystalDecisions.Windows.Forms;
         public class frmMain : System.Windows.Forms.Form
            CrystalDecisions.CrystalReports.Engine.ReportDocument rpt = new CrystalDecisions.CrystalReports.Engine.ReportDocument();
            CrystalDecisions.ReportAppServer.ClientDoc.ISCDReportClientDocument rptClientDoc;
    private void btnOpenReport_Click(object sender, System.EventArgs e)
        rptClientDoc = new CrystalDecisions.ReportAppServer.ClientDoc.ReportClientDocument(); // ReportClientDocumentClass();
        openFileDialog.Filter = "Crystal Reports (*.rpt)|*.rpt|Crystal Reports Secure (*.rptr)|*.rptr";
        //openFileDialog1.Filter = "txt files (*.txt)|*.txt|All files (*.*)|*.*";
        openFileDialog.FilterIndex = 1;
        //rptClientDoc.MinorVersion();
         if (openFileDialog.ShowDialog() == DialogResult.OK)
              object rptName = openFileDialog.FileName;
            try
                rpt.Load(rptName.ToString(), OpenReportMethod.OpenReportByTempCopy);
                rptClientDoc = rpt.ReportClientDocument;
                btnReportName.Text = rptName.ToString();
                //btnReportName.Text = rptClientDoc.DisplayName.ToString();
            catch (Exception ex)
                MessageBox.Show("ERROR: " + ex.Message);
                return;
            rptClientDoc = rpt.ReportClientDocument;
    Don

  • Scaling images and Out of memory error

    Hi all,
    Does anyone knows why this code throws an out of memory error?
    It loads an image (2048x1166 pixels) and saves it at bufi1 (BufferedImage). After that:
    1- Rescale bufi1 to bufi13A (x3).
    2. Rescale bufi1 to bufi12 (x2).
    3. Rescale bufi1 to bufi13B (x3).
    At 3, the code throws an oome. Why?
    Thanks in advance!
    import java.io.*;
    import javax.imageio.*;
    import java.awt.geom.*;
    import java.awt.image.*;
    public class TestScalePercent {
      public static void main(String[] args) {
        TestScalePercent tsp=new TestScalePercent();
        BufferedImage bufi1=null;
        try {
          bufi1 = ImageIO.read(new File("foo.png"));//2048x1166 pixels
        } catch (Exception e){
          e.printStackTrace();
        BufferedImage bufi13A=tsp.scale(bufi1,3,3);//--> OK
        BufferedImage bufi12=tsp.scale(bufi1,2,2);//--> OK
        BufferedImage bufi13B=tsp.scale(bufi1,3,3);//-->OOM error!
      public BufferedImage scale(BufferedImage bufiSource, double scaleX, double scaleY){
        AffineTransform tx = new AffineTransform();
        tx.scale(scaleX,scaleY);
        AffineTransformOp op = new AffineTransformOp(tx, AffineTransformOp.TYPE_NEAREST_NEIGHBOR);
        BufferedImage bufiop=op.filter(bufiSource, null);//--> Creates the OOM error...
        return op.filter(bufiop, null);
    }

    How much memory does your machine have?
    That image is quite large. If my math is correct and
    assuming the image has 32-bit color the original
    image takes up 76.5 megs of memory by itself. Then
    you are trying to create three other versions of it.
    It isn't lying to you, it is indeed probably running
    out of memory to hold the images.OK. Now I'm wondering if is it possible to free memory between bufi13A - bufi12, and bufi12 - bufi13B? I've tried to invocate the garbbage collector but nothing happens...
    Thanks!

  • JNI and Out of Memory errors

    Hi,
    At my job, we seem to have a memory leak related to JNI. We know we
    have a memory leak because we keep getting Out of Memory errors even
    after increasing the maximum heap size to more than 256 megs. And we
    know that this is the application that is eating up all the system
    memory.
    We are running under Windows 2000, with both JDK 1.3.0 and JDK 1.4.1.
    We tried looking at the problem under JProbe, but it shows a stable
    Java heap (no problems, except that the Windows task manager shows it
    growing and growing...)
    I tried a strip down version, where I set the max heap size to 1 Meg,
    and print out the total memory, memory used, and maximum memory used at
    a 5 sec interval.
    Memory used = Runtime.getRuntime().totalMemory() - Runtime.getRuntime().freeMemory().
    Well, I let that strip down version running for about a day. The
    maximum memory used has stabilized to about 1.1 Meg, and has not
    increased. The current memory used increases until it gets to some
    threshold, then it decreases again. However, the Windows task manager
    shows the memory increasing -- and currently it is at 245 Megs!
    In the lab, the behavior we see with the Windows task manager is as
    follows:
    1. Total memory used in the system increases until some threshold.
    2. Then it goes back down by about 100 Megs.
    3. This cycle continues, but at each cycle the memory goes back down
    less and less, until the app crashes.
    Now, our theory is that JNI is consuming all this memory (maybe we are
    using JNI wrong). That's the only explanation we can come up with to
    explain what we have seen (Java showing an OK heap, but the task
    manager showing it growing, until crashing).
    Does that make sense? Can the new operator throw an Out of Memory
    error if the system does not have enough memory to give it, even if
    there is still free heap space as far as the Runtime object is
    concerned? Does the Runtime object figures objects allocated through
    JNI methods into the heap used space?
    Note that I know the task manager is not a reliable indicator.
    However, I don't think a Java app is supposed to runaway with system
    memory -- the problem is not simply that the Java app is consuming too
    much memory, but that it seems to always want more memory. Besides, we
    do get the Out of Memory error.
    Thanks for your help,
    Nicolas Rivera

    Hi,
    there are two sources of memory leakage in JNI:
    - regular leaks in c/c++ code;
    - not released local/global references of java objects in JNI.
    I think that the first issue in not a problem for you. The second is more complex. In your JNI code you should check
    - how many local references alive you keep in your code as their number is restricted (about 16 and can be enlarged). The good style is not to store local references but keep only global.
    - any local reference accepted from java call in JNI are released by JVM. You should release local references you created in JNI and also global references.
    - do not use a large number of java object references. Each new reference gets new memory in JVM. Try to reuse refences.
    I guess that is your problem.
    Vitally

Maybe you are looking for