Out of memory error - large project

I'm consulting on a feature doc edit, and the primary editor (Avid guy) is having serious problems accessing anything from the original project.
It's an hour and 15 minute show, with probably close to 100 hours of footage.
The box is a D2.3 G5 with 1.5 g of RAM, and the media is on two G-Tech drives: a G-RAID and a G-Drive. Plenty of headroom on both (now) and the system drive is brand new, having been replaced after the original died, and there's nothing loaded on it but FC Studio. The FCP version is 5.1.4. The project file is well over 100 MB.
We started getting Out of Memory errors with this large project, and I checked all of the usual suspects: CMYK graphics, hard drive space, sufficient RAM... all checked out okay, except possibly the less-than-ideal amount of RAM.
I copied the important sequences and a couple of select bins to a new project, and everything seems workable for now. The project is still 90 MB, and I've suggested breaking it up into separate projects and work on it as reels, but we can edit and trims work efficiently at the moment. However, the other editor has gotten to a point now where he can't even open bins in the old, big project. He keeps getting the OOM error whenever he tries to do anything.
I have no similar problems opening the same project on my G5, which is essentially identical except I have 2.5 G RAM (1 G extra). Can this difference in RAM really make this big a deal? Is there something else I'm missing? Why can't this editor access even bins from the other project?
G4   Mac OS X (10.2.x)  

Shane's spot on.
What I often do with large projects is pare down, just what you have done. But 90 out of 100 is not a big paredown by any stretch. In the new copy throw away EVERYTHING that's outdated: old sequences are the big culprit. Also toss any render files and re-render.
Remember that, to be effective fcp keeps EVERYTHING in ram, so that it can instantly access anything in your project. The more there is to keep track of the slower you get.

Similar Messages

  • Out of memory error - WLW Schema Project build

    hi
    Wasn't sure which group (workshop vs xmlbeans) but here goes.
    I'm doing a persomal eval of the WLS 8.1.1. platform.
    My current focus is on XML Schema support, XML Beans, transformations etc
    and in particular how it all copes with large schema.
    I'm using Justice XML Data Dictionary (3.0.0.0) as an example of a large
    schema
    (http://it.ojp.gov/jxdd/)
    I am getting an out of memory error when I build this within WLW.
    The build appears to be in the javac phase (and occurs when the memory goes
    over 256M) so I think it's the ANT task.
    WLW doesn't seem to be running out of memory and the build machine is not
    running out of resources.
    Where do I configure this? The <xmlbean> tag ? Is there any doco?
    regards
    Jim Nicolson
    =============== Build trace below ====================
    Build project JXDDSchema started.
    BUILD STARTED
    build:
    check-uptodate:
    build-sub:
    Updating property file:
    C:\data\projects\bea\integration\wlitest\.workshop\.ide\JXDDSchema\build.pro
    perties
    Deleting directory
    C:\DOCUME~1\JIMNIC~1\LOCALS~1\Temp\.wlw-temp\wlw_compile38940\JXDDSchema
    Created dir:
    C:\DOCUME~1\JIMNIC~1\LOCALS~1\Temp\.wlw-temp\wlw_compile38940\JXDDSchema
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\jxdds_3.0.0.0_full-doc.x
    sd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\iso_639-2t_1.0.0.0_full-
    doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\ut_offender-tracking-mis
    c_1.0.0.0_full-doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\ncic_2000-gun_1.0.0.0_fu
    ll-doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\ncic_2000-other-transact
    ions_1.0.0.0_full-doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\ncic_2000-uniform-offens
    e_1.0.0.0_full-doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\mn_offense_1.0.0.0_full-
    doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\dod_jcs-pub2.0-misc_1.0.
    0.0_full-doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\cap_1.0.0.0_full-doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\nibrs_misc_1.0.0.0_full-
    doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\fips_6-4_1.0.0.0_full-do
    c.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\iso_3166_1.0.0.0_full-do
    c.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\dod_misc_1.0.0.0_full-do
    c.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\ncic_2000-misc_1.0.0.0_f
    ull-doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\ncic_2000-boat_1.0.0.0_f
    ull-doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\ncic_2000-securities_1.0
    .0.0_full-doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\ncic_2000-article_1.0.0.
    0_full-doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\unece_rec20-misc_1.0.0.0
    _full-doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\ncic_2000-state-country_
    1.0.0.0_full-doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\iso_4217_1.0.0.0_full-do
    c.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\fips_10-4_1.0.0.0_full-d
    oc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\ansi_d20_1.0.0.0_full-do
    c.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\iso_639-2b_1.0.0.0_full-
    doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\fips_5-2_1.0.0.0_full-do
    c.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\ncic_2000-vehicle_1.0.0.
    0_full-doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\dod_exec-12958_1.0.0.0_f
    ull-doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\ncic_2000-personal-descr
    iptors_1.0.0.0_full-doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\usps_states_1.0.0.0_full
    -doc.xsd
    Time to build schema type system: 6.75 seconds
    The system is out of resources.
    Consult the following stack trace for details.
    java.lang.OutOfMemoryError
    BUILD FAILED
    BUILD FAILED

    hi Kevin
    Thanks I'll take a look there.
    I think it's a good move to make XMLBeans open source.
    (I've been using Castor for the same purpose for the last two years).
    I was mainly focussed on XMLBeans as they are used in WLW context and my
    testing suggested that I would probably want to know more about the ANT
    build/tasks etc. (Just to be able to deal with unexpected issues)
    It's not urgent - I'm just doing a personal eval to come up to speed with
    WLS 8.1 platform.
    thanks
    Jim Nicolson
    "Kevin Krouse" <[email protected]> wrote in message
    news:[email protected]..
    Sorry, the first URL should have been:
    http://cvs.apache.org/viewcvs.cgi/*checkout*/xml-xmlbeans/v1/xkit/anttask.html?rev=1.2
    >
    --k
    On Tue, 30 Sep 2003 22:35:58 -0700, Kevin Krouse wrote:
    Hi Jim,
    Well, the XMLBean docs are in a process of moving into the Apache
    XMLBeans
    project (http://xml.apache.org/xmlbeans), so I can point you to where
    they
    are in the cvs repository. You can also look at the source if you'dlike!
    >>
    Here's an html doc for the xmlbean task:
    http://cvs.apache.org/viewcvs.cgi/*checkout*/xml-xmlbeans/v1/xkit/anttask.html?rev=1.2http://cvs.apache.org/viewcvs.cgi/*checkout*/xml-xmlbeans/v1/xkit/anttask.html?rev=1.2
    >>
    Here's the code for the xmlbean task:
    http://cvs.apache.org/viewcvs.cgi/xml-xmlbeans/v1/src/xmlcomp/org/apache/xmlbeans/impl/tool/XMLBean.java?rev=HEAD&content-type=text/vnd.viewcvs-markup
    >>
    >>
    As to your second question about WLW becoming unresponsive, after the
    Schemas.jar is built, the source for the entire project is re-scanned
    which could take several minutes. The performance impact of rescanning
    the project is improved in SP2 and will be addressed even more in future
    releases. Stay tuned.
    --k

  • Uploading large files from applet to servlet throws out of memory error

    I have a java applet that needs to upload files from a client machine
    to a web server using a servlet. the problem i am having is that in
    the current scheme, files larger than 17-20MB throw an out of memory
    error. is there any way we can get around this problem? i will post
    the client and server side code for reference.
    Client Side Code:
    import java.io.*;
    import java.net.*;
    // this class is a client that enables transfer of files from client
    // to server. This client connects to a servlet running on the server
    // and transmits the file.
    public class fileTransferClient
    private static final String FILENAME_HEADER = "fileName";
    private static final String FILELASTMOD_HEADER = "fileLastMod";
    // this method transfers the prescribed file to the server.
    // if the destination directory is "", it transfers the file to
    "d:\\".
    //11-21-02 Changes : This method now has a new parameter that
    references the item
    //that is being transferred in the import list.
    public static String transferFile(String srcFileName, String
    destFileName,
    String destDir, int itemID)
    if (destDir.equals(""))
    destDir = "E:\\FTP\\incoming\\";
    // get the fully qualified filename and the mere filename.
    String fqfn = srcFileName;
    String fname =
    fqfn.substring(fqfn.lastIndexOf(File.separator)+1);
    try
    //importTable importer = jbInit.getImportTable();
    // create the file to be uploaded and a connection to
    servlet.
    File fileToUpload = new File(fqfn);
    long fileSize = fileToUpload.length();
    // get last mod of this file.
    // The last mod is sent to the servlet as a header.
    long lastMod = fileToUpload.lastModified();
    String strLastMod = String.valueOf(lastMod);
    URL serverURL = new URL(webadminApplet.strServletURL);
    URLConnection serverCon = serverURL.openConnection();
    // a bunch of connection setup related things.
    serverCon.setDoInput(true);
    serverCon.setDoOutput(true);
    // Don't use a cached version of URL connection.
    serverCon.setUseCaches (false);
    serverCon.setDefaultUseCaches (false);
    // set headers and their values.
    serverCon.setRequestProperty("Content-Type",
    "application/octet-stream");
    serverCon.setRequestProperty("Content-Length",
    Long.toString(fileToUpload.length()));
    serverCon.setRequestProperty(FILENAME_HEADER, destDir +
    destFileName);
    serverCon.setRequestProperty(FILELASTMOD_HEADER, strLastMod);
    if (webadminApplet.DEBUG) System.out.println("Connection with
    FTP server established");
    // create file stream and write stream to write file data.
    FileInputStream fis = new FileInputStream(fileToUpload);
    OutputStream os = serverCon.getOutputStream();
    try
    // transfer the file in 4K chunks.
    byte[] buffer = new byte[4096];
    long byteCnt = 0;
    //long percent = 0;
    int newPercent = 0;
    int oldPercent = 0;
    while (true)
    int bytes = fis.read(buffer);
    byteCnt += bytes;
    //11-21-02 :
    //If itemID is greater than -1 this is an import file
    transfer
    //otherwise this is a header graphic file transfer.
    if (itemID > -1)
    newPercent = (int) ((double) byteCnt/ (double)
    fileSize * 100.0);
    int diff = newPercent - oldPercent;
    if (newPercent == 0 || diff >= 20)
    oldPercent = newPercent;
    jbInit.getImportTable().displayFileTransferStatus
    (itemID,
    newPercent);
    if (bytes < 0) break;
    os.write(buffer, 0, bytes);
    os.flush();
    if (webadminApplet.DEBUG) System.out.println("No of bytes
    sent: " + byteCnt);
    finally
    // close related streams.
    os.close();
    fis.close();
    if (webadminApplet.DEBUG) System.out.println("File
    Transmission complete");
    // find out what the servlet has got to say in response.
    BufferedReader reader = new BufferedReader(
    new
    InputStreamReader(serverCon.getInputStream()));
    try
    String line;
    while ((line = reader.readLine()) != null)
    if (webadminApplet.DEBUG) System.out.println(line);
    finally
    // close the reader stream from servlet.
    reader.close();
    } // end of the big try block.
    catch (Exception e)
    System.out.println("Exception during file transfer:\n" + e);
    e.printStackTrace();
    return("FTP failed. See Java Console for Errors.");
    } // end of catch block.
    return("File: " + fname + " successfully transferred.");
    } // end of method transferFile().
    } // end of class fileTransferClient
    Server side code:
    import java.io.*;
    import javax.servlet.*;
    import javax.servlet.http.*;
    import java.util.*;
    import java.net.*;
    // This servlet class acts as an FTP server to enable transfer of
    files
    // from client side.
    public class FtpServerServlet extends HttpServlet
    String ftpDir = "D:\\pub\\FTP\\";
    private static final String FILENAME_HEADER = "fileName";
    private static final String FILELASTMOD_HEADER = "fileLastMod";
    public void doGet(HttpServletRequest req, HttpServletResponse resp)
    throws ServletException,
    IOException
    doPost(req, resp);
    public void doPost(HttpServletRequest req, HttpServletResponse
    resp)
    throws ServletException,
    IOException
    // ### for now enable overwrite by default.
    boolean overwrite = true;
    // get the fileName for this transmission.
    String fileName = req.getHeader(FILENAME_HEADER);
    // also get the last mod of this file.
    String strLastMod = req.getHeader(FILELASTMOD_HEADER);
    String message = "Filename: " + fileName + " saved
    successfully.";
    int status = HttpServletResponse.SC_OK;
    System.out.println("fileName from client: " + fileName);
    // if filename is not specified, complain.
    if (fileName == null)
    message = "Filename not specified";
    status = HttpServletResponse.SC_INTERNAL_SERVER_ERROR;
    else
    // open the file stream for the file about to be transferred.
    File uploadedFile = new File(fileName);
    // check if file already exists - and overwrite if necessary.
    if (uploadedFile.exists())
    if (overwrite)
    // delete the file.
    uploadedFile.delete();
    // ensure the directory is writable - and a new file may be
    created.
    if (!uploadedFile.createNewFile())
    message = "Unable to create file on server. FTP failed.";
    status = HttpServletResponse.SC_INTERNAL_SERVER_ERROR;
    else
    // get the necessary streams for file creation.
    FileOutputStream fos = new FileOutputStream(uploadedFile);
    InputStream is = req.getInputStream();
    try
    // create a buffer. 4K!
    byte[] buffer = new byte[4096];
    // read from input stream and write to file stream.
    int byteCnt = 0;
    while (true)
    int bytes = is.read(buffer);
    if (bytes < 0) break;
    byteCnt += bytes;
    // System.out.println(buffer);
    fos.write(buffer, 0, bytes);
    // flush the stream.
    fos.flush();
    } // end of try block.
    finally
    is.close();
    fos.close();
    // set last mod date for this file.
    uploadedFile.setLastModified((new
    Long(strLastMod)).longValue());
    } // end of finally block.
    } // end - the new file may be created on server.
    } // end - we have a valid filename.
    // set response headers.
    resp.setContentType("text/plain");
    resp.setStatus(status);
    if (status != HttpServletResponse.SC_OK)
    getServletContext().log("ERROR: " + message);
    // get output stream.
    PrintWriter out = resp.getWriter();
    out.println(message);
    } // end of doPost().
    } // end of class FtpServerServlet

    OK - the problem you describe is definitely what's giving you grief.
    The workaround is to use a socket connection and send your own request headers, with the content length filled in. You may have to multi-part mime encode the stream on its way out as well (I'm not about that...).
    You can use the following:
    http://porsche.cis.udel.edu:8080/cis479/lectures/slides-04/slide-02.html
    on your server to get a feel for the format that the request headers need to take.
    - Kevin
    I get the out of Memory Error on the client side. I
    was told that this might be a bug in the URLConnection
    class implementation that basically it wont know the
    content length until all the data has been written to
    the output stream, so it uses an in memory buffer to
    store the data which basically causes memory issues..
    do you think there might be a workaround of any kind..
    or maybe a way that the buffer might be flushed after
    a certain size of file has been uploaded.. ?? do you
    have any ideas?

  • Out of memory error importing a JPA Entity in WebDynpro Project

    Hi All!
    We are having problems importing JPA entities in a WebDynPro project, this is our escenario.
    We have two entities, entity A that has a ManyToOne relationship with entity B and at the same time entity B has a OneToMany relationship with entity A. When in the controller context we try to create a node usign a model binding to Entity A we got an Out of memory error from NetWeaver DS. Trying to figure out the problem we identified that in the model that we imported we got the following.
    Entity A
        Entity B
            Entity A
               Entity B
                  and so on....... without and end
    What are we doing wrong? Or how can we avoid these behavior?
    Regards,

    Hi Kaylan:
    Thanks for your reply. You are rigth about the error that we are getting. This is our scenario.
    We have a ejb that in some of his method uses the entity Lote, and this entity has a relationship with entity Categoria. When we import the EJB using the model importer netweaver imports the EJB methods and the entities that they use.
    So after doing this besides the ejb's methods we got these two entities that are the ones that are generating the error, when we try to create a context node using the Categoria entity.
    @Entity
    @Table(name="TB_LOTE")
    public class Lote implements Serializable {
         @EmbeddedId
         private Lote.PK pk;
         @ManyToOne
         @JoinColumn(name="CO_CATEGORIALOTE")
         private Categoria coCategorialote;
         @Embeddable
         public static class PK implements Serializable {
              @Column(name="CO_LOTE")
              private String coLote;
              @Column(name="CO_ORDENFABRICACION")
              private String coOrdenfabricacion2;
                   ^ this.coOrdenfabricacion2.hashCode();
    @Entity
    @Table(name="TB_CATEGORIA")
    public class Categoria implements Serializable {
         @Id
         @Column(name="CO_CATEGORIA")
         private String coCategoria;
         @OneToMany(mappedBy="coCategorialote")
         private Set<Lote> tbLoteCollection;
    Regards,
    Jose Arango

  • Out of Memory error while builng HTML String from a Large HashMap.

    Hi,
    I am building an HTML string from a large map oject that consits of about 32000 objects using the Transformer class in java. As this HTML string needs to be displayed in the JSP page, the reponse time was too high and also some times it is throwing out of memory error.
    Please let me know how i can implement the concept of building the library tree(folder structure) HTML string for the first set of say 1000 entries and then display in the web page and then detect an onScroll event and handle it in java Script functions and come back and build the tree for the next set of entries in the map and append this string to the previous one and accordingly display it.
    please let me know whether
    1. the suggested solution was the advisable one.
    2. how to build tree(HTML String) for a set of entries in the map while iterating over the map.
    3. How to detect a onScroll event and handle it.
    Note : Handling the events in the JavaScript functions and displaying the tree is now being done using AJAX.
    Thanks for help in Advance,
    Kartheek

    Hi
    Sorry,
    I haven't seen any error in the browser as this may be Out of memory error which was not handled. I got the the following error from the web logic console
    org.apache.struts.actions.DispatchAction">Dispatch[serviceCenterHome] to method 'getUserLibraryTree' returned an exceptionjava.lang.reflect.InvocationTargetException
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:324)
         at org.apache.struts.actions.DispatchAction.dispatchMethod(DispatchAction.java:276)
         at org.apache.struts.actions.DispatchAction.execute(DispatchAction.java:196)
         at org.apache.struts.action.RequestProcessor.processActionPerform(RequestProcessor.java:421)
         at org.apache.struts.action.RequestProcessor.process(RequestProcessor.java:226)
         at org.apache.struts.action.ActionServlet.process(ActionServlet.java:1164)
         at org.apache.struts.action.ActionServlet.doPost(ActionServlet.java:415)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:760)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
         at weblogic.servlet.internal.ServletStubImpl$ServletInvocationAction.run(ServletStubImpl.java:996)
         at weblogic.servlet.internal.ServletStubImpl.invokeServlet(ServletStubImpl.java:419)
         at weblogic.servlet.internal.ServletStubImpl.invokeServlet(ServletStubImpl.java:315)
         at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:6452)
         at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
         at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:118)
         at weblogic.servlet.internal.WebAppServletContext.invokeServlet(WebAppServletContext.java:3661)
         at weblogic.servlet.internal.ServletRequestImpl.execute(ServletRequestImpl.java:2630)
         at weblogic.kernel.ExecuteThread.execute(ExecuteThread.java:219)
         at weblogic.kernel.ExecuteThread.run(ExecuteThread.java:178)
    Caused by: java.lang.OutOfMemoryError
    </L_MSG>
    <L_MSG MN="ILHD-1109" PID="adminserver" TID="ExecuteThread: '14' for queue: 'weblogic.kernel.Default'" DT="2012/04/18 7:56:17:146" PT="WARN" AP="" DN="" SN="" SR="org.apache.struts.action.RequestProcessor">Unhandled Exception thrown: class javax.servlet.ServletException</L_MSG>
    <Apr 18, 2012 7:56:17 AM CDT> <Error> <HTTP> <BEA-101017> <[ServletContext(id=26367546,name=fcsi,context-path=/fcsi)] Root cause of ServletException.
    *java.lang.OutOfMemoryError*
    Please Advise.
    Thanks for your help in advance,
    Kartheek

  • Out of memory error when writing large file

    I have the piece of code below which works fine for writing small files, but when it encounters much larger files (>80M), the jvm throws an out of memory error.
    I believe it has something to do with the Stream classes. If I were to replace my PrintStream reference with the System.out object (which is commented out below), then it runs fine.
    Anyone else encountered this before?
         print = new PrintStream(new FileOutputStream(new File(a_persistDir, getCacheFilename()),
                                                                false));
    //      print = System.out;
              for(Iterator strings = m_lookupTable.keySet().iterator(); strings.hasNext(); ) {
                   StringBuffer sb = new StringBuffer();
                   String string = (String) strings.next();
                   String id = string;
                   sb.append(string).append(KEY_VALUE_SEPARATOR);
                   Collection ids = (Collection) m_lookupTable.get(id);
                   for(Iterator idsIter = ids.iterator(); idsIter.hasNext();) {
                        IBlockingResult blockingResult = (IBlockingResult) idsIter.next();
                        sb.append(blockingResult.getId()).append(VALUE_SEPARATOR);
                   print.println(sb.toString());
                   print.flush();
    } catch (IOException e) {
    } finally {
         if( print != null )
              print.close();
    }

    Yes, my first code would just print the strings as I got them. But it was running out of memory then as well. I thought of constructing a StringBuffer first because I was afraid that the PrintStream wasn't allocating the memory correctly.
    I've also tried flushing the PrintStream after every line is written but I still run into trouble.

  • General Error and Out of Memory Error - affecting multiple projects

    I am running Final Cut Pro 6 on my 1.25 GHz PowerPC G4 iMac, with 1.25 GB RAM. OS X 10.4.11.
    I have had this setup for more than a year and used it without any problems.
    I have two projects stored on an external LaCie firewire drive with more than 140GB disk space remaining. As of this morning, I cannot work on either of them: both are throwing the "General Error" message, and one of them is also telling me it's "Out of Memory". On project #1, the "Out of Memory" occurs whenever I try to open a clip in the viewer, following a "General Error" message. On project #2, the "General Error" message occurs when I try to open the project; it gets halfway through the process and then throws the error, leaving me unable to open the timeline.
    Both projects are short, less than 3 minutes long, and neither project contains any CMYK or grayscale graphics. Project #2 does have a short Motion sequence.
    Things I have tried:
    ~ restarting and hard-rebooting
    ~ trashing preferences
    ~ rebuilding permissions
    ~ trashing render files
    ~ creating a new project
    ~ searching this forum and Google for other answers
    Help?

    Thanks for the support, Jim. I've had terrible experiences with Firewire drives in the past, too; regrettably, there doesn't seem to be an affordable alternative at this point.
    I just looked up resetting the PMU, as that's not something I've done before. I really hope it's that simple, as the thought of recreating all these clips makes my head hurt. But I'll definitely try your suggestion of reconnecting individual media files first. I've been through that before.

  • Out of Memory Error and large video files

    I created a simple page that links a few large video files (2.5 gig) total size. On Preview or FTP upload Mues crashes and gives a Out of Memory error. How should we handle very large files like this?

    Upload the files to your host using an FTP client (i.e. Filezilla) and hyperlink to them from within your Muse site.
    Muse is currently not designed to upload files this large. The upload functionality takes the simple approach of reading an entire linked file into RAM and then uploading it. Given Muse is currently a 32-bit application, it's limited to using 2Gb of RAM (or less) at any given time regardless of how much RAM you have physically installed. We should add a check to the "Link to File..." feature so it rejects files larger than a few hundred megs and puts up a explanation alert. (We're also hard at work on the move to being a 64-bit app, but that's not a small change.)
    In general your site visitor will have a much better experience viewing such videos if you upload them to a service like YouTube or Vimeo rather than hosting them yourself. Video hosting services provide a huge amount of optimization of the delivery of video that's not present for a file hosted on standard hosting (i.e. automatic resizing of the video to the appropriate resolution for the visitor's device (rather than potentially downloading a huge amount of unneeded data), transcoding to the video format required by the visitor's browser (rather than either having to due so yourself or have some visitors unable to view your video), automatic distribution of a highly viewed video to multiple data centers for better performance in multiple geographies, and no doubt tons of other stuff I'm not thinking of or am ignorant of.

  • Large Pdf using XML XSL - Out of Memory Error

    Hi Friends.
    I am trying to generate a PDF from XML, XSL and FO in java. It works fine if the PDF to be generated is small.
    But if the PDF to be generated is big, then it throws "Out of Memory" error. Can some one please give me some pointers about the possible reasons for this errors. Thanks for your help.
    RM
    Code:
    import java.io.*;
    import javax.servlet.*;
    import javax.servlet.http.*;
    import org.xml.sax.InputSource;
    import org.xml.sax.XMLReader;
    import org.apache.fop.apps.Driver;
    import org.apache.fop.apps.Version;
    import org.apache.fop.apps.XSLTInputHandler;
    import org.apache.fop.messaging.MessageHandler;
    import org.apache.avalon.framework.logger.ConsoleLogger;
    import org.apache.avalon.framework.logger.Logger;
    public class PdfServlet extends HttpServlet {
    public static final String FO_REQUEST_PARAM = "fo";
    public static final String XML_REQUEST_PARAM = "xml";
    public static final String XSL_REQUEST_PARAM = "xsl";
    Logger log = null;
         Com_BUtil myBu = new Com_BUtil();
    public void doGet(HttpServletRequest request,
    HttpServletResponse response) throws ServletException {
    if(log == null) {
         log = new ConsoleLogger(ConsoleLogger.LEVEL_WARN);
         MessageHandler.setScreenLogger(log);
    try {
    String foParam = request.getParameter(FO_REQUEST_PARAM);
    String xmlParam = myBu.getConfigVal("filePath") +"/"+request.getParameter(XML_REQUEST_PARAM);
    String xslParam = myBu.SERVERROOT + "/jsp/servlet/"+request.getParameter(XSL_REQUEST_PARAM)+".xsl";
         if((xmlParam != null) && (xslParam != null)) {
    XSLTInputHandler input = new XSLTInputHandler(new File(xmlParam), new File(xslParam));
    renderXML(input, response);
    } else {
    PrintWriter out = response.getWriter();
    out.println("<html><head><title>Error</title></head>\n"+
    "<body><h1>PdfServlet Error</h1><h3>No 'fo' "+
    "request param given.</body></html>");
    } catch (ServletException ex) {
    throw ex;
    catch (Exception ex) {
    throw new ServletException(ex);
    public void renderXML(XSLTInputHandler input,
    HttpServletResponse response) throws ServletException {
    try {
    ByteArrayOutputStream out = new ByteArrayOutputStream();
    response.setContentType("application/pdf");
    Driver driver = new Driver();
    driver.setLogger(log);
    driver.setRenderer(Driver.RENDER_PDF);
    driver.setOutputStream(out);
    driver.render(input.getParser(), input.getInputSource());
    byte[] content = out.toByteArray();
    response.setContentLength(content.length);
    response.getOutputStream().write(content);
    response.getOutputStream().flush();
    } catch (Exception ex) {
    throw new ServletException(ex);
    * creates a SAX parser, using the value of org.xml.sax.parser
    * defaulting to org.apache.xerces.parsers.SAXParser
    * @return the created SAX parser
    static XMLReader createParser() throws ServletException {
    String parserClassName = System.getProperty("org.xml.sax.parser");
    if (parserClassName == null) {
    parserClassName = "org.apache.xerces.parsers.SAXParser";
    try {
    return (XMLReader) Class.forName(
    parserClassName).newInstance();
    } catch (Exception e) {
    throw new ServletException(e);

    Hi,
    I did try that initially. After executing the command I get this message.
    C:\>java -Xms128M -Xmx256M
    Usage: java [-options] class [args...]
    (to execute a class)
    or java -jar [-options] jarfile [args...]
    (to execute a jar file)
    where options include:
    -cp -classpath <directories and zip/jar files separated by ;>
    set search path for application classes and resources
    -D<name>=<value>
    set a system property
    -verbose[:class|gc|jni]
    enable verbose output
    -version print product version and exit
    -showversion print product version and continue
    -? -help print this help message
    -X print help on non-standard options
    Thanks for your help.
    RM

  • More on Out of Memory Error

    I recently purchased the CS3 Master Suite. Am running Encore 3.0.1.008. Have a dual core with 2 gigs, Vista.
    Built a medium size movie with a 70 clips, 14 timelines, and 14 menus. Everything linked nicely, check project finds no errors. Projected size of result on DVD: 3.003 Gigs.
    Invoke build. Transcoder begins. Fails on second clip. Out of Memory Error.
    Checked forums. Note existent of these problems going back many months. Tried various suggestions. Cleared media cache. Rebooted, reentered, restarted. Same result.
    Also came across warning about 2 gig memory limit. However, note that many, many programs are written with input and output sizes much larger than resident memory. This is a CS101 design issue.
    Because I was obligated to make a release to a client base with high expectations, I scaled back the project to an embarrassingly small size.
    I would call it a demo size as opposed to an acceptable project size.
    Of course, I needed to start from scratch. All attempts to prune the original project to smaller and smaller sizes failed (even with media cache clearing, rebooting). A project with 1 menu and 2 video clips did build.
    My conclusion from this experience is that Encore was designed and tested to meet alpha-release demo standards. There is the well deserved expectation with other Adobe products: PhotoShop, Dreamweaver, etc, that production standards can be achieved.
    My forum reading suggests that Encore has been in this crippled state for months, perhaps years. My question is, when can we expect a fix? There are plenty of software engineers out there who could bring this up to Photoshop standards.
    I spent $2500 on the Master Suite with the expectation that I could do production work, not serve as unpaid QA for Adobe.

    When I try to build a DVD this error appears
    The instruction at 0x007fa94c referenced memory at 0x00000000. The memory could not be read
    Click on OK to terminate the program
    Click on CANCEL to debug the program
    What can i do to fix this problem or to build my DVD..?

  • Out of memory error.............plz help

    i m implementing an application containing 90 data fields.
    i m doing it using jdk1.4 and oracle 8i.
    problem is every time i try open any menu item from menubar the application gets stopped and nothing works and it shows out of memory error on console.
    is it due to 90 fields together in the database???
    or it means something else.
    plz help
    thanks in advance.

    thanks for reply
    in menubar i have many menuitems.
    when i click on menuitem1, say there are about 15 fields in which i fill data an insert it into database.
    in next menuitem i.e menuitem2, i have 25 fields in which i view / read 15 fields which were previously entered in menuitem1, and i add another 10 fields to the database.
    this way it happens in all other menuitems.
    i had to follow this way as i m making a project for a company having many departments.
    the code is too large to post over here, if i could get your e-mail address i can mail it to you.

  • InDesign Giving Me "Out of Memory" Error

    Hi all,
    I've been getting an "out of memory" error coming up on several projects I work on in InDesign. Most recently was a 189 page document, mostly text, some imported photos and PDFs. Usually I'd hit okay on the dialog box, but it pops up several times to the point where I can't do anything but quit and restart the program.
    I've been looking around and read that it's InDesign's own memory and has nothing to do with the computer that I'm using. If that's true, is this just what sometimes happens with larger files? It really messes with my work flow, so if there's anything I can do other than just sit back and let it tweak out, I'd prefer it.
    InDesign is CS5, computer is Windows 7.
    Any ideas? Thanks in advance.

    Hi,
    Maybe I'm reading your response wrong, but you're saying a simple save as might alleviate this whole thing?
    Unfortunately I'm not getting the error right this very second, so I can't replicate it, but it's happened very recently and likely to happen again, so I'd like to know how to deal with it in the future.
    Thanks!

  • Out of Memory error, though nothing has changed?

    I have two large projects, 65mb each, which I have had open in FCP 7 with no problems for months.
    Then I left them closed for a month and worked on other projects.
    Now I can't open either of those two earlier projects without receiving the Out of Memory error (I'm trying to open each one individually). There are many things to look for when this error appears, as found all over this forum and others. So here is my checklist of facts:
    Neither project contains any PSD files or any sequences with text files in them.
    One of those projects contains no sequences whatsoever, only clips.
    All media for these projects is Apple AVC-Intra 100M 1280x720.
    The RAID drive containing the media for these projects has not been used since the last time the projects were open and working fine.
    The edit system has not changed in any way:
    - FCP 7.0.3
    - Mac Pro
    - OS 10.6.6
    - 2x2.66 Quad Core
    - 6 GB RAM
    - LaCie 8TB RAID, with 1.7TB free
    All RAM is functioning fine.
    I've trashed FCP preferences and received the same results anyway.
    Again, both projects were working fine on this same computer, same version of FCP, even with BOTH projects open at once.
    Also, I've been working with other projects on this same system with no problems, some nearly as large as these two.
    So... none of the obvious culprits are at work here to cause this error, unless I'm missing something.
    Anyone have any ideas? This is a bit scary, as these are documentary projects which are only going to get bigger, and I've done projects bigger than this several times without problems. Hopefully I'm missing something simple...
    Thanks.

    I think I found the culprit, and it's an obvious one:
    My original big project, 65mb, had a bin that was created by an assistant before I began work on the project and I never saw it. That bin has sequence strings for every tape/card loaded into the project, so it had about 65 rather long sequences. Once I found it, I moved that bin to its own project and deleted it from the original project.
    Now the original project opens, and I can open multiple projects at once, no Out of Memory error.
    So it's the bin full of sequences that did it. If I had seen that bin before I wouldn't have had to run this up the flagpole here. But at least it reinforces that word of advice that has appeared in many previous posts about this error: Don't overload a project with long sequences. Break up the sequences into smaller projects and archive them.
    And my own addition to that advice: If you're an assistant setting up a project for an editor, make sure your editor knows what sequences you've tucked away in bins hidden inside bins inside bins, etc.
    All good to go! (Until the next error... stand by...)

  • Out of Memory Error on simple operations

    Hellooo Apple forums! I have a slightly large project referrencing lots of media. It's a 25+ MB project file (and growing) containing a folder of clips from 100's of master tapes, another folder containing 100's of seperately recorded audio files, and another folder containing sequences syncing the two together (also have 100's of these).
    So nothing too complicated here, with my sync sequences being the most complicated thing (simply audio laid under video).
    I while opening new sequences, placing footage into my timeline, or opening clips in the viewer, i occasionally get an 'Out Of Memory' error. I think that i've only encountered the error during these 3 operations. There doesn't seem to be any consistency to it's appearance. Sometimes I can have 15 sequences open before it pops up when I try to open a 16th. Other times I'll only have one sequence open and it'll appear when I try to put a clip into my timeline.
    My computer is a newly purchased dual 2.0 GHZ PPC G5, with 512 MB of RAM (<-- the problem?), new install of FCP 5.1.2.
    Incase this tidbit helps, one time when I encountered the error, I increased my memory still cache from 10% to 15% (changed to 28 MB), and i was able to resume syncing before it came up again.
    Sometimes I close my other sequences so that only one sequence is open, other times I just wait a bit and I can resume syncing. Any advice on how to make this error go away?

    another question just popped into my mind. Is there a technical explanation of 'why' the project becomes slow to handle after a certain point? E.g. a ~100 MB threshold.
    I super vaguely recall reading something a long time ago that it was due to the way FCP saves the project files, relating to the XML language it uses. Any technical info on the slowness would be awesome to know.
    Also, Shane can you make a random post here so I can mark you with a solved star?

  • Acrobat XI Pro "Out of Memory" Error.

    We just received a new Dell T7600 workstation (Win7, 64-bit, 64GB RAM, 8TB disk storage, and more processors than you can shake a stick at). We installed the Adobe CS6 Master Collection which had Acrobat Pro X. Each time we open a PDF of size greater than roughly 4MB, the program returns an "out of memory" error. After running updates, uninstalling and reinstalling (several times), I bought a copy of Acrobat XI Pro hoping this would solve the problem. Same problem still exists upon opening the larger PDFs. Our business depends on opening very large PDF files and we've paid for the Master Collection, so I'd rather not use an freeware PDF reader. Any help, thoughts, and/or suggestions are greatly appreciated.
    Regards,
    Chris

    As mentioned, the TEMP folder is typically the problem. MS limits the size of this folder and you have 2 choices: 1. empty it or 2. increase the size limit. I am not positive this is the issue, but it does crop up at times. It does not matter how big your harddrive is, it is a matter of the amount of space that MS has allocated for virtual memory. I am surprised that there is an issue with 64GB of RAM, but MS is real good at letting you know you can't have it all for use because you might want to open up something else. That is why a lot of big packages turn off some of the limits of Windows or use Linux.

Maybe you are looking for