Out of Memory during recon

I am running a initial recon on a DB (oracle 10g) which has over 150000 users, the server is going out of memory, so how do I solve this problem?
Thanks

Hi ,
Running job in batch is really good idea ,but you need to evaluate if your resource/connector allow any kind of filter and if not then customization you want to do.
If you have large memory and can;t use above option then you should use below option.(btw specifying 1gb of memory is not sufficient seeing no. of records.As one recon event invokes a series of calls ,your memory grow and gc will try to collect memory based on your algorithm ,but as you reach a point <1.2 gb> JVM fails ).
You can check when your recon failed for memory usage and set your initial size to some higher value i.e. 1.5x and max to 2.1x.
-Ankit

Similar Messages

  • OUT OF MEMORY - during loading images (JPEG's)

    Hallo,
    We use the OHJ (version 4.1.12) inside a Java/Swing application with JDK 1.3.1.
    Our online help contains a lot of larger JPEG images. When the user navigates through the online help - an out of memory occurs while loading the images.
    I tried to split the help pages in a lot of small HTML pages, but this doesn't help. It seems that the OHJ does not clear the memory
    when loading the next HTML page.
    Can the OHJ deal with larger images ?
    Any other possibilities ?
    Thanks
    Markus Pohle

    >
    It seems that the OHJ does not clear the memory when loading the next HTML page.
    Can the OHJ deal with larger images ?
    We have never seen such a problem with large images and OHJ. Could you send us a ZIP containing your help content by e-mail to [email protected] so that we can try to reproduce it?
    Thanks,
    -brian

  • Error: out of memory during render

    Hi,
    I am attempting to write a non-self contained quicktime movie from a sequence in FCP 6.03. The sequence was originally edited in AIC 720p, then onlined to 8 bit uncompressed via the .m2v files.
    When rendering I receive an "error: out of memory" message.
    I am wondering what might have caused this, as I have not experienced it before. I have done a search within mac forums, but none of the threads I found seemed to address my specific issue.
    Any thoughts??
    Thanks,
    -Tom

    Onlined to 8 bit uncompressed SD? or HD?
    You likely have a corrupted media file involved. It's usually the same percentage in from the head of the sequence, as the failure is reported... i.e, the failure is reported after 50% complete? then look half way into your sequence, and re create or re capture that area of the sequence.
    Jerry

  • Named running out of memory during internet sharing

    From the logs on the system providing the connection;
    Nov 5 12:03:03 Macintosh named[59]: internal_send: 192.168.2.6#49197: Cannot allocate memory
    Nov 5 12:03:03 Macintosh named[59]: client 192.168.2.6#49197: error sending response: out of memory
    Nov 5 12:03:08 Macintosh natd[76]: failed to write packet back (Network is unreachable)
    Nov 5 12:03:18: --- last message repeated 1 time ---
    Nov 5 12:03:18 Macintosh named[59]: /SourceCache/bind9/bind9-24/bind9/lib/isc/unix/socket.c:1173: unexpected error:
    Nov 5 12:03:18 Macintosh named[59]: internal_send: 192.168.2.6#49197: Cannot allocate memory
    Nov 5 12:03:18 Macintosh named[59]: client 192.168.2.6#49197: error sending response: out of memory
    Nov 5 12:03:23 Macintosh natd[76]: failed to write packet back (Network is unreachable)
    This is a Leopard Macbook sharing it's Airport connection to a G5 desktop plugged in via ethernet running 10.4.10. This arrangement worked just fine before upgrading the laptop to Leopard. All updates have been run on both systems. Needless to say, the desktop is unable to connect. No errors on the 10.4.10 side.

    Still happens after upgrading the desktop to Leopard.

  • Xi runing out of memory during Mapping runtime..

    Hi I have a a scenario where the a certian field in the source can result in multiple line items in the target... I saw that when the the line items increases to over 50,000 lines in the target I get thsi mapping exception -
    During the application mapping com/sap/xi/tf/_MM_Map1_2_ a com.sap.aii.utilxi.misc.api.BaseRuntimeException was thrown: RuntimeException in Message-Mapping transformatio~
    When I reduce the number of potential line items that can be generated then the mapping runs fine... this mapping has a lot of queue java functions. This leads me to believe that that the issue is related to a memory issue...
    How can i overcome this? Are there parameters that can be set to provide more system resources during mapping runtime.

    Hi Aravind,
    your input file is too large thats why you are getting that error.
    Asks your BASIS team to increase the java heap memory.
    Check this link
    Start java engine failure: how to increase space for object heap
    Regards
    Ramesh

  • System out of memory during deployment

    Hello everybody,
    I have a J2EE project and a respective EAR-project to deploy my application on the WebAS 6.40 (SP13).
    Since yesterday I have the problem that when I add a new entity bean to my J2EE project I got the following error during deployment.
    If I remove the entity bean there is no problem to deploy the project.
    I tried a lot of things, e.g. changing the heap size of the developing workspace or of the sdm, but no results. Has someone an idea?
    Is there probably a limit for beans in a J2EE project?
    ===========================================================================
    Deployment started Wed Aug 17 11:58:00 CEST 2005
    ===========================================================================
    Starting Deployment of HPMisEAR
    Aborted: development component 'HPMisEAR'/'com.hp'/'localhost'/'2005.08.17.11.51.27':
    Caught exception during application deployment from SAP J2EE Engine's deploy service:
    java.rmi.RemoteException: Cannot deploy application com.hp/HPMisEAR.. Reason: Errors while compiling:
    The system is out of resources.
    Consult the following stack trace for details.
    java.lang.OutOfMemoryError
    ; nested exception is:      com.sap.engine.services.ejb.exceptions.deployment.EJBFileGenerationException: Errors while compiling:
    The system is out of resources.
    Consult the following stack trace for details.
    java.lang.OutOfMemoryError
    (message ID: com.sap.sdm.serverext.servertype.inqmy.extern.EngineApplOnlineDeployerImpl.performAction(DeploymentActionTypes).REMEXC)
    Deployment of HPMisEAR finished with Error (Duration 51252 ms)
    Thanks for help,
    Paulo
    Message was edited by: Paulo Calado

    Hi,
    thank you very much, it works now.
    Best regards,
    Paulo

  • ERROR [B3108]: Unrecoverable out of memory error during a cluster operation

    We are using Sun Java(tm) System Message Queue Version: 3.5 SP1 (Build 48-G). We are using two JMS servers as a cluster.
    But we frequently getting the out of memory issue during the cluster operation.
    Messages also got queued up in the Topics. Eventhough listeners have the capability to reconnect with the Server after the broker restarting, usually we are restarting consumer instances to get work this.
    Here is detailed log :
    Jan 5 13:45:40 polar1-18.eastern.com imqbrokerd_cns-jms-18[8980]: [ID 478930 daemon.error] ERROR [B3108]: Unrecoverable out of memory error during a cluster operation. Shutting down the broker.
    Jan 5 13:45:57 polar1-18.eastern18.chntva1-dc1.cscehub.com imqbrokerd: [ID 702911 daemon.notice] Message Queue broker terminated abnormally -- restarting.
    Expecting your attention on this.
    Thanks

    Hi,
    If you do not use any special cmdline options, how do you configure your servers/
    brokers to 1 Gb or 2 Gb JVM heap ?
    Regarding your question on why the consumers appear to be connecting to just
    one of the brokers -
    How are the connection factories that the consumers use configured ?
    Is the connection factory configured using the imqAddressList and
    imqAddressListBehavior attributes ? Documentation for this is at:
    http://docs.sun.com/source/819-2571/ref_adminobj_props.html#wp62463
    imqAddressList should contain a list of brokers (i.e. 2 for you) in the cluster
    e.g.
    mq://server1:7676/jms,mq://server2:7676/jms
    imqAddressListBehavior defines how the 2 brokers in the above list are picked.
    The default is in the order of the list - so mq://server1:7676/jms will always be
    picked by default. If you want random behavior (which will hopefully even out the
    load), set imqAddressListBehavior to RANDOM.
    regards,
    -i
    http://www.sun.com/software/products/message_queue/index.xml

  • Possible "Out of memory" error  during XSLT ?

    Hi ,
    I am working on 11gR1.
    In my project I am reading a file in batches of ten thousand messages.
    The file is getting read and archived and I can see expected number of instances getting created in the console.
    But nothing useful is visible inside the instance as the link for BPEL process is not appearing.
    (I have kept audit level as production but even in this case, atleast link should appear)
    When I checked the logs , it indicated that transaction was rolled back due to out of memory error.
    Just before this error, there is a reference to the xsl file which I am using :
    [2010-12-13T08:42:33.994-05:00] [soa_server1] [NOTIFICATION] [] [oracle.soa.bpel.engine.xml] [tid: pool-5-thread-3] [userId: xxxx] [ecid: 0000InVxneH5AhCmvCECVH1D1XvN00002J,0:6:100000005] [APP: soa-infra] [composite_name: xxxx] [component_name: xxxx] [component_instance_id: 560005] [composite_instance_id: 570005] registered the bpel uri resolver [File-based Repository]oramds:/deployed-composites/xxxx_rev1.0/ base uri xsl/ABCD.xsl
    [2010-12-13T08:46:12.900-05:00] [soa_server1] [ERROR] [] [oracle.soa.mediator.dispatch.db] [tid: oracle.integration.platform.blocks.executor.WorkManagerExecutor$1@e01a3a] [userId: <anonymous>] [ecid: 0000InVuNCt5AhCmvCECVH1D1XvN000005,0] [APP: soa-infra] DBContainerIdManager:run() failed with error.Rolling back the txn[[
    java.lang.OutOfMemoryError
    My question is , is there any limit on how much payload can oracle's xslt parser handle in one go ?
    Is decreasing the batch size only possible solution for this ?
    Please share your valuable inputs ,
    Ketan
    Is there any limit on how many number of the elements xslt parser can handle ?
    I am reading a file in batch of 10 thousand messages per file. (Each recordsa has some 6-8 fields)
    The file is getting picked up but the instance does not show anything.

    > I'm getting out of memory errro during system copy import for Dual stack system (ABAP & JAVA).
    >
    > FJS-00003  out of memory (in script NW_Doublestack_CI|ind|ind|ind|ind, line 6293
    > 6: ???)
    Is this a 32bit instance? How much memory do you have (physically) in that machine?
    Markus

  • Out of memory error during installation

    Hi,
    I am trying to install BPEL 10.1.2.0.2, I am using my already present metadata DB (BPEL Process Manager for OracleAS Middle Tier) as the dehydration DB. When it is performing "Oracle BPEL Process Manager OID configuration Assistant" steps it displays "java.lang.outofmemoryError" and get stuck. Log file has the following just before it "out of memory" error:
    Subscriber "<name>" conatins multiple values for the attribute 'orclcommongroupsearchbase'
    1. cn=users, <namespace>
    2. cn=Groups, <namespace>
    Please help
    Thanks in advance

    Hi,
    I am trying to install BPEL 10.1.2.0.2, I am using my already present metadata DB (BPEL Process Manager for OracleAS Middle Tier) as the dehydration DB. When it is performing "Oracle BPEL Process Manager OID configuration Assistant" steps it displays "java.lang.outofmemoryError" and get stuck. Log file has the following just before it "out of memory" error:
    Subscriber "<name>" conatins multiple values for the attribute 'orclcommongroupsearchbase'
    1. cn=users, <namespace>
    2. cn=Groups, <namespace>
    Please help
    Thanks in advance

  • Problem with out of memory and reservation of memory

    Hi,
    we are running a very simple java program on HP-UX that do some text substitution - replacing special characters with other characters.
    The files that are converted are sometimes very large, and now we have come to a point where the java server doing the work crashes with "Out of memory" message. (no stack) when it process one single 500MB large file.
    I have encountered this error before(with smaller files) and then I have made the maximum Heap larger, but now when I try to set it to 4000M
    i get the message:
    "Error occurred during initialization of VM
    Could not reserve enough space for old generation heap"
    When it crash with this message, my settings are:
    -XX:NewSize=500m -XX:MaxNewSize=1000m -XX:SurvivorRatio=
    8 -Xms1000m -Xmx4000m
    If I run with Xmx3000m instead the java program starts but I get Out of memory error like:
    java.lang.OutOfMemoryError
    <<no stack trace available>>
    The GC log file created when it crashes looks like:
    <GC: -1 31.547669 1 218103808 32 219735744 0 419430400 0 945040 52428800 0 109051904 524288000 877008 877008 1048576 0.934021
    >
    <GC: -1 62.579563 2 436207616 32 218103808 0 419430400 945040 944592 52428800 109051904 327155712 524288000 877008 877008 1048
    576 2.517598 >
    <GC: 1 65.097909 1 436207616 32 0 0 419430400 944592 0 52428800 327155712 219048400 524288000 877008 877008 1048576 2.061976 >
    <GC: 1 67.160178 2 436207616 32 0 0 419430400 0 0 52428800 219048400 219048400 524288000 877008 877008 1048576 0.041408 >
    <GC: -1 128.133097 3 872415232 32 0 0 419430400 0 0 52428800 655256016 655256016 960495616 877008 877008 1048576 0.029950 >
    <GC: 1 128.163584 3 872415232 32 0 0 419430400 0 0 52428800 655256016 437152208 960495616 877008 877008 1048576 3.971305 >
    <GC: 1 132.135106 4 872415232 32 0 0 419430400 0 0 52428800 437152208 437152208 960495616 877008 876656 1048576 0.064635 >
    <GC: -1 256.378152 4 1744830464 32 0 0 419430400 0 0 52428800 1309567440 1309567440 1832910848 876656 876656 1048576 0.058970
    >
    <GC: 1 256.437652 5 1744830464 32 0 0 733282304 0 0 91619328 1309567440 873359824 1832910848 876656 876656 1048576 8.255321 >
    <GC: 1 264.693275 6 1744830464 32 0 0 733282304 0 0 91619328 873359824 873359824 1832910848 876656 876656 1048576 0.103764 >
    We are running:
    java version "1.3.1.02"
    Java(TM) 2 Runtime Environment, Standard Edition (build 1.3.1.02-011206-02:17)
    Java HotSpot(TM) Server VM (build 1.3.1 1.3.1.02-JPSE_1.3.1.02_20011206 PA2.0, mixed mode)
    We have 132GB of physical memory and a lot of not used Swap space, so I cant imagine we have a problem with that.
    Can anyone please suggest what to do proceed troubleshooting or to change some settings? I'm not into this Java really so I really need some help.
    Usually the java program handles thousands of smaller files (around 500 KB - 1 MB sized files).
    Thanks!

    You have a one to one mapping? Where one character is replaced with another?
    And all you do is read the file, replace and then write?
    Then there is no reason to have the entire file in memory.
    Other than that you need to determine if the VM (which is not a Sun VM) has an upper memory bound. That would be the limit that the VM will not go beyond regardless of memory in the system.
    We have 132GB of physical memory and a lot of not used Swap spaceOne would wonder why you have swap space at all.

  • RoboHelp 9 gives an out of memory error and crashes when I try to import or link a Frame 10 file or

    I have Tech Suite 3. If I start a new RoboHelp project and try to import or link Frame files, RoboHelp tries for a while, then I get an Out of Memory error and the program crashes.
    I opened one of the sample projects and was able to link to one of my frame files without any problem, so it seems to be an issue with creating something new.
    Any suggestions?

    It happens when I create a new project and then try to import or link frame docs to make up the content. It starts scanning, then crashes. I did get it to the conversion setting page once, but no further.
    It does not happen if I open one of the supplied example projects and link a file. But then it doesn't let me choose, during import, any style mapping. And I can't delete the sample project fold
    Twice now it has told me when I tried to import (not link, but import) that my .fm file could not be opened, and told me to verify that Frame is installed (it is) and that the file is a valid frame file (it is).
    The docs and project are in separate folders on my C: drive.

  • Windows Mail could not be started. Make sure that your disk is not full or that you are not out of memory (0x800C0155).

    I have Vista SP2. Every time I open Firefox (3.6.13), I get three consecutive error messages:
    1) Windows Mail could not be started. The application was unable to open the Windows Mail message store. Your Windows Mail mailbox data is currently being used by another program, such as a virus scanner. Close the program or wait for it to complete its operation, then open Windows Mail again (0x800C0155).
    2) Windows Mail could not be started. Make sure that your disk is not full or that you are not out of memory (0x800C0155).
    3) Windows Mail could not be started because MSOE.DLL could not be initialized.
    This series of messages continue to recur during my Firefox session and eventually will freeze up my session.
    I use Gmail. I do not use Windows Mail or Outlook Express and deleted Windows Live Mail. There is a a file named msoe.dll using the path C: /Program Files/Windows Mail. I also found C:/Program Files/Windows Live/RemoteActiveX even though I thought I deleted all Windows Live programs.
    I have set Firefox as the default browser. Also, in Firefox, if I click on "File=>Send Link", it triggers the series of three error messages above.
    I don't understand why Gmail would reference Winmail. Any ideas how I can fix this? Can I or should I delete the Windows Mail and Windows Live folders?

    I have had many problems with permissions on Windows Vista, 7,8 & 81.1.  It is usually after an update or installation, especially one that requires a reboot to finish (some AVG, Windows Live, Windows Updates, etc).  Once File(s), Folder(s)
    and Registry Key(s) permissions are screwed up, it can take days to work them out if ever.
    After having spent several hours on a clients PC and online forums with this problem.  I transferred her user profile with a free program called
    Transwiz (run as Administrator) to a zip file on the HDD.  I renamed her user in manage User Accounts to "Username old".  I then ran Transwiz again to restore the profile but telling it to create a new user using the original Username.
     After it completed restoring the profile, I logged out of "Username old" and logged in to Username and all wass well, I only needed to supply the email password which is not transferred.  When you move profile data, all the old permissions
    have to be removed and <st1:personname w:st="on">new</st1:personname> permissions added so the data and files belong to the <st1:personname w:st="on">new</st1:personname> user.  This cleared up all the permission
    problems.
    James Taylor

  • Uploading large files from applet to servlet throws out of memory error

    I have a java applet that needs to upload files from a client machine
    to a web server using a servlet. the problem i am having is that in
    the current scheme, files larger than 17-20MB throw an out of memory
    error. is there any way we can get around this problem? i will post
    the client and server side code for reference.
    Client Side Code:
    import java.io.*;
    import java.net.*;
    // this class is a client that enables transfer of files from client
    // to server. This client connects to a servlet running on the server
    // and transmits the file.
    public class fileTransferClient
    private static final String FILENAME_HEADER = "fileName";
    private static final String FILELASTMOD_HEADER = "fileLastMod";
    // this method transfers the prescribed file to the server.
    // if the destination directory is "", it transfers the file to
    "d:\\".
    //11-21-02 Changes : This method now has a new parameter that
    references the item
    //that is being transferred in the import list.
    public static String transferFile(String srcFileName, String
    destFileName,
    String destDir, int itemID)
    if (destDir.equals(""))
    destDir = "E:\\FTP\\incoming\\";
    // get the fully qualified filename and the mere filename.
    String fqfn = srcFileName;
    String fname =
    fqfn.substring(fqfn.lastIndexOf(File.separator)+1);
    try
    //importTable importer = jbInit.getImportTable();
    // create the file to be uploaded and a connection to
    servlet.
    File fileToUpload = new File(fqfn);
    long fileSize = fileToUpload.length();
    // get last mod of this file.
    // The last mod is sent to the servlet as a header.
    long lastMod = fileToUpload.lastModified();
    String strLastMod = String.valueOf(lastMod);
    URL serverURL = new URL(webadminApplet.strServletURL);
    URLConnection serverCon = serverURL.openConnection();
    // a bunch of connection setup related things.
    serverCon.setDoInput(true);
    serverCon.setDoOutput(true);
    // Don't use a cached version of URL connection.
    serverCon.setUseCaches (false);
    serverCon.setDefaultUseCaches (false);
    // set headers and their values.
    serverCon.setRequestProperty("Content-Type",
    "application/octet-stream");
    serverCon.setRequestProperty("Content-Length",
    Long.toString(fileToUpload.length()));
    serverCon.setRequestProperty(FILENAME_HEADER, destDir +
    destFileName);
    serverCon.setRequestProperty(FILELASTMOD_HEADER, strLastMod);
    if (webadminApplet.DEBUG) System.out.println("Connection with
    FTP server established");
    // create file stream and write stream to write file data.
    FileInputStream fis = new FileInputStream(fileToUpload);
    OutputStream os = serverCon.getOutputStream();
    try
    // transfer the file in 4K chunks.
    byte[] buffer = new byte[4096];
    long byteCnt = 0;
    //long percent = 0;
    int newPercent = 0;
    int oldPercent = 0;
    while (true)
    int bytes = fis.read(buffer);
    byteCnt += bytes;
    //11-21-02 :
    //If itemID is greater than -1 this is an import file
    transfer
    //otherwise this is a header graphic file transfer.
    if (itemID > -1)
    newPercent = (int) ((double) byteCnt/ (double)
    fileSize * 100.0);
    int diff = newPercent - oldPercent;
    if (newPercent == 0 || diff >= 20)
    oldPercent = newPercent;
    jbInit.getImportTable().displayFileTransferStatus
    (itemID,
    newPercent);
    if (bytes < 0) break;
    os.write(buffer, 0, bytes);
    os.flush();
    if (webadminApplet.DEBUG) System.out.println("No of bytes
    sent: " + byteCnt);
    finally
    // close related streams.
    os.close();
    fis.close();
    if (webadminApplet.DEBUG) System.out.println("File
    Transmission complete");
    // find out what the servlet has got to say in response.
    BufferedReader reader = new BufferedReader(
    new
    InputStreamReader(serverCon.getInputStream()));
    try
    String line;
    while ((line = reader.readLine()) != null)
    if (webadminApplet.DEBUG) System.out.println(line);
    finally
    // close the reader stream from servlet.
    reader.close();
    } // end of the big try block.
    catch (Exception e)
    System.out.println("Exception during file transfer:\n" + e);
    e.printStackTrace();
    return("FTP failed. See Java Console for Errors.");
    } // end of catch block.
    return("File: " + fname + " successfully transferred.");
    } // end of method transferFile().
    } // end of class fileTransferClient
    Server side code:
    import java.io.*;
    import javax.servlet.*;
    import javax.servlet.http.*;
    import java.util.*;
    import java.net.*;
    // This servlet class acts as an FTP server to enable transfer of
    files
    // from client side.
    public class FtpServerServlet extends HttpServlet
    String ftpDir = "D:\\pub\\FTP\\";
    private static final String FILENAME_HEADER = "fileName";
    private static final String FILELASTMOD_HEADER = "fileLastMod";
    public void doGet(HttpServletRequest req, HttpServletResponse resp)
    throws ServletException,
    IOException
    doPost(req, resp);
    public void doPost(HttpServletRequest req, HttpServletResponse
    resp)
    throws ServletException,
    IOException
    // ### for now enable overwrite by default.
    boolean overwrite = true;
    // get the fileName for this transmission.
    String fileName = req.getHeader(FILENAME_HEADER);
    // also get the last mod of this file.
    String strLastMod = req.getHeader(FILELASTMOD_HEADER);
    String message = "Filename: " + fileName + " saved
    successfully.";
    int status = HttpServletResponse.SC_OK;
    System.out.println("fileName from client: " + fileName);
    // if filename is not specified, complain.
    if (fileName == null)
    message = "Filename not specified";
    status = HttpServletResponse.SC_INTERNAL_SERVER_ERROR;
    else
    // open the file stream for the file about to be transferred.
    File uploadedFile = new File(fileName);
    // check if file already exists - and overwrite if necessary.
    if (uploadedFile.exists())
    if (overwrite)
    // delete the file.
    uploadedFile.delete();
    // ensure the directory is writable - and a new file may be
    created.
    if (!uploadedFile.createNewFile())
    message = "Unable to create file on server. FTP failed.";
    status = HttpServletResponse.SC_INTERNAL_SERVER_ERROR;
    else
    // get the necessary streams for file creation.
    FileOutputStream fos = new FileOutputStream(uploadedFile);
    InputStream is = req.getInputStream();
    try
    // create a buffer. 4K!
    byte[] buffer = new byte[4096];
    // read from input stream and write to file stream.
    int byteCnt = 0;
    while (true)
    int bytes = is.read(buffer);
    if (bytes < 0) break;
    byteCnt += bytes;
    // System.out.println(buffer);
    fos.write(buffer, 0, bytes);
    // flush the stream.
    fos.flush();
    } // end of try block.
    finally
    is.close();
    fos.close();
    // set last mod date for this file.
    uploadedFile.setLastModified((new
    Long(strLastMod)).longValue());
    } // end of finally block.
    } // end - the new file may be created on server.
    } // end - we have a valid filename.
    // set response headers.
    resp.setContentType("text/plain");
    resp.setStatus(status);
    if (status != HttpServletResponse.SC_OK)
    getServletContext().log("ERROR: " + message);
    // get output stream.
    PrintWriter out = resp.getWriter();
    out.println(message);
    } // end of doPost().
    } // end of class FtpServerServlet

    OK - the problem you describe is definitely what's giving you grief.
    The workaround is to use a socket connection and send your own request headers, with the content length filled in. You may have to multi-part mime encode the stream on its way out as well (I'm not about that...).
    You can use the following:
    http://porsche.cis.udel.edu:8080/cis479/lectures/slides-04/slide-02.html
    on your server to get a feel for the format that the request headers need to take.
    - Kevin
    I get the out of Memory Error on the client side. I
    was told that this might be a bug in the URLConnection
    class implementation that basically it wont know the
    content length until all the data has been written to
    the output stream, so it uses an in memory buffer to
    store the data which basically causes memory issues..
    do you think there might be a workaround of any kind..
    or maybe a way that the buffer might be flushed after
    a certain size of file has been uploaded.. ?? do you
    have any ideas?

  • Error compiling movie: out of memory !!! Premiere CS3

    Hey every1 --- desperate need of some guidance here. I've been all over the internet and these boards --- have seen this come up with a lot of people but no real solid answers that I'd seen or that have helped.
    First off, my comp is an imac 3.06 Ghz Core 2 Duo, 4GB memory. (just bought the computer last year)
    I'm shooting with a canon 7d in HD (1920 x 1080p) and I'm editing using Premiere CS3. I've noticed that editing has become very difficult since I started using HD; playback is very choppy and the 'quality settings' in the preview monitor are a joke and do absolutely nothing whether it's on draft or automatic.
    Anyway, despite the choppiness and slow editing, I was able to get a 3 min clip completed. These are my export settings:
    Export > Movie
    H.264
    960 x 540
    23.976 fps
    Square pixels,
    100% quality,
    recompress: unchecked.
    Once exported (I was able to export smaller timelines previously--- say, 50 secs of footage or so) the files look AMAZING and from there I use them for online video via vimeo -- stuff looks really awesome.
    Anyway, recently I'm getting: 'Error compiling movie. Out of Memory. Change optimization preferences to memory', etc. etc. Well, I did as it asked and that did absolutely nothing.
    So, I decided to open up my activity monitor and watch what's going on with my memory while premiere is exporting. I also closed ALL unused applications. System runs at about 2.16gb free memory before I start exporting. Memory seems stable during export UNTIL around the 50% exporting mark, and then its starts climbing like crazy and my memory just gets eaten up --- finally resulting in a crash at about 70% render.
    I really don't know what else to do. Does my computer not have the power to export 3 min of HD footage??? This seems surprising to me ... I don't know what else to try.
    Thanks for any help !!!
    -Mark

    Hi Mark,
    Hope you are having a wonderful day.
    Sorry to hear that you are having such a hard time. Although I did not specifically get the out of memory error, I can only be convinced that that is the problem.
    Although I am not a computer whiz, I have learned my share of things. I have been troubleshooting a video all this week (really, a couple of months), but I had to put it aside and come back to it.
    I tend to be talkative; so I will get back on topic or at least try to stay on topic, so bare with me. Smile....
    Here is what I have learned and maybe it will help you with yours if you have not found a solution already.
    Based on your hard drive specs and ram, I was thinking the same thing e.g. hard to believe that you are not able to render 3 minutes of video in HD. I primarily work in HD/1080p.
    The place where we are similar is that I eventually could get the video to render by reducing the workspace that is rendered piece by piece until I was able to render the entire file. When I tried to encode the video, I kept getting error compiling messages. I tried different codecs, deleting xmpses, not rendering xmpses, etc. I could not figure out the keyframe trick so I just put in a black video from after effects. By the way, all of my videos intro the same and no problems with the other videos fading from black to a still photo.
    Here is the problem from what I could gather.
    Simply, out of memory. I was rendering and trying to encode a 45 minute video with after effects compositions and the dynamic linking kept timing out once the file reached a certain size. There were no other true fancy effects. I do alot on the green screen and just use after effects to make a white background. Smaller videos, rendering has been slow, but doable. This massive video would never compile. I won't bore you with my system specs, but know that similar to your situation, the system in theory should have been adequate.
    Work around...
    I ended up exporting the video as 3 separate 15 minute video clips. I could not get them to encode as .wmv files so I encoded them as .mp4 files. By the way, I still encoded in HD.
    Here is the kool stuff.
    After what seems like forever, I found a converter that could convert .mp4 to .wmv and rejoin the files to play as one large file with the same quality as my source file and no audio synching issues. By the way, I do not work for this company. I just found this encoder at 4 o'clock this morning after trying all of these other converters that were straight garbage or were free to try and not really free if you wanted quality. By the way, the conversion tool is not only user friendly, but free which is awesome.
    The name of the software is Media Cope. Just visit www.mediacope.com.... Be sure to send the owner an e-mail to thank him and let him know that Mark sent you. I just sent him an e-mail earlier to let him know that I was going to post on Adobe what a wonderful software he created. I just needed something free, quick, easy, and user-friendly.
    Whoops... got off topic....
    I'll do a summary at the end for my recommendation to you.
    Here are some other tidbits that I learned as well that may save you some time.
    When editing this is what I do now...
    1) I use CCleaner from www.piriform.com to quickly clean my system especially temporary files before I start editing and if I am switching from project to project.
    2) In all of my troubleshooting this week, I found a kool little software call Smart Close. (Just google to find this one.) It closes all of your non-critical background files in order to maximize system resources. (When I am done editing, I just reboot and all of my background programs start up again. You do not have to do it that way, that's just how I do it. The nice thing is that it is a wizard interface; so you do not have to try to figure out if a process in the background is critical or non-critical.)
    3) Depending on my mood and if I did not forget, smile.... I change the priority to high for the adobe products in the task manager.
    More things that I learned...
    1) We are not able to change the temp file for how Adobe encodes. My logic was if I could change the encoding to a hard drive that was simply empty so Adobe can render there, then I would be good. Wrong.... Adobe does not allow you to change the temp location. (Note: This is what I discovered in the different forums; so encoding to an external hard drive is not the best move. Read more to se why.)
    2) It is best to have your scratch disks as your internal disk. If you have multiple internal disks, then you shoud check some other forums for the best way to set up your scratch disks. When I was just starting out, I bought an external hard drive enclosure with the thought that I would work primarily off of those external drives with the option of doing some RAID work. Bad idea... Even the simplest of tasks took forever. It was best to work off of the internal drive. By the way, I still use the enclosure hard drives for backups in order to allow me to free up some space on the primary hard drive. When rendering, it is truely a use of the RAM and the hard drive space in the editing process.
    I'll leave you with this...
    I take back what I said earlier, you do need a new computer with more RAM. If you can get 16 GB, great. The primary reason is that if you are editing in Premiere and if you do anything with dynamic linking into your after effects, it will cause your ram to climb as the file gets larger. Of course, be sure to get the largest hard drive possible as an internal drive, the maximum on processor speed, 64 bit operating system, etc.
    I know you are like by now, Mark, what is the point of all of your gibberish... Really, it was to share with you and others on some really kool work arounds.
    My recommendation for you are as follows.
    1) You have already proven that you can render and encode 50 second clips and you are very happy with those clips, so create separate smaller clips for this particular project.
    2) Once you have the final clips outputted, then use Media Cope to rejoin the files as one large file and you are done.
    (Note: The one thing that I did notice is that there is a small break right between where the files are merged. The nice thing is that as a video editor/film maker, you will be the only person likely to notice it because you know where the breaks are. The other nice thing is that the untrained eye will not notice it and if you had to submit your work to a television station especially on DVD, the video system should recognize the file as one large file and play all the way through. Note: For the last sentence, I said 'should play all the way through' as I have not had a chance to check this one yet as I just finished this project today and still have to turn in all of my videos. I still have some more videos to merge.)
    By the way, the only specs that I will share are that I work on a PC and it has 8 GB of ram.
    The summary of the summary is that sometimes an error compiling message could simply be that your system does not have resources to render and the encode that large of a file. At least that is what I learned this week or at least my specific situation. The confirmation is that I just finished shooting a documentary which was one hour and some change and was able to render and encode to .wmv with no problems. Of course, I was not using after effects for this particular project.
    In either case, hope all the information helps you with your project.
    Have a wonderful day and happy holidays everyone.
    Mark

  • Error due to out of memory condition

    Hi,
    system: Windows XP, 2 GM RAM, Indesign CS3
    I placed two files in Indesign: the first is .ai (1.1 MB, probably exported form a CAD program), the second is .psd (53 MB). When I try to print to .ps file, Indesign displays the message: "Export error: error due to out of memory condition". If I: 1. export to PDF, everything is OK, 2. rasterize .ai in Photoshop, then import the resulting .psd to Indesign, print to .ps file is OK.
    Does somebody has an answer to my problem?

    My gues is "mistakes" was referred to simply not outputting as intended by the designer.  I had output issues when going directly to PDF and had to also continue the old way of doing thinkgs by outputting to .ps then distill to PDF.  The "mistakes" that occured for me were nearly unnoticed.  The publication was a 128+4 product catalog that had a header at the top of each page.  I never had any problems outputting in the past.  At the time, I just had updated to CS3 (XP Pro SP3 Dual Core 4 gigs ram).  I was advised that outputting directly to PDF had been durastically improved.  I output my files without "mistakes" or errors, or so I thought.  After proofing, I didn't realize that all of the drop shadows for the header headline were not applied.  The problem though, wasn't that they were applied to all headers, just beginning at page 60 or so, halfway through.  After getting the finished publication back, I didn't notice it for a few months.  After noticing it, I investigated.  Several troubleshooting hours later (after forum posts and other expert help), I/we concluded that it was simply a program deficiency.  During my trouble shooting, I looked at several areas mainly focusing on the transparency settings.  I output the document several times.  I finally exported each page separately and found that that was the only way that I could output every page to include the dropshadow in each header page (past page 60 or so).
    I have been past that issue for a while now.  Now I am on Win 7, Quad core, 6 gigs of ram, and don't have the outputting problem that I used to.  Now I occasionally get this "Error due to out of memory condition" which is BS because my hardware specs are beyond reasonable for what I typically create in my workflow.  Most of the time, my memory is only at about half of its capacity when I get this error.  Not when I have a PS open with a large multipage file that I am working on.  My out of memory error happens during file output or just when performing ordinary layout functions in InDesign.
    When on my personal computer (Aluminum iMac 2.4 Core2 duo, 4 gigs of ram, 512 video, CS4 & CS5) I have neither of the above issues.

Maybe you are looking for

  • Where is the 'Watch Folders' option in PSE 9 Organizer for Mac?

    I'm sure it's right in front of my face, but I can't find it. Thanks!

  • IBook osx 10.5.6 no finder, won't boot fully, can't boot from disk

    My iBook has worked fine ever since the upgrade to 10.5.6, no problems at all. I installed Safari 4, and whatever security updates it required for that a few days ago and everything still worked fine. Yesterday I awoke it from sleep and the colors of

  • Stream behavior

    Hello, We are working on Oracle 9i stream technology for data replication between two remote sites, in bi-directional replication mode. After comparing the behavior of the system before configuring Streams and after the same, we found it has generate

  • New MBP early 2011 Dynamic Graphics Card Switching Bug

    Fresh out of the box 2.3GHz quad core i7 MBP 15" with HiRes Screen. A sudden white or blue "Flash" of the entire screen repeats every 1 or 2 seconds. This occurs without any foreground applications running. Also, noted sudden flash during Screen Save

  • What best way to create a simple DB based GUI application

    Hi I want to develop a small home-use DB GUI application. My questions relate to the GUI: 1. If I want a main window to be a form containing mainly several text fileds (with labels attached) and a search button - what best layout manager would you re