Out-of-memory errors - how to debug/fix

I've only recently noted that many of the tests we were running on our Windows XP oracle 10g server were failing from lack of memory. I have performed a lot of tests, added the /3GB parameter to boot.ini, tried many values for
pga_aggregate_target , sga_target and sga_max_size , but I still get the error.
Google searches and parameter tweaking has not helped. We use a great deal of Java stored procedures in this query.
From our Java application's log:
04-12-2008 02:53:52 ERROR (ProcessLauncher.java:31) >> session @9/146 calculatePLC(null,11801,12000) stopped: ORA-04030: out of process memory when trying to allocate 4032 bytes (ioc_make_sub2,UGAClass)
04-12-2008 02:53:52 ERROR (ProcessLauncher.java:31) >>
04-12-2008 02:57:31 ERROR (ProcessLauncher.java:31) >> session @7/143 calculatePLC(null,18002,18201) stopped: ORA-04030: out of process memory when trying to allocate 8288564 bytes (joxp heap,f:Reserved3)
04-12-2008 02:57:31 ERROR (ProcessLauncher.java:31) >>
04-12-2008 03:00:20 ERROR (ProcessLauncher.java:31) >> session @8/145 calculatePLC(null,21402,21601) stopped: ORA-04030: out of process memory when trying to allocate 8388404 bytes (joxp heap,f:Reserved3)
04-12-2008 03:00:20 ERROR (ProcessLauncher.java:31) >>
04-12-2008 03:27:09 ERROR (ProcessLauncher.java:99) session @9/146 calculatePLC(null,11801,12000) stopped: ORA-04030: out of process memory when trying to allocate 4032 bytes (ioc_make_sub2,UGAClass)
session @7/143 calculatePLC(null,18002,18201) stopped: ORA-04030: out of process memory when trying to allocate 8288564 bytes (joxp heap,f:Reserved3)
session @8/145 calculatePLC(null,21402,21601) stopped: ORA-04030: out of process memory when trying to allocate 8388404 bytes (joxp heap,f:Reserved3)
from the bdump/alert_orcl.log
Tue Nov 04 10:58:25 2008
Errors in file c:\oracle\product\10.2.0\admin\orcl\udump\orcl_ora_1476.trc:
ORA-04030: out of process memory when trying to allocate 38528564 bytes (joxp heap,f:OldSpace)
Tue Nov 04 10:58:30 2008
Thread 1 advanced to log sequence 241
Current log# 3 seq# 241 mem# 0: C:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\REDO03.LOG
Tue Nov 04 10:58:39 2008
Errors in file c:\oracle\product\10.2.0\admin\orcl\udump\orcl_ora_2880.trc:
ORA-04030: out of process memory when trying to allocate 32111412 bytes (joxp heap,f:OldSpace)
Thu Dec 04 02:53:34 2008
Errors in file c:\oracle\product\10.2.0\admin\orcl\udump\orcl_ora_1840.trc:
ORA-04030: out of process memory when trying to allocate 18403380 bytes (joxp heap,f:OldSpace)
Thread 1 cannot allocate new log, sequence 3974
Checkpoint not complete
Current log# 3 seq# 3973 mem# 0: C:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\REDO03.LOG
Thread 1 advanced to log sequence 3974
Current log# 1 seq# 3974 mem# 0: C:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\REDO01.LOG
Thu Dec 04 02:53:37 2008
Errors in file c:\oracle\product\10.2.0\admin\orcl\udump\orcl_ora_1840.trc:
ORA-04030: out of process memory when trying to allocate 753120 bytes (pga heap,kco buffer)
ORA-04030: out of process memory when trying to allocate 18403380 bytes (joxp heap,f:OldSpace)
Thu Dec 04 02:53:41 2008
Process startup failed, error stack:
Thu Dec 04 02:53:41 2008
Errors in file c:\oracle\product\10.2.0\admin\orcl\bdump\orcl_psp0_2260.trc:
ORA-27300: OS system dependent operation:spcdr:9261:4200 failed with status: 997
ORA-27301: OS failure message: Overlapped I/O operation is in progress.
ORA-27302: failure occurred at: skgpspawn
Thu Dec 04 02:53:42 2008
Thread 1 advanced to log sequence 3975
Current log# 2 seq# 3975 mem# 0: C:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\REDO02.LOG
Thu Dec 04 02:53:42 2008
Process J000 died, see its trace file
Thu Dec 04 02:53:42 2008
kkjcre1p: unable to spawn jobq slave process
Thu Dec 04 05:03:46 2008
Errors in file c:\oracle\product\10.2.0\admin\orcl\udump\orcl_ora_2648.trc:
ORA-04030: out of process memory when trying to allocate 36518964 bytes (joxp heap,f:OldSpace)
Thu Dec 04 05:04:23 2008
Errors in file c:\oracle\product\10.2.0\admin\orcl\udump\orcl_ora_1612.trc:
ORA-04030: out of process memory when trying to allocate 52046388 bytes (joxp heap,f:OldSpace)
Latest pfile I've tried is:
# Cache and I/O
db_block_size=8192
db_file_multiblock_read_count=16
# Cursors and Library Cache
open_cursors=300
# Database Identification
db_domain=""
db_name=orcl
# Diagnostics and Statistics
background_dump_dest=C:\oracle\product\10.2.0/admin/orcl/bdump
core_dump_dest=C:\oracle\product\10.2.0/admin/orcl/cdump
user_dump_dest=C:\oracle\product\10.2.0/admin/orcl/udump
# File Configuration
control_files=("C:\oracle\product\10.2.0\oradata\orcl\control01.ctl", "C:\oracle\product\10.2.0\oradata\orcl\control02.ctl", "C:\oracle\product\10.2.0\oradata\orcl\control03.ctl")
db_recovery_file_dest=C:\oracle\product\10.2.0/flash_recovery_area
db_recovery_file_dest_size=2147483648
# Job Queues
job_queue_processes=10
# Miscellaneous
compatible=10.2.0.1.0
# Processes and Sessions
processes=250
# SGA Memory
sga_target=1677721600
sga_max_size=1677721600
# Security and Auditing
audit_file_dest=C:\oracle\product\10.2.0/admin/orcl/adump
remote_login_passwordfile=EXCLUSIVE
# Shared Server
dispatchers="(PROTOCOL=TCP) (SERVICE=orclXDB)"
# Sort, Hash Joins, Bitmap Indexes
pga_aggregate_target=629145600
# System Managed Undo and Rollback Segments
undo_management=AUTO
undo_tablespace=UNDOTBS1
--Charles
Edited by: user10601251 on Dec 6, 2008 11:51 PM

Alas, I do not have metalink.
I have asked my company if we have a CSI number that I can use to get a metalink account, but I have not gotten a reply.
I have done substantial additional testing - setting JAVA_POOL_SIZE up to 660MB, using oracle.aurora.vm.OracleRuntime.setMaxMemorySize of 1GB and 640MB, and I have tried 'orastack' on oracle and tnslsnr of 512KB and 700KB.
I still get the ORA-04030 joxp heap,f:OldSpace errors.
The odd thing is that it used to work 'sometimes' - even with the default Oracle configuration,
but since November 24th it has failed every single time. It is possible I suppose that the database has grown enough to break the application for queries of this size.
I think it might be caused by the NCOMP testing I did. After I installed the companion CD and Visual Studio 2008 I think it has been failing on large queries consistantly. Is there a way to safely 'drop' the NCOMP classes without uninstalling everything?

Similar Messages

  • Out of memory error - from parsing a "fixed width file"

    This may be fairly simple for someone out there but I am trying to write a simple program that can go through a "fixed width" flat txt file and parse it to be comma dilmeted.
    I use a xml file with data dictionary specifications to do the work. I do this because there are over 430 fields that need to be parsed from a fixed width with close to 250,000 lines I can read the xml file fine to get the width dimensions but when I try to apply the parsing instructions, I get an out of memory error.
    I am hoping it is an error with code and not the large files. If it is the latter, does anyone out there know some techniques for getting at this data?
    Here is the code
       import java.io.*;
       import org.w3c.dom.Document;
       import org.w3c.dom.*;
       import javax.xml.parsers.DocumentBuilderFactory;
       import javax.xml.parsers.DocumentBuilder;
       import org.xml.sax.SAXException;
       import org.xml.sax.SAXParseException;
        public class FixedWidthConverter{
          String[] fieldNameArray;
          String[] fieldTypeArray;
          String[] fieldSizeArray;      
           public static void main(String args []){
             FixedWidthConverter fwc = new FixedWidthConverter();
             fwc.go();
             fwc.loadFixedWidthFile();
            //System.exit (0);
          }//end of main
           public void go(){
             try {
                DocumentBuilderFactory docBuilderFactory = DocumentBuilderFactory.newInstance();
                DocumentBuilder docBuilder = docBuilderFactory.newDocumentBuilder();
                Document doc = docBuilder.parse (new File("files/dic.xml"));
                // normalize text representation            doc.getDocumentElement ().normalize ();
                System.out.println ("Root element of the doc is " +
                     doc.getDocumentElement().getNodeName());
                NodeList listOfFields = doc.getElementsByTagName("FIELD");
                int totalFields = listOfFields.getLength();
                System.out.println("Total no of fields : " + totalFields);
                String[] fldNameArray = new String[totalFields];
                String[] fldTypeArray = new String[totalFields];
                String[] fldSizeArray = new String[totalFields];
                for(int s=0; s<listOfFields.getLength() ; s++){
                   Node firstFieldNode = listOfFields.item(s);
                   if(firstFieldNode.getNodeType() == Node.ELEMENT_NODE){
                      Element firstFieldElement = (Element)firstFieldNode;
                      NodeList firstFieldNMList = firstFieldElement.getElementsByTagName("FIELD_NM");
                      Element firstFieldNMElement = (Element)firstFieldNMList.item(0);
                      NodeList textFNList = firstFieldNMElement.getChildNodes();
                      //System.out.println("Field Name : " +
                               //((Node)textFNList.item(0)).getNodeValue().trim());
                      //loads values into an array
                      //fldNameArray[s] = ((Node)textFNList.item(0)).getNodeValue().trim();
                      NodeList typeList = firstFieldElement.getElementsByTagName("TYPE");
                      Element typeElement = (Element)typeList.item(0);
                      NodeList textTypList = typeElement.getChildNodes();
                      //System.out.println("Field Type : " +
                               //((Node)textTypList.item(0)).getNodeValue().trim());
                      //loads values into an array
                      //fldTypeArray[s] = ((Node)textTypList.item(0)).getNodeValue().trim(); 
                      NodeList sizeList = firstFieldElement.getElementsByTagName("SIZE");
                      Element sizeElement = (Element)sizeList.item(0);
                      NodeList textSizeList = sizeElement.getChildNodes();
                      //System.out.println("Field Size : " +
                               //((Node)textSizeList.item(0)).getNodeValue().trim());
                      //loads values into an array
                      fldSizeArray[s] = ((Node)textSizeList.item(0)).getNodeValue().trim();   
                   }//end of if clause
                }//end of for loop with s var
                //setFldNameArray(fldNameArray);
                //setFldTypeArray(fldTypeArray);
                setFldSizeArray(fldSizeArray);
                 catch (SAXParseException err) {
                   System.out.println ("** Parsing error" + ", line "
                      + err.getLineNumber () + ", uri " + err.getSystemId ());
                   System.out.println(" " + err.getMessage ());
                 catch (SAXException e) {
                   Exception x = e.getException ();
                   ((x == null) ? e : x).printStackTrace ();
                 catch (Throwable t) {
                   t.printStackTrace ();
          }//end go();
           public void setFldNameArray(String[] s){
             fieldNameArray = s;
          }//end setFldNameArray
           public void setFldTypeArray(String[] s){
             fieldTypeArray = s;
          }//end setFldTypeArray
           public void setFldSizeArray(String[] s){
             fieldSizeArray = s;
          }//end setFldSizeArray
           public String[] getFldNameArray(){
             return fieldNameArray;
          }//end setFldNameArray
           public String[] getFldTypeArray(){
             return fieldTypeArray;
          }//end setFldTypeArray
           public String[] getFldSizeArray(){
             return fieldSizeArray;
          }//end setFldSizeArray 
           public int getNumLines(){
             int countLines = 0;
             try {
                    //File must be in same director and be the name of the string below
                BufferedReader in = new BufferedReader(new FileReader("files/FLAT.txt"));
                String str;
                while ((str = in.readLine()) != null) {
                   countLines++;
                in.close();
                 catch (IOException e) {}    
             return countLines;
          }//end of getNumLines
           public void loadFixedWidthFile(){
             int c = getNumLines();
             int i = 0;
             String[] lineProcessed = new String[c];
             String chars;
             try {
                    //File must be in same director and be the name of the string below
                BufferedReader in = new BufferedReader(new FileReader("files/FLAT.txt"));
                String str;
                while ((str = in.readLine()) != null) {
                   //System.out.println(str.length());
                   lineProcessed[i] = parseThatLine(str);
                   i++;
                in.close();
                 catch (IOException e) {}     
                //write out the lineProcess[] array to another file
             writeThatFile(lineProcessed);
          }//end loadFixedWidthFile()
           public void writeThatFile(String[] s){
             try {
                BufferedWriter out = new BufferedWriter(new FileWriter("files/outfilename.txt"));
                for(int i = 0; i < s.length -1; i++){
                   out.write(s);
    }//end for loop
    out.close();
    catch (IOException e) {}
    }//end writeThatFile
    public String parseThatLine(String s){
    int start = 0;
    int end = 0;
    String parsedLine = "";
    int numChars = getFldSizeArray().length;
    //Print number of lines for testing
    //System.out.println(numChars);
    String[] oArray = getFldSizeArray();
    //String chars = oArray[0];
    //System.out.println(chars.length());
    //oArray
    for(int i = 0; i < numChars -1; i++ ){
    if(i == 0){
    start = 0;
    end = end + Integer.parseInt(oArray[i])-1;
    else
    start = end;
    end = end + Integer.parseInt(oArray[i]);
    parsedLine = parsedLine + s.substring(start, end) + "~";
    }//end for loop
    return parsedLine;
    }//End of parseThatLine
    I have tried to illeminate as many arrays as I can thinking that was chewing up the memory but to no avail.
    Any thoughts or ideas?
    Message was edited by:
    SaipanMan2005

    You should not keep a String array of all the lines of the file read.
    Instead for each line read, parse it, then write the parsed line in the other file:      public void loadFixedWidthFile() {
             BufferedReader in = null;
             BufferedWriter out = null;
             try {
                //File must be in same director and be the name of the string below
                in = new BufferedReader(new FileReader("files/FLAT.txt"));
                out = new BufferedWriter(new FileWriter("files/outfilename.txt"));
                String str;
                while ((str = in.readLine()) != null) {
                   //System.out.println(str.length());
                   str = parseThatLine(str);
                   //write out the parsed str to another file
                   out.write(str);
             catch (IOException e) {
                e.printStackTrace(); // At least print the exception - never swallow an exception
             finally { // Use a finally block to be sure of closing the files even when exception occurs
                try { in.close(); }
                catch (Exception e) {}
                try { out.close(); }
                catch (Exception e) {}
          }//end loadFixedWidthFile()Regards

  • When I download a PP my keynote says out of memory. How do I fix this?

    The file isnt too big. I've opened it before but now its not letting me. I tried restarting my computer and thats not working either.

    The file isnt too big. I've opened it before but now its not letting me. I tried restarting my computer and thats not working either.

  • Getting out of memory errors in Indesign 5.5. What can I do to fix it?

    Getting out of memory errors in Indesign 5.5. What can I do to fix it?

    Tell your dumb friend to pay you for a new phone as he damaged it. You cannot get help here for a phone that has been taken apart, as it is not user servicible. Your dumb friend also voided your warranty and, even if the warranty were expired Apple will never touch that phone.
    Time to get smarter friends.

  • Out of memory error - JS Runtime: How many users can one connect?

    Not talking video here.  Talking interactive apps, like chat.  Ours crashes at about 500 connected users.  When I report this I'm told "make sure you're not creating too many objects serverside" or "increase the JSRuntimeSize setting in your application.xml file to the max".
    Have now done both of those things but still get this out of memory error.  Let's say I optomized my app and got 100% more connection capacity.  That would be 1,000 connected users - still nowhere near enough.
    Are my dreams of 6,000 or 10,000 connected users enjoying all of the fruits of the FMS interactivity pipe dreams?  Is it not meant for sessions of that size?  Where does one find documentation or advice or application assistance on this issue?
    How do large social media applications connect so many people concurrently.
    Thoughts appreciated.
    Thanks

    Yes.  I'm using the max.
    <RuntimeSize>51200</RuntimeSize>
    See:
    http://help.adobe.com/en_US/FlashMediaServer/3.5_AdminGuide/WS5b3ccc516d4fbf351e63e3d119f2 926bcf-7ff0.html#WS5b3ccc516d4fbf351e63e3d119f2926bcf-7ed2
    Don't think 100MB or 200MB would be valid settings.

  • How can I solve out of memory error on excell file in PL/SQL

    Hi,
    I'm new on PL/SQL. One of the PL/SQL code which is created excell report got out of Memory error. The first reason of this error, excell file not supported more than 65536 data. So I change the excell file separeted sheets. So that the single sheet size cannot exceed 65536 data.
    All the data are held on system cach and if many user want to take the report the they would get an out of memory error.
    So I want to change the code like that; when out of memory exception raises,
    the old excell file save to disk and new excell file is created,
    and go on to write the new file without exiting the program.
    At the end of the data all the excell file append and show only one file to the user.
    I do know how to save the file and create a new file. But I don't know how can PL/SQL program to turn back to loop again when the exception occurs.
    Is anyone help me on this issue?
    Here is my code
    Thank you
    dworkbook:=hssfworkbook.new;
    dCurrentItem := Get_Block_Property(pCurrentBlock, FIRST_ITEM);     
    while not (name_in('system.last_record')='TRUE') loop
    /* The data would be written to the excell file column order. */
    if (dRow=0) then
              /* Create a new sheet */
    elsif (dRow <= dMaxWorksheetNum) then
         /* Data of the report are written here. The data are written in column order */
    if (dRow > dMaxWorksheetNum) then
         /* give dRow and dColumn intial value */
    /* increase worksheet number */
    end if; /* End of if (dRow=1) */
    if (isWritten) and not name_in('system.last_record')='TRUE'then
         /* if not at the end of the record and the previously read record is written to the file
         , then go to next record */
         next_record;
    end if;
    /* save excell report */
    workbookwriter.save(dworkbook,global.gethome||dFileName);
    web.show_document('/users/'||dFileName,'_BLANK');
    /* when exceptions occurs */
    EXCEPTION
    WHEN ORA_JAVA.EXCEPTION_THROWN THEN
    begin
         javaException := ORA_JAVA.LAST_EXCEPTION;
         -- Print out the Exception by using the toString()
         -- Method of the exception Object
         javaException2 := Exception_.new(javaException);
         mess(27002,Exception_.getMessage(javaException2));
         -- and clean up
         ORA_JAVA.CLEAR_EXCEPTION;
    exception
         WHEN ORA_JAVA.JAVA_ERROR THEN
    -- In this case, the error would be
    -- Argument 1 can not be null
    mess(27002,ORA_JAVA.LAST_ERROR);
    --Clean up
    ORA_JAVA.CLEAR_ERROR;
    end;
    WHEN ORA_JAVA.JAVA_ERROR THEN
    -- In this case, the error would be
    -- Argument 1 can not be null
    message(ORA_JAVA.LAST_ERROR);
    --Clean up
    ORA_JAVA.CLEAR_ERROR;

    No need to double-post... most questions are answered pretty quickly...

  • When downloading ios 5 to my itunes an error message saying "the network connection has timed out" keeps popping up. how do i fix this?

    when downloading ios 5 to my itunes an error message saying "the network connection has timed out" keeps popping up. how do i fix this?

    Temporarily dip ****.
    And excluding the extremely high network traffic due to the many millions of users downloading the update a day or two after is was released, this is a problem with that swiss cheese for security garbage OS that is Windows only, which is the joke. An update that is now running on 1 in 3 compatible iDevices which is in the 10's of milllions is vaporware Einstein?
    If I were your rep, I would tell you to **** off. Go get Windoze mobile devices.

  • Will 64-bit office fix out of memory error?

    I've been troubleshooting an out of memory error in Excel 2010 for some time. I've read quite a few articles on forums and on MS sites (including here.) I find many hits but none seems to offer a solution that works. One idea that seems to show
    up often is that 64-bit Office versions will have a LOT MORE memory to work with than 32-bit versions. I'm running a 64-bit version of Windows 7, so I'm considering giving Office 2010 64-bit a try. However, I also find a lot of caveats in those articles that
    concern me. On the other hand most of those articles are 2 to 3 years old, so I'm wondering if most of those issues have been dealt with. For example, VBA issues; third-party add-ins that do not (did not) support 64-bit; ActiveX and COM issues.
    Sorry to be overly verbose. My questions pretty much come down to this:
    1. Is 64-bit likely to solve my out of memory problem?
    2. What issues are still unresolved in 64-bit Excel with Windows 7?
    TIA,
    Phil

    If you are an Excel power user working with huge amounts of data, then you would benefit from 64-bit Office being able to utilize more memory.
    MS is recommending 32-bit Office as the default installation on both 32-bit and 64-bit Windows mainly due to compatibility with existing 32-bit controls, add-ins, and VBA.
    This is really not an issue that can be resolved on Office side, it depends on whether you are using any 32-bits controls, add-ins and VBA. The question is those existing 32-bit controls, add-ins and vba need an update to adapter to the 64bit of Office.
    If all your controls, add-ins and VBA is 64 bits, or designed to work with 64 bits of Office, then you are good.
    Bhasker Timilsina (ManTechs Inc)

  • Out of Memory error on some PDF's, not all PDF's.

    Hi there,
    I have read most of the posts in the forums that relate to 'Out of Memory' issues with PDF's and I have to say that there is still no solution that I have found.
    I have tried reinstalling Adobe Reader, Flash Player and tried clearing my Temp Files. None of these fixed the issue.
    The PDF's that receive this memory error are downloaded off a CMS website that has many offices and logins. Only one office is experiencing this out of memory issue so we know that it is not an issue with generating the PDF's on the CMS website, otherwise all the offices and logins would have this issue, since they use the same system.
    I am an admin of that CMS website and even I receive the same out of memory issue when I try and view the PDF's from that office, as the customer does.
    When we open these PDF's with another PDF reading program, the PDF's open fine. So that tells me that the issue lies with Adobe Reader and not the PDF file itself.
    These PDF files that are receiving the error about 1MB-4MB, so they are not large.
    Could you please tell me the solution to this out of memory error as we may lose a customer because your product is producing an error that does not seem to have been fixed after years of being reported by your customers.
    Thanks.

    As an Adobe Reader XI user, if I may put my two cents in, that "Out of Memory" problem is not with the Adobe Reader XI application, which was installed on our computers! I think, the problem is with how the PDF documents, which we were trying to open, were generated (or regenerated).
    I remember I was able to open my old credit card statement online without any problem and now I am not able to open the same old statement (out of memory) because it could have been regenerated! In fact, I could not open any of my credit card statements (old or new) on this specific credit card web site, which makes me believe those statments were regenerated or somthing with some new Adobe software.
    As others mentioned, even I'm able to view other PDF documents without any problem.

  • Thread: Out of memory error

    I'm running a program that reads a file of more than 20k lines. Each line of this file is being processed by a thread. Sometimes the program is being halted with the out of memory error.
    What can I do to solve this problem ? Is it better to change the logic of this program and instead of calling a new thread for each line, call few threads to be responsible of processing multiple lines ?
    The current java version is 1.4.2.13. Upgrading it to the most recent version can solve the problem ?
    I'm running it with the following configuration:
    "-Xms1024m -Xmx2048m -XX urvivorRatio=10"
    Thanks a lot,
    Marcos.

    Indeed this design is flawed. It is an old system that came to my hands and now I have to fix some bugs on it.
    What is the difference between calling thousand of threads and having this backport of concurrent to take care of it ? I mean, isn't the jvm responsible for managing the both cases ?
    Using the backport of concurrent and defining a limited size thread pool, what will happen if the number of lines in the file is greater than the size of this pool ?
    Do you have an example or a link to where I can get a good example on how to use it ?
    Thanks for helping!

  • Out of Memory error in  JTextPane

    Hi all,
    A application contains four JTextPanes.The features supported are foreground,background colouring,editing etc...
    When i try to load a file
    containing 10,000 lines i get "out of memory error".
    What is the maximum number of characters or lines that can be
    represented in a JTextPane.
    I find a lot of objects related to JTextPane getting created.
    Is there any work around to avoid this?.
    Has any one faced this problem already?.
    The forum already contains the question but no replies to it.
    Thanks.

    Has any one faced this problem already?.assuming you're having the same problem (sure sounds similar)...
    see my thread here:
    http://forum.java.sun.com/thread.jsp?forum=57&thread=340872
    there's a bug in jtextarea, jtextpane and jeditorpane. a link to the initial bug report (jtextarea) is in there. i duplicated the bug in all three components by creating a blank jframe with the component in question on it. memory use skyrocketed with all three
    i already cast two votes for for the bug to be fixed. i don't know how sun can expect people to adopt java when there are debilitating problems such as these...
    anyway, try using an awt text area instead...that is, if you don't need to make use of the extra features of jtextpane. using the awt text area solved my problems.

  • More on Out of Memory Error

    I recently purchased the CS3 Master Suite. Am running Encore 3.0.1.008. Have a dual core with 2 gigs, Vista.
    Built a medium size movie with a 70 clips, 14 timelines, and 14 menus. Everything linked nicely, check project finds no errors. Projected size of result on DVD: 3.003 Gigs.
    Invoke build. Transcoder begins. Fails on second clip. Out of Memory Error.
    Checked forums. Note existent of these problems going back many months. Tried various suggestions. Cleared media cache. Rebooted, reentered, restarted. Same result.
    Also came across warning about 2 gig memory limit. However, note that many, many programs are written with input and output sizes much larger than resident memory. This is a CS101 design issue.
    Because I was obligated to make a release to a client base with high expectations, I scaled back the project to an embarrassingly small size.
    I would call it a demo size as opposed to an acceptable project size.
    Of course, I needed to start from scratch. All attempts to prune the original project to smaller and smaller sizes failed (even with media cache clearing, rebooting). A project with 1 menu and 2 video clips did build.
    My conclusion from this experience is that Encore was designed and tested to meet alpha-release demo standards. There is the well deserved expectation with other Adobe products: PhotoShop, Dreamweaver, etc, that production standards can be achieved.
    My forum reading suggests that Encore has been in this crippled state for months, perhaps years. My question is, when can we expect a fix? There are plenty of software engineers out there who could bring this up to Photoshop standards.
    I spent $2500 on the Master Suite with the expectation that I could do production work, not serve as unpaid QA for Adobe.

    When I try to build a DVD this error appears
    The instruction at 0x007fa94c referenced memory at 0x00000000. The memory could not be read
    Click on OK to terminate the program
    Click on CANCEL to debug the program
    What can i do to fix this problem or to build my DVD..?

  • Acrobat XI Pro "Out of Memory" Error.

    We just received a new Dell T7600 workstation (Win7, 64-bit, 64GB RAM, 8TB disk storage, and more processors than you can shake a stick at). We installed the Adobe CS6 Master Collection which had Acrobat Pro X. Each time we open a PDF of size greater than roughly 4MB, the program returns an "out of memory" error. After running updates, uninstalling and reinstalling (several times), I bought a copy of Acrobat XI Pro hoping this would solve the problem. Same problem still exists upon opening the larger PDFs. Our business depends on opening very large PDF files and we've paid for the Master Collection, so I'd rather not use an freeware PDF reader. Any help, thoughts, and/or suggestions are greatly appreciated.
    Regards,
    Chris

    As mentioned, the TEMP folder is typically the problem. MS limits the size of this folder and you have 2 choices: 1. empty it or 2. increase the size limit. I am not positive this is the issue, but it does crop up at times. It does not matter how big your harddrive is, it is a matter of the amount of space that MS has allocated for virtual memory. I am surprised that there is an issue with 64GB of RAM, but MS is real good at letting you know you can't have it all for use because you might want to open up something else. That is why a lot of big packages turn off some of the limits of Windows or use Linux.

  • Acrobat XI Pro "Out of Memory" error after Office 2010 install

    Good Afternoon,
    We recently pushed Office 2010 to our users and are now getting reports of previous installs of Adobe Acrobat XI Pro no longer working but throwing "Out of Memory" errors.
    We are in a Windows XP environment. All machines are HP 8440p/6930p/6910 with the same Service pack level (3) and all up to date on security patches.
    All machines are running Office 2010 SP1.
    All machines have 2GB or 4GB of RAM (Only 3.25GB recognized as we are a 32bit OS environment).
    All machines have adequate free space (ranging from 50gb to 200gb of free space).
    All machines are set to 4096mb initial page file size with 8192mb maximum page file size.
    All machines with Acrobat XI Pro *DO NOT* have Reader XI installed alongside. If Reader is installed, it is Reader 10.1 or higher.
    The following troubleshooting steps have been taken:
    Verify page file size (4096mb - 8192mb).
    Deleted local user and Windows temp files (%temp% and c:\WINDOWS\Temp both emptied).
    Repair on Adobe Acrobat XI Pro install. No change.
    Uninstall Acrobat Pro XI, reboot, re-install. No change.
    Uninstall Acrobat Pro XI Pro along with *ALL* other Adobe applications presently installed (Flash Player, Air), delete all Adobe folders and files found in a full search of the C drive, delete all orphaned Registry entries for all Adobe products, re-empty all temp folders, reboot.
    Re-install Adobe Acrobat XI Pro. No change.
    Disable enhanced security in Acrobat XI Pro. No change.
    Renamed Acrobat XI's plug_ins folder to plug_ins.old.
    You *can* get Acrobat to open once this is done but when you attempt to edit a file or enter data into a form, you get the message, "The "Updater" plug-in has been removed. Please re-install Acrobat to continue viewing the current file."
    A repair on the Office 2010 install and re-installing Office 2010 also had no effect.
    At this point, short of re-imaging the machines (which is *not* an option), we are stumped.
    We have not yet tried rolling back a user to Office 2007 as the upgrade initiative is enterprise-wide and rolling back would not be considered a solution.
    Anyone have any ideas beyond what has been tried so far?

    As mentioned, the TEMP folder is typically the problem. MS limits the size of this folder and you have 2 choices: 1. empty it or 2. increase the size limit. I am not positive this is the issue, but it does crop up at times. It does not matter how big your harddrive is, it is a matter of the amount of space that MS has allocated for virtual memory. I am surprised that there is an issue with 64GB of RAM, but MS is real good at letting you know you can't have it all for use because you might want to open up something else. That is why a lot of big packages turn off some of the limits of Windows or use Linux.

  • Out of Memory Error While deploying as EAR file

    Hai,
    I was trying to deploy an EAR file of size 63 MB which inturn containing about 60 EJB.jars. No WARs. application.xml has all the entries for the JARs. While I am deploying it is giving Out of Memory Error. Is there any way to tweak this problem. I am using my own hand written java application which uses the SunONE deployment APIs for deployment. Can u please tell how to tackle this problem. I am running my application through a batch file which uses jdk1.4.
    Please help me regarding this issue.

    You can set the initial heap size and maximum heap size for the JVM, either in the app-server admin console, or maybe in one of your scripts. You look-up the syntax!...
    I had this error yesterday. I too had run out of memory (150Mb). You simply need to allocate more to the app-server.

Maybe you are looking for

  • Amount in local currency for GRs

    Hello gurus, can anyone please explain to me this phenomenon: We have got a PO for 10 pieces of a material. The price is $7,40 per 10 pieces, so $7,40 total. Someone posted a goods receipt for 7 pieces. For whatever reason the amount in local currenc

  • Data source issue in RSA7???? urgent plz

    hi experts, unfortunately i have deleted my data source from RSA7, i want to extract new records and push to BI. but its deleted. what is the procedure to make my data source available... my update method is QUEUED DELTA.. plz provide me the solution

  • 10.4.6 Font importing problem - .exe

    I have purchased a G5 and a Mac Mini, both running OS 10.4.6. I am trying to load fonts onto the machines but when I copy them to the users/shared folder they show up as Unix executable files. I use FontAgent Pro 3 and usually store fonts in users/sh

  • Convert mathematical formulat to excel formula

    I have a formula that was generated by excel when using a trendline on a graph. I need to convert this formula into an excel formula I can use again. the formula is below. Assume that the value for X will be in Cell A1. Any help converting this to an

  • 15 in MacBook Pro 15 mc373ba NVIDIA GeForce 320M

    I bought my MacBook Pro in Mid 2010, and it was running fine until I put Lion on it.. Since then the graphics card regulary crashes. Anyone else know of this problem? Is there a fix about? thanks