Out of memory error - from parsing a "fixed width file"

This may be fairly simple for someone out there but I am trying to write a simple program that can go through a "fixed width" flat txt file and parse it to be comma dilmeted.
I use a xml file with data dictionary specifications to do the work. I do this because there are over 430 fields that need to be parsed from a fixed width with close to 250,000 lines I can read the xml file fine to get the width dimensions but when I try to apply the parsing instructions, I get an out of memory error.
I am hoping it is an error with code and not the large files. If it is the latter, does anyone out there know some techniques for getting at this data?
Here is the code
   import java.io.*;
   import org.w3c.dom.Document;
   import org.w3c.dom.*;
   import javax.xml.parsers.DocumentBuilderFactory;
   import javax.xml.parsers.DocumentBuilder;
   import org.xml.sax.SAXException;
   import org.xml.sax.SAXParseException;
    public class FixedWidthConverter{
      String[] fieldNameArray;
      String[] fieldTypeArray;
      String[] fieldSizeArray;      
       public static void main(String args []){
         FixedWidthConverter fwc = new FixedWidthConverter();
         fwc.go();
         fwc.loadFixedWidthFile();
        //System.exit (0);
      }//end of main
       public void go(){
         try {
            DocumentBuilderFactory docBuilderFactory = DocumentBuilderFactory.newInstance();
            DocumentBuilder docBuilder = docBuilderFactory.newDocumentBuilder();
            Document doc = docBuilder.parse (new File("files/dic.xml"));
            // normalize text representation            doc.getDocumentElement ().normalize ();
            System.out.println ("Root element of the doc is " +
                 doc.getDocumentElement().getNodeName());
            NodeList listOfFields = doc.getElementsByTagName("FIELD");
            int totalFields = listOfFields.getLength();
            System.out.println("Total no of fields : " + totalFields);
            String[] fldNameArray = new String[totalFields];
            String[] fldTypeArray = new String[totalFields];
            String[] fldSizeArray = new String[totalFields];
            for(int s=0; s<listOfFields.getLength() ; s++){
               Node firstFieldNode = listOfFields.item(s);
               if(firstFieldNode.getNodeType() == Node.ELEMENT_NODE){
                  Element firstFieldElement = (Element)firstFieldNode;
                  NodeList firstFieldNMList = firstFieldElement.getElementsByTagName("FIELD_NM");
                  Element firstFieldNMElement = (Element)firstFieldNMList.item(0);
                  NodeList textFNList = firstFieldNMElement.getChildNodes();
                  //System.out.println("Field Name : " +
                           //((Node)textFNList.item(0)).getNodeValue().trim());
                  //loads values into an array
                  //fldNameArray[s] = ((Node)textFNList.item(0)).getNodeValue().trim();
                  NodeList typeList = firstFieldElement.getElementsByTagName("TYPE");
                  Element typeElement = (Element)typeList.item(0);
                  NodeList textTypList = typeElement.getChildNodes();
                  //System.out.println("Field Type : " +
                           //((Node)textTypList.item(0)).getNodeValue().trim());
                  //loads values into an array
                  //fldTypeArray[s] = ((Node)textTypList.item(0)).getNodeValue().trim(); 
                  NodeList sizeList = firstFieldElement.getElementsByTagName("SIZE");
                  Element sizeElement = (Element)sizeList.item(0);
                  NodeList textSizeList = sizeElement.getChildNodes();
                  //System.out.println("Field Size : " +
                           //((Node)textSizeList.item(0)).getNodeValue().trim());
                  //loads values into an array
                  fldSizeArray[s] = ((Node)textSizeList.item(0)).getNodeValue().trim();   
               }//end of if clause
            }//end of for loop with s var
            //setFldNameArray(fldNameArray);
            //setFldTypeArray(fldTypeArray);
            setFldSizeArray(fldSizeArray);
             catch (SAXParseException err) {
               System.out.println ("** Parsing error" + ", line "
                  + err.getLineNumber () + ", uri " + err.getSystemId ());
               System.out.println(" " + err.getMessage ());
             catch (SAXException e) {
               Exception x = e.getException ();
               ((x == null) ? e : x).printStackTrace ();
             catch (Throwable t) {
               t.printStackTrace ();
      }//end go();
       public void setFldNameArray(String[] s){
         fieldNameArray = s;
      }//end setFldNameArray
       public void setFldTypeArray(String[] s){
         fieldTypeArray = s;
      }//end setFldTypeArray
       public void setFldSizeArray(String[] s){
         fieldSizeArray = s;
      }//end setFldSizeArray
       public String[] getFldNameArray(){
         return fieldNameArray;
      }//end setFldNameArray
       public String[] getFldTypeArray(){
         return fieldTypeArray;
      }//end setFldTypeArray
       public String[] getFldSizeArray(){
         return fieldSizeArray;
      }//end setFldSizeArray 
       public int getNumLines(){
         int countLines = 0;
         try {
                //File must be in same director and be the name of the string below
            BufferedReader in = new BufferedReader(new FileReader("files/FLAT.txt"));
            String str;
            while ((str = in.readLine()) != null) {
               countLines++;
            in.close();
             catch (IOException e) {}    
         return countLines;
      }//end of getNumLines
       public void loadFixedWidthFile(){
         int c = getNumLines();
         int i = 0;
         String[] lineProcessed = new String[c];
         String chars;
         try {
                //File must be in same director and be the name of the string below
            BufferedReader in = new BufferedReader(new FileReader("files/FLAT.txt"));
            String str;
            while ((str = in.readLine()) != null) {
               //System.out.println(str.length());
               lineProcessed[i] = parseThatLine(str);
               i++;
            in.close();
             catch (IOException e) {}     
            //write out the lineProcess[] array to another file
         writeThatFile(lineProcessed);
      }//end loadFixedWidthFile()
       public void writeThatFile(String[] s){
         try {
            BufferedWriter out = new BufferedWriter(new FileWriter("files/outfilename.txt"));
            for(int i = 0; i < s.length -1; i++){
               out.write(s);
}//end for loop
out.close();
catch (IOException e) {}
}//end writeThatFile
public String parseThatLine(String s){
int start = 0;
int end = 0;
String parsedLine = "";
int numChars = getFldSizeArray().length;
//Print number of lines for testing
//System.out.println(numChars);
String[] oArray = getFldSizeArray();
//String chars = oArray[0];
//System.out.println(chars.length());
//oArray
for(int i = 0; i < numChars -1; i++ ){
if(i == 0){
start = 0;
end = end + Integer.parseInt(oArray[i])-1;
else
start = end;
end = end + Integer.parseInt(oArray[i]);
parsedLine = parsedLine + s.substring(start, end) + "~";
}//end for loop
return parsedLine;
}//End of parseThatLine
I have tried to illeminate as many arrays as I can thinking that was chewing up the memory but to no avail.
Any thoughts or ideas?
Message was edited by:
SaipanMan2005

You should not keep a String array of all the lines of the file read.
Instead for each line read, parse it, then write the parsed line in the other file:      public void loadFixedWidthFile() {
         BufferedReader in = null;
         BufferedWriter out = null;
         try {
            //File must be in same director and be the name of the string below
            in = new BufferedReader(new FileReader("files/FLAT.txt"));
            out = new BufferedWriter(new FileWriter("files/outfilename.txt"));
            String str;
            while ((str = in.readLine()) != null) {
               //System.out.println(str.length());
               str = parseThatLine(str);
               //write out the parsed str to another file
               out.write(str);
         catch (IOException e) {
            e.printStackTrace(); // At least print the exception - never swallow an exception
         finally { // Use a finally block to be sure of closing the files even when exception occurs
            try { in.close(); }
            catch (Exception e) {}
            try { out.close(); }
            catch (Exception e) {}
      }//end loadFixedWidthFile()Regards

Similar Messages

  • Out-of-memory errors - how to debug/fix

    I've only recently noted that many of the tests we were running on our Windows XP oracle 10g server were failing from lack of memory. I have performed a lot of tests, added the /3GB parameter to boot.ini, tried many values for
    pga_aggregate_target , sga_target and sga_max_size , but I still get the error.
    Google searches and parameter tweaking has not helped. We use a great deal of Java stored procedures in this query.
    From our Java application's log:
    04-12-2008 02:53:52 ERROR (ProcessLauncher.java:31) >> session @9/146 calculatePLC(null,11801,12000) stopped: ORA-04030: out of process memory when trying to allocate 4032 bytes (ioc_make_sub2,UGAClass)
    04-12-2008 02:53:52 ERROR (ProcessLauncher.java:31) >>
    04-12-2008 02:57:31 ERROR (ProcessLauncher.java:31) >> session @7/143 calculatePLC(null,18002,18201) stopped: ORA-04030: out of process memory when trying to allocate 8288564 bytes (joxp heap,f:Reserved3)
    04-12-2008 02:57:31 ERROR (ProcessLauncher.java:31) >>
    04-12-2008 03:00:20 ERROR (ProcessLauncher.java:31) >> session @8/145 calculatePLC(null,21402,21601) stopped: ORA-04030: out of process memory when trying to allocate 8388404 bytes (joxp heap,f:Reserved3)
    04-12-2008 03:00:20 ERROR (ProcessLauncher.java:31) >>
    04-12-2008 03:27:09 ERROR (ProcessLauncher.java:99) session @9/146 calculatePLC(null,11801,12000) stopped: ORA-04030: out of process memory when trying to allocate 4032 bytes (ioc_make_sub2,UGAClass)
    session @7/143 calculatePLC(null,18002,18201) stopped: ORA-04030: out of process memory when trying to allocate 8288564 bytes (joxp heap,f:Reserved3)
    session @8/145 calculatePLC(null,21402,21601) stopped: ORA-04030: out of process memory when trying to allocate 8388404 bytes (joxp heap,f:Reserved3)
    from the bdump/alert_orcl.log
    Tue Nov 04 10:58:25 2008
    Errors in file c:\oracle\product\10.2.0\admin\orcl\udump\orcl_ora_1476.trc:
    ORA-04030: out of process memory when trying to allocate 38528564 bytes (joxp heap,f:OldSpace)
    Tue Nov 04 10:58:30 2008
    Thread 1 advanced to log sequence 241
    Current log# 3 seq# 241 mem# 0: C:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\REDO03.LOG
    Tue Nov 04 10:58:39 2008
    Errors in file c:\oracle\product\10.2.0\admin\orcl\udump\orcl_ora_2880.trc:
    ORA-04030: out of process memory when trying to allocate 32111412 bytes (joxp heap,f:OldSpace)
    Thu Dec 04 02:53:34 2008
    Errors in file c:\oracle\product\10.2.0\admin\orcl\udump\orcl_ora_1840.trc:
    ORA-04030: out of process memory when trying to allocate 18403380 bytes (joxp heap,f:OldSpace)
    Thread 1 cannot allocate new log, sequence 3974
    Checkpoint not complete
    Current log# 3 seq# 3973 mem# 0: C:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\REDO03.LOG
    Thread 1 advanced to log sequence 3974
    Current log# 1 seq# 3974 mem# 0: C:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\REDO01.LOG
    Thu Dec 04 02:53:37 2008
    Errors in file c:\oracle\product\10.2.0\admin\orcl\udump\orcl_ora_1840.trc:
    ORA-04030: out of process memory when trying to allocate 753120 bytes (pga heap,kco buffer)
    ORA-04030: out of process memory when trying to allocate 18403380 bytes (joxp heap,f:OldSpace)
    Thu Dec 04 02:53:41 2008
    Process startup failed, error stack:
    Thu Dec 04 02:53:41 2008
    Errors in file c:\oracle\product\10.2.0\admin\orcl\bdump\orcl_psp0_2260.trc:
    ORA-27300: OS system dependent operation:spcdr:9261:4200 failed with status: 997
    ORA-27301: OS failure message: Overlapped I/O operation is in progress.
    ORA-27302: failure occurred at: skgpspawn
    Thu Dec 04 02:53:42 2008
    Thread 1 advanced to log sequence 3975
    Current log# 2 seq# 3975 mem# 0: C:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\REDO02.LOG
    Thu Dec 04 02:53:42 2008
    Process J000 died, see its trace file
    Thu Dec 04 02:53:42 2008
    kkjcre1p: unable to spawn jobq slave process
    Thu Dec 04 05:03:46 2008
    Errors in file c:\oracle\product\10.2.0\admin\orcl\udump\orcl_ora_2648.trc:
    ORA-04030: out of process memory when trying to allocate 36518964 bytes (joxp heap,f:OldSpace)
    Thu Dec 04 05:04:23 2008
    Errors in file c:\oracle\product\10.2.0\admin\orcl\udump\orcl_ora_1612.trc:
    ORA-04030: out of process memory when trying to allocate 52046388 bytes (joxp heap,f:OldSpace)
    Latest pfile I've tried is:
    # Cache and I/O
    db_block_size=8192
    db_file_multiblock_read_count=16
    # Cursors and Library Cache
    open_cursors=300
    # Database Identification
    db_domain=""
    db_name=orcl
    # Diagnostics and Statistics
    background_dump_dest=C:\oracle\product\10.2.0/admin/orcl/bdump
    core_dump_dest=C:\oracle\product\10.2.0/admin/orcl/cdump
    user_dump_dest=C:\oracle\product\10.2.0/admin/orcl/udump
    # File Configuration
    control_files=("C:\oracle\product\10.2.0\oradata\orcl\control01.ctl", "C:\oracle\product\10.2.0\oradata\orcl\control02.ctl", "C:\oracle\product\10.2.0\oradata\orcl\control03.ctl")
    db_recovery_file_dest=C:\oracle\product\10.2.0/flash_recovery_area
    db_recovery_file_dest_size=2147483648
    # Job Queues
    job_queue_processes=10
    # Miscellaneous
    compatible=10.2.0.1.0
    # Processes and Sessions
    processes=250
    # SGA Memory
    sga_target=1677721600
    sga_max_size=1677721600
    # Security and Auditing
    audit_file_dest=C:\oracle\product\10.2.0/admin/orcl/adump
    remote_login_passwordfile=EXCLUSIVE
    # Shared Server
    dispatchers="(PROTOCOL=TCP) (SERVICE=orclXDB)"
    # Sort, Hash Joins, Bitmap Indexes
    pga_aggregate_target=629145600
    # System Managed Undo and Rollback Segments
    undo_management=AUTO
    undo_tablespace=UNDOTBS1
    --Charles
    Edited by: user10601251 on Dec 6, 2008 11:51 PM

    Alas, I do not have metalink.
    I have asked my company if we have a CSI number that I can use to get a metalink account, but I have not gotten a reply.
    I have done substantial additional testing - setting JAVA_POOL_SIZE up to 660MB, using oracle.aurora.vm.OracleRuntime.setMaxMemorySize of 1GB and 640MB, and I have tried 'orastack' on oracle and tnslsnr of 512KB and 700KB.
    I still get the ORA-04030 joxp heap,f:OldSpace errors.
    The odd thing is that it used to work 'sometimes' - even with the default Oracle configuration,
    but since November 24th it has failed every single time. It is possible I suppose that the database has grown enough to break the application for queries of this size.
    I think it might be caused by the NCOMP testing I did. After I installed the companion CD and Visual Studio 2008 I think it has been failing on large queries consistantly. Is there a way to safely 'drop' the NCOMP classes without uninstalling everything?

  • Import Manager Out of Memory error

    I am running MDM 5.5.24.06 server on Windows XP Professional. The server has 4.0 GB RAM with 1.5 GB Virtual memory.
    I am trying to load 129 material master records from R/3 4.6 ( XML file size 8 MB), into MDM using Import Manager.
    When I click on import icon (Import Status tab selected), MDM Import Manager rightway returns 'Out of Memory' error.
    Tried to import the file again after rebooting the machine, restarting MDM Server, but still it returned the same error.
    Has anybody tried loading the R/3 material data into MDM? What's the experience? What was the load (# of records)?
    What was MDM IM performance? What's your hardware/software config?
    Appreciate your help.
    Thanks,
    Abhay

    Hi Abhay,
    My workstation has Windows 2000, Pentium 4 CPU 3.2 Ghz, 515,555 KB RAM.  The recommended specifications can be found on service market place of SAP, and we used those to procure our workstations.
    MDM server must be more pwerful than a workstation.  Try running the import manager from the server machine itself.  If it works there but not on your machine, it is memory of your workstation.
    Sometimes, I think this error comes when import map some issues- fields not mapped accurately or remote key mapped wrong etc.
    It is probably judicious to rpove the system memory as first step.
    Good luck!

  • Out of memory error importing a JPA Entity in WebDynpro Project

    Hi All!
    We are having problems importing JPA entities in a WebDynPro project, this is our escenario.
    We have two entities, entity A that has a ManyToOne relationship with entity B and at the same time entity B has a OneToMany relationship with entity A. When in the controller context we try to create a node usign a model binding to Entity A we got an Out of memory error from NetWeaver DS. Trying to figure out the problem we identified that in the model that we imported we got the following.
    Entity A
        Entity B
            Entity A
               Entity B
                  and so on....... without and end
    What are we doing wrong? Or how can we avoid these behavior?
    Regards,

    Hi Kaylan:
    Thanks for your reply. You are rigth about the error that we are getting. This is our scenario.
    We have a ejb that in some of his method uses the entity Lote, and this entity has a relationship with entity Categoria. When we import the EJB using the model importer netweaver imports the EJB methods and the entities that they use.
    So after doing this besides the ejb's methods we got these two entities that are the ones that are generating the error, when we try to create a context node using the Categoria entity.
    @Entity
    @Table(name="TB_LOTE")
    public class Lote implements Serializable {
         @EmbeddedId
         private Lote.PK pk;
         @ManyToOne
         @JoinColumn(name="CO_CATEGORIALOTE")
         private Categoria coCategorialote;
         @Embeddable
         public static class PK implements Serializable {
              @Column(name="CO_LOTE")
              private String coLote;
              @Column(name="CO_ORDENFABRICACION")
              private String coOrdenfabricacion2;
                   ^ this.coOrdenfabricacion2.hashCode();
    @Entity
    @Table(name="TB_CATEGORIA")
    public class Categoria implements Serializable {
         @Id
         @Column(name="CO_CATEGORIA")
         private String coCategoria;
         @OneToMany(mappedBy="coCategorialote")
         private Set<Lote> tbLoteCollection;
    Regards,
    Jose Arango

  • XSOMParser throwing out of memory error

    Hello,
    Currently we are using XSOM parser with DomAnnotationParserFactory to parse XSD file. For small files it is working fine. However is was throwing out of memory error while parsing 9MB file. We could understood reason behind this. Is there any way to resolve this issue?
    Code :
         XSOMParser parser = new XSOMParser();
    parser.setAnnotationParser(new DomAnnotationParserFactory());
    XSSchemaSet schemaSet = null;
    XSSchema xsSchema = null;
    parser.parse(configFilePath);
    Here we are getting error on parser.parse() method. (using 128 MB heap memory using -Xrs -Xmx128m).
    Stack Trace :
    Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
         at oracle.xml.parser.v2.XMLDocument.xdkIncCurrentId(XMLDocument.java:3020)
         at oracle.xml.parser.v2.XMLNode.xdkInit(XMLNode.java:2758)
         at oracle.xml.parser.v2.XMLNode.<init>(XMLNode.java:423)
         at oracle.xml.parser.v2.XMLNSNode.<init>(XMLNSNode.java:144)
         at oracle.xml.parser.v2.XMLElement.<init>(XMLElement.java:373)
         at oracle.xml.parser.v2.XMLDocument.createNodeFromType(XMLDocument.java:2865)
         at oracle.xml.parser.v2.XMLDocument.createElement(XMLDocument.java:1896)
         at oracle.xml.parser.v2.DocumentBuilder.startElement(DocumentBuilder.java:224)
         at oracle.xml.parser.v2.XMLElement.reportStartElement(XMLElement.java:3188)
         at oracle.xml.parser.v2.XMLElement.reportSAXEvents(XMLElement.java:2164)
         at oracle.xml.jaxp.JXTransformer.transform(JXTransformer.java:337)
         at oracle.xml.jaxp.JXTransformerHandler.endDocument(JXTransformerHandler.java:141)
         at com.sun.xml.xsom.impl.parser.state.NGCCRuntime.endElement(NGCCRuntime.java:267)
         at org.xml.sax.helpers.XMLFilterImpl.endElement(Unknown Source)
         at oracle.xml.parser.v2.NonValidatingParser.parseElement(NonValidatingParser.java:1257)
         at oracle.xml.parser.v2.NonValidatingParser.parseRootElement(NonValidatingParser.java:314)
         at oracle.xml.parser.v2.NonValidatingParser.parseDocument(NonValidatingParser.java:281)
         at oracle.xml.parser.v2.XMLParser.parse(XMLParser.java:196)
         at org.xml.sax.helpers.XMLFilterImpl.parse(Unknown Source)
         at com.sun.xml.xsom.parser.JAXPParser.parse(JAXPParser.java:79)
         at com.sun.xml.xsom.impl.parser.NGCCRuntimeEx.parseEntity(NGCCRuntimeEx.java:298)
         at com.sun.xml.xsom.impl.parser.ParserContext.parse(ParserContext.java:87)
         at com.sun.xml.xsom.parser.XSOMParser.parse(XSOMParser.java:147)
         at com.sun.xml.xsom.parser.XSOMParser.parse(XSOMParser.java:136)
         at com.sun.xml.xsom.parser.XSOMParser.parse(XSOMParser.java:129)
         at com.sun.xml.xsom.parser.XSOMParser.parse(XSOMParser.java:122)
    Please let me know if anyone has comment on this.
    Also let me know if there any other parser which handles large input files efficiently.

    Hello,
    Currently we are using XSOM parser with DomAnnotationParserFactory to parse XSD file. For small files it is working fine. However is was throwing out of memory error while parsing 9MB file. We could understood reason behind this. Is there any way to resolve this issue?
    Code :
         XSOMParser parser = new XSOMParser();
    parser.setAnnotationParser(new DomAnnotationParserFactory());
    XSSchemaSet schemaSet = null;
    XSSchema xsSchema = null;
    parser.parse(configFilePath);
    Here we are getting error on parser.parse() method. (using 128 MB heap memory using -Xrs -Xmx128m).
    Stack Trace :
    Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
         at oracle.xml.parser.v2.XMLDocument.xdkIncCurrentId(XMLDocument.java:3020)
         at oracle.xml.parser.v2.XMLNode.xdkInit(XMLNode.java:2758)
         at oracle.xml.parser.v2.XMLNode.<init>(XMLNode.java:423)
         at oracle.xml.parser.v2.XMLNSNode.<init>(XMLNSNode.java:144)
         at oracle.xml.parser.v2.XMLElement.<init>(XMLElement.java:373)
         at oracle.xml.parser.v2.XMLDocument.createNodeFromType(XMLDocument.java:2865)
         at oracle.xml.parser.v2.XMLDocument.createElement(XMLDocument.java:1896)
         at oracle.xml.parser.v2.DocumentBuilder.startElement(DocumentBuilder.java:224)
         at oracle.xml.parser.v2.XMLElement.reportStartElement(XMLElement.java:3188)
         at oracle.xml.parser.v2.XMLElement.reportSAXEvents(XMLElement.java:2164)
         at oracle.xml.jaxp.JXTransformer.transform(JXTransformer.java:337)
         at oracle.xml.jaxp.JXTransformerHandler.endDocument(JXTransformerHandler.java:141)
         at com.sun.xml.xsom.impl.parser.state.NGCCRuntime.endElement(NGCCRuntime.java:267)
         at org.xml.sax.helpers.XMLFilterImpl.endElement(Unknown Source)
         at oracle.xml.parser.v2.NonValidatingParser.parseElement(NonValidatingParser.java:1257)
         at oracle.xml.parser.v2.NonValidatingParser.parseRootElement(NonValidatingParser.java:314)
         at oracle.xml.parser.v2.NonValidatingParser.parseDocument(NonValidatingParser.java:281)
         at oracle.xml.parser.v2.XMLParser.parse(XMLParser.java:196)
         at org.xml.sax.helpers.XMLFilterImpl.parse(Unknown Source)
         at com.sun.xml.xsom.parser.JAXPParser.parse(JAXPParser.java:79)
         at com.sun.xml.xsom.impl.parser.NGCCRuntimeEx.parseEntity(NGCCRuntimeEx.java:298)
         at com.sun.xml.xsom.impl.parser.ParserContext.parse(ParserContext.java:87)
         at com.sun.xml.xsom.parser.XSOMParser.parse(XSOMParser.java:147)
         at com.sun.xml.xsom.parser.XSOMParser.parse(XSOMParser.java:136)
         at com.sun.xml.xsom.parser.XSOMParser.parse(XSOMParser.java:129)
         at com.sun.xml.xsom.parser.XSOMParser.parse(XSOMParser.java:122)
    Please let me know if anyone has comment on this.
    Also let me know if there any other parser which handles large input files efficiently.

  • Can anyone help with an out of memory error on CS2?

    Hello,
    I have been battling with an out of memory error on a dell laptop running 2gb of ram. I have had no prior issues with this laptop running CS2 until last week when I started getting the out of memory errors. I have seen several articles about this occuring on later versions of illustrator. Just prior to this issue appearing I had to delete and rebuild the main user profile on the computer due to it consistently logging into a temporary account.
    Once the new profile was established and all user documents and profile settings were transferred back into it, this problem started showing up. The error occurs when clicking on files listed as Illustrator files, and when opening the illustrator suite. Once I click on the out of memory error it still continues to load illustrator. The files are located on a server and not on the remote computer. I have tried to copy the files to the local hard drive with the same result when I try to open them. I also tried to change the extension to .pdf and got no further. I tried to open a file and recreate the work and all I got was the black outline of the fonts. none of the other artwork appeared. once rebuilt, when trying to save, It gives the out of memory error and does not save the file.
    Illustrator was installed on another dell laptop, same model, etc and works fine. no problem accessing network files. I just attempted to use Illustrator from the admin profile on the machine in question and it works fine and lightening fast. Just not from the new rebuilt profile. I have tried to uninstall and reinstall the software. it has not helped. could there be stray configuration settings that we transfered over to the new profile that should not have been?
    Any insights will be helpful.
    Thank You,
    Brad Yeutter

    Here are some steps that could help you:
    http://kb2.adobe.com/cps/320/320782.html
    http://kb2.adobe.com/cps/401/kb401053.html
    I hope this helps!

  • Out of memory error, Urgent!!!

    Hi All,
    I know there have been a lot of posts for this topic, but none of them seem to address my problem. So please help me out!!! I have written an GUI application that if you open and close for up to certain times, say 10 times, then you will get OOM. I have set the objects to be null when closing the application and called System.gc() as well as System.runFinalization(). Note that I can't call System.exit(0) since this application is not a standalone one. I thought in java if the objects are not used any more, then they will be garbage collected. I may not even have to set the objects to be null. But obviously there are still references to these objects in memory.
    I am really frastrated. I read about SoftReference, but I don't get the error while running the application. So I don't think it's very useful for my case.
    hhh

    >
    You probably still have references to the objects
    somewhere.
    The variables you've set to null, aren't the only
    references.
    Is there a way that I can find out what referencesare
    still alive after I exit the application?
    After you exit the application? After you exit
    the application, there are zero references, and also
    it's impossible to get out of memory errors from java
    after it exits.
    But I think you may be using the word "application" or
    "exit" incorrectly here...
    What do you mean you don't get the error whenrunning
    the application?I meant I don't get OOM error while running
    application. I simply launch the application andthen
    exit it. By doing so for about 10 times, then I get
    the OOM error. But that's impossible. After the JVM exits, the JVM
    can't throw an out of memory error. It can't throw
    any errors. It's not running anymore.As I said in my original message, my application is not a standalone one. It is actually one of the tools out of the main application. That's why when I exit this tool I can't use System.exit(0) to stop JVM.

  • Uploading large files from applet to servlet throws out of memory error

    I have a java applet that needs to upload files from a client machine
    to a web server using a servlet. the problem i am having is that in
    the current scheme, files larger than 17-20MB throw an out of memory
    error. is there any way we can get around this problem? i will post
    the client and server side code for reference.
    Client Side Code:
    import java.io.*;
    import java.net.*;
    // this class is a client that enables transfer of files from client
    // to server. This client connects to a servlet running on the server
    // and transmits the file.
    public class fileTransferClient
    private static final String FILENAME_HEADER = "fileName";
    private static final String FILELASTMOD_HEADER = "fileLastMod";
    // this method transfers the prescribed file to the server.
    // if the destination directory is "", it transfers the file to
    "d:\\".
    //11-21-02 Changes : This method now has a new parameter that
    references the item
    //that is being transferred in the import list.
    public static String transferFile(String srcFileName, String
    destFileName,
    String destDir, int itemID)
    if (destDir.equals(""))
    destDir = "E:\\FTP\\incoming\\";
    // get the fully qualified filename and the mere filename.
    String fqfn = srcFileName;
    String fname =
    fqfn.substring(fqfn.lastIndexOf(File.separator)+1);
    try
    //importTable importer = jbInit.getImportTable();
    // create the file to be uploaded and a connection to
    servlet.
    File fileToUpload = new File(fqfn);
    long fileSize = fileToUpload.length();
    // get last mod of this file.
    // The last mod is sent to the servlet as a header.
    long lastMod = fileToUpload.lastModified();
    String strLastMod = String.valueOf(lastMod);
    URL serverURL = new URL(webadminApplet.strServletURL);
    URLConnection serverCon = serverURL.openConnection();
    // a bunch of connection setup related things.
    serverCon.setDoInput(true);
    serverCon.setDoOutput(true);
    // Don't use a cached version of URL connection.
    serverCon.setUseCaches (false);
    serverCon.setDefaultUseCaches (false);
    // set headers and their values.
    serverCon.setRequestProperty("Content-Type",
    "application/octet-stream");
    serverCon.setRequestProperty("Content-Length",
    Long.toString(fileToUpload.length()));
    serverCon.setRequestProperty(FILENAME_HEADER, destDir +
    destFileName);
    serverCon.setRequestProperty(FILELASTMOD_HEADER, strLastMod);
    if (webadminApplet.DEBUG) System.out.println("Connection with
    FTP server established");
    // create file stream and write stream to write file data.
    FileInputStream fis = new FileInputStream(fileToUpload);
    OutputStream os = serverCon.getOutputStream();
    try
    // transfer the file in 4K chunks.
    byte[] buffer = new byte[4096];
    long byteCnt = 0;
    //long percent = 0;
    int newPercent = 0;
    int oldPercent = 0;
    while (true)
    int bytes = fis.read(buffer);
    byteCnt += bytes;
    //11-21-02 :
    //If itemID is greater than -1 this is an import file
    transfer
    //otherwise this is a header graphic file transfer.
    if (itemID > -1)
    newPercent = (int) ((double) byteCnt/ (double)
    fileSize * 100.0);
    int diff = newPercent - oldPercent;
    if (newPercent == 0 || diff >= 20)
    oldPercent = newPercent;
    jbInit.getImportTable().displayFileTransferStatus
    (itemID,
    newPercent);
    if (bytes < 0) break;
    os.write(buffer, 0, bytes);
    os.flush();
    if (webadminApplet.DEBUG) System.out.println("No of bytes
    sent: " + byteCnt);
    finally
    // close related streams.
    os.close();
    fis.close();
    if (webadminApplet.DEBUG) System.out.println("File
    Transmission complete");
    // find out what the servlet has got to say in response.
    BufferedReader reader = new BufferedReader(
    new
    InputStreamReader(serverCon.getInputStream()));
    try
    String line;
    while ((line = reader.readLine()) != null)
    if (webadminApplet.DEBUG) System.out.println(line);
    finally
    // close the reader stream from servlet.
    reader.close();
    } // end of the big try block.
    catch (Exception e)
    System.out.println("Exception during file transfer:\n" + e);
    e.printStackTrace();
    return("FTP failed. See Java Console for Errors.");
    } // end of catch block.
    return("File: " + fname + " successfully transferred.");
    } // end of method transferFile().
    } // end of class fileTransferClient
    Server side code:
    import java.io.*;
    import javax.servlet.*;
    import javax.servlet.http.*;
    import java.util.*;
    import java.net.*;
    // This servlet class acts as an FTP server to enable transfer of
    files
    // from client side.
    public class FtpServerServlet extends HttpServlet
    String ftpDir = "D:\\pub\\FTP\\";
    private static final String FILENAME_HEADER = "fileName";
    private static final String FILELASTMOD_HEADER = "fileLastMod";
    public void doGet(HttpServletRequest req, HttpServletResponse resp)
    throws ServletException,
    IOException
    doPost(req, resp);
    public void doPost(HttpServletRequest req, HttpServletResponse
    resp)
    throws ServletException,
    IOException
    // ### for now enable overwrite by default.
    boolean overwrite = true;
    // get the fileName for this transmission.
    String fileName = req.getHeader(FILENAME_HEADER);
    // also get the last mod of this file.
    String strLastMod = req.getHeader(FILELASTMOD_HEADER);
    String message = "Filename: " + fileName + " saved
    successfully.";
    int status = HttpServletResponse.SC_OK;
    System.out.println("fileName from client: " + fileName);
    // if filename is not specified, complain.
    if (fileName == null)
    message = "Filename not specified";
    status = HttpServletResponse.SC_INTERNAL_SERVER_ERROR;
    else
    // open the file stream for the file about to be transferred.
    File uploadedFile = new File(fileName);
    // check if file already exists - and overwrite if necessary.
    if (uploadedFile.exists())
    if (overwrite)
    // delete the file.
    uploadedFile.delete();
    // ensure the directory is writable - and a new file may be
    created.
    if (!uploadedFile.createNewFile())
    message = "Unable to create file on server. FTP failed.";
    status = HttpServletResponse.SC_INTERNAL_SERVER_ERROR;
    else
    // get the necessary streams for file creation.
    FileOutputStream fos = new FileOutputStream(uploadedFile);
    InputStream is = req.getInputStream();
    try
    // create a buffer. 4K!
    byte[] buffer = new byte[4096];
    // read from input stream and write to file stream.
    int byteCnt = 0;
    while (true)
    int bytes = is.read(buffer);
    if (bytes < 0) break;
    byteCnt += bytes;
    // System.out.println(buffer);
    fos.write(buffer, 0, bytes);
    // flush the stream.
    fos.flush();
    } // end of try block.
    finally
    is.close();
    fos.close();
    // set last mod date for this file.
    uploadedFile.setLastModified((new
    Long(strLastMod)).longValue());
    } // end of finally block.
    } // end - the new file may be created on server.
    } // end - we have a valid filename.
    // set response headers.
    resp.setContentType("text/plain");
    resp.setStatus(status);
    if (status != HttpServletResponse.SC_OK)
    getServletContext().log("ERROR: " + message);
    // get output stream.
    PrintWriter out = resp.getWriter();
    out.println(message);
    } // end of doPost().
    } // end of class FtpServerServlet

    OK - the problem you describe is definitely what's giving you grief.
    The workaround is to use a socket connection and send your own request headers, with the content length filled in. You may have to multi-part mime encode the stream on its way out as well (I'm not about that...).
    You can use the following:
    http://porsche.cis.udel.edu:8080/cis479/lectures/slides-04/slide-02.html
    on your server to get a feel for the format that the request headers need to take.
    - Kevin
    I get the out of Memory Error on the client side. I
    was told that this might be a bug in the URLConnection
    class implementation that basically it wont know the
    content length until all the data has been written to
    the output stream, so it uses an in memory buffer to
    store the data which basically causes memory issues..
    do you think there might be a workaround of any kind..
    or maybe a way that the buffer might be flushed after
    a certain size of file has been uploaded.. ?? do you
    have any ideas?

  • Getting out of memory errors in Indesign 5.5. What can I do to fix it?

    Getting out of memory errors in Indesign 5.5. What can I do to fix it?

    Tell your dumb friend to pay you for a new phone as he damaged it. You cannot get help here for a phone that has been taken apart, as it is not user servicible. Your dumb friend also voided your warranty and, even if the warranty were expired Apple will never touch that phone.
    Time to get smarter friends.

  • Out of Memory error while builng HTML String from a Large HashMap.

    Hi,
    I am building an HTML string from a large map oject that consits of about 32000 objects using the Transformer class in java. As this HTML string needs to be displayed in the JSP page, the reponse time was too high and also some times it is throwing out of memory error.
    Please let me know how i can implement the concept of building the library tree(folder structure) HTML string for the first set of say 1000 entries and then display in the web page and then detect an onScroll event and handle it in java Script functions and come back and build the tree for the next set of entries in the map and append this string to the previous one and accordingly display it.
    please let me know whether
    1. the suggested solution was the advisable one.
    2. how to build tree(HTML String) for a set of entries in the map while iterating over the map.
    3. How to detect a onScroll event and handle it.
    Note : Handling the events in the JavaScript functions and displaying the tree is now being done using AJAX.
    Thanks for help in Advance,
    Kartheek

    Hi
    Sorry,
    I haven't seen any error in the browser as this may be Out of memory error which was not handled. I got the the following error from the web logic console
    org.apache.struts.actions.DispatchAction">Dispatch[serviceCenterHome] to method 'getUserLibraryTree' returned an exceptionjava.lang.reflect.InvocationTargetException
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:324)
         at org.apache.struts.actions.DispatchAction.dispatchMethod(DispatchAction.java:276)
         at org.apache.struts.actions.DispatchAction.execute(DispatchAction.java:196)
         at org.apache.struts.action.RequestProcessor.processActionPerform(RequestProcessor.java:421)
         at org.apache.struts.action.RequestProcessor.process(RequestProcessor.java:226)
         at org.apache.struts.action.ActionServlet.process(ActionServlet.java:1164)
         at org.apache.struts.action.ActionServlet.doPost(ActionServlet.java:415)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:760)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
         at weblogic.servlet.internal.ServletStubImpl$ServletInvocationAction.run(ServletStubImpl.java:996)
         at weblogic.servlet.internal.ServletStubImpl.invokeServlet(ServletStubImpl.java:419)
         at weblogic.servlet.internal.ServletStubImpl.invokeServlet(ServletStubImpl.java:315)
         at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:6452)
         at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
         at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:118)
         at weblogic.servlet.internal.WebAppServletContext.invokeServlet(WebAppServletContext.java:3661)
         at weblogic.servlet.internal.ServletRequestImpl.execute(ServletRequestImpl.java:2630)
         at weblogic.kernel.ExecuteThread.execute(ExecuteThread.java:219)
         at weblogic.kernel.ExecuteThread.run(ExecuteThread.java:178)
    Caused by: java.lang.OutOfMemoryError
    </L_MSG>
    <L_MSG MN="ILHD-1109" PID="adminserver" TID="ExecuteThread: '14' for queue: 'weblogic.kernel.Default'" DT="2012/04/18 7:56:17:146" PT="WARN" AP="" DN="" SN="" SR="org.apache.struts.action.RequestProcessor">Unhandled Exception thrown: class javax.servlet.ServletException</L_MSG>
    <Apr 18, 2012 7:56:17 AM CDT> <Error> <HTTP> <BEA-101017> <[ServletContext(id=26367546,name=fcsi,context-path=/fcsi)] Root cause of ServletException.
    *java.lang.OutOfMemoryError*
    Please Advise.
    Thanks for your help in advance,
    Kartheek

  • Will 64-bit office fix out of memory error?

    I've been troubleshooting an out of memory error in Excel 2010 for some time. I've read quite a few articles on forums and on MS sites (including here.) I find many hits but none seems to offer a solution that works. One idea that seems to show
    up often is that 64-bit Office versions will have a LOT MORE memory to work with than 32-bit versions. I'm running a 64-bit version of Windows 7, so I'm considering giving Office 2010 64-bit a try. However, I also find a lot of caveats in those articles that
    concern me. On the other hand most of those articles are 2 to 3 years old, so I'm wondering if most of those issues have been dealt with. For example, VBA issues; third-party add-ins that do not (did not) support 64-bit; ActiveX and COM issues.
    Sorry to be overly verbose. My questions pretty much come down to this:
    1. Is 64-bit likely to solve my out of memory problem?
    2. What issues are still unresolved in 64-bit Excel with Windows 7?
    TIA,
    Phil

    If you are an Excel power user working with huge amounts of data, then you would benefit from 64-bit Office being able to utilize more memory.
    MS is recommending 32-bit Office as the default installation on both 32-bit and 64-bit Windows mainly due to compatibility with existing 32-bit controls, add-ins, and VBA.
    This is really not an issue that can be resolved on Office side, it depends on whether you are using any 32-bits controls, add-ins and VBA. The question is those existing 32-bit controls, add-ins and vba need an update to adapter to the 64bit of Office.
    If all your controls, add-ins and VBA is 64 bits, or designed to work with 64 bits of Office, then you are good.
    Bhasker Timilsina (ManTechs Inc)

  • Adobe X Pro 10.1.10 Out of Memory error message

    Hello One, Hello All
    Since updating our Adobe X Pro machines to version update 10.1.10, we occasionally receive an "Out of Memory" error message, which depending on what we are doing, may force us to shutdown all Adobe windows and re-open Adobe, or simply click OK and continue working with the PDF document like nothing is wrong. It seems to have no rhyme or reason to when or why it occurs. I do not see any warnings or errors in the Windows Event Log. It happens with files we have created, and also happens with files received from internal and external sources through email. It is affecting our high end machines: Lenovo X Series and W Series laptops with Windows 8.1 3rd/4th gen i5 CPUs, 8+GB RAM, 128+GB SSD's. We have all system and Windows updates applied. We have Trend Micro and Malwarebytes real-time protection and system scans do not find any malware.
    I have found a few other recent threads on Adobe forum related to this error message but the responses are weak at best with no definitive fix. The system and user temp folders size are less than 100MB each so this cannot be the issue. When the error occurs I check task manager and system utilization, including memory, is well below 100%.
    We did not have this issue prior to version update 10.1.10.
    Really hoping Adobe can step in to help here and hoping boilerplate responses are not used.

    Hi wayne103,
    We released a new security update yesterday that is v10.1.11
    Please update to this version and check the performance.
    I have seen this error message occur while opening a 3rd party created pdf that has corrupted file structure.
    Please let me know if the issue occurs for some specific files or random
    Also let me know the PDF producer for these pdfs.
    Regards,
    Rave

  • Large Pdf using XML XSL - Out of Memory Error

    Hi Friends.
    I am trying to generate a PDF from XML, XSL and FO in java. It works fine if the PDF to be generated is small.
    But if the PDF to be generated is big, then it throws "Out of Memory" error. Can some one please give me some pointers about the possible reasons for this errors. Thanks for your help.
    RM
    Code:
    import java.io.*;
    import javax.servlet.*;
    import javax.servlet.http.*;
    import org.xml.sax.InputSource;
    import org.xml.sax.XMLReader;
    import org.apache.fop.apps.Driver;
    import org.apache.fop.apps.Version;
    import org.apache.fop.apps.XSLTInputHandler;
    import org.apache.fop.messaging.MessageHandler;
    import org.apache.avalon.framework.logger.ConsoleLogger;
    import org.apache.avalon.framework.logger.Logger;
    public class PdfServlet extends HttpServlet {
    public static final String FO_REQUEST_PARAM = "fo";
    public static final String XML_REQUEST_PARAM = "xml";
    public static final String XSL_REQUEST_PARAM = "xsl";
    Logger log = null;
         Com_BUtil myBu = new Com_BUtil();
    public void doGet(HttpServletRequest request,
    HttpServletResponse response) throws ServletException {
    if(log == null) {
         log = new ConsoleLogger(ConsoleLogger.LEVEL_WARN);
         MessageHandler.setScreenLogger(log);
    try {
    String foParam = request.getParameter(FO_REQUEST_PARAM);
    String xmlParam = myBu.getConfigVal("filePath") +"/"+request.getParameter(XML_REQUEST_PARAM);
    String xslParam = myBu.SERVERROOT + "/jsp/servlet/"+request.getParameter(XSL_REQUEST_PARAM)+".xsl";
         if((xmlParam != null) && (xslParam != null)) {
    XSLTInputHandler input = new XSLTInputHandler(new File(xmlParam), new File(xslParam));
    renderXML(input, response);
    } else {
    PrintWriter out = response.getWriter();
    out.println("<html><head><title>Error</title></head>\n"+
    "<body><h1>PdfServlet Error</h1><h3>No 'fo' "+
    "request param given.</body></html>");
    } catch (ServletException ex) {
    throw ex;
    catch (Exception ex) {
    throw new ServletException(ex);
    public void renderXML(XSLTInputHandler input,
    HttpServletResponse response) throws ServletException {
    try {
    ByteArrayOutputStream out = new ByteArrayOutputStream();
    response.setContentType("application/pdf");
    Driver driver = new Driver();
    driver.setLogger(log);
    driver.setRenderer(Driver.RENDER_PDF);
    driver.setOutputStream(out);
    driver.render(input.getParser(), input.getInputSource());
    byte[] content = out.toByteArray();
    response.setContentLength(content.length);
    response.getOutputStream().write(content);
    response.getOutputStream().flush();
    } catch (Exception ex) {
    throw new ServletException(ex);
    * creates a SAX parser, using the value of org.xml.sax.parser
    * defaulting to org.apache.xerces.parsers.SAXParser
    * @return the created SAX parser
    static XMLReader createParser() throws ServletException {
    String parserClassName = System.getProperty("org.xml.sax.parser");
    if (parserClassName == null) {
    parserClassName = "org.apache.xerces.parsers.SAXParser";
    try {
    return (XMLReader) Class.forName(
    parserClassName).newInstance();
    } catch (Exception e) {
    throw new ServletException(e);

    Hi,
    I did try that initially. After executing the command I get this message.
    C:\>java -Xms128M -Xmx256M
    Usage: java [-options] class [args...]
    (to execute a class)
    or java -jar [-options] jarfile [args...]
    (to execute a jar file)
    where options include:
    -cp -classpath <directories and zip/jar files separated by ;>
    set search path for application classes and resources
    -D<name>=<value>
    set a system property
    -verbose[:class|gc|jni]
    enable verbose output
    -version print product version and exit
    -showversion print product version and continue
    -? -help print this help message
    -X print help on non-standard options
    Thanks for your help.
    RM

  • Out of Memory error on some PDF's, not all PDF's.

    Hi there,
    I have read most of the posts in the forums that relate to 'Out of Memory' issues with PDF's and I have to say that there is still no solution that I have found.
    I have tried reinstalling Adobe Reader, Flash Player and tried clearing my Temp Files. None of these fixed the issue.
    The PDF's that receive this memory error are downloaded off a CMS website that has many offices and logins. Only one office is experiencing this out of memory issue so we know that it is not an issue with generating the PDF's on the CMS website, otherwise all the offices and logins would have this issue, since they use the same system.
    I am an admin of that CMS website and even I receive the same out of memory issue when I try and view the PDF's from that office, as the customer does.
    When we open these PDF's with another PDF reading program, the PDF's open fine. So that tells me that the issue lies with Adobe Reader and not the PDF file itself.
    These PDF files that are receiving the error about 1MB-4MB, so they are not large.
    Could you please tell me the solution to this out of memory error as we may lose a customer because your product is producing an error that does not seem to have been fixed after years of being reported by your customers.
    Thanks.

    As an Adobe Reader XI user, if I may put my two cents in, that "Out of Memory" problem is not with the Adobe Reader XI application, which was installed on our computers! I think, the problem is with how the PDF documents, which we were trying to open, were generated (or regenerated).
    I remember I was able to open my old credit card statement online without any problem and now I am not able to open the same old statement (out of memory) because it could have been regenerated! In fact, I could not open any of my credit card statements (old or new) on this specific credit card web site, which makes me believe those statments were regenerated or somthing with some new Adobe software.
    As others mentioned, even I'm able to view other PDF documents without any problem.

  • Out of memory errors

    I'm having problems starting 8i. When started from /etc/rc.d/
    init.d/dbora during bootup or manually by root, I get an out of
    memory error.
    If I start it from the dbadmin user (oracle), I either get the
    same out of memory error, or I end up with several hundred shell
    logins and the database still doesn't respond.
    This is RedHat Linux 6.0, kernel 2.2.5-22smp. Here is a sample of
    what happens when I get the out of memory error:
    Mem: 160448K av, 53488K used, 106960K free, 26264K shrd,
    6532K buff
    Swap: 656360K av, 0K used, 656360K free
    36328K cached
    Oracle Server Manager Release 3.1.5.0.0 - Production
    (c) Copyright 1997, Oracle Corporation. All Rights Reserved.
    Oracle8i Release 8.1.5.0.0 - Production
    With the Java option
    PL/SQL Release 8.1.5.0.0 - Production
    SVRMGR> Connected.
    SVRMGR> ORA-27102: out of memory
    Linux Error: 22: Invalid argument
    SVRMGR>
    Server Manager complete.
    Database "ORCL" warm started.
    null

    It turns out that the problem I was having with a large number of
    shell processes being created was due to the use of oraenv in my
    .bashrc file (so much for following the instructions!) It was
    calling itself recursively until the process limit was reached.
    However, even with this fixed, the out of memory error still
    exists.
    max (guest) wrote:
    : dan....
    : check your init.ora......
    Aside from comments, it has these lines, which were created by
    dbassist:
    db_name = test
    instance_name = ORCL
    service_names = test
    control_files = ("/u02/oradata/test/control01.ctl", "/u02/oradata/
    test/control02.ctl")
    db_block_buffers = 8192
    shared_pool_size = 4194304
    log_checkpoint_interval = 10000
    log_checkpoint_timeout = 1800
    # I reduced processes to see if it would help
    processes = 10
    log_buffer = 163840
    background_dump_dest = /u01/admin/test/bdump
    core_dump_dest = /u01/admin/test/cdump
    user_dump_dest = /u01/admin/test/udump
    db_block_size = 2048
    remote_login_passwordfile = exclusive
    os_authent_prefix = ""
    compatible = "8.1.0"
    : also check ulimit
    Here's ulimit -a:
    core file size (blocks) 1000000
    data seg size (kbytes) unlimited
    file size (blocks) unlimited
    max memory size (kbytes) unlimited
    stack size (kbytes) 8192
    cpu time (seconds) unlimited
    max user processes 256
    pipe size (512 bytes) 8
    open files 1024
    virtual memory (kbytes) 2105343
    Everything looks pretty large to me.
    : Dan Wilga (guest) wrote:
    : : I'm having problems starting 8i. When started from /etc/rc.d/
    : : init.d/dbora during bootup or manually by root, I get an out
    : of
    : : memory error.
    : : If I start it from the dbadmin user (oracle), I either get
    the
    : : same out of memory error, or I end up with several hundred
    : shell
    : : logins and the database still doesn't respond.
    : : This is RedHat Linux 6.0, kernel 2.2.5-22smp. Here is a
    sample
    : of
    : : what happens when I get the out of memory error:
    : : Mem: 160448K av, 53488K used, 106960K free, 26264K shrd,
    : : 6532K buff
    : : Swap: 656360K av, 0K used, 656360K free
    : : 36328K cached
    : : Oracle Server Manager Release 3.1.5.0.0 - Production
    : : (c) Copyright 1997, Oracle Corporation. All Rights Reserved.
    : : Oracle8i Release 8.1.5.0.0 - Production
    : : With the Java option
    : : PL/SQL Release 8.1.5.0.0 - Production
    : : SVRMGR> Connected.
    : : SVRMGR> ORA-27102: out of memory
    : : Linux Error: 22: Invalid argument
    : : SVRMGR>
    : : Server Manager complete.
    : : Database "ORCL" warm started.
    null

Maybe you are looking for

  • Sort option in BEX Report

    Hi All, In the Bex Reoprt. I need to sort a Keyfield . If right click on keyfield following options are coming (but, sort option is not there in this) again if i double click on keyfield sort option is coming but it is not working... what is the reas

  • Migo Goods Receipt Purchase Order

    Hi all I have to clear ,in transaction Migo, the wbs elements ( only in certain cases ) .For that we ve created an user-exit in the program of Substitutions that clear the fields : cobl-ps_psp_pnr, cobl-mat_pspnr. But the two fileds are not cleared .

  • Strange Black Blob on the screen

    Hello, I have an Ipod Nano 5G. Recently A strange black blob started appearing on my screen. It obscures the screen behind where it is, and it sometimes changes places randomly, but only horizontally. Also, I can make it walk by pressing it lightly,

  • Processing D800 files

    I am considering upgrading to a D800 (36.3MP).  I understand that I will have to upgrade my LR 3.6 to 4.x.  Is that correct? Also, I have an i7-2600, 16GB RAM and two SATA 3 drives (2TB each).  Is that going to be enough?  Which one of these componen

  • Getting completed process details using BPM API

    Hi Experts, is there any way to get the completed process details using BPM API or Analytics API for a specific process definition (in 7.31 SP10)? my observations: 1. using getRunningProcessInstances(process_def_id) of ProcessInstanceManager, I can f