XSOMParser throwing out of memory error
Hello,
Currently we are using XSOM parser with DomAnnotationParserFactory to parse XSD file. For small files it is working fine. However is was throwing out of memory error while parsing 9MB file. We could understood reason behind this. Is there any way to resolve this issue?
Code :
XSOMParser parser = new XSOMParser();
parser.setAnnotationParser(new DomAnnotationParserFactory());
XSSchemaSet schemaSet = null;
XSSchema xsSchema = null;
parser.parse(configFilePath);
Here we are getting error on parser.parse() method. (using 128 MB heap memory using -Xrs -Xmx128m).
Stack Trace :
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at oracle.xml.parser.v2.XMLDocument.xdkIncCurrentId(XMLDocument.java:3020)
at oracle.xml.parser.v2.XMLNode.xdkInit(XMLNode.java:2758)
at oracle.xml.parser.v2.XMLNode.<init>(XMLNode.java:423)
at oracle.xml.parser.v2.XMLNSNode.<init>(XMLNSNode.java:144)
at oracle.xml.parser.v2.XMLElement.<init>(XMLElement.java:373)
at oracle.xml.parser.v2.XMLDocument.createNodeFromType(XMLDocument.java:2865)
at oracle.xml.parser.v2.XMLDocument.createElement(XMLDocument.java:1896)
at oracle.xml.parser.v2.DocumentBuilder.startElement(DocumentBuilder.java:224)
at oracle.xml.parser.v2.XMLElement.reportStartElement(XMLElement.java:3188)
at oracle.xml.parser.v2.XMLElement.reportSAXEvents(XMLElement.java:2164)
at oracle.xml.jaxp.JXTransformer.transform(JXTransformer.java:337)
at oracle.xml.jaxp.JXTransformerHandler.endDocument(JXTransformerHandler.java:141)
at com.sun.xml.xsom.impl.parser.state.NGCCRuntime.endElement(NGCCRuntime.java:267)
at org.xml.sax.helpers.XMLFilterImpl.endElement(Unknown Source)
at oracle.xml.parser.v2.NonValidatingParser.parseElement(NonValidatingParser.java:1257)
at oracle.xml.parser.v2.NonValidatingParser.parseRootElement(NonValidatingParser.java:314)
at oracle.xml.parser.v2.NonValidatingParser.parseDocument(NonValidatingParser.java:281)
at oracle.xml.parser.v2.XMLParser.parse(XMLParser.java:196)
at org.xml.sax.helpers.XMLFilterImpl.parse(Unknown Source)
at com.sun.xml.xsom.parser.JAXPParser.parse(JAXPParser.java:79)
at com.sun.xml.xsom.impl.parser.NGCCRuntimeEx.parseEntity(NGCCRuntimeEx.java:298)
at com.sun.xml.xsom.impl.parser.ParserContext.parse(ParserContext.java:87)
at com.sun.xml.xsom.parser.XSOMParser.parse(XSOMParser.java:147)
at com.sun.xml.xsom.parser.XSOMParser.parse(XSOMParser.java:136)
at com.sun.xml.xsom.parser.XSOMParser.parse(XSOMParser.java:129)
at com.sun.xml.xsom.parser.XSOMParser.parse(XSOMParser.java:122)
Please let me know if anyone has comment on this.
Also let me know if there any other parser which handles large input files efficiently.
Hello,
Currently we are using XSOM parser with DomAnnotationParserFactory to parse XSD file. For small files it is working fine. However is was throwing out of memory error while parsing 9MB file. We could understood reason behind this. Is there any way to resolve this issue?
Code :
XSOMParser parser = new XSOMParser();
parser.setAnnotationParser(new DomAnnotationParserFactory());
XSSchemaSet schemaSet = null;
XSSchema xsSchema = null;
parser.parse(configFilePath);
Here we are getting error on parser.parse() method. (using 128 MB heap memory using -Xrs -Xmx128m).
Stack Trace :
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at oracle.xml.parser.v2.XMLDocument.xdkIncCurrentId(XMLDocument.java:3020)
at oracle.xml.parser.v2.XMLNode.xdkInit(XMLNode.java:2758)
at oracle.xml.parser.v2.XMLNode.<init>(XMLNode.java:423)
at oracle.xml.parser.v2.XMLNSNode.<init>(XMLNSNode.java:144)
at oracle.xml.parser.v2.XMLElement.<init>(XMLElement.java:373)
at oracle.xml.parser.v2.XMLDocument.createNodeFromType(XMLDocument.java:2865)
at oracle.xml.parser.v2.XMLDocument.createElement(XMLDocument.java:1896)
at oracle.xml.parser.v2.DocumentBuilder.startElement(DocumentBuilder.java:224)
at oracle.xml.parser.v2.XMLElement.reportStartElement(XMLElement.java:3188)
at oracle.xml.parser.v2.XMLElement.reportSAXEvents(XMLElement.java:2164)
at oracle.xml.jaxp.JXTransformer.transform(JXTransformer.java:337)
at oracle.xml.jaxp.JXTransformerHandler.endDocument(JXTransformerHandler.java:141)
at com.sun.xml.xsom.impl.parser.state.NGCCRuntime.endElement(NGCCRuntime.java:267)
at org.xml.sax.helpers.XMLFilterImpl.endElement(Unknown Source)
at oracle.xml.parser.v2.NonValidatingParser.parseElement(NonValidatingParser.java:1257)
at oracle.xml.parser.v2.NonValidatingParser.parseRootElement(NonValidatingParser.java:314)
at oracle.xml.parser.v2.NonValidatingParser.parseDocument(NonValidatingParser.java:281)
at oracle.xml.parser.v2.XMLParser.parse(XMLParser.java:196)
at org.xml.sax.helpers.XMLFilterImpl.parse(Unknown Source)
at com.sun.xml.xsom.parser.JAXPParser.parse(JAXPParser.java:79)
at com.sun.xml.xsom.impl.parser.NGCCRuntimeEx.parseEntity(NGCCRuntimeEx.java:298)
at com.sun.xml.xsom.impl.parser.ParserContext.parse(ParserContext.java:87)
at com.sun.xml.xsom.parser.XSOMParser.parse(XSOMParser.java:147)
at com.sun.xml.xsom.parser.XSOMParser.parse(XSOMParser.java:136)
at com.sun.xml.xsom.parser.XSOMParser.parse(XSOMParser.java:129)
at com.sun.xml.xsom.parser.XSOMParser.parse(XSOMParser.java:122)
Please let me know if anyone has comment on this.
Also let me know if there any other parser which handles large input files efficiently.
Similar Messages
-
Uploading large files from applet to servlet throws out of memory error
I have a java applet that needs to upload files from a client machine
to a web server using a servlet. the problem i am having is that in
the current scheme, files larger than 17-20MB throw an out of memory
error. is there any way we can get around this problem? i will post
the client and server side code for reference.
Client Side Code:
import java.io.*;
import java.net.*;
// this class is a client that enables transfer of files from client
// to server. This client connects to a servlet running on the server
// and transmits the file.
public class fileTransferClient
private static final String FILENAME_HEADER = "fileName";
private static final String FILELASTMOD_HEADER = "fileLastMod";
// this method transfers the prescribed file to the server.
// if the destination directory is "", it transfers the file to
"d:\\".
//11-21-02 Changes : This method now has a new parameter that
references the item
//that is being transferred in the import list.
public static String transferFile(String srcFileName, String
destFileName,
String destDir, int itemID)
if (destDir.equals(""))
destDir = "E:\\FTP\\incoming\\";
// get the fully qualified filename and the mere filename.
String fqfn = srcFileName;
String fname =
fqfn.substring(fqfn.lastIndexOf(File.separator)+1);
try
//importTable importer = jbInit.getImportTable();
// create the file to be uploaded and a connection to
servlet.
File fileToUpload = new File(fqfn);
long fileSize = fileToUpload.length();
// get last mod of this file.
// The last mod is sent to the servlet as a header.
long lastMod = fileToUpload.lastModified();
String strLastMod = String.valueOf(lastMod);
URL serverURL = new URL(webadminApplet.strServletURL);
URLConnection serverCon = serverURL.openConnection();
// a bunch of connection setup related things.
serverCon.setDoInput(true);
serverCon.setDoOutput(true);
// Don't use a cached version of URL connection.
serverCon.setUseCaches (false);
serverCon.setDefaultUseCaches (false);
// set headers and their values.
serverCon.setRequestProperty("Content-Type",
"application/octet-stream");
serverCon.setRequestProperty("Content-Length",
Long.toString(fileToUpload.length()));
serverCon.setRequestProperty(FILENAME_HEADER, destDir +
destFileName);
serverCon.setRequestProperty(FILELASTMOD_HEADER, strLastMod);
if (webadminApplet.DEBUG) System.out.println("Connection with
FTP server established");
// create file stream and write stream to write file data.
FileInputStream fis = new FileInputStream(fileToUpload);
OutputStream os = serverCon.getOutputStream();
try
// transfer the file in 4K chunks.
byte[] buffer = new byte[4096];
long byteCnt = 0;
//long percent = 0;
int newPercent = 0;
int oldPercent = 0;
while (true)
int bytes = fis.read(buffer);
byteCnt += bytes;
//11-21-02 :
//If itemID is greater than -1 this is an import file
transfer
//otherwise this is a header graphic file transfer.
if (itemID > -1)
newPercent = (int) ((double) byteCnt/ (double)
fileSize * 100.0);
int diff = newPercent - oldPercent;
if (newPercent == 0 || diff >= 20)
oldPercent = newPercent;
jbInit.getImportTable().displayFileTransferStatus
(itemID,
newPercent);
if (bytes < 0) break;
os.write(buffer, 0, bytes);
os.flush();
if (webadminApplet.DEBUG) System.out.println("No of bytes
sent: " + byteCnt);
finally
// close related streams.
os.close();
fis.close();
if (webadminApplet.DEBUG) System.out.println("File
Transmission complete");
// find out what the servlet has got to say in response.
BufferedReader reader = new BufferedReader(
new
InputStreamReader(serverCon.getInputStream()));
try
String line;
while ((line = reader.readLine()) != null)
if (webadminApplet.DEBUG) System.out.println(line);
finally
// close the reader stream from servlet.
reader.close();
} // end of the big try block.
catch (Exception e)
System.out.println("Exception during file transfer:\n" + e);
e.printStackTrace();
return("FTP failed. See Java Console for Errors.");
} // end of catch block.
return("File: " + fname + " successfully transferred.");
} // end of method transferFile().
} // end of class fileTransferClient
Server side code:
import java.io.*;
import javax.servlet.*;
import javax.servlet.http.*;
import java.util.*;
import java.net.*;
// This servlet class acts as an FTP server to enable transfer of
files
// from client side.
public class FtpServerServlet extends HttpServlet
String ftpDir = "D:\\pub\\FTP\\";
private static final String FILENAME_HEADER = "fileName";
private static final String FILELASTMOD_HEADER = "fileLastMod";
public void doGet(HttpServletRequest req, HttpServletResponse resp)
throws ServletException,
IOException
doPost(req, resp);
public void doPost(HttpServletRequest req, HttpServletResponse
resp)
throws ServletException,
IOException
// ### for now enable overwrite by default.
boolean overwrite = true;
// get the fileName for this transmission.
String fileName = req.getHeader(FILENAME_HEADER);
// also get the last mod of this file.
String strLastMod = req.getHeader(FILELASTMOD_HEADER);
String message = "Filename: " + fileName + " saved
successfully.";
int status = HttpServletResponse.SC_OK;
System.out.println("fileName from client: " + fileName);
// if filename is not specified, complain.
if (fileName == null)
message = "Filename not specified";
status = HttpServletResponse.SC_INTERNAL_SERVER_ERROR;
else
// open the file stream for the file about to be transferred.
File uploadedFile = new File(fileName);
// check if file already exists - and overwrite if necessary.
if (uploadedFile.exists())
if (overwrite)
// delete the file.
uploadedFile.delete();
// ensure the directory is writable - and a new file may be
created.
if (!uploadedFile.createNewFile())
message = "Unable to create file on server. FTP failed.";
status = HttpServletResponse.SC_INTERNAL_SERVER_ERROR;
else
// get the necessary streams for file creation.
FileOutputStream fos = new FileOutputStream(uploadedFile);
InputStream is = req.getInputStream();
try
// create a buffer. 4K!
byte[] buffer = new byte[4096];
// read from input stream and write to file stream.
int byteCnt = 0;
while (true)
int bytes = is.read(buffer);
if (bytes < 0) break;
byteCnt += bytes;
// System.out.println(buffer);
fos.write(buffer, 0, bytes);
// flush the stream.
fos.flush();
} // end of try block.
finally
is.close();
fos.close();
// set last mod date for this file.
uploadedFile.setLastModified((new
Long(strLastMod)).longValue());
} // end of finally block.
} // end - the new file may be created on server.
} // end - we have a valid filename.
// set response headers.
resp.setContentType("text/plain");
resp.setStatus(status);
if (status != HttpServletResponse.SC_OK)
getServletContext().log("ERROR: " + message);
// get output stream.
PrintWriter out = resp.getWriter();
out.println(message);
} // end of doPost().
} // end of class FtpServerServletOK - the problem you describe is definitely what's giving you grief.
The workaround is to use a socket connection and send your own request headers, with the content length filled in. You may have to multi-part mime encode the stream on its way out as well (I'm not about that...).
You can use the following:
http://porsche.cis.udel.edu:8080/cis479/lectures/slides-04/slide-02.html
on your server to get a feel for the format that the request headers need to take.
- Kevin
I get the out of Memory Error on the client side. I
was told that this might be a bug in the URLConnection
class implementation that basically it wont know the
content length until all the data has been written to
the output stream, so it uses an in memory buffer to
store the data which basically causes memory issues..
do you think there might be a workaround of any kind..
or maybe a way that the buffer might be flushed after
a certain size of file has been uploaded.. ?? do you
have any ideas? -
SAP AS JAVA installation in Solaris Zone envirn throws OUT of Memory error.
Hi,
We are installing SAP NW2004s 7.0 SR3 AS JAVA on Solaris 10 in zone environment. Thsi was prod server getting build where on top of this we will install GRC 5.3.
We have faced no issues in development build.
But during the Prod build at databse create step, we are getting the below error
ORA-27102: out of memory
SVR4 Error: 22: Invalid argument
Disconnected
SAPinst log entries where error messages started:
04:43:58.128
Execute step createDatabase of component |NW_Onehost|ind|ind|ind|ind|0|0|NW_Onehost_System|ind|ind|ind|ind|2|0|NW_CreateDBandLoad|ind|ind|ind|ind|10|0|NW_CreateDB|ind|ind|ind|ind|0|0|NW_OraDBCheck|ind|ind|ind|ind|0|0|NW_OraDBMain|ind|ind|ind|ind|0|0|NW_OraDBStd|ind|ind|ind|ind|3|0|NW_OraDbBuild|ind|ind|ind|ind|5|0
INFO 2011-04-01 04:45:14.590
Working directory changed to /tmp/sapinst_exe.16718.1301647358.
INFO 2011-04-01 04:45:14.595
Working directory changed to /tmp/sapinst_instdir/NW04S/SYSTEM/ORA/CENTRAL/AS.
INFO 2011-04-01 04:45:14.609
Working directory changed to /tmp/sapinst_exe.16718.1301647358.
INFO 2011-04-01 04:45:14.621
Working directory changed to /tmp/sapinst_instdir/NW04S/SYSTEM/ORA/CENTRAL/AS.
INFO 2011-04-01 04:45:14.850
Account oraac5 already exists.
INFO 2011-04-01 04:45:14.852
Account dba already exists.
INFO 2011-04-01 04:45:14.852
Account oraac5 already exists.
INFO 2011-04-01 04:45:14.853
Account dba already exists.
INFO 2011-04-01 04:45:14.867
Working directory changed to /tmp/sapinst_exe.16718.1301647358.
INFO 2011-04-01 04:45:14.899
Working directory changed to /tmp/sapinst_instdir/NW04S/SYSTEM/ORA/CENTRAL/AS.
ERROR 2011-04-01 04:45:32.280
CJS-00084 SQL statement or script failed. DIAGNOSIS: Error message: ORA-27102: out of memory
SVR4 Error: 22: Invalid argument
Disconnected
. SOLUTION: See ora_sql_results.log and the Oracle documentation for details.
ERROR 2011-04-01 04:45:32.286
MUT-03025 Caught ESAPinstException in Modulecall: ORA-27102: out of memory
SVR4 Error: 22: Invalid argument
Disconnected
ERROR 2011-04-01 04:45:32.453
FCO-00011 The step createDatabase with step key |NW_Onehost|ind|ind|ind|ind|0|0|NW_Onehost_System|ind|ind|ind|ind|2|0|NW_CreateDBandLoad|ind|ind|ind|ind|10|0|NW_CreateDB|ind|ind|ind|ind|0|0|NW_OraDBCheck|ind|ind|ind|ind|0|0|NW_OraDBMain|ind|ind|ind|ind|0|0|NW_OraDBStd|ind|ind|ind|ind|3|0|NW_OraDbBuild|ind|ind|ind|ind|5|0|createDatabase was executed with status ERROR ( Last error reported by the step :Caught ESAPinstException in Modulecall: ORA-27102: out of memory
SVR4 Error: 22: Invalid argument
Disconnected
ora_sql_results.log
04:45:15 SAPINST ORACLE start logging for
SHUTDOWN IMMEDIATE;
exit;
Output of SQL executing program:
SQL*Plus: Release 10.2.0.4.0 - Production on Fri Apr 1 04:45:15 2011
Copyright (c) 1982, 2007, Oracle. All Rights Reserved.
Connected to an idle instance.
ORA-01034: ORACLE not available
ORA-27101: shared memory realm does not exist
SVR4 Error: 2: No such file or directory
Disconnected
SAPINST: End of output of SQL executing program /oracle/AC5/102_64/bin/sqlplus.
SAPINST found errors.
SAPINST The current process environment may be found in sapinst_ora_environment.log.
2011-04-01, 04:45:15 SAPINST ORACLE stop logging
================================================================================
2011-04-01, 04:45:15 SAPINST ORACLE start logging for
STARTUP NOMOUNT;
exit;
Output of SQL executing program:
SQL*Plus: Release 10.2.0.4.0 - Production on Fri Apr 1 04:45:15 2011
Copyright (c) 1982, 2007, Oracle. All Rights Reserved.
Connected to an idle instance.
ORA-27102: out of memory
SVR4 Error: 22: Invalid argument
Disconnected
SAPINST: End of output of SQL executing program /oracle/AC5/102_64/bin/sqlplus.
SAPINST found errors.
SAPINST The current process environment may be found in sapinst_ora_environment.log.
2011-04-01, 04:45:32 SAPINST ORACLE stop logging
Already viewed S-note
724713 - parameter settings for Solaris 10 - (Parameters are set as per this note)
743328 - Composite SAP note: ORA-27102 - (no much information in memory problem on zones)
Please provide your suggestions/resolution.
Thankyou.Hi,
@ Sunny: Thanks for response, the referred note was already checked and parameters are in sync as per note.
@Mohit: SAP wouldn't proceed till create database if Oracle software was not installed. thanks for the response.
@Markus: Thanks I agree with you, but I have doubt in this area. Isn't it like project.max-shm-memory was new parameter we need to set in local zone rather using shmsys:shminfo_shmmax in /etc/system. Do we need to still maintain this parameter in /etc/system in global zone.
As per SUN doc below parameter was obsolete from Solaris 10.
The following parameters are obsolete.
■ shmsys:shminfo_shmmni
■ shmsys:shminfo_shmmax
As per your suggestion, do we need to set below parameters in that case, please clarify.
Parameter Replaced by Resource Control Recommended Value
semsys:seminfo_semmni project.max-sem-ids 100
semsys:seminfo_semmsl process.max-sem-nsems 256
shmsys:shminfo_shmmax project.max-shm-memory 4294967295
shmsys:shminfo_shmmni project.max-shm-ids 100
Also findings of /etc/release
more /etc/release
Solaris 10 10/08 s10s_u6wos_07b SPARC
Regards,
Sitarama. -
Acrobat XI Pro "Out of Memory" error after Office 2010 install
Good Afternoon,
We recently pushed Office 2010 to our users and are now getting reports of previous installs of Adobe Acrobat XI Pro no longer working but throwing "Out of Memory" errors.
We are in a Windows XP environment. All machines are HP 8440p/6930p/6910 with the same Service pack level (3) and all up to date on security patches.
All machines are running Office 2010 SP1.
All machines have 2GB or 4GB of RAM (Only 3.25GB recognized as we are a 32bit OS environment).
All machines have adequate free space (ranging from 50gb to 200gb of free space).
All machines are set to 4096mb initial page file size with 8192mb maximum page file size.
All machines with Acrobat XI Pro *DO NOT* have Reader XI installed alongside. If Reader is installed, it is Reader 10.1 or higher.
The following troubleshooting steps have been taken:
Verify page file size (4096mb - 8192mb).
Deleted local user and Windows temp files (%temp% and c:\WINDOWS\Temp both emptied).
Repair on Adobe Acrobat XI Pro install. No change.
Uninstall Acrobat Pro XI, reboot, re-install. No change.
Uninstall Acrobat Pro XI Pro along with *ALL* other Adobe applications presently installed (Flash Player, Air), delete all Adobe folders and files found in a full search of the C drive, delete all orphaned Registry entries for all Adobe products, re-empty all temp folders, reboot.
Re-install Adobe Acrobat XI Pro. No change.
Disable enhanced security in Acrobat XI Pro. No change.
Renamed Acrobat XI's plug_ins folder to plug_ins.old.
You *can* get Acrobat to open once this is done but when you attempt to edit a file or enter data into a form, you get the message, "The "Updater" plug-in has been removed. Please re-install Acrobat to continue viewing the current file."
A repair on the Office 2010 install and re-installing Office 2010 also had no effect.
At this point, short of re-imaging the machines (which is *not* an option), we are stumped.
We have not yet tried rolling back a user to Office 2007 as the upgrade initiative is enterprise-wide and rolling back would not be considered a solution.
Anyone have any ideas beyond what has been tried so far?As mentioned, the TEMP folder is typically the problem. MS limits the size of this folder and you have 2 choices: 1. empty it or 2. increase the size limit. I am not positive this is the issue, but it does crop up at times. It does not matter how big your harddrive is, it is a matter of the amount of space that MS has allocated for virtual memory. I am surprised that there is an issue with 64GB of RAM, but MS is real good at letting you know you can't have it all for use because you might want to open up something else. That is why a lot of big packages turn off some of the limits of Windows or use Linux.
-
Large Pdf using XML XSL - Out of Memory Error
Hi Friends.
I am trying to generate a PDF from XML, XSL and FO in java. It works fine if the PDF to be generated is small.
But if the PDF to be generated is big, then it throws "Out of Memory" error. Can some one please give me some pointers about the possible reasons for this errors. Thanks for your help.
RM
Code:
import java.io.*;
import javax.servlet.*;
import javax.servlet.http.*;
import org.xml.sax.InputSource;
import org.xml.sax.XMLReader;
import org.apache.fop.apps.Driver;
import org.apache.fop.apps.Version;
import org.apache.fop.apps.XSLTInputHandler;
import org.apache.fop.messaging.MessageHandler;
import org.apache.avalon.framework.logger.ConsoleLogger;
import org.apache.avalon.framework.logger.Logger;
public class PdfServlet extends HttpServlet {
public static final String FO_REQUEST_PARAM = "fo";
public static final String XML_REQUEST_PARAM = "xml";
public static final String XSL_REQUEST_PARAM = "xsl";
Logger log = null;
Com_BUtil myBu = new Com_BUtil();
public void doGet(HttpServletRequest request,
HttpServletResponse response) throws ServletException {
if(log == null) {
log = new ConsoleLogger(ConsoleLogger.LEVEL_WARN);
MessageHandler.setScreenLogger(log);
try {
String foParam = request.getParameter(FO_REQUEST_PARAM);
String xmlParam = myBu.getConfigVal("filePath") +"/"+request.getParameter(XML_REQUEST_PARAM);
String xslParam = myBu.SERVERROOT + "/jsp/servlet/"+request.getParameter(XSL_REQUEST_PARAM)+".xsl";
if((xmlParam != null) && (xslParam != null)) {
XSLTInputHandler input = new XSLTInputHandler(new File(xmlParam), new File(xslParam));
renderXML(input, response);
} else {
PrintWriter out = response.getWriter();
out.println("<html><head><title>Error</title></head>\n"+
"<body><h1>PdfServlet Error</h1><h3>No 'fo' "+
"request param given.</body></html>");
} catch (ServletException ex) {
throw ex;
catch (Exception ex) {
throw new ServletException(ex);
public void renderXML(XSLTInputHandler input,
HttpServletResponse response) throws ServletException {
try {
ByteArrayOutputStream out = new ByteArrayOutputStream();
response.setContentType("application/pdf");
Driver driver = new Driver();
driver.setLogger(log);
driver.setRenderer(Driver.RENDER_PDF);
driver.setOutputStream(out);
driver.render(input.getParser(), input.getInputSource());
byte[] content = out.toByteArray();
response.setContentLength(content.length);
response.getOutputStream().write(content);
response.getOutputStream().flush();
} catch (Exception ex) {
throw new ServletException(ex);
* creates a SAX parser, using the value of org.xml.sax.parser
* defaulting to org.apache.xerces.parsers.SAXParser
* @return the created SAX parser
static XMLReader createParser() throws ServletException {
String parserClassName = System.getProperty("org.xml.sax.parser");
if (parserClassName == null) {
parserClassName = "org.apache.xerces.parsers.SAXParser";
try {
return (XMLReader) Class.forName(
parserClassName).newInstance();
} catch (Exception e) {
throw new ServletException(e);Hi,
I did try that initially. After executing the command I get this message.
C:\>java -Xms128M -Xmx256M
Usage: java [-options] class [args...]
(to execute a class)
or java -jar [-options] jarfile [args...]
(to execute a jar file)
where options include:
-cp -classpath <directories and zip/jar files separated by ;>
set search path for application classes and resources
-D<name>=<value>
set a system property
-verbose[:class|gc|jni]
enable verbose output
-version print product version and exit
-showversion print product version and continue
-? -help print this help message
-X print help on non-standard options
Thanks for your help.
RM -
Out of Memory Error in Oracle 8i
When i ty to fetch data from view it throws "Out OF Memory Error",Normally it sholud return 300000 records as output uisng joing condition.
Any body please help me as this is Bit Urgent ....
Thanks.user7725408 wrote:
When i ty to fetch data from view it throws "Out OF Memory Error",Normally it sholud return 300000 records as output uisng joing condition.
Any body please help me as this is Bit Urgent ....
Thanks.There are several specific "out of memory" errors. How about an acutal error code (ORA-nnnnn). Is this on a 32-bit Windows system? -
Out of Memory error while builng HTML String from a Large HashMap.
Hi,
I am building an HTML string from a large map oject that consits of about 32000 objects using the Transformer class in java. As this HTML string needs to be displayed in the JSP page, the reponse time was too high and also some times it is throwing out of memory error.
Please let me know how i can implement the concept of building the library tree(folder structure) HTML string for the first set of say 1000 entries and then display in the web page and then detect an onScroll event and handle it in java Script functions and come back and build the tree for the next set of entries in the map and append this string to the previous one and accordingly display it.
please let me know whether
1. the suggested solution was the advisable one.
2. how to build tree(HTML String) for a set of entries in the map while iterating over the map.
3. How to detect a onScroll event and handle it.
Note : Handling the events in the JavaScript functions and displaying the tree is now being done using AJAX.
Thanks for help in Advance,
KartheekHi
Sorry,
I haven't seen any error in the browser as this may be Out of memory error which was not handled. I got the the following error from the web logic console
org.apache.struts.actions.DispatchAction">Dispatch[serviceCenterHome] to method 'getUserLibraryTree' returned an exceptionjava.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:324)
at org.apache.struts.actions.DispatchAction.dispatchMethod(DispatchAction.java:276)
at org.apache.struts.actions.DispatchAction.execute(DispatchAction.java:196)
at org.apache.struts.action.RequestProcessor.processActionPerform(RequestProcessor.java:421)
at org.apache.struts.action.RequestProcessor.process(RequestProcessor.java:226)
at org.apache.struts.action.ActionServlet.process(ActionServlet.java:1164)
at org.apache.struts.action.ActionServlet.doPost(ActionServlet.java:415)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:760)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
at weblogic.servlet.internal.ServletStubImpl$ServletInvocationAction.run(ServletStubImpl.java:996)
at weblogic.servlet.internal.ServletStubImpl.invokeServlet(ServletStubImpl.java:419)
at weblogic.servlet.internal.ServletStubImpl.invokeServlet(ServletStubImpl.java:315)
at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:6452)
at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:118)
at weblogic.servlet.internal.WebAppServletContext.invokeServlet(WebAppServletContext.java:3661)
at weblogic.servlet.internal.ServletRequestImpl.execute(ServletRequestImpl.java:2630)
at weblogic.kernel.ExecuteThread.execute(ExecuteThread.java:219)
at weblogic.kernel.ExecuteThread.run(ExecuteThread.java:178)
Caused by: java.lang.OutOfMemoryError
</L_MSG>
<L_MSG MN="ILHD-1109" PID="adminserver" TID="ExecuteThread: '14' for queue: 'weblogic.kernel.Default'" DT="2012/04/18 7:56:17:146" PT="WARN" AP="" DN="" SN="" SR="org.apache.struts.action.RequestProcessor">Unhandled Exception thrown: class javax.servlet.ServletException</L_MSG>
<Apr 18, 2012 7:56:17 AM CDT> <Error> <HTTP> <BEA-101017> <[ServletContext(id=26367546,name=fcsi,context-path=/fcsi)] Root cause of ServletException.
*java.lang.OutOfMemoryError*
Please Advise.
Thanks for your help in advance,
Kartheek -
JavaScript Out of Memory Error on Portal timeout.
Hello All,
I am using jsf and Inline navigation in all our portlets and when user leave the browser idle for portal timeout we have 2 problems. 1: Login portlet shows in that specific portlet. 2: we get a javascript alert saying out of memory at line 40. and the porltet shows error message as "Gateway was not able to access requested content. If the error persists, contact your portal Administrator."
We are using Plumtree 5.0.4 Java version.
any help is highly appreciated.
Thanks
A.J.Both are valid behaviors unfortunately.
1) login portlet is showing up in specific portlet b/c inline navigation allows for you to create and load pages without affecting the overall portal.
This happens when you use iframes (which behave in a similar fashion).
- your only workaround is really to write some javascript function to "listen" to the portal login page getting loaded and then throwing the session into the parent browser (which is Portal). At least this is the only solution that I ever came up with when using Iframes.
2) Don't know about out of memory error actually, but getting the "gatewy was not able to access requested content" is valid b/c the session died.
- javascript errors require javascript solutions. Sorry I couldn't be more helpful than that.
Maybe someone else will have better suggestions.
The other suggestion is to use your app server to listen to the logout event and redirect appropriately to somewhere else, or have it do what you want it to do in situations as this. -
Scaling images and Out of memory error
Hi all,
Does anyone knows why this code throws an out of memory error?
It loads an image (2048x1166 pixels) and saves it at bufi1 (BufferedImage). After that:
1- Rescale bufi1 to bufi13A (x3).
2. Rescale bufi1 to bufi12 (x2).
3. Rescale bufi1 to bufi13B (x3).
At 3, the code throws an oome. Why?
Thanks in advance!
import java.io.*;
import javax.imageio.*;
import java.awt.geom.*;
import java.awt.image.*;
public class TestScalePercent {
public static void main(String[] args) {
TestScalePercent tsp=new TestScalePercent();
BufferedImage bufi1=null;
try {
bufi1 = ImageIO.read(new File("foo.png"));//2048x1166 pixels
} catch (Exception e){
e.printStackTrace();
BufferedImage bufi13A=tsp.scale(bufi1,3,3);//--> OK
BufferedImage bufi12=tsp.scale(bufi1,2,2);//--> OK
BufferedImage bufi13B=tsp.scale(bufi1,3,3);//-->OOM error!
public BufferedImage scale(BufferedImage bufiSource, double scaleX, double scaleY){
AffineTransform tx = new AffineTransform();
tx.scale(scaleX,scaleY);
AffineTransformOp op = new AffineTransformOp(tx, AffineTransformOp.TYPE_NEAREST_NEIGHBOR);
BufferedImage bufiop=op.filter(bufiSource, null);//--> Creates the OOM error...
return op.filter(bufiop, null);
}How much memory does your machine have?
That image is quite large. If my math is correct and
assuming the image has 32-bit color the original
image takes up 76.5 megs of memory by itself. Then
you are trying to create three other versions of it.
It isn't lying to you, it is indeed probably running
out of memory to hold the images.OK. Now I'm wondering if is it possible to free memory between bufi13A - bufi12, and bufi12 - bufi13B? I've tried to invocate the garbbage collector but nothing happens...
Thanks! -
Out of memory Error with jdk 1.6
Hello,
I have a swing application launched on the client with the help of Java web start. The applications works fine in jre 1.4 and jre 1.5. The heap sizes are :
initial-heap-size="5m" max-heap-size="24m"
But when I run this using jre 1.6.0_05-b13, I am getting Out of memory Error, java heap size. And I see the memory usage is growing rapidly which I didn't notice in other jre versions (1.4 and 1.5).
Does anyone have any idea on this?
Thanks in advance,
MR.Thanks for your response Peter. During my continuous testing I identified that error happens on jdk 1.5 also. And I have increased the min-heap-size to 24 MB and max-heap-size to 64 MB. But in that case also I noticed the out of memory error. The interesting thing is, the min-heap-size is never increased from 24MB and lot of free memory also.
Memory: 24,448K Free: 12,714K (52%) ... completed.
The Outofmemoryerror triggers from the reader thread which does the job of reading the data from the InputStream. One thing is we are continuously pushing more data on the output stream from the other end. Is that any limitation on InputStream which can hold some definite amount of data.
Please throw some light on this. -
Another version of 'out of memory' error
Hi all
I have a colleague who is getting the following error message.
As discussed...when I attempt to render a clip on the timeline, once it
gets to the end of the rendering process and attempts to play the clip, an
'out of memory' error message box appears on the screen.
Then when I click on 'OK' to close this box, the canvas window turns to red
and the following message displays in the canvas window...'Out of memory.
Free up memory and window will redraw.'
He is using HDV footage captured with the "HDV" codec [not the intermediate one], and editing it into HDV1080i50 sequences.
He has a G5 DP 2 GHZ machine running Tiger 10.4.2 and 2 GB of ram.
He has approx 80 GB free space on Mac HD and approx 400 GB on external Lacie HD. He is running only FCP HD 5 at any one time.
I have sourced some previous posts which speak of corrupt graphics, clips, sequences and trashing fcp prefs.
Does anyone have any other suggestions for him?
[He is quite new to macs and FCP especially].
I am going to send him an email to subscribe and create an account for this forum, his name is AGebhardt.Hello,
I had the same issue last night, when a render (only 15 min., so not THAT big) completed and would not play and the canvas turned red and said I was out of memory.
This is different than a general error! out of memory warning, which I have seen happen. Some of the answers in linked threads seem to be pointing to this other situation.
I have plenty of space and plenty of RAM and was only running iTunes. I quit iTunes and it worked, almost to my disappointment because in the past I have had many apps working at a time with no problems,
I would be pretty bummed if I could only have FCP open all of a sudden.
I will try going through my clips to check for corruptions as suggested just to be on the safe side, but have a question;
What good is it to throw out your render files if you have already checked to see if you have enough storage space? I can see the good if a file is corrupt, but with every project a new render folder is created and unless there is a limit on these folders that I'm unaware of I can't see the sense in trashing the folder.
Am I missing something?
Thanks,
Jesse
733 G4 -
Hi,
At my job, we seem to have a memory leak related to JNI. We know we
have a memory leak because we keep getting Out of Memory errors even
after increasing the maximum heap size to more than 256 megs. And we
know that this is the application that is eating up all the system
memory.
We are running under Windows 2000, with both JDK 1.3.0 and JDK 1.4.1.
We tried looking at the problem under JProbe, but it shows a stable
Java heap (no problems, except that the Windows task manager shows it
growing and growing...)
I tried a strip down version, where I set the max heap size to 1 Meg,
and print out the total memory, memory used, and maximum memory used at
a 5 sec interval.
Memory used = Runtime.getRuntime().totalMemory() - Runtime.getRuntime().freeMemory().
Well, I let that strip down version running for about a day. The
maximum memory used has stabilized to about 1.1 Meg, and has not
increased. The current memory used increases until it gets to some
threshold, then it decreases again. However, the Windows task manager
shows the memory increasing -- and currently it is at 245 Megs!
In the lab, the behavior we see with the Windows task manager is as
follows:
1. Total memory used in the system increases until some threshold.
2. Then it goes back down by about 100 Megs.
3. This cycle continues, but at each cycle the memory goes back down
less and less, until the app crashes.
Now, our theory is that JNI is consuming all this memory (maybe we are
using JNI wrong). That's the only explanation we can come up with to
explain what we have seen (Java showing an OK heap, but the task
manager showing it growing, until crashing).
Does that make sense? Can the new operator throw an Out of Memory
error if the system does not have enough memory to give it, even if
there is still free heap space as far as the Runtime object is
concerned? Does the Runtime object figures objects allocated through
JNI methods into the heap used space?
Note that I know the task manager is not a reliable indicator.
However, I don't think a Java app is supposed to runaway with system
memory -- the problem is not simply that the Java app is consuming too
much memory, but that it seems to always want more memory. Besides, we
do get the Out of Memory error.
Thanks for your help,
Nicolas RiveraHi,
there are two sources of memory leakage in JNI:
- regular leaks in c/c++ code;
- not released local/global references of java objects in JNI.
I think that the first issue in not a problem for you. The second is more complex. In your JNI code you should check
- how many local references alive you keep in your code as their number is restricted (about 16 and can be enlarged). The good style is not to store local references but keep only global.
- any local reference accepted from java call in JNI are released by JVM. You should release local references you created in JNI and also global references.
- do not use a large number of java object references. Each new reference gets new memory in JVM. Try to reuse refences.
I guess that is your problem.
Vitally -
Out of memory error - large project
I'm consulting on a feature doc edit, and the primary editor (Avid guy) is having serious problems accessing anything from the original project.
It's an hour and 15 minute show, with probably close to 100 hours of footage.
The box is a D2.3 G5 with 1.5 g of RAM, and the media is on two G-Tech drives: a G-RAID and a G-Drive. Plenty of headroom on both (now) and the system drive is brand new, having been replaced after the original died, and there's nothing loaded on it but FC Studio. The FCP version is 5.1.4. The project file is well over 100 MB.
We started getting Out of Memory errors with this large project, and I checked all of the usual suspects: CMYK graphics, hard drive space, sufficient RAM... all checked out okay, except possibly the less-than-ideal amount of RAM.
I copied the important sequences and a couple of select bins to a new project, and everything seems workable for now. The project is still 90 MB, and I've suggested breaking it up into separate projects and work on it as reels, but we can edit and trims work efficiently at the moment. However, the other editor has gotten to a point now where he can't even open bins in the old, big project. He keeps getting the OOM error whenever he tries to do anything.
I have no similar problems opening the same project on my G5, which is essentially identical except I have 2.5 G RAM (1 G extra). Can this difference in RAM really make this big a deal? Is there something else I'm missing? Why can't this editor access even bins from the other project?
G4 Mac OS X (10.2.x)Shane's spot on.
What I often do with large projects is pare down, just what you have done. But 90 out of 100 is not a big paredown by any stretch. In the new copy throw away EVERYTHING that's outdated: old sequences are the big culprit. Also toss any render files and re-render.
Remember that, to be effective fcp keeps EVERYTHING in ram, so that it can instantly access anything in your project. The more there is to keep track of the slower you get. -
Getting HeapDump on out of memory error when executing method through JNI
I have a C++ code that executes a method inside the jvm through the JNI.
I have a memory leak in my java code that results an out of memory error, this exception is caught in my C++ code and as a result the heap dump is not created on the disk.
I am running the jvm with
-XX:+HeapDumpOnOutOfMemoryError
-XX:HeapDumpPath=C:\x.hprof
Any suggestions?
ThanksI'll rephrase it then.
I have a java class named PbsExecuter and one static method in it ExecuteCommand.
I am calling this method through JNI (using CallStaticObjectMethod). sometimes this method causes the jvm to throw OutOfMemoryError and I would like to get a heap dump on the disk when this happens in order to locate my memory leak.
I've started the jvm with JNI_CreateJavaVM and I've put two options inside the JavaVMInitArgs that is used to create the Jvm. -XX:+HeapDumpOnOutOfMemoryError and -XX:HeapDumpPath=C:\x.hprof
which supposed to create a heap dump on the disk when OutOfMemoryError occurs.
Normally if I would execute normal java code, when this exception would occur and I wouldn't catch it the Jvm would crash and the heap dump would be created on the disk.
Since I need to handle errors in my C++ code I am use ExceptionOccured() and extracts the exception message from the exception it self and write it.
For some reason when I execute this method through JNI it doesn't create the dump. -
Out of memory error when writing large file
I have the piece of code below which works fine for writing small files, but when it encounters much larger files (>80M), the jvm throws an out of memory error.
I believe it has something to do with the Stream classes. If I were to replace my PrintStream reference with the System.out object (which is commented out below), then it runs fine.
Anyone else encountered this before?
print = new PrintStream(new FileOutputStream(new File(a_persistDir, getCacheFilename()),
false));
// print = System.out;
for(Iterator strings = m_lookupTable.keySet().iterator(); strings.hasNext(); ) {
StringBuffer sb = new StringBuffer();
String string = (String) strings.next();
String id = string;
sb.append(string).append(KEY_VALUE_SEPARATOR);
Collection ids = (Collection) m_lookupTable.get(id);
for(Iterator idsIter = ids.iterator(); idsIter.hasNext();) {
IBlockingResult blockingResult = (IBlockingResult) idsIter.next();
sb.append(blockingResult.getId()).append(VALUE_SEPARATOR);
print.println(sb.toString());
print.flush();
} catch (IOException e) {
} finally {
if( print != null )
print.close();
}Yes, my first code would just print the strings as I got them. But it was running out of memory then as well. I thought of constructing a StringBuffer first because I was afraid that the PrintStream wasn't allocating the memory correctly.
I've also tried flushing the PrintStream after every line is written but I still run into trouble.
Maybe you are looking for
-
Interactive forms- i see only one record -how can i see more?
Hi experts, i have a table and the result is only one record instead of more records. how can i see more records? my code is: types: begin of structure_0021, favor type pa0021-favor, yy_id type pa0021-yy_id, fgbdt type pa0021-fgbdt, end of structure_
-
Sorry if this is basic or has already been answered, but I'm new to iPods and unfamiliar with iTunes. So, about a year ago I moved in with my current roommate who had an iPod. He had previously lived with his parents and had used their computer as th
-
Need help - Error while updating the firmware of Muvo Slim using the recovery t
Hi , While I am trying to update my Muvo slim firmware using the recovery tool it throwd up an error. And from then onwards if I am trying to power it up it won't. If I again try to update the firmware, it throws up the error that "Unable to Write to
-
How do I change the font color in Live Type?
How can I change the font color in Live Type?
-
After importing the transport request status is not changing to imported
Hi Experts, I have added the data & cofiles from rollout system to sandbox system via sftp on os level. Now when I am trying to import these requests, the requests are not getting imported. Can any one tell me the reason? In tp system log, I found th