Confusion on Java heap memory and WLS_FORMS

Hello all,
Background first:
Oracle Forms/Reports 11.1.2 64-bit
WebLogic Server 10.3.6
JDK 1.6 update 37 64-bit
Microsoft Windows 2008 R2
Using nodemanager to start/stop managed servers
After having read all of the documentation and searched both this forum and the Internet for advice, I'm still utterly confused about the best way to make use of memory on the server (the server I'm working on now has 8GB). The two trains of thought that I have discovered in my search:
1). Don't change the Javaheap size at all (stick with the defaults) and just create additional managed servers on the same machine.
2). Increase the Java heap size for WLS_FORMS
Having said that, here are my questions:
A). What is the best-practices approach (#1 or #2)?
B). If it's #2, what's the approved way to increase the heap size? I have tried adding -Xms and -Xmx arguments to the WLS server start arguments in the WLS console. These are applied when the managed server is started (confirmed in the log file), but because of the way WLS_FORMS is started, there are more -Xms and -Xmx arguments applied after mine, and Java picks the last one mentioned if there are duplicates.
First update: Question #2 seems to be answered by support note 1260074.1 (the one place I hadn't yet looked)
Thanks for any insight you can provide. If there's a document I've missed somewhere, I'm happy to be told where it is and will read and summarize findings here.
Regards,
John

John,
Let me try to comment on each of yours:
1). We had been getting some "Apache unable to contact the forms server" type errors (the users were seeing the "Failure of server APACHE bridge" error). The log files showed nothing of interest. I increased the memory allocated using setDomainEnv.cmd, and the error seems to have gone away. Yes, I know that it was a shotgun approach, trying something without really having a reason to do so, but it seems to have helped Edit: Now that I review the OHS logs instead of the WLS_FORMS logs, I have found log messages, which leads me to Doc 1380762.1, which tells me I need a patch. DOH. And, oh crikey, Forms 11.1.2.1 is out, it came out shortly after we downloaded 11.1.2.0 to create these environments. Good news/bad news kind of thing... <blockquote>The Apache Bridge error is fairly straight forward if you understand what it is telling you. It is an error generated by mod_wl_ohs who is owned by OHS (Apache). This module is responsible for the connection between OHS and WLS. The Apache Bridge error means that OHS (mod_wls) was unable to get a response from the WLS managed server it was calling. Basically it was unable to cross the bridge ;) The cause could be anything from the managed server is not running, to the managed server is over tasked, or there is a network configuration issue and the managed server simply didn't hear OHS calling.
This is all discussed in MOS note 1304095.1
As for 11.1.2.1, this can be installed fresh or as a patch over 11.1.2.0. So for machines that don't currently have anything installed, you can go directly to 11.1.2.1 without having to install 11.1.2.0 first.</blockquote>
2). As tony.g suggested, we are looking for what we should do to solve the "I have n servers with x GB of RAM, what should I do to the out-of-the-box configuration of Forms for stability" question. <blockquote>As I mentioned, there really are no "Forms" specific tweaks related to how much RAM your machine has. The only exception to this is (although somewhat indirect) to use JVM Pooling. JVM Pooling can reduce the size of each runtime process's memory footprint by moving its java calls to the jvm pool then sharing common requests with other running runtimes. Memory usage by OHS or the WLS managed server really has little to do directly with Forms. Specifically to the managed server, from a Forms point of view, I would not expect the memory cost of WLS_FORMS to increase much because of load. I expect it to increase as concurrent load increases, but I would not expect it to be significant. If I had to guess, seeing an increase of 1m or less per user would not surprise me (this is just a guess - I don't know what the expected values would be). If we were to use our (Oracle) older scalability guidelines, typically we would have suggested that you should consider about 100 sessions per 1 jvm for best performance. Given that v11 uses a newer java version and scalability is better today, I suspect you can easily scale to a few hundred (e.g. 300) or so before performance drops off. Beyond that, the need to add more managed servers would likely be necessary.
This is discussed in MOS note 989118.1</blockquote>
3). HA is important to us, so we are implementing a cluster of Forms/Reports servers with an LBR in front of it. I have read in the docs on clustering, cloning a managed server, and via Support, how to increase the heap memory for the WLS_FORMS server. My thought process was "if Oracle gives me instructions on how to increase heap memory and how to clone managed servers, there must be a scenario in which doing so provides benefit." I'm trying to understand the scenarios in which we would do either of those activities. <blockquote>Refer to the note I mentioned above. Generally, if you limit the number of concurrent sessions to less than around 300-400, I would think the default settings should be fine. If you think you would like to go beyond 300 or 400 per managed server then likely you will need to increase the max heap for the managed server. Again, refer to the note I mentioned previously.
Also see MOS note 1260074.1</blockquote>
I am aware of the JVM pooling (yes we do call out to Reports) - I've yet to implement this, but it's on my to-do list.
<blockquote>This is discussed in the [url http://docs.oracle.com/cd/E38115_01/doc.111210/e24477/jvm.htm]Forms Deployment Guide</blockquote>
Hope that helps ;)
.

Similar Messages

  • Is there a way to define the ideal size of the java heap memory?

    Hello all!
    Is there a way to define the ideal size the java heap memory? I'm using a server with (IR,FR,WA) installed and i'm using the Windows Server 2008 R2 with 32GB of ram memory. I have other server with the same configuration using essbase. How can i set the heap memory? I have around 250 users (not simultaneous).
    Regards,
    Rafael Melo
    Edited by: Rafael Melo on Aug 17, 2012 5:40 AM

    For 2008 which is 64 bit you can have
    For FR in windows registry
    HKEY_LOCAL_MACHINE\SOFTWARE\Hyperion Solutions\Hyperion Reports\HyS9FRReport
    Xms and Xmx can have 1536 each.
    For workspace
    Start “Start Workspace Agent UI” service and open Configuration Management
    Console (CMC) via http://localhost:55000/cmc/index.jsp
    for
    Workspace Agent / Common Services Java Heap size you can have
    Xms and Xmx as 1024 each.

  • Java heap memory errors..

    I make a spider program, but it runs out of memory after ~200 websites.
    after that it crashes.. or becomes so slow its useless.
    The problem is in this Spider class , that cant be garbage collected (But I dont see any reason why not)
    These profilers i have seen show you that memory is allocated and cant be garbage collected , but not where ? Can anyone give suggestions what can be wrong with this class or tell how they fix memory problems?
    peter
    import java.util.*;
    import java.net.*;
    import java.net.MalformedURLException;
    import java.net.URL;
    import java.net.URLConnection;
    import java.io.File;
    import java.io.FileNotFoundException;
    import java.io.FileOutputStream;
    import java.io.IOException;
    import java.io.InputStream;
    import java.io.PrintWriter;
    import org.htmlparser.util.ParserException;
    import org.htmlparser.Node;
    import org.htmlparser.NodeFilter;
    import org.htmlparser.Parser;
    import org.htmlparser.PrototypicalNodeFactory;
    import org.htmlparser.tags.BaseHrefTag;
    import org.htmlparser.tags.FrameTag;
    import org.htmlparser.tags.TitleTag;
    import org.htmlparser.tags.HeadingTag;
    import org.htmlparser.tags.LinkTag;
    import org.htmlparser.tags.MetaTag;
    import org.htmlparser.util.EncodingChangeException;
    import org.htmlparser.util.NodeIterator;
    import org.htmlparser.util.NodeList;
    import org.htmlparser.util.ParserException;
    import org.htmlparser.beans.StringBean;
    public class Spider implements Runnable {
       private URL base;
       private int siteid;  
       private int companyid;
       private int MaxPaginas;
       protected Collection workloadPath = new ArrayList(3);
       protected Collection workloadError = new ArrayList(3);
       protected Collection workloadWaiting = new ArrayList(3);
       protected Collection workloadProcessed = new ArrayList(3);
       protected ISpiderReportable report;
       protected Done done;
       protected Parser mParser;
       String content = "";
       String meta = "";
       String titel = "";
       String kopjes = "";
       static private int count = 0;
       private int taskNumber;
      public Spider(int DBcompanyid, int DBsiteid, URL DBbase, ISpiderReportable report)
         base = DBbase;
         siteid = DBsiteid;
         companyid = DBcompanyid;
         MaxPaginas = 20;
         this.report = report;
         count++;
         taskNumber = count;
         mParser = new Parser ();
         PrototypicalNodeFactory factory = new PrototypicalNodeFactory ();
         factory.registerTag (new LocalLinkTag ());
         factory.registerTag (new LocalMetaTag ());
         factory.registerTag (new LocalFrameTag ());
         factory.registerTag (new LocalTitleTag ());
         factory.registerTag (new LocalHeadingTag ());
         mParser.setNodeFactory (factory);
      public void run()
          clear();
          report.koppelDB(siteid,companyid);
          addURL(base);
          begin();
      public Collection getWorkloadPath()
        return workloadPath;
      public Collection getWorkloadError()
        return workloadError;
      public Collection getWorkloadWaiting()
        return workloadWaiting;
      public Collection getWorkloadProcessed()
        return workloadProcessed;
      public void clear()
        getWorkloadError().clear();
        getWorkloadWaiting().clear();
        getWorkloadProcessed().clear();
        getWorkloadPath().clear();
      public void addURL(URL url)
        if ( getWorkloadWaiting().contains(url) )
          return;
        if ( getWorkloadError().contains(url) )
          return;
        if ( getWorkloadProcessed().contains(url) )
          return;
        if ( getWorkloadPath().contains(url.getPath()) )
          return;
         getWorkloadPath().add(url.getPath());
         log("PROCES: " + taskNumber + "  Adding to workload: " + url );
         getWorkloadWaiting().add(url);
         MaxPaginas--;
       protected void processURL (URL Furl) throws ParserException
              NodeList Nlist;
             getWorkloadWaiting().remove(Furl);
             getWorkloadProcessed().add(Furl);
             String url = Furl.toString();
             StringExtractor se = new StringExtractor (url);
             try
                content = se.extractStrings ();
            catch (ParserException e)
                    e.printStackTrace ();
            try
            mParser.setURL (url);
                try
                   Nlist = new NodeList ();
                    for (NodeIterator e = mParser.elements (); e.hasMoreNodes (); )
                        Nlist.add (e.nextNode ());
                catch (EncodingChangeException ece)
                    mParser.reset ();
                    Nlist = new NodeList ();
                    for (NodeIterator e = mParser.elements (); e.hasMoreNodes (); )
                        Nlist.add (e.nextNode ());
            catch (ParserException pe)
               String message;
               message = pe.getMessage ();
               if ((null != message) && (message.endsWith ("does not contain text")))
                    System.out.println("Is geen text bestand...");
                else
                    throw pe;
            report.writeDB(siteid,(String)Furl.getPath(),content, titel, meta, kopjes);
            String content = "";
            String meta = "";
            String titel = "";
            String kopjes = "";
            log("Complete: " + url);
        class LocalLinkTag extends LinkTag
            public void doSemanticAction ()
                throws
                    ParserException
                if(!isHTTPLikeLink())
                    return;
                String link = getLink();
               int index = link.indexOf('#');
                 if (index != -1)
                     link = link.substring(0, index);
                if(MaxPaginas>1)
                   handleLink(base,link);
                else
                    return;
        class LocalFrameTag extends FrameTag
            public void doSemanticAction ()
                throws
                    ParserException
                String link = getFrameLocation ();
                if(MaxPaginas>1){
                    handleLink(base,link);
       public class StringExtractor
         private String resource;
         public StringExtractor (String resource)
            this.resource = resource;
         public String extractStrings ()
            throws
               ParserException
            StringBean sb;
            sb = new StringBean ();
            sb.setLinks (false);
            sb.setURL (resource);
            return (sb.getStrings ());
        class LocalTitleTag extends TitleTag
            public void doSemanticAction ()
                throws
                    ParserException
                titel = getTitle();
        class LocalHeadingTag extends HeadingTag
           public void doSemanticAction ()
                throws
                    ParserException
                kopjes = kopjes + " " + toPlainTextString();
          class LocalMetaTag extends MetaTag
            public void doSemanticAction ()
                throws
                    ParserException
                String metaNaam = null;
                metaNaam = getMetaTagName();
                if(metaNaam!=null)
                    if(metaNaam.equals("keywords") || metaNaam.equals("description"))
                            meta = meta + " " + getMetaContent();
      public void begin()
        while ( !getWorkloadWaiting().isEmpty()) {
           Object list[] = getWorkloadWaiting().toArray();
          for ( int i=0;(i<list.length);i++ )
          try{
             processURL((URL)list);
    }catch(ParserException pe){
    System.out.println("Parser error:"+pe);
    MaxPaginas++;
    protected void handleLink(URL base,String str)
    try {
    URL url = new URL(base,str);
    if ( report.spiderFoundURL(base,url)){
    addURL(url);
    } catch ( MalformedURLException e ) {}
    public void log(String entry)
    System.out.println(entry );
    It must have something to do with the inner classes and processURL since i replaced only this code Swing parser with htmlparser.
    I dont expect anyone to exactly say this is wrong or that, but maybe some tools and suggestions how to solve would be very welcome.
    thanks

    The structure is like this:
    public class test{
      public static void main(String args[])
             test x = new test();
             x.execute();
      public void execute()
         ThreadPool pool = new ThreadPool(10);
          for (int i = 0; i < size; i++ )
              pool.assign(new Spider(DBcompanyid, DBsiteid, base,this));
         pool.complete();
    }

  • How to increase Memory and Java Heap Size for Content Server

    Hi,
    My content server is processing requests very slowly. Over performance is not good. I have 2 GB of RAM any idea what files I can modify to increase the Java Heap Size for the Content Server. If I increase the RAM to 4 or 6 GB where do I need to make changes for the Java Heap Size and what are the recommended values. I just have Content Server Running on the linux box. Or how do I assign more memory to the user that owns the content server install.
    Thanks

    You might find these interesting:
    http://blogs.oracle.com/fusionecm/2008/10/how_to_-javatuning.html
    http://download.oracle.com/docs/cd/E10316_01/cs/cs_doc_10/documentation/admin/performance_tuning_10en.pdf
    Do you have access to metalink? This has about everything you could want:
    https://metalink2.oracle.com/metalink/plsql/f?p=130:14:9940589543422282072::::p14_database_id,p14_docid,p14_show_header,p14_show_help,p14_black_frame,p14_font:NOT,788210.1,1,1,1,helvetica
    Or search for "788210.1" in metalink knowledgebase if that link doesn't work and look for the FAQ on configuring Java for Content Servers

  • OAS Heap memory issue: An error "java.lang.OutOfMemoryError: GC overhead

    OAS - 10.1.3.4.0
    We are running out of Heap memory and seeing lots of full GC and out of memory events
    Verbose GC is on.
    Users don't know what they are doing to cause this
    We have 30-40 users per server and 1.5 GB heap memory allocated
    There are no other applications on the machine. Only the PRD instance with 1.5 GB allocated to the JVM. We do not have any issue with memory on the server and we could increase the heap but we dont want to go over the 1.5 GB since that is what I understood to be the high end of what is recommended. we only have 30-40 users on each machine. There are 8 servers and a typical heavy usage day we may have 1 or two machines that have the out of memory or continuous full GC in the logs. When this occurs the phones light up with the people on that machine experiencing slowness.
    below is an example of what we see in a file created in the OPMN log folder on the JAS server then this occurs. I think this is the log created when Verbose GC is turned on. I can send you the full log or anything else you need. Thanks
    1194751K->1187561K(1365376K), 4.6044738 secs]
    java.lang.OutOfMemoryError: GC overhead limit exceeded
    Dumping heap to java_pid10644.hprof ...
    [Full GC 1194751K->1188321K(1365376K), 4.7488200 secs]
    Heap dump file created [1326230812 bytes in 47.602 secs]
    [Full GC 1194751K->1177641K(1365376K), 5.6128944 secs]
    [Full GC 1194751K->986239K(1365376K), 4.6376179 secs]
    [Full GC 1156991K->991906K(1365376K), 4.5989155 secs]
    [Full GC 1162658K->1008331K(1365376K), 4.1139016 secs]
    [Full GC 1179083K->970476K(1365376K), 4.9670050 secs]
    [GC 1141228K->990237K(1365376K), 0.0561096 secs]
    [GC 1160989K->1012405K(1365376K), 0.0920553 secs]
    [Full GC 1012405K->1012274K(1365376K), 4.1170216 secs]
    [Full GC 1183026K->1032000K(1365376K), 4.4166454 secs]
    [Full GC 1194739K->1061736K(1365376K), 4.4009954 secs]
    [Full GC 1194739K->1056175K(1365376K), 5.1124431 secs]
    [Full GC 1194752K->1079807K(1365376K), 4.5160851 secs]
    in addition to the 'overhead limit exceded' we also will see :
    [Full GC 1194751K->1194751K(1365376K), 4.6785776 secs]
    [Full GC 1194751K->1188062K(1365376K), 5.4413659 secs]
    [Full GC 1194751K->1194751K(1365376K), 4.5800033 secs]
    [Full GC 1194751K->1194751K(1365376K), 4.4951213 secs]
    [Full GC 1194751K->1194751K(1365376K), 4.5227857 secs]
    [Full GC 1194751K->1171773K(1365376K), 5.5696274 secs]
    11/07/25 11:07:04 java.lang.OutOfMemoryError: Java heap space
    [Full GC 1194751K->1183306K(1365376K), 4.5841678 secs]
    [Full GC 1194751K->1184329K(1365376K), 4.5469164 secs]
    [Full GC 1194751K->1184831K(1365376K), 4.6415273 secs]
    [Full GC 1194751K->1174738K(1365376K), 5.3647290 secs]
    [Full GC 1194751K->1183878K(1365376K), 4.5660217 secs]
    [Full GC 1194751K->1184651K(1365376K), 4.5619460 secs]
    [Full GC 1194751K->1185795K(1365376K), 4.4341158 secs]

    There's an Oracle support note with a very similar MO :
    WebLogic Server: Getting "java.lang.OutOfMemoryError: GC overhead limit exceeded" exception with Sun JDK 1.6 [ID 1242994.1]
    If I search for "java.lang.OutOfMemoryError: GC overhead" on Oracle Support it returns at least 12 documents
    Might be bug 6065704. Search Oracle support for this bug number.
    Best Regards
    mseberg

  • Java heap out of memory error with -Xms1g -Xmx4g 64 bit VM

    We are getting Java Heap memory error for the application we are running on linux 64 bit machine (VM).
    The OOM came when heap usage was 1.7gb though we have specified min as 1gb and max as 4gb. If I understand correctly then it should not have been thrown as we have specified max as 4gb. If address space was the problem then it should have thrown swap space error.
    Also, there were no other processes running on this node.
    Below are the specifics of linux node we are using:
    linux kernel: 2.6.18-128.el5
    Linux Version: Red Hat Enterprise Linux Server release 5.3 (Tikanga) 64 Bit
    Ulimts
    [ppoker@aquariusvir11 ~]$ ulimit -a
    core file size (blocks, -c) unlimited
    data seg size (kbytes, -d) unlimited
    scheduling priority (-e) 0
    file size (blocks, -f) unlimited
    pending signals (-i) 139264
    max locked memory (kbytes, -l) unlimited
    max memory size (kbytes, -m) unlimited
    open files (-n) 100000
    pipe size (512 bytes, -p) 8
    POSIX message queues (bytes, -q) 819200
    real-time priority (-r) 0
    stack size (kbytes, -s) 10240
    cpu time (seconds, -t) unlimited
    max user processes (-u) 139264
    virtual memory (kbytes, -v) unlimited
    file locks (-x) unlimited
    Java Version
    [ppoker@aquariusvir11 ~]$ java -version
    java version "1.6.0_21"
    Java(TM) SE Runtime Environment (build 1.6.0_21-b06)
    Java HotSpot(TM) 64-Bit Server VM (build 17.0-b16, mixed mode)
    Kernel Semaophores
    [ppoker@aquariusvir11 ~]$ ipcs -l
    ------ Shared Memory Limits --------
    max number of segments = 4096
    max seg size (kbytes) = 67108864
    max total shared memory (kbytes) = 17179869184
    min seg size (bytes) = 1
    ------ Semaphore Limits --------
    max number of arrays = 128
    max semaphores per array = 250
    max semaphores system wide = 32000
    max ops per semop call = 32
    semaphore max value = 32767
    ------ Messages: Limits --------
    max queues system wide = 16
    max size of message (bytes) = 65536
    default max size of queue (bytes) = 65536
    Please suggest what could be the reason for this error.
    Thanks,
    Ashish

    javaguy4u wrote:
    the OOM error ... wasn't coming when we had set min and max both as 4 GB.You deviously withheld that information.
    When the JVM needs to grow the heap it asks the OS for a bigger memory block than the one it has.
    The OS may refuse this and the JVM will throw an OOME.

  • Java heap problem

    Hi.
    I have a question about objects in session scope.
    I have created an object which holds a grid (a matrix of 2046 elements of HtmlnputText). At the beginning everything works ok, but later when several users create this matrix populating from database several times I get a java heap error. Reading several documents I think the object created are never released from memory so I don't know how to focus this stuff.
    I have changed the scope of the object to resquest and the behaviour is the same..
    Any comments about this?
    thanks
    PD: I use tomcat 6.0.x

    Hi,
    thanks both for comments.
    I have no access to JVM, so I can't change it.
    About the logic I think the problem is the GC is not working, I mean. when I get the recordset from database I make something like this:
    Grid grid = new Grid(myData);
    myData is an object populated from the recordset and I have defined the beans in request scope (not in session scope).
    so everytime I make this, I lose the reference for the old Grid but I think this remains in memory so, when the application is running for some time it collapse and I get a java heap memory error. I'm not sure but I think that is the behaviour. Am I wrong?
    Anyway I'll try with jmeter.
    Thanks
    greetings

  • Java.lang.OutOfMemoryError: Java heap space error

    Hello All,
    We are on SOA 10g (10.1.3.5) and are facing java heap memory error intermittently. We have already beefed up the memory to 1024MB and have implemented separate module-wise domain on the SOA server ex: PTP,DTD etc.
    The error is thrown ecen when data volume is very low << 7MB in size.
    Following is the excerpt from the opmn.xml file.
    <process-type id="oc4j_soa" module-id="OC4J" status="enabled">
    <module-data>
    <category id="start-parameters">
    <data id="java-options" value="-server *-Xms1024m -Xmx1024m* -Djava.security.policy=$ORACLE_HOME/j2ee/oc4j_soa/config/java2.policy -Djava.awt.headless=true -Dhttp.webdir.enable=false *-XX:MaxPermSize=256M* -Doraesb.home=/u01/app/oracle/soaprd2/product/app/integration/esb -Dhttp.proxySet=false -Doc4j.userThreads=true -Doracle.mdb.fastUndeploy=60 -Doc4j.formauth.redirect=true -Djava.net.preferIPv4Stack=true -Dorabpel.home=/u01/app/oracle/soaprd2/product/app/bpel -Xbootclasspath^/p:/u01/app/oracle/soaprd2/product/app/bpel/lib/orabpel-boot.jar -Dhttp.proxySet=false"/>
    </category>
    <category id="stop-parameters">
    <data id="java-options" value="-Djava.security.policy=$ORACLE_HOME/j2ee/oc4j_soa/config/java2.policy -Djava.awt.headless=true -Dhttp.webdir.enable=false"/>
    </category>
    </module-data>
    Can anyone please point out what could go wrong here..
    Thanks,
    Rahul.

    Thanks All..We were able to resolve this issue by tuning the query that SOA used via the DB adapter to poll and select the data set from the custom staging tables. Still, we are thinking of increasing the memory to 2 Gigs.
    Can anyone point out what is the ideal settings for a high transaction environment on SOA 10g (10.1.3.5) for the following..
    -Xms1024m
    -Xmx2048m
    - XX:MaxPermSize=512M
    Thanks,
    Rahul

  • JAVA HEAP SPACE

    dear All
    we are running EBS R12 (12.0.6 )  on Sun Solaris 10 on 10.2.0.4 DB. 1 of our custom made report failed with below mentioned error. Its a new report used for first time.
    Can anyone tellme how to check the current Java heap space and how to increase it.
    INFO: oracle.adf.share.config.ADFConfigFactory No META-INF/adf-config.xml found
    Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
    Regards
    Musaddaq

    Hi Mussaddaq,
    Please refer following as it is discussed below previously:
    OutOfMemoryError: Java heap space
    https://forums.oracle.com/thread/search.jspa?peopleEnabled=true&userID=&containerType=&container=&q=Java+heap+space
    Regards,

  • Java heap size adminserver

    Hello,
    i would like to know the java heap memory size for adminserver by default.
    regards
    Jean marc

    hello,
    thanks for your answers.
    i install oracle bi publisher in standalone on windows 2008 64 bits.
    the installation create a weblogic server with an adminserver, bi publisher is installed in the adminserver.
    if i look for setDomainEnv.cmd,
    i have
    set XMS_SUN_64BIT=256
    set XMS_SUN_32BIT=256
    set XMX_SUN_64BIT=512
    set XMX_SUN_32BIT=512
    set XMS_JROCKIT_64BIT=256
    set XMS_JROCKIT_32BIT=256
    set XMX_JROCKIT_64BIT=512
    set XMX_JROCKIT_32BIT=512
    so , i use java sun 64 bits , so the setting XMX_SUN_64BIT=512 is used.
    but if i look in the task manager, the java.exe process, the memory allocated is around 2G .
    i don't find xmx settings in adminserver.log located at
    I:\BI_HOME\BI11G2\user_projects\domains\bifoundation_domain\servers\AdminServer\logs
    regards
    jean marc
    Edited by: jmniard on 29 juil. 2011 08:40

  • PDFBox ... Java heap space problem !!

    Hello,
    I'm using PDFBox trying to read (2) pdf files in it, but both are big in size(About 17 MB) each.
    Here is an example for what i want to do :
    public void loadPdfs() {
       String pdf1FilePath = "c:\\java long pdf1.pdf";
       String pdf2FilePath = "c:\\java long pdf2.pdf";
       try {
          PDDocument pdf1File = PDDocument.load( pdf1FilePath );
          //....Do some stuff here.
          pdf1File.close();
          Runtime.getRuntime().gc();
          PDDocument pdf2File = PDDocument.load( pdf2FilePath ); //This line causes the Exception.
       } catch (IOException e) {
          e.printStackTrace();
    } I get this Exception :
    org.pdfbox.exceptions.WrappedIOException: Java heap space
    java.lang.OutOfMemoryError: Java heap space And if i tried to load only (1) pdf it works fine. So why does this happen ?
    I do close the first pdf, and run the Garbage Collector, so when opening the second pdf, it's as if opening (1) file now, or am i mistaken ?

    Sure, you close the PDF (whatever that does). But you're still holding a reference to it, so it can't be garbage collected. This is one of the rare occasions where setting a variable to null is a useful thing to do.try {
      PDDocument pdf1File = PDDocument.load( pdf1FilePath );
      //....Do some stuff here.
      pdf1File.close();
      pdf1File = null;
      PDDocument pdf2File = PDDocument.load( pdf2FilePath ); //This line causes the Exception.
    } catch (IOException e) {
      e.printStackTrace();
    }  Don't bother to call the gc() method, if garbage collection needs to be done then it will be done.

  • Tunning Java Heap Space

    Hello;
    I am getting this error when running some of the Oracle Diagnostics through OAM:
    JSP Error:
    Request URI:/OA_HTML/jtfqaadv.jsp
    Exception:
    java.lang.OutOfMemoryError: Java heap space
    When I review my configurations on my two Windows 32bit servers I find these results:
    On the APP/Web/Forms Server:
    In the %IAS_ORACLE_HOME%/Apache/Jserv/etc/jserv.properties file is:
    wrapper.bin.parameters=-verbose:gc -Xmx512M -Xms128M -XX:MaxPermSize=128M -XX:NewRatio=2 -XX:+PrintGCTimeStamps -XX:+UseTLAB
    In the Context file is:
    <forms_jvm_options oa_var="s_forms_jvm_options" osd="NT">-Xmx256M -Xms128M -XX:MaxPermSize=128M -XX:NewSize=60M -XX:MaxNewSize=120M -Xrs</forms_jvm_options>
    <jvm_options oa_var="s_jvm_options" osd="NT">-verbose:gc -Xmx512M -Xms128M -XX:MaxPermSize=128M -XX:NewRatio=2 -XX:+PrintGCTimeStamps -XX:+UseTLAB</jvm_options>
    On the DB/CCM/Admin Server:
    In the %IAS_ORACLE_HOME%/Apache/Jserv/etc/jserv.properties file is:
    wrapper.bin.parameters=-verbose:gc -Xmx512M -Xms128M -XX:MaxPermSize=128M -XX:NewRatio=2 -XX:+PrintGCTimeStamps -XX:+UseTLAB
    In the Context file is:
    <forms_jvm_options oa_var="s_forms_jvm_options" osd="NT">-Xmx256M -Xms128M -XX:MaxPermSize=128M -XX:NewSize=60M -XX:MaxNewSize=120M -Xrs</forms_jvm_options>
    <jvm_options oa_var="s_jvm_options" osd="NT">-verbose:gc -Xmx512M -Xms128M -XX:MaxPermSize=128M -XX:NewRatio=2 -XX:+PrintGCTimeStamps -XX:+UseTLAB</jvm_options>
    I am considering changing the value of jvm_options oa_var Xmx from 512M to 640M. My problem is that I have not worked with the JVM at this level and am wondering if this is the correct approach or any tips/hints on this.

    Hi Hussein;
    Thanks for the reply. I have been reviewing these documents and a few others. I this is how I came up with the idea to increase Xmx value. How ever I did read about a number of other options that sound good like:
    hotspot - HotSpot is an "ergonomic" JVM. Based upon the platform configuration, it will select a compiler, Java heap configuration, and garbage collector that produce good to excellent performance for most applications. The Java HotSpot Virtual Machine is a core component of the Java SE platform. It implements the Java Virtual Machine Specification, and is delivered as a shared library in the Java Runtime Environment.
    -XX:+AggressiveOpts Turn on point performance compiler optimizations that are expected to be default in upcoming releases. (Introduced in 5.0 update 6.)
    -XX:+UseFastAccessorMethods Use optimized versions of Get<Primitive>Field.
    So I am looking for some skilled guidance as to how to tune this thing.

  • OutOfMemory: Java heap space

    Hello,
    I just downloaded and installed jdk 1.5.0 on win XP box. I changed the path variable to point to 1.5.0 from 1.4 and I'm getting the following error:
    Exception in thread "main" java.lang.OutOfMemoryError: Java heap space and don't think that I should be getting this error for such a small test app. It's a hello world test for jdk1.5.0. Her'es the code:
    public class HelloWorld {
      public static void main(String[] args) {
          System.out.println("hello world");
    }Any help would be greatly appreciated!

    Does JAVA_HOME point to the correct JDK?
    /Kaj

  • Differences between Java Virtual Machine and Java HotSpot ??

    I am little bit confused between Java Virtual Machine and Java HotSpot.
    My understanding is:
    Java Virtual Machine is the environment to execute Java programs. I think I could understand this part.
    However, the description says, "The Java HotSpot product line consists of a server-side and a client-side virtual machine that share the Java HotSpot runtime environment, but have different compilers suited to the different performance characteristics of clients and servers."
    I am confused with server-side virtual machine. what is that?? Does it mean the environment to execute Java programs remotely through network?? For example, Java plug-in.?? Please advise.

    Hotspot is a JVM.
    Hotspot has different configuration settings. Some of those settings are better suited for a client application. Some are better suited to server applications.

  • Changing heap memory paramter  and Additonal Java node configuration

    Hi All,
    Kindly provide me the procedure to change the heap memory parameter and location (in Visual admin). How to install/configure 2nd Java server node in PI system.
    Thanks in advance.

    Hi Ramesh,
    You would have posted this question in Netweaver Administrator forum...
    Well here are the answers.
    Login to Configtool.
    Expand Cluster-Data->instance<xx>->dispatcher. On the right hand side you can change the heap size for the dispatcher
    Follow the same procedure to increase the server heap memory.
    To add the server node click on instance<xx>. Then click Server->Add server.
    Cheers....
    Raghu

Maybe you are looking for

  • IWeb - Fixed position - help!

    Hello, I have the problem with menu on my site. Trying to make it flow but nothing is working for me and I don't know why. Can anyone help me? Thanks

  • In mail, how do you remove the picture in your header?

    In the header of my mail account, on the top, right side, I have this stupid picture of a dog. I don't want this picture or any other picture in my header. How do I remove it?!?

  • Hyperion Shared Services -- External user containers getting missed out .

    Hi All , In my hyperion enviornment user authentication is done through native directory and also through External directories configured to LDAP - OIDM . Frequently the external containers are getting disappeared from the shared services console. Bu

  • Cant call simple method from main method

    hi guys, this maybe a stupid question, im a newbie to java programming. Im trying to call the method calculateArea in main method and its not working. Any ideas? Thanks. package project1; public class makePurchase {     public int calculateArea(int w

  • V 6.5 problem with @MDANCESTVAL?

    Has anyone used the @MDANCESTVAL formula successfully in V 6.5? I am using it referencing three dimensions, using level references rather than generation references, but Essbase won't recognize all level zero members as level zero members, so for som