MII Performance with Large XML

HI,
We are facing performance issue while parsing XML of large size of around 50MB.
Initially XMII was crashing due to "out of heap memory error". However, with some changes the problem has been removed.But
now, we are facing an issue with time taken by the Payload to execute. Its taking more than half-an-hour to execute the transaction.
The solution tried so far has decreased the time to half-an-hour. earlier it was taking more than one and half hour to process.
We have tried parallel processing by asynchronous call to the transactions
Is there any way to further reduce the time of processing?
Further, is it possible to find out which parser is used by MII internally for XML SAX or DOM Parser.
Thanks
Amit

Hi Amit,
Just some tips to improve performance of BLS.
1. Use Xpath whenever possible.
2. Remove unnecessary repeaters performing on BLS.
3. Check the Like to Optimizing BLS Performance.
Optimizing BLS Performance for XML Handling in SAP MII
If you are storing the data in database. just pass whole xml to query and insert data using bulk insert.
Thanks
Anshul

Similar Messages

  • Slow performance with javax.xml.ws.Endpoint.publish method

    I've published an endpoint on my computer with the javax.xml.ws.Endpoint.publish method. When I'm load testing my endpoint on the local machine, with the client side in another jvm, the endpoint reacts very fast (server side(endpoint) and the client side on the same computer). There's not performance problem there.
    But when I try to run a load test with the server side endpoint on my local computer and the client side on another computer the endpoint reacts slow, very slow compared to the local scenario. Instead of 500 requests / second I get like 3 requests / second. Why?
    When I look at the traffic between the client and the server running on different machines it's like 4,5 kB/sec on a 100Mbit connection. And almost no cpu activity (neither server or client).
    When I've a web server, like Tomcat or Sun Java Application Server and deploy my endpoint there the traffics goes up to 400kB/sec. So then it works fine with good performance over the same network, same ip address, same port and everything.
    Why is my endpoint so slow when I publish it with javax.xml.ws.Endpoint.publish instead of on a for example Tomcat. And why is the endpoint fast when I'm running client and server on the same machine?

    the timeout is a client side thing, most likely. you need to set the http request timeout on the client.

  • SQL Adapter Crashes with large XML set returned by SQL stored procedure

    Hello everyone. I'm running BizTalk Server 2009 32 bit on Windows Server 2008 R2 with 8 GB of memory.
    I have a Receive Port with the Transport Type being SQL and the Receive Pipeline being XML Receive.
    I have a Send Port which processes the XML from this Receive Port and creates an HIPAA 834 file.
    Once a large file is created (approximately 1.6 GB in XML format, 32 MB in EDI form), a second file 1.7 GB fails to create.
    I get the following error in the Event Viewer:
    Event Type: Warning
    Event Source: BizTalk Server 2009
    Event Category: (1)
    Event ID: 5740
    Date:  10/28/2014
    Time:  7:15:31 PM
    User:  N/A
    The adapter "SQL" raised an error message. Details "HRESULT="0x80004005" Description="Unspecified error"
    Is there a way to change some BizTalk server settings to help in the processing of this large XML set without the SQL adapter crashing?
    Paul

    Could you check Sql Profiler to trace or determine if you are facing deadlock?
    Is your Adapter running under 64 bits?
    Have you studied the possibility of using SqlBulkInser Adapter?
    http://blogs.objectsharp.com/post/2005/10/23/Processing-a-Large-Flat-File-Message-with-BizTalk-and-the-SqlBulkInsert-Adapter.aspx

  • ArrayIndexOutOfBoundsException with large XML

    Hello,
    I have some java code that queries the DB and displays the XML in a browser. I am using the oracle.jbo.ViewObject object with the writeXML() method. Everything works well until I try to process a large XML file, the I get the following error:
    java.lang.ArrayIndexOutOfBoundsException: 16388
         at oracle.xml.io.XMLObjectOutput.writeUTF(XMLObjectOutput.java:215)
         at oracle.xml.parser.v2.XMLText.writeExternal(XMLText.java:354)
         at oracle.xml.parser.v2.XMLElement.writeExternal(XMLElement.java:1459)
         at oracle.xml.parser.v2.XMLElement.writeExternal(XMLElement.java:1459)
         at oracle.xml.parser.v2.XMLElement.writeExternal(XMLElement.java:1459)
         at oracle.xml.parser.v2.XMLElement.writeExternal(XMLElement.java:1459)
         at oracle.xml.parser.v2.XMLElement.writeExternal(XMLElement.java:1414)
         at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1267)
         at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1245)
         at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1052)
         at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:278)
         at com.evermind.server.ejb.EJBUtils.cloneSerialize(EJBUtils.java:409)
         at com.evermind.server.ejb.EJBUtils.cloneObject(EJBUtils.java:396)
    etc...
    I can put in the query to only allow a specific size to be displayed, but the users need to be able to access the larger XML files also. Has anyone else run into this issue?
    Oracle 10g
    Any help or pointers are greatly appreciated.
    Thank you.
    S

    No. We are not limiting the size in our code. Here is a snip of the offending code. The exception occurs on the " results = batchInterfaceView.writeXML(0, 0);" line, but only with larger files.
    <pre>
    try {
    // Request and response helper classes
    XMLHelper request = new XMLHelper(inputXML);
    response = new ResponseXMLHelper();
    if (request.doesValueExist(APP_ERROR_ID)) {
    //get input parameter
    strAppErrorId = request.getValue(APP_ERROR_ID);
    appErrorId = NumberConverter.toBigDecimal(strAppErrorId);
    //get Pos location view
    ViewObject batchInterfaceView =
    findViewObject(GET_ERROR_VIEW, PACKAGE_NAME);
    // get data for selected BatchInterface
    batchInterfaceView.setWhereClauseParam(0, appErrorId);
    batchInterfaceView.executeQuery();
    results = batchInterfaceView.writeXML(0, 0);
    response.addView(results);
    } catch (JboException e) {
    </pre>
    Thank you again for any help.
    S

  • PostMethod with large XML passed in setRequestEntity is truncated

    Hi,
    I use PostMethod to transfer large XML to a servlet.
    CharArrayWriter chWriter = new CharArrayWriter();
    post.marshal(chWriter);
    //The chWriter length is 120KB
    HttpClient httpClient = new HttpClient();
    PostMethod postMethod = new PostMethod(urlStr);
    postMethod.setRequestEntity(new StringRequestEntity(chWriter.toString(), null, null));
    postMethod.setRequestHeader("Content-type", "text/xml");
    int responseCode = httpClient.executeMethod(postMethod);
    String responseBody = postMethod.getResponseBodyAsString();
    postMethod.releaseConnection();
    When I open the request in doPost method in sevlet:
    Reader inpReader = request.getReader();
    char[] chars = MiscUtils.ReaderToChars(inpReader);
    inpReader = new CharArrayReader(chars);
    //The data is truncated (in chars[]) ???
    static public char[] ReaderToChars(Reader r) throws IOException, InterruptedException {
    //IlientConf.logger.debug("Start reading from stream");
    BufferedReader br = new BufferedReader(r);
    StringBuffer fileData = new StringBuffer("");
    char[] buf = new char[1024];
    int numRead=0;
    while((numRead=br.read(buf)) != -1){
    String readData = String.valueOf(buf, 0, numRead);
    fileData.append(readData);
    buf = new char[1024];
    return fileData.toString().toCharArray();
    Any idies what can be the problem??
    Lior.

    Hi,
    I use the same code and have 2 tomcats running with the same servlet on each of them.
    One running on Apache/2.0.52 (CentOS) Server and the second running on Apache/2.0.52 (windows XP) Server.
    I managed to post large XML from (CentOS) Server to (windows XP) Server successfully and failed to post large XML
    from (windows XP) Server to (CentOS) Server.
    I saw somthing called mod_isapi that might be disabling posting large XML files.
    Can anyone help me on going over that limitation?
    Thanks,
    Lior.

  • Does the parser work with large XML files?

    Is there a restriction on the XML file size that can be loaded into the parser?
    I am getting a out of memory exception reading in large XML file(10MB) using the commands
    DOMParser parser = new DOMParser();
    URL url = createURL(argv[0]);
    parser.setErrorStream(System.err);
    parser.setValidationMode(true);
    parser.showWarnings(true);
    parser.parse(url);
    Win NT 4.0 Server
    Sun JDK 1.2.2
    ===================================
    Error output
    ===================================
    Exception in thread "main" java.lang.OutOfMemoryError
    at oracle.xml.parser.v2.ElementDecl.getAttrDecls(ElementDecl.java, Compi
    led Code)
    at java.util.Hashtable.<init>(Unknown Source)
    at oracle.xml.parser.v2.DTDDecl.<init>(DTDDecl.java, Compiled Code)
    at oracle.xml.parser.v2.ElementDecl.getAttrDecls(ElementDecl.java, Compi
    led Code)
    at oracle.xml.parser.v2.ValidatingParser.checkDefaultAttributes(Validati
    ngParser.java, Compiled Code)
    at oracle.xml.parser.v2.NonValidatingParser.parseAttributes(NonValidatin
    gParser.java, Compiled Code)
    at oracle.xml.parser.v2.NonValidatingParser.parseElement(NonValidatingPa
    rser.java, Compiled Code)
    at oracle.xml.parser.v2.ValidatingParser.parseRootElement(ValidatingPars
    er.java:97)
    at oracle.xml.parser.v2.NonValidatingParser.parseDocument(NonValidatingP
    arser.java:199)
    at oracle.xml.parser.v2.XMLParser.parse(XMLParser.java:146)
    at TestLF.main(TestLF.java:40)
    null

    We have a number of test files that are that size and it works without a problem. However using the DOMParser does require significantly more memory than your doc size.
    What is the memory configuration of the JVM that you are running with? Have you tried increasing it? Are you using our latest version 2.0.2.6?
    Oracle XML Team

  • Problems with Large XML files

    I have tried increasing the memory pool using the -mx and -ms options. It doesnt work. I am using your latest XML parser for Java v2. Please let me know if there are some specific options I should be using.
    Thanx,
    -Sameer
    We have a number of test files that are that size and it works without a problem. However using the DOMParser does require significantly more memory than your doc size.
    What is the memory configuration of the JVM that you are running with? Have you tried increasing it? Are you using our latest version 2.0.2.6?
    Oracle XML Team
    Is there a restriction on the XML file size that can be loaded into the parser?
    I am getting a out of memory exception reading in large XML file(10MB) using the commands
    DOMParser parser = new DOMParser();
    URL url = createURL(argv[0]);
    parser.setErrorStream(System.err);
    parser.setValidationMode(true);
    parser.showWarnings(true);
    parser.parse(url);
    Win NT 4.0 Server
    Sun JDK 1.2.2
    ===================================
    Error output
    ===================================
    Exception in thread "main" java.lang.OutOfMemoryError
    at oracle.xml.parser.v2.ElementDecl.getAttrDecls(ElementDecl.java, Compi
    led Code)
    at java.util.Hashtable.<init>(Unknown Source)
    at oracle.xml.parser.v2.DTDDecl.<init>(DTDDecl.java, Compiled Code)
    at oracle.xml.parser.v2.ElementDecl.getAttrDecls(ElementDecl.java, Compi
    led Code)
    at oracle.xml.parser.v2.ValidatingParser.checkDefaultAttributes(Validati
    ngParser.java, Compiled Code)
    at oracle.xml.parser.v2.NonValidatingParser.parseAttributes(NonValidatin
    gParser.java, Compiled Code)
    at oracle.xml.parser.v2.NonValidatingParser.parseElement(NonValidatingPa
    rser.java, Compiled Code)
    at oracle.xml.parser.v2.ValidatingParser.parseRootElement(ValidatingPars
    er.java:97)
    at oracle.xml.parser.v2.NonValidatingParser.parseDocument(NonValidatingP
    arser.java:199)
    at oracle.xml.parser.v2.XMLParser.parse(XMLParser.java:146)
    at TestLF.main(TestLF.java:40)
    null

    You might try using a different JDK/JRE - either a 1.1.6+ or 1.3 version as 1.2 in our experience has the largest footprint. If this doesn't work can you give us some details about your system configuration. Finally you might try the SAX interface as it does not need to load the entire DOM tree into memory.
    Oracle XML Team

  • Performance Issues with large XML (1-1.5MB) files

    Hi,
    I'm using an XML Schema based Object relational storage for my XML documents which are typically 1-1.5 MB in size and having serious performance issues with XPath Query.
    When I do XPath query against an element of SQLType varchar2, I get a good performance. But when I do a similar XPath query against an element of SQLType Collection (Varray of varchar2), I get a very ordinary performance.
    I have also created indexes on extract() and analyzed my XMLType table and indexes, but I have no performance gain. Also, I have tried all sorts of storage options available for Collections ie. Varray's, Nested Tables, IOT's, LOB's, Inline, etc... and all these gave me same bad performance.
    I even tried creating XMLType views based on XPath queries but the performance didn't improve much.
    I guess I'm running out of options and patience as well.;)
    I would appreciate any ideas/suggestions, please help.....
    Thanks;
    Ramakrishna Chinta

    Are you having similar symptoms as I am? http://discussions.apple.com/thread.jspa?threadID=2234792&tstart=0

  • Slow Performance with large library (PC)

    I've been reading many posts about slow performance but didn't see anything addressing this issue:
    I have some 40000 photos in my catalog and despite generating previews for a group of directories, LR still is very slow in scrolling through the pics in these directories.
    When I take 2000 of these pics and import them into a new catalog - again generating previews, the scroll through the pics happens much much faster.
    So is there some upper limit of recommended catalog size for acceptable performance?
    Do I need to split my pics up by year? Seems counter productive, but the only way to see the pics at an acceptable speed.

    I also have serious performance issues, and i don´t even have a large database catalog, only around 2.000 pictures, the db file itself is only 75 mb big. Done optimization - didn´t help. What i encountered is that the cpu usage of LR 1.1 goes up and STAYS up around 85% for 4-5 minutes after programm start - during that time, zooming in to an image can take 2-3 minutes! After 4-5 minutes, cpu usage drops to 0%, the background task (whatever LR does during that time!) has finished and i can work very smoothly. preview generation cannot be the problem, since it also happens when i´m working in a folder that already has all previews build, close LR, and start again instantly. LR loads and AGAIN i´ll have to wait 4-5 minutes untill cpu ussage has dropped so i can continue working with my images smoothly.
    This is very annoying! I will stop using LR and go back to bridge/acr/ps, this is MUCH much faster. BUMMER!

  • Slow Performance with large OR query

    Hi All;
    I am new to this forum... so please tread lightly on me if I am asking some rather basic questions. This question has been addressed before in this forum more than a year ago (http://swforum.sun.com/jive/thread.jsp?forum=13&thread=9041). I am going to ask it again. We have a situation where we have large filters using the OR operator. The searches look like:
    & (objectclass=something) (|(attribute=this) (attribute=that) (attribute=something) .... )
    We are finding that the performance between 100 attributes versus 1 attribute in a filter is significant. In order to increase performance, we have to issue the following filters in seperate searches:
    & (objectclass=something) (attribute=this)
    & (objectclass=something) (attribute=that)
    & (objectclass=something) (attribute=something)
    The first search takes an average of 60 seconds, and the combination of searches in the second filter takes an average of 4 seconds. This is a large performance improvement.
    We feel that this solution is not desirable because:
    1. When the server is under heavy load, this solution will not scale very well.
    2. We feel we should not have to modify our code to deal with a server deficiency
    3. This solution creates too much network traffic
    My questions:
    1. Is there a query optimizer in the server? If so, shouldn't the query optimizer take care of this?
    2. Why is there such a large performance difference between the two filters above?
    3. Is there a setting somewhere in the server (documented or undocumented) that would handle this issue? (ie average query size)
    4. Is this a known issue?
    5. Besides breaking up the filter into pieces, is there a better way to approach this type of problem?
    Thanks in advance,
    Paul Rowe

    I also have serious performance issues, and i don´t even have a large database catalog, only around 2.000 pictures, the db file itself is only 75 mb big. Done optimization - didn´t help. What i encountered is that the cpu usage of LR 1.1 goes up and STAYS up around 85% for 4-5 minutes after programm start - during that time, zooming in to an image can take 2-3 minutes! After 4-5 minutes, cpu usage drops to 0%, the background task (whatever LR does during that time!) has finished and i can work very smoothly. preview generation cannot be the problem, since it also happens when i´m working in a folder that already has all previews build, close LR, and start again instantly. LR loads and AGAIN i´ll have to wait 4-5 minutes untill cpu ussage has dropped so i can continue working with my images smoothly.
    This is very annoying! I will stop using LR and go back to bridge/acr/ps, this is MUCH much faster. BUMMER!

  • TableView performance with large number of columns

    I notice that it takes awhile for table views to populate when they have a large number of columns (> 100 or so subjectively).
    Running VisualVM based on CPU Samples, I see that the largest amount of time is spent here:
    javafx.scene.control.TableView.getVisibleLeafIndex() 35.3% 8,113 ms
    next is:
    javfx.scene.Parent$1.onProposedChange() 9.5% 2,193 ms
    followed by
    javafx.scene.control.Control.loadSkinClass() 5.2% 1,193 ms
    I am using JavaFx 2.1 co-bundled with Java7u4. Is this to be expected, or are there some performance tuning hints I should know?
    Thanks,
    - Pat

    We're actually doing some TableView performance work right now, I wonder if you could file an issue with a simple reproducible test case? I haven't seen the same data you have here in our profiles (nearly all time is spent on reapplying CSS) so I would be interested in your exact test to be able to profile it and see what is going on.
    Thanks
    Richard

  • TransformerHandler throws OutOfMemoryError with large xml files

    i'm using TransformerHandler to convert any content to SAX events and transform it using XSLT into an XML file.
    the problem is that for large amount of content i get a OutOfMemoryError.
    it seams that the content is kept in memory and only flushed when i call handler.endDocument();
    i tried using auto flush writers as the Result, or call the flush() method myself, but nothing.
    here is the example - pls help!
    import javax.xml.transform.TransformerFactory;
    import javax.xml.transform.sax.SAXTransformerFactory;
    import javax.xml.transform.sax.TransformerHandler;
    import javax.xml.transform.stream.StreamResult;
    import javax.xml.transform.stream.StreamSource;
    import org.xml.sax.helpers.AttributesImpl;
    public class Test
          * test handler memory usage
          * @param loops no of loops - when large enogh - OutOfMemoryError !!!
          * @param xsltFilePath xslt file
          * @param targetXmlFile output xml file
          * @throws Exception
         public static void testHandlerMemUsage(int loops, String xsltFilePath, String targetXmlFile)throws Exception
              //verify SAX support
              TransformerFactory factory = TransformerFactory.newInstance();
              if(!factory.getFeature(SAXTransformerFactory.FEATURE))
                   throw new UnsupportedOperationException("SAX tranformations not supported");
              TransformerHandler handler=
                   ((SAXTransformerFactory)factory).newTransformerHandler(new StreamSource(xsltFilePath));
              handler.setResult(new StreamResult(targetXmlFile));
              handler.startDocument();
              handler.startElement(null,"root","root",new AttributesImpl());
              //loop
              for(int i=0;i<loops;i++)
                   handler.startElement(null,"el-"+i,"el-"+i,new AttributesImpl());
                   handler.characters("value".toCharArray(),0,"value".length());
                   handler.endElement(null,"el-"+i,"el-"+i);
              handler.endElement(null,"root","root");
              //System.out.println("end document");
              //only after endDocument() starts to print..
              handler.endDocument();
              //System.out.println("ended document");
         public static void main(String[] args)throws Exception
              System.out.println("--starting..");
              testHandlerMemUsage(500000,"/copy.xslt","/testHandlerMemUsage.xml");
              System.out.println("--we are still here -- increase loops..");
    }

    Did you try increasing memeory when starting java with the -Xmx parameter? You know that java uses only 64MB by default, so you might need to increase it to e.g. 256MB for your XML to work.

  • Xml query hungs up with large xml response from utl_http request

    We are having very sensitive problem in Production environment with xmlquery.
    When receive a small or medium size xml, the query shown below works. But when the xml is large or very large its hung and slow down all the database.
    We are with Oracle 11gR2
    We are using clob to parse the response from the http request.
    What could be the problem or the solutions?. Please help. Its urgent...
    SELECT opciones_obj (x.strindice,
    x.nombrecompleto,
    x.nombre,
    x.strtipodato,
    x.codigoopcion,
    x.floatval,
    x.strtipo,
    x.strval)
    BULK COLLECT INTO t_opciones
    FROM XMLTABLE (
    xmlnamespaces (
    'http://schemas.xmlsoap.org/soap/envelope/' AS "env",
    'http://wsevaluarreglacondicioncomercial/' AS "ns0",
    'http://wsevaluarreglacondicioncomercial/types/' AS "ns1",
    'http://www.oracle.com/webservices/internal/literal' AS "ns2"),
    '/env:Envelope/env:Body/ns0:listarOpcionesAtributoEventoResponseElement/ns0:result/ns1:listaVariables/ns2:item/ns2:item'
    PASSING rsp_xml
    COLUMNS strindice VARCHAR2 (4000)
    PATH 'ns2:mapEntry[ns2:key="strIndice"]/ns2:value',
    nombrecompleto VARCHAR2 (4000)
    PATH 'ns2:mapEntry[ns2:key="nombreCompleto"]/ns2:value',
    nombre VARCHAR2 (4000)
    PATH 'ns2:mapEntry[ns2:key="nombre"]/ns2:value',
    strtipodato VARCHAR2 (4000)
    PATH 'ns2:mapEntry[ns2:key="strTipoDato"]/ns2:value',
    codigoopcion NUMBER
    PATH 'ns2:mapEntry[ns2:key="codigoOpcion"]/ns2:value',
    floatval FLOAT
    PATH 'ns2:mapEntry[ns2:key="floatVal"]/ns2:value',
    strtipo VARCHAR2 (4000)
    PATH 'ns2:mapEntry[ns2:key="strTipo"]/ns2:value',
    strval VARCHAR2 (4000)
    PATH 'ns2:mapEntry[ns2:key="strVal"]/ns2:value') x;

    What could be the problem or the solutions?1) Create an XMLType table (could be temporary) using binary XML storage :
    create table tmp_xml of xmltype
    xmltype store as securefile binary xml;2) In your procedure, load the XMLType containing the response (rsp_xml) into the table :
    insert into tmp_xml values (rsp_xml);3) Then, execute the query directly from the table :
    SELECT opciones_obj ( ... )
    BULK COLLECT INTO t_opciones
    FROM tmp_xml t
       , XMLTABLE (
             xmlnamespaces ( ... ),
             '/env:Envelope/env:Body/...'
             PASSING t.object_value
             COLUMNS ...4) At the end of the procedure, delete (or truncate) the table or simply let the table delete itself when the session ends (in case you created it TEMPORARY)

  • Animation performance with large images -- too large?

    I'm working with a client who created an animation using Edge Animate, and we're having some performance problems. The HTML page itself is 3840x1080, and the animation that is causing the most grief is a series of 304 PNGs, each 2592x1080 in size, showing at 10 fps.
    IE 10 is the required browser. When this animation runs, many of the frames are dropped, showing the X placeholder image instead. Additionally, there are pauses and stutters and such, presumably as the browser is trying to keep up. I note that when I go in and modify the animation() constructor to use maxFrames: 50 or so, the performance problems go away entirely. It seems like the preloading of all the frames is hitting a memory ceiling.
    Question: Is it even feasible to ask the browser to do this kind of work? I'm not a JavaScript/HTML guy, and if I had my druthers, this would be done using something like C++/DirectX.
    Any opinions here on optimizing, or approaching this problem differently?
    Many thanks,
    Dave

    Welcome to the family. You're coming into the fold at a bad time, sad to say.
    I'd be very interested to know more about your workflow, jc, scanner hardware, acquisition software, etc.
    Right now, I'd have to say Aperture is not a tool you even want to consider but I gotta give you my opinion: your catalog shouldn't point to the originals anyway. Do your conversions to a suitable format that will get you the versatility you need without overkill and then offload those huge originals (make at least two copies of your library) for permanent storage with one offsite.
    Folks around here speak highly of iView but I've never used it. Portfolio is a longtime favorite and I believe you can distribute copies of your catalog with a free Portfolio viewer (could be wrong about that). You certainly cannot do that with Aperture. Your catalog will use little JPEGs for thumbnails but you can make prints and slideshows from your more reasonably-sized assets like, say, 8 bit TIFFs at 6-14 megs.
    Hey, good luck and I hope you get better advice on a wider selection of products.
    bogiesan

  • Hyperion Financial Reporting server 9.3.1 - Performance with Large batches

    I have not been able to find any help yet, so hopefully someone here can help.
    We have several financial reporting servers so that we can schedule reports and have users on seperate servers so that they do not interfere with each other.
    The problem is that when bursted batch reports that select 100 - +1000 members from the bursting dimension run, the resources are maxing out memory and if multiple batches with the same large number of (but different members) are run at the same time, we start having failures on the services where they hang or worse the server crashes (one server crashed early this morning).
    The Windows 2003 servers are Dell 2950 1x intel core 2 duo 3GHz with 8GB of RAM.
    We found that if we set the java memory parms at anything higher than 1.5GB the services do not start, so all 8GB available is hardly being used by the server since the FR services (batch scheduler, print server, reports, RMI) are the only things running.
    The batches are bursting the report for each member to a network folder for users to access and for archival purposes.
    We may need to get Oracle involved, but if anyone here has insight I'd appreciate the assistance.

    Hi Robert
    I have come across similar issues where the reports take much longer to run as part of a batch than it does when accessed direct via the workspace. Our issue was that Financial Reporting was not dropping the connection to Essbase. We were using windows os and had to add a few DWORDs to the registry:
    1. Open the registry and navigate to Local Machine\System\CurrentControlSet\Services\TCPIP\Parameters
    2. Add new DWORD Value named TcpTimedWaitDelay, right click and select Modify. Select decimal radio button, type in 30 (this is the number of seconds that TCP/IP will hold on to a connection before it is released and made available again, the default is 120 seconds)
    3. Add new DWORD Value named MaxUserPort, right click and select Modify. Select decimal radio button, type in 65534 (this determines the highest port number TCP can assign when an application requests an available user port from the system, the default is 5000)
    4. Add new DWORD Value named MaxFreeTcbs, right click and select Modify. Select decimal radio button, type in 6250 (this determines the number of TCP control blocks (TCBs) the system creates to support active connections, the default is 2000. As each connection requires a control block, this value determines how many active connections TCP can support simultaneously. If all control blocks are used and more connection requests arrive, TCP can prematurely release connections in the TIME_WAIT state in order to free a control block for a new connection)
    I think we did this to both our essbase and application server and rebooted both afterwards, it made a dramatic improvement to batch times!!
    As a personal note I try not to have too many batches running at once as they can interfere with each other and it can lead to failures. Where I have done this before we tend to use windows batch (.bat) files to launch the FR batches from the command line, if time allows I run a few batches to get a reasonable estimate of the time to complete and in my .bat file I add a pause of around that amount of time in between sending the batch requrests to the scheduler. Admittedly I've not done it yet where the number of reports in a bursting batch is as many as 1000.
    Hopefully this will help
    Stuart

Maybe you are looking for

  • Interview transcribed but transcript not showing in Metadata

    Hi I have 4 interviews spread across two Premiere projects. I wanted to transcribe them all and left them in the queue overnight on Media Encoder. I closed Premiere. In the morning, Adobe Media Encoder had green ticks against all four files, showing

  • How to send pdf attachment through mail using Webdynpro-Java

    How can i send the Adobe Interactive Form with data involved as a PDF attachment to mail? using Webdynpro-Java. Is there any Adobe API or jar file for generating pdf? Helpful answer is highly appreciable and rewarded with points.!

  • Help with mysql query plz anyone i am begging!!!

    Hi everyone pls forgive me i am new to java. can someone pls tel me where i am going wrong wit this mysql query? <sql:query var="parish" maxRows="1" dataSource="jdbc/gav"> SELECT ParishName, OwnerOccupierHousehold, OwnerOccupierPercOfOverall, OwnerOc

  • Tree control with Nodes in ALV

    Hi ABAP Gurus, I am trying to create tree control with nodes in ALV. I have referred to standard BCALV_TREE* programs but new to OO programming. Not sure as to how I can display my internal table data than the SFLIGHT information in these programs. C

  • Any apps for deleting duplicate mail?

    I was thinking this was built in to Mail, but looking for a script or app that will help me find and eliminate duplicate emails in my mailboxes.