Processing large XML messages ( 100Mb) in PI 7.1

Hi All
I have PI 7.1 & need to process & create Large XML messages with not so extensive mapping & direction is from SAP to FTP server.
I created the test scenario using Consumer Proxy in order to check how large message our PI server can handle, so till 100Mb it went fine but anything above 100Mb got stuck in R3 server Integration engine only. When I checked the message in MONI of R3 server, initially it gives "Automatic Restart" status & after the retry limit was over it gave me "System Error-after automatic restart" with the error status as -->
<?xml version="1.0" encoding="UTF-8" standalone="yes" ?>
- <!--  Call Integration Server
  -->
- <SAP:Error xmlns:SAP="http://sap.com/xi/XI/Message/30" xmlns:SOAP="http://schemas.xmlsoap.org/soap/envelope/" SOAP:mustUnderstand="">
  <SAP:Category>XIServer</SAP:Category>
  <SAP:Code area="INTERNAL">CLIENT_RECEIVE_FAILED</SAP:Code>
  <SAP:P1>110</SAP:P1>
  <SAP:P2 />
  <SAP:P3 />
  <SAP:P4 />
  <SAP:AdditionalText />
  <SAP:ApplicationFaultMessage namespace="" />
  <SAP:Stack>Error while receiving by HTTP (error code: 110, error text: )</SAP:Stack>
  <SAP:Retry>A</SAP:Retry>
  </SAP:Error>
Can you let me know how can we further tune our SAP R3 server to process the large files as the message is getting stuck in R3 server (at HTTP_SEND) only it is not even reaching PI server, also breaking message into small messages cannot be implemented as we need to send the file in ONE go only.
Regards
Lalit
Edited by: Lalit Chaudhary on May 5, 2009 6:05 PM
Edited by: Lalit Chaudhary on May 6, 2009 2:49 AM

so till 100Mb it went fine but anything above 100Mb got stuck in R3 server Integration engine only
all the systems are configured with a default timeout parameter....it determines till what time a system should try processing a file...if this is exceeded then you may get the mentioned error....
So what you can do is try increasing the timeout of your system....so that you give the system some more time for processing the file
Please note that increasing the timeout would mean decreasing performance
To check how much is the impact see the performance before increasing the timeout and after increasing..
If you need some help in file processing in XI:
/people/sravya.talanki2/blog/2005/11/29/night-mare-processing-huge-files-in-sap-xi
Hope it helps.
regards,
Abhishek.

Similar Messages

  • Processing large xml file (500mb)? break into small part? load into jtree

    hi,
    i'm doing an assignment to processing large xml file (500mb) and
    load into jree using JAVA.
    can someone advice me on the algorithm to do this?
    how can i load a 500mb xml in a jtree without system hang?
    how to i break my file and do the loading?

    1 Is the file schema based binary XML.
    2. The limits are dependant on storage model and chacater set.
    3. For all NON-XML content the current limit is 4GBytes (Where that is bytes not characters). So for Character content in an AL32UTF8 database the limit is 2GB
    4. For XML Content stored as CLOB the limit is the same as for character data (2GB/4GB) dependant on database character set.
    5. For SB Based XML content stored in Object Relatioanl storage the limit is determined by the complexity and structures defiend in the XML Schema

  • SXMB_MONI-- Processing Statistcs - XML messages in "Being Edited" State

    Under SXMB_MONI--> Processing Statistics, I see that I have a large number of messages in a state of "XML Message being edited".  I need to delete these messages out of my system.  The regular delete jobs do not work because they are not in a state of Complete Successfully or in error. 
    These are causing my persistence layer to be above 100%.  Any input is appreciated.  All of the online help mentions every statistic on this page except for the "XML message being edited".
    Thanks!

    delete messages in SXMB_MONI
    For step by step procedure of how to archieve and delete messages in XI go through the urls:
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/402fae48-0601-0010-3088-85c46a236f50

  • Best method for encrypting/decrypting large XML files ( 100MB)

    I am in need of encrypting XML for large part files that can get upwards of 100Mb+.
    I found some articles and code, but the only example I was successful in getting to work used XMLCipher, which takes a Document, parses it, and then encrypts it.
    Obviously, 100Mb files do not cooperate well with DOM, so I want to find a better method for encryption/decryption of these files.
    I found some articles using a CipherInputStream and CipherOutputStreams, but am not clear if this is the way to go and if this will avoid memory errors.
    import java.io.*;
    import java.security.spec.AlgorithmParameterSpec;
    import javax.crypto.*;
    import javax.crypto.spec.IvParameterSpec;
    public class DesEncrypter {
        Cipher ecipher;
        Cipher dcipher;
        public DesEncrypter(SecretKey key) {
            // Create an 8-byte initialization vector
            byte[] iv = new byte[]{
                (byte)0x8E, 0x12, 0x39, (byte)0x9C,
                0x07, 0x72, 0x6F, 0x5A
            AlgorithmParameterSpec paramSpec = new IvParameterSpec(iv);
            try {
                ecipher = Cipher.getInstance("DES/CBC/PKCS5Padding");
                dcipher = Cipher.getInstance("DES/CBC/PKCS5Padding");
                // CBC requires an initialization vector
                ecipher.init(Cipher.ENCRYPT_MODE, key, paramSpec);
                dcipher.init(Cipher.DECRYPT_MODE, key, paramSpec);
            } catch (java.security.InvalidAlgorithmParameterException e) {
            } catch (javax.crypto.NoSuchPaddingException e) {
            } catch (java.security.NoSuchAlgorithmException e) {
            } catch (java.security.InvalidKeyException e) {
        // Buffer used to transport the bytes from one stream to another
        byte[] buf = new byte[1024];
        public void encrypt(InputStream in, OutputStream out) {
            try {
                // Bytes written to out will be encrypted
                out = new CipherOutputStream(out, ecipher);
                // Read in the cleartext bytes and write to out to encrypt
                int numRead = 0;
                while ((numRead = in.read(buf)) >= 0) {
                    out.write(buf, 0, numRead);
                out.close();
            } catch (java.io.IOException e) {
        public void decrypt(InputStream in, OutputStream out) {
            try {
                // Bytes read from in will be decrypted
                in = new CipherInputStream(in, dcipher);
                // Read in the decrypted bytes and write the cleartext to out
                int numRead = 0;
                while ((numRead = in.read(buf)) >= 0) {
                    out.write(buf, 0, numRead);
                out.close();
            } catch (java.io.IOException e) {
    }This looks like it might fit, but there is one more twist, I am using a persistence manager and xml encoding to accomplish that, so I am not sure how (where) to implement this method without affecting persistence.
    Any guidance on what would work best in this situation would be appreciated.
    Regards,
    vbplayr2000

    I can give some general guidelines that might help, having done much similar work:
    You have 2 different issues, at least from my reading of your problem:
    1) How to deal with large XML docs that most parsers will not handle without memory issues
    2) Where to hide or "black box" the encrypt/decrypt routines
    #1: Check into XPP3/XMLPull. Yes, it's different that the other XML parsers you are used to using, and more work is involved, but it is blazing fast and can be used to parse a stream as it is being read. You can populate beans and process as needed since there is really not much "inversion of control" involved compared to parsers that go on to finish the entire document or load it all into memory.
    #2: Extend Serializable and write your own readObject/writeObject methods. Place the encrypt/decrypt in there as appropriate. That will "hide" the implementation and should be what any persistence manager can deal with.
    Regards,
    antarti

  • Process Large XML files

    Hi,
    I am working on a File-File scenarion and I have around 300 MB XML files files coming in.
    My interface cud successfully process smaller files but it is not able to handle larger files.
    Group pls direct me.
    Thanks,
    Nandini

    Nandini,
    Check out this thread...
    File Adpater: Size of your processed messages
    Other than parameter and hardware setting, use JAVA mapping rather than Message / XSLT mapping...Java mapping is much faster compare to any other sort of mapping...
    Nilesh

  • How to make use of XMLDB to process large XML and create spatial objects

    For someone new to XMLDB I find it hard to make sense of the enormous amount of information (and easy to get lost in it). So I come here to ask for directions.
    I have to build a procedure that fills a table of spatial objects based on XML input. Basically the XML contains a large amount of elements that describe the geometry type and contain the geometries coordinates. The XML can get quite large (200-300Mb).
    Somehow I have to process each element and create an sdo_geometry object.
    Now let me ask a very broad question: What would be a good way to handle this?

    I have the same question. Any news on this?
    Wijnand

  • PI process  Large Xml payload?

    HI experts:
       in my scenario,  i call a RFC and the RFC have a response ,the  response  payload  have   10M, and the my java crushed with the error:out of memory,is there any way to expand the pi java memory?
    thanks ,best regards

    Hi Raja Sekhar Reddy T :
    thanks for  your response. my scenario is  abap proxy<-> PI<->RFC. when i call the Proxy use the abap code . the RFC's response ,message payload is tremendous. so pi crashed with the error out of memory.  the response message structrue is like folllwing:
    <Persons>
    <Person>
    <personname></personname>
    <***></***>
    </Person>
    <Person>
    <personname></personname>
    <***></***>
    </Person>
    </Persons>
    the Person element is 0~unbounded. so i can split the message into small  piece . but i have some doubt:
    1.where can i  split the message . if i split  message in the mapping.because the original message is tremendous,the jvm will be crash when  response message load into the jvm memory before  starting message splitting.
    2. if the splitting is success.how the RFC which call the abapproxy receiver the response muti-message.
    thanks very much.

  • Handling xml message of size more than 100mb in SAP PI 7.1

    Dear Experts,
    Is it possible for PI to pick-up and process a XML message of size more than 100 MB in PI 7.1 EHP-1?
    If yes, can you please let me know how to handle it?
    Thank  you.

    Hi Saravana,
    it is not a best practice to more than 100mb..
    you can increase below parameters and so that you would be able to process for the best..
    u2022     UME Parameters :  May be we need to look into the pool size and poolmax wait parameters - UME recommended parameters (like: poolmaxsize=50, poolmaxwait=60000)
    u2022     Tuning Parameters:  May be we need to look/define the Message Size Limit u201Clike: EO_MSG_SIZE_LIMIT = 0000100u201D under tuning category
    u2022     ICM Parameters: May be we need to consider ICM parameters (ex: icm/conn_timeout = 900000. icm/HTTP/max_request_size_KB = 2097152)
    Thanks and Regards,
    Naveen

  • Table Names of Processed  XML Messages

    Hi ALL,
    We are facing space problem  in PRD box with one of our client, due to increase in large number of Messages  .
    Our basis tryed to do archiving, but endup with no luck.
    Could any one tell us in which tables processed idoc xml messages will get stored.
    .Any help is highly appreciated.
    Kind Regards,
    Vijay

    Hi,
    you should not delete XI Message directely from database table. Please schedule archive or delete jobs. If you never define which interface should be archived, it could NOT be done afterwards. This is also documented clearly on the help.sap.com. What you could do is delete the old messages to free table space. You could configure now interface which should be archived, in this case the message processed in XI in future could be archvied.
    XI message is actually stored in different tables,
    SXMSPMAST, SXMSPMAST2      XML Message master Table,  contains runtime information on XML messages processed by XI; in this table you can find all the information that is visible in SXMB_MONI.
    SXMSCLUP     XI Message Payload Tabelle
    SXMSCLUR     XI Message Payload Tabelle
    SMPPREL3     this table contains information about interfaces; here you can find information on sender (party, service, interface and namespace) and receiver (party, service, interface and namespace) as well as information on maps (their GUID and name as well as namespace).
    SXMSPFRAWH     Performance Head Data, Audit Log,
    SXMSPFRAWD     Performance Data, Audit Log
    There are could be more tables used to store information about XI messages, as I mentioned don't modify database table by your own risk. You will not get support by SAP, it could cause fatal error.
    regards,
    Hai

  • XML messages not processed

    Hi all,
    I have a problem in my FTP folder. I have a text file to be processed as XML messages in XI. The problem is, it's not processing. The text file goes straight to the archive folder without processing it to an XML file. It's not in sxmb_moni and smq2. I'm not sure what happened. It was being processed before.
    Thanks.
    Regards,
    IX

    Check your sender file adapter status from the Communication channel monitoring on your Runtime Workbench.
    Thanks,
    Pooja Pandey

  • Best way to parse XML Message ?

    Hi,
    I have to implement in PL/SQL an interface which receives for example orders in xml format. Something like this:
    <order>
    <order_no>4711</order_no>
    <order_line>
    <skuid>10001</skuid>
    <qty>34</qty>
    </order_line>
    <order_line>
    <skuid>10002</skuid>
    <qty>35</qty>
    </order_line>
    </order>
    I have to parse this xml and store it into an order_mast and order_detl table.
    I'm uncertain which would be the best approach. I read in this document http://www.oracle.com/technology/tech/xml/xdk_sample/xdksample_040602.pdf to do it with an XDK Sax Parser.
    According to this document the performance is better when using SAX then DOM for large xml message.
    But how "large" is large? In my case the xml message can have theoretically several thousand lines, is this already a performance issue for DOM ?
    Are besides the performance any good reason to use DOM instead of SAX ?
    Are there in Oracle 10g any even better possiblities to do this ?
    I read about some DMBS_XML packages, but do know them really.
    I really appreciate any hint or suggestion.
    Many Thanks!
    Peter

    I use the DBMS_XMLDOM package to parse through XML Schemas (about 2Mb in text file terms, thousands of lines (never counted them)) on my desktop, and including all of the processing I do and storing all the information in database tables, the whole lot is processed in about 30 seconds. These are fairly large schemas with lots of included schemas. I imagine it's gonna be a helluva lot faster when we put it on our server, so I don't think DBMS_XMLDOM is a slow way of doing it. Guess the best thing to do is to try different ways and do some performance testing based on your own business requirements i.e. varying size of XML messages and frequency/amount of messages being received.
    ;)

  • Large XML and DBMS_XMLSTORE

    Oracle DB = 10.2.0.3
    I have XML messages (size range 300KB - 15MB) that are inserted into a table with an XMLType column throughout the day.
    The volume will vary...say 10 transactions per minute. A single transaction can contain any number of entities/objects.
    I have no need to preserve the XML - aside from assigning it a sequence number, date etc. and storing it for debugging (if needed).
    Currently, I have a trigger defined on the table that calls various database packages
    that shred the data via XMLTable (i.e. insert into tab1...select...xmltable..).
    The problem is that I potentially need to insert into 50+ tables out of 200+ tables per transaction. So, I'm passing through the single instance document 200+ times.
    The number of tables that are needed to represent the XML will continue to grow.
    A 300KB XML message takes approx 10 seconds to shred. Obviously, I'm concerned about the larger messages.
         So, I was wondering if doing the following would be any better:
    1. Have a XSLT transform the original message into "pseudo" Oracle canonical format.
    At this point, all of the data contained in the message would need to be shredded and all unwanted data from the original message would be filtered..i.e. lowering the message size a little.
    The message to be inserted would then look something like this:
    <Transaction>
         <Table>
         <Name>TAB1</Name>
         <ROWSET>
              <ROW>
              <COL1>
              <COLn>
              </ROW>
    </ROWSET>
    </Table>
         <Table>
         <Name>TAB2</Name>
         <ROWSET>
              <ROW>
              <COL1>
              <COLn>
              </ROW>
    </ROWSET>
    </Table>
    </Transaction>
    2. Next, define a trigger on the table like this:
    CREATE OR REPLACE TRIGGER tr_ai_my_table
    AFTER INSERT
    ON my_table
    FOR EACH ROW
    DECLARE
    CURSOR table_cur
    IS
    SELECT a.table_nm
    ,a.xml_fragment
    FROM XMLTABLE ('/Transaction/Table[Name]'
    PASSING :NEW.xml_document
    COLUMNS
    table_nm VARCHAR2(30) PATH 'Name'
    ,xml_fragment XMLTYPE PATH 'ROWSET'
    ) a;
    v_context DBMS_XMLSTORE.ctxtype;
    v_rows NUMBER;
    -- XML Date format: 2007-08-02
    -- XML Datetime format: 2007-08-02T11:58:28.229-04:00
    -- ALTER SESSION SET NLS_DATE_FORMAT = 'YYYY-MM-DD';
    -- ALTER SESSION SET NLS_TIMESTAMP_FORMAT = 'YYYY-MM-DD-HH24.MI.SS.FF';
    -- ALTER SESSION SET NLS_TIMESTAMP_TZ_FORMAT = 'YYYY-MM-DD"T"HH24:MI:SS.FFTZH:TZM';
    BEGIN
    FOR table_rec IN table_cur
    LOOP
    v_context := DBMS_XMLSTORE.newcontext (table_rec.table_nm);
    v_rows := DBMS_XMLSTORE.insertxml (v_context, table_rec.xml_fragment);
    DBMS_XMLSTORE.closecontext (v_context);
    END LOOP;
    END;
         I know this will get me out of maintaining 200+ insert statements/xpaths. Also, I wouldn't be coding the XSLT either and the XSLT may end up running on some specialized hardware for optimized performance.
    But do I gain a significant performance boost using the Oracle canonical format
    and DBMS_XMLSTORE even though I'm still building the DOM tree to build the cursor ?

    i saw a problem with the xmlbeans implementation with wli 8.1
    we had memeory issues with large xmls messages as they seemded to be in memory even through the JPD had finished.
    to solve this we inserted code at the end of a process to assign the top level xmlbean document class to a tiny version of the same message type. this is valid when the document object is a top level variable of the jpd.
    this was done to stop xmlbeans caching the large message behind the scenes.

  • XML messages eamil (Problem)

    http://www.flickr.com/photos/25222280@N03/
    Hi, All
    We are using XI 3.0 Support package: 20
    I can expalin the probelm step by step please help me about thanks in advanced
    Step: 1
    I am trying to get XML message from XI system messages shows in SXMB_MONI with some read flags
    Step 2
    SAPconnect working fine... I can send email anyone internaly through so01 so connections works
    Step 3
    Goto ALRTCATDEF and create alert category which is XDV_ALRT then goto RWB and click Alrt configuration, select classification and click Roules its does show my user ID and Alert cagt. there
    Then I received XML messages my eamil with Attachment file and subject: Alert, which is I didn't define anywhere... I didn't know where this messages comes from?
    Step 4: I put some others users using alrtcatdef tr such as make_m and amer_J and my user id in it as well but only I am getting these messages not others I double check user profile and roles all pretty much same look like...
    My simple question is that I want to setup XML messages to some users through email thats it
    I am sending some screen shot I thing someitng is missing or I am getting some wrong please look at this
    http://www.flickr.com/photos/25222280@N03/
    Thanks

    Hello Aamir
    Thanks for very fast responsed, Aamir when I goto SXMB_MONI first screen comes up "Monitor processed for XML messages"
    First Screen
    view is = Defaul
    Status group is empty
    Standard selection criteria  (from date to date = I just define date couple of days ago!)
    Sender and receiver all fields are empty
    2nd Screen
    When I execute its shows xml messages with black flag and Red dot all Red dot comes with attachemnt eamil
    in this screen they have some button like "display", "error information" "referencing infromation" "reset" 'shows"
    with Error look like that before I click on it
    @5C\QSystem Error - Restart Not Possible@     @5F@     02.04.2008     02:14:39 02.04.2008     02:14:43                    MOL_NQA tp://www.abc.com/MOL/OnlineStore     O_msgIF_CHECK_BILLING
    dbcinqa     http://www.mitel.com/MOL/OnlineStore     I_msgIF_CHECK_BILLING     "Current Status"     @5F@     @5F@     @5F@               6          
    When I click on that 3rd screen comesup with some bottons like windows1 windows2, ....
    with RED error messages in right side....
    that error messages comes with my attachment....
    I didn't see anything scenarios, how may I check that where is it? please let me know
    My name is also Aamir
    Thanks

  • Split Bulk Xml message in to many

    Hi,
    How to split large xml message in to many xml messages in biztalk 2006. For example if that xml message contain more then 10,000 records then I have split into each 10,000 records. Can anyone help me?
    Thanks,

    Hi,
    There are four methods to get your message debatched:
    Receive Port Pipeline Debatching
    Orchestration debatching by calling a pipeline.
    Orchestration XPath Debatching 
    Orchestration Atomic Scope Node List Debatching
    1)
    Receive Port Pipeline Debatching
    http://social.technet.microsoft.com/wiki/contents/articles/26005.biztalk-server-debatch-xml-with-envelope.aspx
    2)
    Orchestration debatching by calling a pipeline. 
    http://jeremyronk.wordpress.com/2011/10/03/how-to-debatch-into-an-orchestration-with-a-pipeline/
    3)
    Orchestration XPath Debatching
    http://www.digitaldeposit.net/blog/2006/12/message-debatching-inside-biztalk.html
    4)
    Orchestration Atomic Scope Node List Debatching
    http://geekswithblogs.net/sthomas/archive/2005/03/21/26924.aspx
    You can check out the performance of each method @
    http://geekswithblogs.net/sthomas/archive/2004/12/12/17373.aspx
    Rachit
    If this answers your question please mark it accordingly. If this post is helpful, please vote as helpful by clicking the upward arrow mark next to my reply.

  • Large XML File in JMS Message

    Hi Experts,
    I got to send larg XML files (+20MB) to the Message listener for processing. Would you please comment if it would be a good approach or suggest some tested and authentic solution for this purpose. Prompt response will be highly appreciated.
    Regards
    Shahzad mahmood

    You can use the following algo to send large messages
    - use BytesMessage to send the large message
    - break the message into chunks (chunk size can be configurable)
    - in the header of the first message add a header prioperty to notify that this is the first message
    - in the header add a property for the size of the each message
    - in the header of the last message add a property to notify that its the last message
    on the receiver side
    - see the header property of the message
    - if its first message, open a stream (may be file stream)
    - keep writing messages to this stream till you receive the last message (identified by the header property)
    - close the stream after processing the last message.
    BTW some JMS products like FioranoMQ (www.fiorano.com) provide built-in support for large messages.
    You might want to look into such products.
    Thanks
    Bhuvan

Maybe you are looking for

  • Customer ageing report - FI AR Drilldown reports

    Hi,    There is standard report for Customer Due Date Analysis S_ALR_87012168 . this report displays amount in local currency  value only. Is there any Standard report for Customer due date analysis in Document Currency. i seached SDN, there is no po

  • Help Plz

    I need an encryption program idea quick (or even better the program itself)... plz...ill love the person who gives me one... if u dont need love i have money. Plz help a programmer in need

  • How to push a hangup request to phone

    Hi, When a call is in process on a cisco ip phone, is there a way to push a hangup request to disconnect the call. Or in other way is it possible to force phone to dial a number but before disconnect the current call? Thanks for you ideas, Parisa

  • Just got new Macbook Pro 15' Retina and Google Chrome instantly crashes

    I already tried reinstalling it many times, but no dice. I literally just came from my local apple tech store and got some of my applicatons and files transfered, since I was having problems transfering. Now that they transfered as much as they could

  • Error in backup

    Dear All, I have error in backup. I scheduled the backup for (whole database + redolog backup ) in DB13. The backup of datafiles completes sucussfully but it gives error in while switching to next redolog . <b>I am attaching the few initial line whic