Processing large volume of idocs using BPM Processing

Hi,
I have a scenario in which SAP R/3 sends large volume say 30,000 DEBMAS Idocs to XI. XI then sends data to 3 legacy systems using jdbc adapter.
I created a BPM Process which waits for 4 hrs to collect all the idocs. This is what my BPM does:
1. Wait for 4 hrs Collect the idocs
2. For every idoc do a IDOC->JDBC Message transformation.
3. Append to a Big List
4. Loop at the Big list from step 4 and in the loop for
5. Start counter from 0 and increment. Append to a Small List.
6. if counter reaches 100 then send a Batch JDBC Message in send step.
7. Reset counter after every send.
8. Process remaining list i.e if there was an odd count of say 5300 idoc then the remaining 53 idocs will be sent in anther block.
After sending 5000 idocs to above BPM following problems are there:
1. I cannot read the workflow log as system does not respond.
2. In the For Each loop which loops through the big list of say 5000 idocs only first pass of 100 was processed after that the workflow item is not moving ahead. It remains in the status as "STARTED" but I do not see further processing.
Please tell me why certain Work Items are stuck is it becuase I have reached upper limit and is this the right approach? The Main BPM Process is also hanging from last 2 days.
I have concerns about using BPM for processing such high volume of idocs in production. Please advice and thanks in advance.
Regards
Ashish

Hi Ashish,
Please read SAPs Checklist for proper usage of BPMs: http://help.sap.com/saphelp_nw04/helpdata/en/43/d92e428819da2ce10000000a1550b0/content.htm
One point i'm wondering about is why do you send the IDocs out of R/3 one by one and don't use packaging there? From a performance stand point this is much better than a bpm.
The SAP Checklist states the following:
<i>"No Replacement for Mass Interfaces
Check whether it would not be better to execute particular processing steps, for example, collecting messages, on the sender or receiver system.
If you only want to collect the messages from one business system to forward them together to a second business system, you should do so by using a mass interface and not an integration process.
If you want to split a message up into lots of individual messages, also use a mass interface instead of an integration process. A mass interface requires only a fraction of the back-end system and Integration-Server resources that an integration process would require to carry out the same task. "</i>
Also you might want to have a look at the IDoc packaging capabilities within XI (available since SP14 i believe): http://help.sap.com/saphelp_nw04/helpdata/en/7a/00143f011f4b2ee10000000a114084/content.htm
And here is Sravyas good blog about this topic: /people/sravya.talanki2/blog/2005/12/09/xiidoc-message-packages
If for whatever reason you can't or don't want to use the IDoc packets from R/3 or XI there are other points on which you can focus for optimizing your process:
In the section "Using the Integration Server Efficiently" there is an overview on which steps are costly and which steps are not so costly in their resource consumption. Mappings are one of the steps that tend to consume a lot of resources and unless it is a multi mapping that can not be executed outside a BPM there is always the option to do the mapping in the interface determination either before or after the BPM. So i would sugges if your step 2 is not a multi mapping you should try to execute it before entering the BPM and just handle the JDBC Messages in the BPM.
Wait steps are also costly steps, so reducing the time in your wait step could potentially lead to better performance. Or if possible you could omitt the wait step and just create a process that waits for 100 messages and then processes them.
Regards
Christine

Similar Messages

  • Processing large volumes of data in PL/SQL

    I'm working on a project which requires us to process large volumes of data on a weekly/monthly/quarterly basis, and I'm not sure we are doing it right, so any tips would be greatly appreciated.
    Requirement
    Source data is in a flat file in "short-fat" format i.e. each data record (a "case") has a key and up to 2000 variable values.
    A typical weekly file would have maybe 10,000 such cases i.e. around 20 million variable values.
    But we don't know which variables are used each week until we get the file, or where they are in the file records (this is determined via a set of meta-data definitions that the user selects at runtime). This makes identifying and validating each variable value a little more interesting.
    Target is a "long-thin" table i.e. one record for each variable value (with numeric IDs as FKs to identify the parent variable and case.
    We only want to load variable values for cases which are entirely valid. This may be a merge i.e. variable values may already exist in the target table.
    There are various rules for validating the data against pre-existing data etc. These rules are specific to each variable, and have to be applied before we put the data in the target table. The users want to see the validation results - and may choose to bail out - before the data is written to the target table.
    Restrictions
    We have very limited permission to perform DDL e.g. to create new tables/indexes etc.
    We have no permission to use e.g. Oracle external tables, Oracle directories etc.
    We are working with standard Oracle tools i.e. PL/SQL and no DWH tools.
    DBAs are extremely resistant to giving us more disk space.
    We are on Oracle 9iR2, with no immediate prospect of moving to 10g.
    Current approach
    Source data is uploaded via SQL*Loader into static "short fat" tables.
    Some initial key validation is performed on these records.
    Dynamic SQL (plus BULK COLLECT etc) is used to pivot the short-fat data into an intermediate long-thin table, performing the validation on the fly via a combination of including reference values in the dynamic SQL and calling PL/SQL functions inside the dynamic SQL. This means we can pivot+validate the data in one step, and don't have to update the data with its validation status after we've pivoted it.
    This upload+pivot+validate step takes about 1 hour 15 minutes for around 15 million variable values.
    The subsequent "load to target table" step also has to apply substitution rules for certain "special values" or NULLs.
    We do this by BULK collecting the variable values from the intermediate long-thin table, for each valid case in turn, applying the substitution rules within the SQL, and inserting into/updating the target table as appropriate.
    Initially we did this via a SQL MERGE, but this was actually slower than doing an explicit check for existence and switching between INSERT and UPDATE accordingly (yes, that sounds fishy to me too).
    This "load" process takes around 90 minutes for the same 15 million variable values.
    Questions
    Why is it so slow? Our DBAs assure us we have lots of table-space etc, and that the server is plenty powerful enough.
    Any suggestions as to a better approach, given the restrictions we are working under?
    We've looked at Tom Kyte's stuff about creating temporary tables via CTAS, but we have had serious problems with dynamic SQL on this project, so we are very reluctant to introduce more of it unless it's absolutely necessary. In any case, we have serious problems getting permissions to create DB objects - tables, indexes etc - dynamically.
    So any advice would be gratefully received!
    Thanks,
    Chris

    We have 8 "short-fat" tables to hold the source data uploaded from the source file via SQL*Loader (the SQL*Loader step is fast). The data consists simply of strings of characters, which we treat simply as VARCHAR2 for the most part.
    These tables consist essentially of a case key (composite key initially) plus up to 250 data columns. 8*250 = 2000, so we can handle up to 2000 of these variable values. The source data may have 100 any number of variable values in each record, but each record in a given file has the same structure. Each file-load event may have a different set of variables in different locations, so we have to map the short-fat columns COL001 etc to the corresponding variable definition (for validation etc) at runtime.
    CASE_ID VARCHAR2(13)
    COL001 VARCHAR2(10)
    COL250     VARCHAR2(10)
    We do a bit of initial validation in the short-fat tables, setting a surrogate key for each case etc (this is fast), then we pivot+validate this short-fat data column-by-column into a "long-thin" intermediate table, as this is the target format and we need to store the validation results anyway.
    The intermediate table looks similar to this:
    CASE_NUM_ID NUMBER(10) -- surrogate key to identify the parent case more easily
    VARIABLE_ID NUMBER(10) -- PK of variable definition used for validation and in target table
    VARIABLE_VALUE VARCHAR2(10) -- from COL001 etc
    STATUS VARCHAR2(10) -- set during the pivot+validate process above
    The target table looks very similar, but holds cumulative data for many weeks etc:
    CASE_NUM_ID NUMBER(10) -- surrogate key to identify the parent case more easily
    VARIABLE_ID NUMBER(10) -- PK of variable definition used for validation and in target table
    VARIABLE_VALUE VARCHAR2(10)
    We only ever load valid data into the target table.
    Chris

  • Idoc using inbound process

    Hi,
    my issue is regarding the Idoc using inbound process using the FM BAPI_IDOC_DISTRIBUTE_LIST1.
    here the user wants the dhmt plant tobe refer instead of mirror plant.
    can anybody help me regarding this or any doc regarding iDOCS inbound.

    Seetha,
    I have this already..
    IDOCPACKET                                                           
    Object type                         IDPKWMMBXY             
    End event                           MASSINPUTFINISHED      
    IDOC                                                          
    Object type                         IDOCWMMBXY             
    Start event                         INPUTERROROCCURRED     
    End event                           INPUTFINISHED          
    Application object                                                                      
    Object type                         BUS2017                           
    Start event

  • Collection of IDOCs using BPM

    Hi Experts,
    I am trying collection of IDOCs by using BPM.
    I ran into problems and In moni,  the error is
    <SAP:Code area="BPE_ADAPTER">SYSTEM_FAILURE_INTERNAL</SAP:Code>
      <SAP:P1 />
      <SAP:P2 />
      <SAP:P3 />
      <SAP:P4 />
      <SAP:AdditionalText />
      <SAP:ApplicationFaultMessage namespace="" />
      <SAP:Stack>An internal error has occurred</SAP:Stack>
    Details of above error in trace is : 
    <Trace level="1" type="T">--start sender interface action determination</Trace>
      <Trace level="1" type="T">select interface BLAORD.BLAORD03*</Trace>
      <Trace level="1" type="T">select interface namespace urn:sap-com:document:sap:idoc:messages</Trace>
      <Trace level="1" type="T">no interface found</Trace>
      <Trace level="1" type="T">--start receiver interface action determination</Trace>
      <Trace level="1" type="T">Loop 0000000001</Trace>
      <Trace level="1" type="T">select interface UpdatedContracts_abs_idocs*</Trace>
      <Trace level="1" type="T">select interface namespace http://com/UpdatedContracts/</Trace>
      <Trace level="1" type="T">no interface found</Trace>
      <Trace level="1" type="T">--no sender or receiver interface definition found</Trace>
      <Trace level="1" type="T">Hence set action to DEL</Trace>
      <Trace level="1" type="B" name="CL_IDX_IDOC_RESOURCE-GETBLOBDATA" />
      <Trace level="1" type="B" name="CL_XMS_MAIN-PERSIST_READ_MESSAGE" />
      <Trace level="1" type="T">Note: the following trace entry is written delayed (after read from persist)</Trace>
    Here i am trying to collect IDOCs for 5 minutes and i need to send them in XML format. Here i have mapping logic and that logic is working perfectly.
    What i am dong is
    1. I am sending IDOCs from R3 system to BPM in XI. ( Here to push IDOCs from R3 system to BPM, do we need to do extra ALE settings. Actually With out BPM simple IDOC to File is working fine, With BPM only i am having error. )
    2. How to reslove above problem. AS per my understanding everthing is fine.
    Thanks,
    Subbu

    Hi Rama Subbarao,
        Error: Check with your loop condition and process steps.
           Collecting of IDOCS using BPM is good, If u want to send the 100 IDOCS per one hour with single BPM it will takes times , Some performs issues and hardware issues are raising.
    Suppose if u send 1000 Idocs per one hour it will takes much time.
    Other Process:
      The collecting of IDOCS it will be downloaded into FILE System.Once IDOCS is downloaded into FILE, it will be not changed the IDOCS.
    Using FILE Adapter we send the flatfiles data from FILE System to XI. if  u using this process performance wise its very good.
    Regards,
    Sateesh N

  • Can XI process the sold-to IDOC before the processing the ship-to IDOC

    Hi Experts,
    Can XI process the sold-to IDOC before the processing the ship-to IDOC? For example if I have 10 IDOCs for both sold-to and ship-to, Can XI process all Sold-To IDOCs first, then process the ship-to IDOCs right after? I know that this can be acheived by using BPM. Is there any other way?
    Thanks and regards,
    Prasad

    There exist a possibility to serailize the idocs based on various criteria
    http://help.sap.com/saphelp_erp2004/helpdata/en/0b/2a66c9507d11d18ee90000e8366fc2/frameset.htm
    Regards,
    Prateek

  • Split of a Large Volume Outbound Idoc

    Hi,
    anyone who can tell me how split of a large volume outbound idoc works?
    /Elvez

    Hi Elvez,
    One way to split your idoc is to group your idoc into different segments
    You can create segments and group your data logically.
    Go through the following link. It will give you good tips on IDOCs
         http://www.netweaverguru.com/EDI/HTML/IDocBook.htm
      other helpful links are...
         ALE/ IDOC/ XML
         http://www.sapgenie.com/sapgenie/docs/ale_scenario_development_procedure.doc
         http://www.thespot4sap.com/Articles/SAP_XML_Business_Integration.asp
         http://help.sap.com/saphelp_srm30/helpdata/en/72/0fe1385bed2815e10000000a114084/content.htm
    Good luck ..Reward me for the same
    Thanks
    Ashok

  • How to process files sequentially in PI using bpm

    Hi Folks,
    I am really looking for one requirement like, Sender file adapter has to pick multiple files by file name based with some time gap? Can we?normally i am getting 40 files in the source directory with some time gap like 1hr and 2 hours.but my in some situations like system got down and if the server stopped for some refresh work then 2 days files will come to source directory and after system is sap pi try to process files at a time but the messages not going in order.
    I have got one BPM in this, I have tried with Process mode : Name and Date, with wait step on BPM, but no use. The way how PI behaving, if 40 files in file directory, it is picking all files in one shot. Start processing but not in order. if it is process also the SNC system can't process 4 at time.it will process files with some time gap.
    The problem is on Receiver system side. The receiver system is SNC system, if old data receives later than earlier date data; we get data obsolete application error.
    Ex: If I receive 25th and 26th files, first I need to process 25th first on PI sends to SNC, I need to give some time gap and pick another file or even PI picks and process 26th file, no problem but I need to give some time gap to send SNC to this 26th date file?
    Please how guys, throw me your great ideas
    Step1: i configured the sender file adapter with by name property to sort the files but some times pi picking new date file first and old one later.here my question is how to configure adapter to pick files in sorted by name.the filename i given like xml_0809008998_*.xml
    step2: after pi picking the files in order but the messages not sending order to target system.i was configure the bpm like first recive step then transformation step then i was used wait step to process files with time gap.after that block step mode is default inside i was used 2 bblock steps.
    here my question how to configure bpm process messages in order?
    Thanks in advance!!
    Regards
    SG

    Hi,
    In the sender file comm channel use Processing Sequence = By Date. After that use Quality of service as EOIO and provide one queue name. Use same queue name in the receiver comm channel as well. So files will be picked by the date of the file and messages will be passed to SNC system in "first in first out" basis.
    Reagrds,
    Nayan

  • Bundling of idocs using BPM

    Hi
    If you are using BPM to bundle the idocs and send them in file format to target system.............. here we can think of 3 different options on which we can bundle
    1. Payload based
    2. Message based
    3. Time based.
    Could you plz confirm me if we can achieve all these 3 options even without BPM in the application system ?? If we can achieve this in the applicaiton system itself ........are there any situations in which we are foced to use BPM for this particular scenario (Bundling of idocs)
    Thanks
    Kumar

    Hi Kumar,
    Very simple example which  I had faced....
    If you bundle idocs are application level..i.e. by setting XML port to collect idocs you will have multiple Control records i.e. control record for every idoc where as if you bundle them using BPM you can have only one control record. ( which we need most of the time while bundling idocs.)
    Also in BPM you can set correlation on payload data. You can have multiple condition here...but at application level you do not get that control. Basically I think you get a control over Payload in BPM and you can play as you want.
    Again even at application level if you try to bundle large number of idocs it will give you memory dump ( in sap)..
    But bundling idocs at application level is faster process than BPM.
    Nilesh

  • File to Idocs using BPM

    Hi,
    Current Interface Flow - File (xml) to Idoc (single Idoc type)
    Thirdparty sends a File for Goods Receipt. This file may have multiple orders.
    Orders are sorted in an xslt by ORDNUM. Each record has a delivery type (Deliver Type PO or Delivery Type STO). For each order Graphical Message mapping occurs.
    In case of PO, inbound delivery number and item number have to be received from ECC via an RFC call and passed to VBELN and POSNR. While in case of STO, source field ORDNUM and ORDITEM are passed to VBELN and POSNR. (RFC call happens only for POs, not for STOs)
    In case of STO, TCode  will be MB0A.and IDoc would be WMMBXY (As is the case right now)
    In case of PO, there may be two cases -
         a) There is at least one response from ECC for all the line items inside an order --> in this case TCode will be MB0A and IDoc is  WMMBXY .
         b) If there is no inbound delivery number for any line item from ECC inside an order --> TCode MB01 will be used and IDoc used is WMMBXY
    So for all the above cases Target IDoc is WMMBXY.
    Changes to be done - Now  File (xml) to Idoc (Two different Idoc types, WMMBXY and DELIVERY03)
    Now there is an additional requirement that I will be using an additional IDoc (DELIVERY03) in above case a).
    I want to use BPM in this case
    1) RFC call in message mapping to get the inbound delivery number (for POs only) (Should be before the message enters BPM)
    2) Separating the messages based on IDoc type
    3) Interface / Message mapping for each IDoc type
    4)Send the message to ECC
    Please suggest how to proceed with BPM
    Thanks,
    Varun
    Edited by: Varun Reddy on Feb 4, 2011 4:47 PM

    Hi Varun,
    Follow this thread... You have answer for this.
    This link gives design ideas for your requirement
    /people/sudharshan.aravamudan/blog/2005/12/01/illustration-of-multi-mapping-and-message-split-using-bpm-in-sap-exchange-infrastructure
    Follow Bhavesh discussion. It might be helpful too
    Message Split: File to Multiple IDOC Types

  • FILE TO IDOC USING BPM

    Hi,
             I am the fresher and new to XI. I want to know the step-by-step procedure for file to idoc scenario using bpm.I require it urgently.
    Please help me out.
    Thanks and have a nice day.

    Sangeet - Check this blog :
    <a href="/people/anish.abraham2/blog/2005/12/22/file-to-multiple-idocs-xslt-mapping to multiple IDocs using IP in ccBPM</a>
    Hope this helps.
    Cheers!
    Shireesh M

  • Idoc to Idoc using BPM

    Dear Friends,
      I am sending the Idoc from one ECC server to other ECC server using XI, it has been completed ,but reciever side after one hour only it has to be updated.
    so in this case how it's possible with BPM or any other method is available.
    Please let me know the process.
    Regards,
    Shalini shah.

    I am sending the Idoc from one ECC server to other ECC server using XI, it has been completed ,but reciever side after one hour only it has to be updated.
    so in this case how it's possible with BPM or any other method is available.
    With BPM you can have a wait step inbetween the receive and send ........not a good option to make the BPM wait for 1 hour....
    Without BPM...refer the replies of Samiullah in these two threads:
    Re: Can I slow down Processing of each mesg in PI?
    Re: Delay in MM possible ?
    Regards,
    Abhishek.

  • Bundling of IDOCs using BPM.PLS Help .Its very very URGENT

    Hi all,
    I have a IDOC to file scenario in which I have to bundle some number of INVOIC02 IDOCs.
    For this I have an IDOC whosw occurance is 1...1.
    I have an XSD for the same IDOC whose max occurance is 1.....9999999999.
    Si I have created one mapping for these two(i.e IDOC with max occurs 1 and IDOC with max occurs 9999999999.)
    I created one more mapping  for IDOC XSD and the target file structure.
    For this scenario how many abstract interfaces and interface mappings do I need to create?
    Can anybody explain the BPM flow in detail in this case?
    Thanks in advance.

    Hi  Rambabu Mujja  ,
    The following websites wil solve u'r problem:
    A Step-by-Step Guide on IDoc-to-File Using Business Service in the XI Integration Directory
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/e07dcaa0-a92b-2a10-3a96-b3d942bd1539
    How to convert an IDoc-XML structure to a flat file and vice-versa in XI 3.0
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/46759682-0401-0010-1791-bd1972bc0b8a
    Introduction to IDoc-XI-File scenario and complete walk through for starters.
    /people/prateek.shah/blog/2005/06/08/introduction-to-idoc-xi-file-scenario-and-complete-walk-through-for-starters
    IDOCs (Multiple Types) Collection in BPM
    /people/pooja.pandey/blog/2005/07/27/idocs-multiple-types-collection-in-bpm
    cheers!
    gyanaraj
    ****Pls reward points if u find this helpful

  • JDBC to IDOC using BPM

    Hi Experts,
    I am doing the scenario JDBC to IDOC
    Note: i need a solution without stored procedures
    My requirement is , in sender JDBC side
    1) We need to select data from 3 tables from the DB ( I think we can do this by JOIN query specified in the channel)
    2) after selecting data we need to update all 3 tables
    (I think we can't do this, as we have only one update option in the sender JDBC channel). So i want to go for BPM.
    Can we do like this...??
    2 mappings:   
    --> sender JDBC o/p structure (MT-1)   to  IDOC structure(MT-2)                                         (MM-1)---->  (IM-1)
    --> sender JDBC o/p structure (MT-1)  to  JDBC update structure (for 3 tables) (MT-3)                 (MM-2) ----> (IM-2)
    BPM:
    MT-1                    (IM-1)                                                     (IM-2)
    Receive           Trnsformation             AsyncSend to R/3             Transformation             Async Send to JDBC(Receiver)
      Step                    Step1                                                   Step2
    MT = Message Type;   MI = Message Interface;    IM = Interface mapping;    MM = Message mapping
    Edited by: murali krishna on Mar 9, 2010 8:24 AM

    u do not require a BPM..
    the method that u specified is right..
    mappings:
    --> sender JDBC o/p structure (MT-1)   to  IDOC structure(MT-2)                                         (MM-1)---->  (IM-1)
    --> sender JDBC o/p structure (MT-1)  to  JDBC update structure (for 3 tables) (MT-3)                 (MM-2) ----> (IM-2)
    just in the 2nd mapping that updates the tables the receiver service is same as the sender ..
    also specify both the mappings in the interface determination...first the update mapping and then the Idoc one and click the maintain order at runtime so that Idoc is trigeered only after update mapping is done..

  • Collecting multiple idocs using BPM

    Hi,
             I have a scenario where idocs sent from R/3 system are collected in Xi and posted as a single message in the file. We have done all the configurations both in XI and R/3 pertaining IDOC.
    When we trigger an idoc in the R/3 system,we are able to see the message that IDOC has been sent to XI system. But we are unable to see any of the message transfers in SXMB_MONI.
    (In simple words our interfaces are not getting triggered.)
    Could anyone help us in rectifying the problem?
    Thanks & Regards,
    Vishnu.

    When you trigger the IDOC from R/3
    --Did u complete the configuration part of sending the IDOC to XI, have you created the TRFC POrt in R/3, cross check the RFC dest which you are using here is pointing to Right Xi box?
    Your configration looks fine for atleast to take IDOC in XI, rest i am not sure at this point.
    ---No errors in the SM58 on R/3?
    ---IDOC metadata is loaded in XI? do u c any entry in the IDX5?
    --Is your IS been configured wth the tracle level parameters, if yes set it to 3 value
    --is runitme paramter LOGGING, LOGGING_PROPOGATION,LOGGING_SYNC present in SXMB_ADM?
    if you have these then atleast you should c an entry in SXMB_MONI with with either sucessfull or failed state.
    More importanatly the LS name of the R/3 is according to standards what SAP says in SLD?
    Edited by: Nisar Khan on Feb 28, 2008 11:00 AM

  • Using Human tasks in a BPM process

    Hi
    I managed to deploy the VacationRequest sample which uses human task in a BPEL process.
    I want to implement the same human task in a BPM process. Are there any tutorials or stepswise guide for the Vacation Request sample using BPM process?
    Also, what are the main differences between a BPEL and a BPM process because most of the things that are done in BPEL can be done in an BPM process as well.
    Thanks

    I was able to implement the same using a BPM Process.
    For those of you, who may require the steps:
    1. Create a webservice based on the VacationRequest.xsd with VacationRequest as request and VacationResponse as CallBack
    2. Create a human task and task flow as described in the VacationRequest.pdf provided along with the sample.
    3. Create a BPMN process, associate the Start Event to the WebService Interface and map the request and response.
    Also associate the End Event to theWebService callback port.
    4. Deploy the process and taskflow.
    But I have a query here? Even in the sample given(using bpel process) how will the requestor know that the leave has been approved/rejected.
    what exactly is sent out as the response to the call back port. And, where can the requestor view the response?
    Regards

Maybe you are looking for