Idocs parallel processing

Hi,
We have some goods movement idocs(MBGMCR/MBGMCR03)getting posted simultaneosly.(Std ALE idocs no customization)We dont want to collect it and process it(timing issue).When we allow it to process immediately material batch combination gets locked out and one idoc goes through and others fail...i know we can collect it and process it through a batch prog setting RBDAPP01 variant with no prallel processing but is there any way i can switch on immediately and allow each idocs to process one by one avoiding parallel processing..?
Thanks,
Larry

Hi Larry,
Please check your partner profile (WE20) and set to trigger immediately under processing by function module area.
Regards,
Ferry Lianto

Similar Messages

  • Parallel Processing in creation of idocs

    Hi Gurus ,
    I am working on the EDI Inbound process with IDOCS . I am getting several idocs as per my process from legacy systems and i am supposed to merge couple of idocs based on certain conditions and then create new IDOCS which in turns create the Sales Order .
    The response time of this utility is very bad hence for performance optimization we are planning to apply Parallel processing concept in the creation of idoc.
    we have a function module which creats Sales order . I want to  call this Function module in parallel in different LUWs so that multiple sales order can be created in parallel .
    can anyone please help me and tell me the logic to  call a function module in parallel .
    thanks in advance
    regards,
    khushi.

    yes i did max performance things on the merging logic now i  have to create the idocs in the parallel . can you please help me in wrirting the code of creating idocs in parallel.
    thanks in advance.
    regards ,
    khushi
    Edited by: Khushboo Tyagi on Jan 19, 2009 4:24 PM

  • Parallel processing of the IDOCs through the program RBDAPP01

    Hello All,
    We are prcoessing the inbound IDOCS using the program RBDAPP01 by scheduling but this processing is taking lot of time to process the IDOCs.
    Please let me know how do we improove the performance of this program RBDAPP01 by making use of
    parallel processing of IDOCS?I read some documentation in this forum but not understand clearly.
    Please give some info regarding this who have already done this requirement earlier.
    Regards
    Mahesh

    Hi,
    Thanks a lot for your quick reply just by giving the package size it will initiate parallel processing?
    in our partner profiles we have setting to trigger through back ground only.
    How to use parallel processing tab in the selection screen of the program RBDAPP01?
    Regards
    Mahesh

  • Parallel Processing (RSEINB00)

    Hi,
    I am trying to achieve parrallel processing for the below scenario:
    I am trying to create a child job for processing each file. 
    Suppose if i have 5 files, in this case i will have one Main Job & 5 child Jobs. Apprecaite if anyone has come across such scenario.
    Each file can have two types of data. One set of data for trasferring to application server & another set for posting idocs.
    LOOP AT t_files_to_proc INTO wl_files_to_proc.
    *-- This perform builds two sets of data  [  Data Set-1 for transferring file onto App Server
    *--                                                                 Data Set-2 for posting idocs using RSEINB00     ]
       PERFORM build_table_data.
    *-- Data Set-1 for transferring file onto App Server
    PERFORM transfer_data_to_appserver.   
    *-- Data Set-2 for posting idocs using RSEINB00    
    PERFORM submit_rseinb00.
    ENDLOOP.

    Hi rao,
    here is a sample, adapt to your needs:
    [Easily implement parallel processing in online and batch processing |https://wiki.sdn.sap.com/wiki/display/Snippets/Easilyimplementparallelprocessinginonlineandbatchprocessing]
    Regards,
    Clemens

  • Parallel processing concern.

    Hello SAPients!
    I modified some user exits. I'm modifying the contents of the IDoc WPUBON01. In the "Before Inbound Processing" I am exporting a value to memory, use that value in one of the user exits to populate VBRK-VBRP, and I'm clearing (FREE) that value  in the "After Inbound Processing".
    It works fine so far, but I haven't tested with many IDocs, just with one (I don't have enough data to generate a complete test). My concern is that the standard transaction that executes the IDocs is enabled for parallel processing. Does SAP generate separated memory areas for every process that is triggered? What is going to happen if one of the processes finishes and clears (FREE) the memory? Will this generate inconsistency?
    Thanks in advance for your kind help.

    Well I almost answered a while ago, but we don't run Idocs so I can't test this... however, I have always found "export to memory id 'Z:MY_KEY' style logic very reliable in handling the type of requirement you describe e.g. several online sessions for the same SAPGui user will each get their own ID, multiple batch jobs can run at the same time without collision etc... whereas "set parameter ID 'Z12'" stype code is not reliable.
    Jonathan

  • Parallel processing for program RBDAPP01

    Hi All,
    I am running this program RBDAPP01 daily after every 30minutes to clear the error I Docs (Status 51 Application document not posted) Status Message u201CObject requested is currently locked by user ADMINJOBSu201D when I run this job it only clears few Idocs because of the Status Message u201CObject requested is currently locked by user ADMINJOBSu201D Means when one Idoc is getting updated the second one tries to update the same time for same order, same customer, same material and same plant but different ship to party it finds locked and cannot be posted.
    Can any one tell me what parallel processing is and will it help my case.
    Thanks

    You didn't specfiy which release you use so I can just give some suggestions:
    Note 547253 - ALE: Wait for end of parallel processing with RBDAPP01
    Note 715851 - IDoc: RBDAPP01 with parallel processing
    Markus

  • Parallel processing question

    I have written a module to read data from an external source, convert input to IDOCS, and process these IDOCS.
    Due to performance of IDOC processing, I decided to use parallel processing to spread the IDOC creation load among several processes using server groups.
    My main program sets up the parallel processing environment with a call to function module SPBT_INITIALIZE, then make repeated calls to a function module that processes the IDOCS, using the STARTING NEW TASK and DESTINATION IN GROUP xxx keywords.
    What I'm seeing is the main process running as a background task and one dialog process running the parallel function module, even though there are still 8 or 9 other dialog processes available.
    What I expected to see was several dialog process running my child function modules, not just the one.
    Does anyone know why the other dialog processes are not being used?
    Thanks for any input,
    Dorian.

    Thomas:
    I'm logging any errors that occur; I'm not seeing any resource failures - in fact no errors at all, other than expected application data errors. It seems that the RFC calls are all being made in a single child process that queues up the parallel jobs and uses just one dialog process to run them all. I expected to see as many dialog tasks being used as were available.
    As far as the RFC parameters go - are you referring to the RZ10 values? I looked at all of the parameters containing "rfc" as part of their name, and nothing looked as though it was restricting the parallel task behaviour. Do you have any advice as to suitable settings?
    I'm also wondering if what I am seeing is just the way SAP is supposed to work? Although I expected to see lots of child processes running in multiple dialogs if processes are available, maybe by design only one remote process per server is allowed? I checked the documentation I could find on the "starting new task" keyword, and nowhere does it say that multiple processes will be started on each server in the server group; only that a child process will NOT be started if the number of unused processes fall below a defined threshold.

  • Parallel processing of condition records in SAP

    Hi,
    I have a particular scenario, wherein XI sends 30000 idocs for pricing condition records of message type COND_A to SAP, and SAP has to process all the idocs within 15 minutes. Is it possible, and what kind of parallel processing techniques can be used to achieve this?
    Regards,
    Vijay
    Edited by: Vijay Iyengar on Feb 21, 2008 2:05 PM

    Hi
    We had a similar performance issue to load conditions of sales deal.
    We did not use IDOC.
    Initially we did the BDC and it was loading 19 records per second and than later we developed a direct input program, which loaded close to 900 records per second.
    What we did was, we wrote a direct input pogram and called the function module
    CALL FUNCTION 'RV_KONDITION_SICHERN_V13A' IN UPDATE TASK
    But Pls note - We took approval from SAP before using it.
    Regards
    Madhan
    Edited by: Madhan Doraikannan on Oct 20, 2008 11:40 AM

  • Parallel Processing Bottleneck in POS Inbound WPUUMS01

    Will someone explain what is the objective of BD51, if most IDOCs are processed by SAP 1-by-1 using INPUTTYP = 1.
    Parallel processing of WPUUMS01 IDOCs seems to be a case of a movie hall with many ticket counters but only 1 usher!  IDOC_INPUT_POS_SALES_ACCOUNT seems to me the bottleneck.
    Am I right?
    Read carefully the two links on BD51
    http://help.sap.com/saphelp_nw04/helpdata/en/0b/2a6688507d11d18ee90000e8366fc2/content.htm
    http://help.sap.com/saphelp_erp2005vp/helpdata/en/0b/2a6688507d11d18ee90000e8366fc2/content.htm
    In SDN IDES the 269 entries in TBD51 are
    64   0-Mass processing
    <b>170   1-Individual input</b>
    35   2-Individual input with IDoc lock in CALL TRANSACTION
    Observe vast majority as "1 Individual Input".
    After reading Notes <b>199840 & 118924</b> mentioned below I implemented small IDOCs and RDBAPP01 parallel processing. But benefit was marginal & disappointing at ~12%.
    I checked <b>IDOC_INPUT_POS_SALES_ACCOUNT</b> has INPUTTYP = 1.
    I suspect that 0-Mass processing should have given the better performance.
    "1" and "2" is for function modules process one IDoc per call.
         is for FMs that use "call transaction method"
    INPUTTYP = 0 function modules which process IDocs in packets and good for performance.
         is for FMs that use "direct input method"
    IDOC_INPUT_POS_SALES_ACCOUNT  is part of Function Group WPUE - POS interface, upload processing which is part of package WPOS. This does not have "call transaction" string nor "bdc".
    I read carefully
    <b>Note 199840 </b>- POS IDoc inbound dispatcher with locking control  is SAP's BP must read 
    This note says "To achieve a good load distribution in doing so, you should split the large IDocs, which are expensive with regards to processing, according to a defined record length. This mainly affects message types WPUUMS (aggregated sales) .... we recommend splitting a large IDoc into several small IDocs anyway..."
    <b>Note 118924 </b>- Parallel processing of IDocs in the POS inbound
    "Activate the check box 'parallel posting' in the initial screen of RBDAPP01 and enter the server group defined before.
    The system automatically distributes the load to the servers of the server group, and for each Idoc packet it starts a dialog process. For each server all dialog processes except for two can be occupied, therefore always make sure there are enough online processes available."
    Looking for comments from IDOCs gurus ....
    I spent a huge amount of effort with little to show.
    Regards
    -jnc

    yes i did max performance things on the merging logic now i  have to create the idocs in the parallel . can you please help me in wrirting the code of creating idocs in parallel.
    thanks in advance.
    regards ,
    khushi
    Edited by: Khushboo Tyagi on Jan 19, 2009 4:24 PM

  • IDOC - sequential processing

    Hello,
    I hope someone can help me with this problem concerning the processing of IDOCS.
    We got an Warhouse management system connected as a subsystem to ERP6.0.
    First we get a WHSCON for a return shipment (stocktyp R). The subsystems sends directly after the WHSCON an WMMBID02 to transfer the stocks from R to free.
    The WHSCON locks the system and the WMMBID coming in directly after it can not be processed.
    So how can I enforce the system to process the second IDOC in sequence only when the delivery is not locked?
    Thanks
    Christian

    Hello Christian,
    without much effort, you can try to solve you problem with following solution.
    Change IDoc processing (WE20) for second (or both) IDoc from "Trigger immedeately" to "Trigger by background program".
    Schedule batch job for program RBDAPP01 (without parallel processing).
    Best Regards, Dirk

  • Parallel Processing : Unable to capture return results using RECIEVE

    Hi,
    I am using parallel processing in one of my program and it is working fine but I am not able to collect return results using RECIEVE statement.
    I am using
      CALL FUNCTION <FUNCTION MODULE NAME>
             STARTING NEW TASK TASKNAME DESTINATION IN GROUP DEFAULT_GROUP
             PERFORMING RETURN_INFO ON END OF TASK
    and then in subroutine RETURN_INFO I am using RECEIVE statement.
    My RFC is calling another BAPI and doing explicit commit as well.
    Any pointer will be of great help.
    Regards,
    Deepak Bhalla
    Message was edited by: Deepak Bhalla
    I used the wait command after rfc call and it worked additionally I have used Message switch in Receive statement because RECIEVE statement was returing sy-subrc 2.

    Not sure what's going on here. Possibly a corrupt drive? Or the target drive is full?
    Try running the imagex command manually from a F8 cmd window (in WinPE)
    "\\OCS-MDT\CCBShare$\Tools\X64\imagex.exe" /capture /compress maximum C: "\\OCS-MDT\CCBShare$\Captures\CCB01-8_15_14.wim" "CCB01CDrive" /flags ENTERPRISE
    Keith Garner - Principal Consultant [owner] -
    http://DeploymentLive.com

  • Parallel processing of mass data : sy-subrc value is not changed

    Hi,
    I have used the Parallel processing of mass data using the "Start New Task" . In my function module I am handling the exceptions and finally raise the application specific old exception to be handled in my main report program. Somehow the sy-subrc is not getting changed and always returns 0 even if the expection is raised.
    Can anyone help me about the same.
    Thanks & Regards,
    Nitin

    Hi Silky,
    I've build a block of code to explain this.
      DATA: ls_edgar TYPE zedgar,
            l_task(40).
      DELETE FROM zedgar.
      COMMIT WORK.
      l_task = 'task1'.
      ls_edgar-matnr = '123'.
      ls_edgar-text = 'qwe'.
      CALL FUNCTION 'Z_EDGAR_COMMIT_ROLLBACK' STARTING NEW TASK l_task PERFORMING f_go ON END OF TASK
        EXPORTING
          line = ls_edgar.
      l_task = 'task2'.
      ls_edgar-matnr = 'abc'.
      ls_edgar-text = 'def'.
      CALL FUNCTION 'Z_EDGAR_COMMIT_ROLLBACK' STARTING NEW TASK l_task PERFORMING f_go ON END OF TASK
        EXPORTING
          line = ls_edgar.
      l_task = 'task3'.
      ls_edgar-matnr = '456'.
      ls_edgar-text = 'xyz'.
      CALL FUNCTION 'Z_EDGAR_COMMIT_ROLLBACK' STARTING NEW TASK l_task PERFORMING f_go ON END OF TASK
        EXPORTING
          line = ls_edgar.
    *&      Form  f_go
    FORM f_go USING p_c TYPE ctype.
      RECEIVE RESULTS FROM FUNCTION 'Z_EDGAR_COMMIT_ROLLBACK' EXCEPTIONS err = 2.
      IF sy-subrc = 2.
    *this won't affect the LUW of the received function
        ROLLBACK WORK.
      ELSE.
    *this won't affect the LUW of the received function
        COMMIT WORK.
      ENDIF.
    ENDFORM.                    "f_go
    and the function is:
    FUNCTION z_edgar_commit_rollback.
    *"*"Interface local:
    *"  IMPORTING
    *"     VALUE(LINE) TYPE  ZEDGAR
    *"  EXCEPTIONS
    *"      ERR
      MODIFY zedgar FROM line.
      IF line-matnr CP 'a*'.
    *comment raise or rollback/commit to test
    *    RAISE err.
        ROLLBACK WORK.
      ELSE.
        COMMIT WORK.
      ENDIF.
    ENDFUNCTION.
    ok.
    In your main program you have a Logical Unit of Work (LUW), witch consists of an application transaction and is associated with a database transaction. Once you start a new task, your creating an independent LUW, with it's own database transaction.
    So if you do a commit or rollback in your function the effect is only on the records your processing in the function.
    There is a way to capture the event when this LUW concludes in the main LUW. That is the PERFORMING whatever ON END OF TASK. In there you can get the result of the function but you cannot commit or rollback the LUW from the function since it already have implicitly happened at the conclusion of the funtion. You can test it by correctly comment the code I've supplied.
    So, if you  want to rollback the LUW of the function you better do it inside it.
    I don't think it matches exactly your question, maybe it lead you on the right track. Give me more details if it doesn't.
    Hope it helps,
    Edgar

  • Parallel Processing and Capacity Utilization

    Dear Guru's,
    We have following requirement.
    Workcenter A Capacity is 1000.   (Operations are similar)
    Workcenter B Capacity is 1500.   (Operations are similar)
    Workcenter C Capacity is 2000.   (Operations are similar)
    1) For Product A: Production Order Qty is 4500. Can we use all workcenter as a parallel processing through Routing.
    2) For Product B: Production Order Qty is 2500. Can we use only W/C A and B as a parallel processing through Routing.
    If yes, plz explain how?
    Regards,
    Rashid Masood

    May be you can create a virtual WC VWCA=ABC (connected with a hierarchy with transaction CR22) and another VWCB=A+B and route your products to each VWC

  • Parallel processing open items (FPO4P)

    Hello,
    I have a question about transaction FPO4p (parallel processing of open items).
    When saving the parameters the following message always appears : "Report cannot be evaluated in parallel". The information details tells that when you use a specific parallel processing object, you also need to use that field to sort on.
    I my case I use the object GPART for parallel processing (see tab technical settings). In the tab output control I selected a line layout which is sorted by business partner (GPART). Furthermore no selection options are used.
    Does anyone know why the transaction cannot save the parameters and shows the error message specified above. I really don't know what goes wrong.
    Thank you in advance.
    Regards, Ramon.

    Ramon
    Apply note 1115456.
    Maybe that note can help you
    Regards
    Arcturus

  • How to do parallel processing with dynamic internal table

    Hi All,
    I need to implement parallel processing that involves dynamically created internal tables. I tried doing so using RFC function modules (using starting new task and other such methods) but didn't get success this requires RFC enabled function modules and at the same time RFC enabled function modules do not allow generic data type (STANDARD TABLE) which is needed for passing dynamic internal tables. My exact requirement is as follows:
    1. I've large chunk of data in two internal tables, one of them is formed dynamically and hence it's structure is not known at the time of coding.
    2. This data has to be processed together to generate another internal table, whose structure is pre-defined. But this data processing is taking very long time as the number of records are close to a million.
    3. I need to divide the dynamic internal table into (say) 1000 records each and pass to a function module and submit it to run in another task. Many such tasks will be executed in parallel.
    4. The function module running in parallel can insert the processed data into a database table and the main program can access it from there.
    Unfortunately, due to the limitation of not allowing generic data types in RFC, I'm unable to do this. Does anyone has any idea how to implement parallel processing using dynamic internal tables in these type of conditions.
    Any help will be highly appreciated.
    Thanks and regards,
    Ashin

    try the below code...
      DATA: w_subrc TYPE sy-subrc.
      DATA: w_infty(5) TYPE  c.
      data: w_string type string.
      FIELD-SYMBOLS: <f1> TYPE table.
      FIELD-SYMBOLS: <f1_wa> TYPE ANY.
      DATA: ref_tab TYPE REF TO data.
      CONCATENATE 'P' infty INTO w_infty.
      CREATE DATA ref_tab TYPE STANDARD TABLE OF (w_infty).
      ASSIGN ref_tab->* TO <f1>.
    * Create dynamic work area
      CREATE DATA ref_tab TYPE (w_infty).
      ASSIGN ref_tab->* TO <f1_wa>.
      IF begda IS INITIAL.
        begda = '18000101'.
      ENDIF.
      IF endda IS INITIAL.
        endda = '99991231'.
      ENDIF.
      CALL FUNCTION 'HR_READ_INFOTYPE'
        EXPORTING
          pernr           = pernr
          infty           = infty
          begda           = '18000101'
          endda           = '99991231'
        IMPORTING
          subrc           = w_subrc
        TABLES
          infty_tab       = <f1>
        EXCEPTIONS
          infty_not_found = 1
          OTHERS          = 2.
      IF sy-subrc <> 0.
        subrc = w_subrc.
      ELSE.
      ENDIF.

Maybe you are looking for

  • SOP for transporting deployed VC model in 3 systems EP landscape

    Hi experts, I have a question which I hope to find answers. My company adopts a 3 system landscape for both backend R/3 ECC6 and EP on NW04s. Is there any standard operations procedures as to how we should transport the Visual Composer applications?

  • Accessing Private class data

    Hi I have Created Class in SE24 and I have declared three public methods in that class and Finally I made Class as Private. then How can access the public methods of the class From the Report Program. Regards, D.Kiran Kumar.

  • Super class default constructor

    Hello, I want to clear some confusion. I am studying for the exam. In this particular book an example shows that Super class has 2 constructor public abc() and public abc(int n) Sub class has 2 constructor public xyz() and public xyz(int n) now when

  • Imovie loads all from Iphoto

    Is it normal for Imovie to take a long time to start because its loading every iphoto movie i have?  Is there a way to have it only do it when i ask or once they are loaded have them stay there so they dont have to load again?  i am getting tired of

  • Everything lost on Playbook after recent update - No I didn't backup Wasn't suggested

    So I updated my Playbook the other night, and now everything is gone.  All pictures, videos, Books, recipes etc Is there a way to retrieve any of my information? TIA