50 Parallel Processing through Process Chain

Hello Experts,
Here is what I am trying to do.
I want to create Process Chain which run 50(for ex.) Abap Program in parallel. I can create one by one through Process Types and select 'ABAP Program' and drag in the chain which is really very time consuming and tidious.
Does anybody have better approach to do that ?? Any table where I can create all this parallel steps and put it in a chain ?? or copy each step and put it in parallel in chain ??
Quick reply will be appreciated.
Points are guaranteed for Right Resolution.
Thanks in Advance.

Hi "Believe in Jainism",
Do you need to execute other process chain steps after executing the 50 abap programs? If not, why not just do the following:
1) Create an ABAP program
2) Inside the ABAP program, use the SUBMIT statement to execute the 50 ABAP programs in a background task.
If you need to execute steps after the 50 abap programs, you can try to create a custom process type that executes the 50 ABAP programs in a background task and then monitors the statuses and then terminate only once all of these programs have executed successfully or if 1 has failed.
Hope this helps.
P.S. If you don't mind, may I ask why do you need to execute 50 ABAP programs in parallel?

Similar Messages

  • How to determine N Step Approval process through Process Controlled workflow

    HI
    For SC approval process after 4th level of approval the workflow keep go N step till the approval determine. If no approval find then go for WF Exceptions.
    In PCW we have process level like 100, 200, 300 and so on. What would be the procedure for Nth process level determination in SRM process controlled workflows settings.
    Regards
    Kharabela

    Hi,
      From SRM 7.0 PCW, requestor of SC or creator(buyer) of the PO can't be in the approval list.. this is standard...
    create two custom events like zev_po_schema_eva and zev_po_sl_approval
    create two custom expression.. one with constants as zev_po_schema_def and other one is formula interpreter(0FB001)..
      assign zev_po_schema_eva as schema evaluation for PO.. MAKE SURE YOU LINK THE zev_po_sl_approval
      with as zev_po_schema_def..
       now create schema definition entry and specify as zev_po_schema_def..
       now create process level
       100 approval / approval with completion ( Your chooice ) .. maintain  zev_po_sl_approval as evaluation id in the process level... maintain RR_SPENDING_LIMIT_APPROVER as Resp.Resolver...
    maintain formula interperter value as (OV_SC_PREVAPPROVALIMI < 0V_PO_POTOTALVALUE ) AND (OV_SC_SPNDNGLMTFRQST < 0V_PO_POTOTALVALUE).
    Saravanan

  • 'QM value transferring through process messages'

    Hi,
    What exactly 'QM value transferring through process messages' means ?
    Regards,
    Samir

    Hi wadajkar ,
                   When you are using the  PI sheets for the Process mangaement Functionality. Some Of QM realated values can be tranfered through the Process messages. e.g . when process order is confirmed , inspection lot can be sent through the process messages. Some of the below QM related can be processed through process messages
    inspection char
    inspection lot
    inspection result
    inspection short text
    Thanks,
    Rajanikanth

  • Loading through Process Chains 2 Delta Loads and 1 Full Load (ODS to Cube).

    Dear All,
    I am loading through Process chains with 2 Delta Loads and 1 Full load from ODS to Cube in 3.5. Am in the development process.
    My loading process is:
    Start - 2 Delta Loads - 1 Full Load - ODS Activation - Delete Index - Further Update - Delete overlapping requests from infocube - Creating Index.
    My question is:
    When am loading for the first am getting some data and for the next load i should get as Zero as there is no data for the next load but am getting same no of records for the next load. May be it is taking data from full upload, i guess. Please, guide me.
    Krishna.

    Hi,
    The reason you are getting the same no. of records is as you said (Full load), after running the delta you got all the changed records but after those two delta's again you have a full load step which will pick whole of the data all over again.
    The reason you are getting same no. of records is:
    1> You are running the chain for the first time.
    2> You ran this delta ip's for the first time, as such while initializing these deltas you might have choosen "Initialization without data transfer", as such now when you ran these deltas for the first time they picked whole of the data.Running a full load after that will also pick the same no. of records too.
    If the two delats you are talking are one after another then is say u got the data because of some changes, since you are loading for a single ods to a cube both your delta and full will pick same "For the first time " during data marting, for they have the same data source(ODS).
    Hope fully this will serve your purpose and will be expedite.
    Thax & Regards
    Vaibhave Sharma
    Edited by: Vaibhave Sharma on Sep 3, 2008 10:28 PM

  • Parallel process in process chains

    Hi All,
    I have created a process chain with two parallel jobs (for forecast run) in APO DP. whenever i'm checking it, it is showing me that <b>"Too many parallel processes for chosen server"</b>
    In the diagnosis, the message is "On the server  you have chosen, there are only 1 batch processes available. The process chain has been designed in such as way that 2 processes must be processed parallel".
    Do I need to maintain any settings.Please let me know.
    Thank you.
    Regards,
    Raj

    Hi Raj,
    What it is complaining about is that the number of batch processes(background processes) available on the system are only 1 and you have given it 2 background processes..now if the system only has 1 bckgnd process,, how will it run 2 processes ) ??
    You can check the number of background processes in SM50 and then modify the number of background processes in RZ10(start up profile)......Ask your basis guy to increase/check that???
    Thanks
    Abhi

  • Incorrect data after activating the request through Process chain.

    Dear SDN chaps.
    Today morning. I encountered a strange issue in DSO..
    I have DSO which is updating from the AL11(application server) flat file.
    While i am loading it to PSA there were no issues and after loading it to the DSO there is no issue and its passing through the routine and the data is populating properly in NEW data Table .But after successful activation of  the request through process i am getting the wrong records in active data table.
    Then i deleted the request and reran it manually i mean triggered the DTP and ran the manual activation surprisingly accurate records are coming through manual process..
    I am just wondering why it is not working through process chain and why it is showing incorrect records through process chain execution and how it is showing accurate records through manual uploading process..'
    Could some one please help to come out from this..By the way mine is SAP BI 7 SP20 &SP05 for BW 7.01
    Thanks
    K M R
      

    Hi Pra
    Thanks for your response..
    We are doing PSA deletion and then we are uploading the data to PSA as well as DSO.
    Now the issue is not in the part of loading we are facing the issue in Actiation of request if i am executing the activation through process chain it is sucess but the values are incorrect. If i am doing the manual activation it sucess with correct data.
    Even i tried with a new chain but still i am facing the issue.
    Surprise thing is in new data table the data is perfect in both the ways like manual upate and Process chain update only during activation i am getting incorrect record in the active data table..
    Appreciate your help on this....
    Thanks
    K M R
    Edited by: K M R on Jul 9, 2010 11:09 AM

  • How to call program through process chain

    Hi Gurus,
    I am in the position to execute the abap program through process chain, I have used  abap program as process type in process chain(First time I am using this process type).when I am executing the process chain, the abap program is not executing.Eagerly anticipating your reply.
    Regards
    Shiva

    Hi
    I managed the execution of rscrm jobs in PC as follows:
    1. Execute query through RSCRM_BAPI transaction
    2. goto sm37 and copy the Jobname (the active one)
    3. Create following progromm
    *& Report /WST/RSCRM_START *
    REPORT /WST/RSCRM_START .
    parameter: l_bid TYPE sysuuid_c.
    CALL METHOD cl_rscrmbw_bapi=>exec_rep_in_batch
    EXPORTING
    i_barepid = l_bid
    4. Execute Programm and fill the Parameter with the Jobname
    5. Save as a new program variant and use in PC as a normal program
    I hope that helps.
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/3507aa90-0201-0010-6891-d7df8c4722f7
    regards
    ashwin

  • Problem in Activating Hierarchy Attribute change run through process chain.

    Hi All,
    I have the problem in Hierarchy change run processw running through process chain..and also process chain completes successfully without any error..
    Even hierarchy change run  completes successfully the hierarchy objects are not activating,
    so daily when i have to activate hierarchy objects manually..
    and what i did was in one attributechange run process i have included 10 hierarchy object is it correct or else i have to separatly create 10 attribute change hierachy proceess in the process chain.. pls through some lights on this issue
    With regards,
    Hari

    Hi venkat,
    Yes.. i already include the save hierarchies process in process chain...
    my main problem is hierarchy infobject are not activating  using process chain and also even process chain is not giving error message.. and in which log table the hierarchy objects are maintained (such as run time, date , status etc..).
    IS there is any other settings to be made in process chain..
    with regards,
    hari

  • Parallel Processing through ABAP program

    Hi,
    We are trying to do the parallel processing through ABAP. As per SAP documentation we are using the CALL FUNCTION STARTING NEW TASK DESTINATION.
    We have one Z function Module and as per SAP we are making this Function module (FM)as Remote -enabled module.
    In this FM we would like to process data which we get it from internal table and would like to send back the processed data(through internal table) to the main program where we are using CALL FUNCTION STARTING NEW TASK DESTINATION.
    Please suggest how to achieve this.
    We tried out EXPORT -IMPORT option meaning we used EXPORT internal table in the FM with some memory ID and in the main program using IMPORT internal table with the same memory ID.  But this option is not working even though ID and name of the internal table is not working.
    Also, SAP documentation says that we can use RECEIVE RESULTS FROM FUNCTION 'RFC_SYSTEM_INFO'
    IMPORTING RFCSI_EXPORT = INFO in conjunction with CALL FUNCTION STARTING NEW TASK DESTINATION. Documentation also specifies that "RECEIVE is needed to gather IMPORTING and TABLE returns of an asynchronously executed RFC Function module". But while creating the FM remote-enabled we cant have EXPORT or IMPORT parameters.
    Please help !
    Thanks in advance
    Santosh

    <i>We tried out EXPORT -IMPORT option meaning we used EXPORT internal table in the FM with some memory ID and in the main program using IMPORT internal table with the same memory ID. But this option is not working even though ID and name of the internal table is not working</i>
    I think that this is not working because that memory does not work across sessions/tasks.  I think that the
    IMPORT FROM SHARED BUFFER and EXPORT TO SHARED BUFFER would work.  I have used these in the past and it works pretty good.
    Also,
    here is a quick sample of the "new task" and "recieve" functionality.   You can not specify the importing parameters when call the FM.  You specify them at the recieving end.
    report zrich_0001 .
    data: session(1) type c.
    data: ccdetail type bapi0002_2.
    start-of-selection.
    * Call the transaction in another session...control will be stop
    * in calling program and will wait for response from other session
      call function 'BAPI_COMPANYCODE_GETDETAIL'
               starting new task 'TEST' destination 'NONE'
                   performing set_session_done on end of task
        exporting
          companycodeid             = '0010'
    * IMPORTING
    *   COMPANYCODE_DETAIL        = ccdetails
    *   COMPANYCODE_ADDRESS       =
    *   RETURN                    =
    * wait here till the other session is done
      wait until session = 'X'.
      write:/ ccdetail.
    *       FORM SET_session_DONE
    form set_session_done using taskname.
    * Receive results into messtab from function.......
    * this will also close the session
      receive results from function 'BAPI_COMPANYCODE_GETDETAIL'
        importing
           companycode_detail        = ccdetail.
    * Set session as done.
      session = 'X'.
    endform.
    Hope this helps.
    Rich Heilman

  • Error in the Source system while data loding through Process Chain

    Hi,
    I am facing issue while data loading for certain extractors through Process chain. Load of 0BPM_WIHEAD, 0BP_LOGHIST, OBPM_OBJREL, 0BPM_DEADLINES(there are 2-3 more extractors) is getting failed daily with message Error occurred in the source system and if we repeat the same it is getting executed successfully.
    Regards,
    Javed

    Hi..
    It means that the extraction job is failing in the source system..
    Please check the Job log in the source system  --> Copy the request nimber --> In the source system go to SM37 --> Give * then the request number......execute --> Check the Job log..
    Look there may be multiple reasons behind this....
    1) Connection problem : Go to SM59 and test the connection, If you don't have the access in SM59...then go to RSA1 --> Source Systems --> Right click --> Connection Parameters --> Connection Test
    2) Or may be work processes are not available and due to which jobs are failing due to time out...
    In both the cases you can check with the Basis Team if they can suggest something....or you can change the process chain scheduling time if possible, and schedule it in such a time when less no of jobs are running..
    Regards,
    Debjani..

  • Broadcasting through process chains

    How to broadcast reports to indivdual mailboxs in the company through process chains. say current data should be broadcasted into individual email account every time a new sales order or purchase order is created or by the end of the day , it should automatically trigger the mail with the report. can anyone help on this.

    Hi,
    Here see the blog details.
    The goal of information broadcasting is to distribute the right information, in the appropriate format, to the right people, through different channels, at the right time.
    With the BEx Broadcaster, you can precalculate queries, query views, Web templates, reports and workbooks and broadcast them by e-mail or to the portal. In addition to the precalculated documents in various formats (HTML, MHTML, ZIP and so on) that contain historical data, you can also generate online links.
    Accessing the Broadcaster
    The broadcaster can also be accessed via the Portal through the delivered BI Role.
    You can determine the Scheduling for Information Broadcaster
    Based on a data change event triggered by a process chain.
    Based on a pre-defined time point.
    Freely definable scheduling.
    Steps to Schedule Information Broadcaster based on a data change event triggered by a process chain
    Create the Query in the Query Designer.
    Execute the report in the BEx Web Analyzer.
    Click on u201CSendu201D to create the settings for broadcasting.
    The Broadcast Wizard takes you through a series of prompts where you supply the key information required to develop a broadcast. At any time you can leave the Wizard and use the standard settings dialogs which offer more settings.
    Then schedule the broadcast. If you start the Broadcaster for a query (or template or workbook) that gets data from an InfoProvider that will be selected in the process chain, you can select the InfoProvider for Scheduling.
    Create the process chain and include the event data change, include process type "Trigger Event Data Change (for Broadcaster), itu2019s available under "Load Process and Post -Processing".
    The Process Chain is created including the process types: (1) Start (2) Execute InfoPackage (3) Delta Data Transfer Process (4) Activate DSO (5) Trigger Event data Change.
    When you create the Variant for the Event Data Change, using checkbox we can indicate when the Broadcast should trigger.
    As soon as that InfoProvider is affected by a Process Chain, the Broadcasting is triggered.
    After successful activation you can now schedule your chain. Press button u201CScheduleu201D or menu u201CExecution -> scheduleu201D. The chain will be scheduled as background job. You can see it in SM37. You will find a job named u201CBI_PROCESS_TRIGGERu201D. Unfortunately every process chain is scheduled with a job with this name. In the job variant you will find which process chain will be executed. During execution the steps defined in RSPCPROCESSCHAIN will be executed one after each other. The execution of the next event is triggered by events defined in the table.  You can watch SM37 for new executed jobs starting with u201CBI_u201D or look at the protocol view of the chain.
    You can monitor the Broadcaster from the SOST transaction.
    Note:
    Depending on authorizations, end-users can schedule their Broadcasting Settings.
    Only for those queries which are checked "Execution with Data Change in the Infoprovider" while you schedule, will be triggered by process chain event.
    You may wish to refer the Note on Settings for Information Broadcasting- 760775
    Hope this help you
    Regards,
    Rakesh

  • Archiving Infocube through Process Chain...

    Hi All,
    I need help in creating process chain for archiving infocube.I am able to archive infocube manually but not getting through process chain.
    Is it possible to archive infocube through Process Chain?If yes then please give steps to create process chain for archiving.
    Thanks in advance.
    Bandana.

    Hi,
    It is possible to Archive data from an infocube via process chain.
    Have a start process followed by Archive data from an infoprovider. Here the trick lies in the variants used for the archive steps used to make the chain.
    Create a process by dragging the "Archive data..." process and give name for the variant and use this variant for writing the archive file, choose your archive process (the same archive process u have created to archive data from the infocube) , as this is write phase do not check "Continue Open Archiving  requests" chk box, and choose "40 Wirte phase completed succesfully option" from the options in Continue Process Until target status.Now enter the required selection conditions. In case u want to reuse this chain give a relative value under 'Primary Time Restriction tab' and save this variant. This is your variant 1.
    Now drag the same 'Archiving process' and create a variant and in this process u need to select "Continue Open Archiving  request" and choose the option from the dropdown list '70 Deletion phase confirmed and request completed' this variant for deleting the data from the infocube. This is your variant 2.
    So now have a Process chain with Start >Archive Process with Variant 1 ( to write the archive file) > Archive Process with Variant 2 ( to delete the data from the cube).
    Thats it !!
    Regards
    Edited by: Ellora Dobbala on Apr 8, 2009 5:28 PM

  • PSA Deletion through Process Chain

    Hi Experts,
    Currently I am working on the BW3.5 version. I would like to delete the old PSA req through Process Chain. I need some clarification. Please provide me your suggestions.  Thanks in advance.
    1)In SAP-BW 3.5 version, there is only one process type u2018Deleting Requests from the PSAu2019 is available to do this activity. This process type can delete the requests from the PSA only or Will it delete PSA and Change Log tables!!!!! In the SDN threads, some people said it delete the PSA requests only and some said  it delete the PSA requests as well as Change Log.
    2)Currently we are having six process chain and each has master & transaction data load(Data load through PSA).
    Similarly I am planning to create six PSA deletion process chain which includes master & transaction data deletion. Here I am got little complication on finding Object Name(PSA Table Name).Please refer the screen shot.Is there any short cut to find full list of Object Name(PSA Table) which is currently available in the Process chain every day load.
    3)On the request selection, i prefer "Only sucessfully booked/updated req.". I did not select"Only those req with errors,that are not booked in a data target". Please share your view on this selection preference.
    http://img818.imageshack.us/img818/3963/psa1.jpg
    Thanks,
    RR

    Hi Murali,
    Thanks for the response. I do understand about the identifying the data source & PSA retention period(days). Let me elaborate my question little more. Thanks.
    1)In SAP-BW 3.5 version, there is only one process type u2018Deleting Requests from the PSAu2019 is available to do this activity. This process type can delete the requests from the PSA only or Will it delete PSA and Change Log tables!!!!! In the SDN threads, some people said it delete the PSA requests only and some said it delete the PSA requests as well as Change Log.If i am not mistaken, In BW 3.5 PSA deletion will leads through Change Log also.Am i right?
    2)Currently we are having six process chain and each has master & transaction data load(Data load through PSA).
    Similarly I am planning to create six PSA deletion process chain which includes master & transaction data deletion. Here I am got little complication on finding Object Name(PSA Table Name).Please refer the screen shot.Is there any short cut to find full list of Object Name(PSA Table) which is currently available in the Process chain every day load.In the very day process chain, we are having many stages in master & transaction load. So I can go through info pkg level and find the data source,then i can identify the PSA table name. But is there any other simplest way to find the PSA table name.
    http://img818.imageshack.us/img818/3963/psa1.jpg
    3)On the request selection, i prefer "Only sucessfully booked/updated req.". I did not select"Only those req with errors,that are not booked in a data target". Please share your view on this selection preference. May i know that which option is optimal on normal business process. If i did not select both options, then will that delete all the request in PSA!!!
    Thanks.
    RR.

  • Broadcasting Workbooks through Process Chain

    Dear friends,
    I want to broadcast workbooks through process chain. I have tried two programs
    1) RSRD_BROADCAST_STARTER
    2) RSRD_BROADCAST_BATCH
    My requirement is that, I want a log to be mailed to me stating the success or failure of the broadcast of the workbook.
    Through the first program, I am able to send the log to my e-mail ID however I am not able to send the workbook.
    Through the second program, when I execute the process chain, it immediately says process chain executed. After some time the log mail comes stating succss. After some more time the workbook gets broadcasted.
    I want the workbook to be broadcasted first and then the log should come specifying the success or failure of the precalculation of the workbook in the precal server.
    Please let me know if somebody knows how to do it.
    Regards
    Himanshu

    Hi Sapna,
    Yes You have to install pre-calculation server for broadcasting and the process for that is
    http://service.sap.com/swdc > Download>Support Packages & Patches>Entry by Application Group>SAP NETWEAVER -->SAP NETWEAVER 04 -->BI Precalculation
    and can check its status in transaction : RSPRECADMIN   or
    Transaction SPRO>SAP Referance IMG> SAP Netweaver> Business Interlligence>
      Settings for Reporting and Analysis> Settings for Information Broadcasting> Administrate Pre-calculation Server.
    and the precalculation is done : Open Query>Publish>BEx Broadcaster , then you can also make new settings --> three or foue tabs  and one of the tab is pre-calculate.
    Regards,
    Nisha Jagtap.

  • Updated no of records through process chain

    We are executing an iP and we want to add a message through email in the IP ,if it is success or fail and also is it possible to update the no of records that are updated in the iP in the same email...
    We are using BW 3.5 version..

    Hi Prince,
    Kindly have a look at below links,
    Process Chains - number of records loaded in various targets
    Message Alret  through Process Chain
    Hope this helps.
    Regards,
    Mani

Maybe you are looking for

  • Reception - SIM Card Connection

    I have been following the banter for a few days and looking for a reason why my reception isn't as good as I expected it to be. While I am not an engineer or electronics expert, I did conclude that the only part of the iPhone 4 that wasn't completely

  • How to transfer photos/film from Mac mini to memory stick

    mac mini late 2012 processor 2.3 Ghz intel core 7 memory 8 Gb software OS X maverick Can somebody tell me how to traNSFER PHOTOS AND MOVIES FROM  MAC MINI TO SD OR MEMORY STICK? Thank You <E-mail Edited by Host>

  • Using FaceTime for first time on mini iPad

    When I verify my apple Id to use face time, it goes back to sign in page for FaceTime. I have checked my apple Id and it says it is verified. If I use another device to try FaceTime myself it says my email address is not available to FaceTime. The ap

  • Bundler Problem

    Im making a ipad issue at the moment but came acros a problem. one of my stacks is apears duble. anyone got a solution?

  • Plugin for emoticons in Apple Mail?

    Is there a plugin for emoticons in Apple Mail? Thanks, HL