Parallel Processing : How to Handle Resource failure?

Hi,
I have implemented the parallel processing/ asynchronous rfc call in my system because we have to process millions of records and processing is important. My Program does work fine in Development and quality for small number of records but during SVT I am encountering RESOURCE_FAILURE exception. As of now I have tried to wait for more time and then process it again and also on failure I have tried to process sequential but nothing worked with second approach that is on resource_failure execute normal FM call it is resulting in terminating Parallel processing.
Any Pointer on how to handle it is appreciated.
Regards,
Deepak Bhalla

<b>Handling the RESOURCE_FAILURE exception:</b> As each parallel processing task is dispatched, the SAP system counts down the number of resources (dialog work processes) available for processing additional tasks. This count goes up again as each parallel processing task is completed and returns to your program.
Should your parallel processing tasks take a long time to complete, then the parallel processing resources may temporarily run out. In this case, CALL FUNCTION returns the exception RESOURCE_FAILURE. This means simply that all dialog work processes in the RFC group that your program is using are in use.
Your program must now wait until resources become available and then re-issue the CALL FUNCTION that failed. In the sample program, we use a simple, reasonably failsafe wait mechanism. The program waits for parallel processing tasks to return, freeing up resources. The WAIT also specifies a initial timeout of 1 second. If the CALL FUNCTION again fails, the WAIT is repeated with a longer time-out. You can increase the time-outs if you expect that your parallel tasks will take longer to complete. You should also add code to exit from the retry loop after a suitable number of iterations.
Use WAIT statement.
Hope this resolves u r issue.
- Raj

Similar Messages

  • How to handle delta failure in COPA?

    HI FOLKS
    I appreciate if any one can take me through how to handle delta load failure in COPA?Also I need whether we have any other method for 2004 release...to handle delta issue in COPA.......
    no doubt Step by step walk through will definetely gain full points...
    Regards
    GTR

    Hi Indu,
    I will try to implement this HowTo and see what happens. I hope, it works with it.
    Thanks for your advice.
    Kind regards,
    Ali

  • Parallel processing - How to resolve

    Hi,
    Can anyone help to optimize this processing of 2 thread trying to access same resource? Thanks.
    <code>
    create table p1 ( col integer);
    create table p2 ( col integer);
    create table shared_resource(col integer);
    create primary key shared_res_pk on shared_resource(col);
    --Initialize records for p1 and p2
    declare
    -- Local variables here
    i integer;
    begin
    -- Test statements here
    i := 1;
    for x in 1..50000 loop
    insert into p1 values (x);
    insert into p2 values (x);
    commit;
    i := i+1;
    end loop;
    end;
    --Thread 1: p1 runs in parallel with thread to insert same records into table "shared resource"
    declare
    -- Local variables here
    i integer;
    begin
    -- Test statements here
    for x in (select * from p1) loop
    begin
    insert into shared_resource values (x.col);
    commit;
    exception
    when others then
    insert into track_lock (col,process) values (x.col, 'p1');
    commit;
    end;
    end loop;
    commit;
    end;
    --Thread 2:
    declare
    -- Local variables here
    i integer;
    begin
    -- Test statements here
    for x in (select * from p2) loop
    begin
    insert into shared_resource values (x.col);
    commit;
    exception
    when others then
    insert into track_lock (col,process) values (x.col, 'p2');
    commit;
    end;
    end loop;
    end;
    </code>

    Hi, thanks for the reply. Sure, I can use SQL to do the insertion. But my purpose here is to illustrate concurrency issue and how to resolve it optimally.
    In this case, I am trying to illustrate 2 independent PL/SQL data loading running in parallel trying to load same table.

  • How to handle resource conflicts when adding nonworking time.

    I am creating a project plan in MS 2013.  It is a calculate from finish date style T minus tasks before project finish project setup.
    When I enter Nonworking days for the first resource, everything is fine.  Then, when I enter the second resource that is off on that same week, I get a conflicting resource error msg.  asking me to change the non working days or the nonworking
    type.
    It is Christmas and team members will be off at the same time during Christmas week and New Years week.  How do I manage this within MS Project 2013?

    Hi,
    I deleted the duplicated thread.
    What error message do you have exactly?
    I'd suggest you to create a project calendar and enter in it the holidays as exceptins, since it is much more easy than entering Christmas holidays for all resources. You shouldn't have any error message doing this. Please test it and tell us if it works.
    Hope this helps,
    Guillaume Rouyre, MBA, MVP, P-Seller |

  • How to enable an monitor parallel processing in Oracle

    Hi All,
    I have 2 short questions:
    1. When we want parallel processing, we can either use a parallel hint in the query, or alter a table to be parrallel. My question is what is the difference in the following 2 syntax:
    a. ALTER TABLE myTable PARALLEL (DEGREE 3);
    b. ALTER TABLE myTable PARALLEL 3;
    Does the "DEGREE" keywor make any difference? or they both are same statements?
    2. When we enable parallel processing, how can we monitor oracle processes, to confirm that a certain table is being actually processed by multiple threads of a singe user process?
    An early response would be highly appreciated. Thanks.

    1)The parallel clause lets you change the default degree of parallelism for queries and DML on the table.
    2) PARALLEL DEGREE specifies the number of query server processes that can scan the table in parallel. Either specify a positive integer or DEFAULT which signifies to use the initialization parameter
    check further http://mywebsys.com/oracle/syntax/view_syntax.php?id=23
    Thanks

  • How to monitor parallel processing

    Hi All,
    I have 2 short questions:
    1. When we want parallel processing, we can either use a parallel hing in the query, or alter a table to be parrallel. My question is what is the difference in the following 2 syntax:
    a. ALTER TABLE myTable PARALLEL (DEGREE 3);
    b. ALTER TABLE myTable PARALLEL 3;
    Does the "DEGREE" keywor make any difference? or they both are same statements?
    2. When we enable parallel processing, how can we monitor oracle processes, to confirm that a certain table is being actually processed by multiple threads of a singe user process?
    An early response would be highly appreciated. Thanks.

    user566817 wrote:
    2. When we enable parallel processing, how can we monitor oracle processes, to confirm that a certain table is being actually processed by multiple threads of a singe user process?There are a number of virtual performance views that can be used. Please refer to the Oracle® Database Reference guide for details on these.
    Had a look though my scripts and I have this one.. cannot recall if I "borrowed" it from somewhere and customised it and how old it is.. but it should (hopefully) still be mostly correct. It uses the virtual view v$px_process to determine the list of current PQ slaves in the pool and if they are used, map them to the Oracle session using them.
    select  distinct
            x.server_name           as "PQ",
            x.status                as "Status",
            x.sid                   as "OraPID",
            w2.sid                  as "Parent OraPID",
            v.osuser                as "O/S User",
            v.schemaname            as "User",
            w1.event                as "Child Wait",
            w2.event                as "Parent Wait"
    from    v$px_process    x,
            v$lock          l,
            v$session       v,
            v$session_wait w1,
            v$session_wait w2
    where   x.sid =! l.sid(+)
    and     to_number (substr(x.server_name,3)) = l.id2(+)
    and     x.sid = w1.sid(+)
    and     l.sid = w2.sid(+)
    and     x.sid = v.sid(+)
    and     nvl(l.type,'PS') = 'PS'Use at own risk - best would be to verify that this is still valid using the Reference Guide. Or create similar ones using the available V$ views for the details you want to see (e.g. SQL statement executed per PQ, etc).

  • Parallel processing in process chain

    Hi All,
    In 3.X i created a process chain. after dso we get activate data store object data. In that i went and activated parellel processing.
    How do i delete  the parallel processing (how do i disable that) .
    Thanks in advance.

    Go to t-code RSODSO_SETTINGS, then enter your DSO name and change. At maintenance of runtime parameter screen, at the "Parameter for Activation" section, click Change Process Param. Then change the number of processes value.
    If the value is 1, it should state Serial Processing.
    If the value is more than 1, it should state Parallel Processing.
    Modify the value, save. And reenter the "Change Process Param" to see the changes.

  • How to achieve parallel processing in a single request?

    Hi all,
    I have a method in a Session EJB that will perform some business logic before it returns an answer to the client. The logic it will perform is to collect data from the applications database and two external systems, before sending all data to a third external system to get a response and send it back to the client. Each external system is quite slow so I would like to do all the collecting of data concurrent, parallel processing. How should I handle this? I'm not allowed to create my own threads in EJB's. Can I use MDB in some way? To the calling client this should be a synchronous call...
    Greatfull for any suggestions
    Cheers
    Anders =)

    Usually, the request is received by a component located in the web container, such as by an HTTP request (including Web Services). This component is able to start threads to allow parallel processing. Now, if for some reason the request arrives directly at EJB level and that you cannot move its receiver to web component, I think JMS is not a viable solution because you will switch to asynchronous processing and you have no way to make your EJB wait for the responses while preserving the client request (waiting implies programmatic life cycle management, which is forbidden in EJB container). Maybe a resource adapter (JCA) can bring a solution. A resource adapter acts as a datasource (a datasource is a specialization of a resource adapter) and thus it is a logical way to implement an adapter to an external, eventually non-J2EE, resource, as the name implies :) But I don't have enough knowledge in JCA to be sure of this.
    Hope it helps.
    Bruno Collet
    http://www.practicalsoftwarearchitect.com

  • How to handle multiple records in BPMN process

    Hi All,
    We are using Oracle BPM 11g.In my requirement,I am using the database adapter to get the data from table and I need to validate the each record and update the status of that record from the BPM Process.But I dont know how to handle if multiple records come at a time.Can anybody please helpout from this problem.
    Thanks in advanced.
    Narasimha Rao.

    Can you have a look at this post: http://redstack.wordpress.com/2010/09/30/iteratingtraversing-arrays-in-bpm/
    It's solving a different problem, but the key is that it's using a multi-instance subprocess to iterate over an array of "things" that need to be acted in. In your case it's the set of results from the db query rather than the set of tests in the example. But the principle is the same. You'd take collection of rows from the DB and process them in a multi-instance subprocess. The text that begins with the following would be good place to start:
    "Now let’s implement the body of our process. We will use the Subprocess object to handle the traversal of the array of tests. Drag a Subprocess from the component palette on the right into the process and drop it on the line between the Start and End nodes."
    In the loop characteristics you'd define whether you want to execute serially or in parallel.

  • Error in handling Print Params In Parallel Processing of background jobs

    Hi Friends,
    My requirement is to optimize the performance of standard pgm RELEABL1 that takes a long time to complete when scheduled in background,and for that i have created a zpgm which will split the input data and run the jobs in parallel . I am using the submit statement and JOB_OPEN and JOB_CLOSE function modules to schedule the standard prg RELEABL1 in background with the input from my zpgm. The problem here is there is a push button " PRINT PARAMETERS" near the execute button on the selection screen of the standard pgm RELEABL1 in which the printer details have to be mentioned. Whenever i schedule the job in backgroung it throughs an error stating that " Define the Print Parameter First " . I tried my luck with all possible combinations but not able to handle this through my zpgm.....otherwise my pgm works fine. Can someone please guide me on how to handle this print parameters either through submit or what ever way possible.
    Thanks & Regards,
    Balaji.K

    Hi Balaji,
    We have the same performanced problem. 8 processes in parallel, still suffers from bad performance. How many subscribers are there on the system and how many processes do you use ?
    Best Regards,
    Ugur Uygan

  • Parallel Processing - Rechecking number of available resources.

    Hi  SAP Gurus,
    Anyone got an idea on how to determine the number of available resources when using parallel processing / multithreading approach to optimize a program?  I was able to determine number of free resources  by calling fm SPBT_INITIALIZE but wasn't able to perform another similar call to this fm (exception PBT_ENV_ALREADY_INITIALIZED is being triggered) for the purpose of rechecking the current available resources that I may use from time to time.  Any idea?
    Thanks,
    Allex

    Hi,
    insert after INITIALIZE:
    case sy-subrc.
        when 0. "ok
        when 3.
          call function 'SPBT_GET_CURR_RESOURCE_INFO'
           importing
    *   MAX_PBT_WPS                       =
             free_pbt_wps                      = gv_maxno_pbt_available
    exceptions
             internal_error                    = 1
             pbt_env_not_initialized_yet       = 2
             others                            = 3.
      when others.
      endcase.
    kind regards,
    hp

  • How to do parallel processing with dynamic internal table

    Hi All,
    I need to implement parallel processing that involves dynamically created internal tables. I tried doing so using RFC function modules (using starting new task and other such methods) but didn't get success this requires RFC enabled function modules and at the same time RFC enabled function modules do not allow generic data type (STANDARD TABLE) which is needed for passing dynamic internal tables. My exact requirement is as follows:
    1. I've large chunk of data in two internal tables, one of them is formed dynamically and hence it's structure is not known at the time of coding.
    2. This data has to be processed together to generate another internal table, whose structure is pre-defined. But this data processing is taking very long time as the number of records are close to a million.
    3. I need to divide the dynamic internal table into (say) 1000 records each and pass to a function module and submit it to run in another task. Many such tasks will be executed in parallel.
    4. The function module running in parallel can insert the processed data into a database table and the main program can access it from there.
    Unfortunately, due to the limitation of not allowing generic data types in RFC, I'm unable to do this. Does anyone has any idea how to implement parallel processing using dynamic internal tables in these type of conditions.
    Any help will be highly appreciated.
    Thanks and regards,
    Ashin

    try the below code...
      DATA: w_subrc TYPE sy-subrc.
      DATA: w_infty(5) TYPE  c.
      data: w_string type string.
      FIELD-SYMBOLS: <f1> TYPE table.
      FIELD-SYMBOLS: <f1_wa> TYPE ANY.
      DATA: ref_tab TYPE REF TO data.
      CONCATENATE 'P' infty INTO w_infty.
      CREATE DATA ref_tab TYPE STANDARD TABLE OF (w_infty).
      ASSIGN ref_tab->* TO <f1>.
    * Create dynamic work area
      CREATE DATA ref_tab TYPE (w_infty).
      ASSIGN ref_tab->* TO <f1_wa>.
      IF begda IS INITIAL.
        begda = '18000101'.
      ENDIF.
      IF endda IS INITIAL.
        endda = '99991231'.
      ENDIF.
      CALL FUNCTION 'HR_READ_INFOTYPE'
        EXPORTING
          pernr           = pernr
          infty           = infty
          begda           = '18000101'
          endda           = '99991231'
        IMPORTING
          subrc           = w_subrc
        TABLES
          infty_tab       = <f1>
        EXCEPTIONS
          infty_not_found = 1
          OTHERS          = 2.
      IF sy-subrc <> 0.
        subrc = w_subrc.
      ELSE.
      ENDIF.

  • How to define "leading" random number in Infoset fpr parallel processing

    Hello,
    in Bankanalyzer we use an Infoset which consists of a selection across 4 ODS tables to gather data.
    No matter which PACKNO fields we check or uncheck in the infoset definition screen (TA RSISET), the parallel frameworks always selects the same PACKNO field from one ODS table.
    Unfortunately, the table that is selected by the framework is not suitable, because our
    "leading" ODS table which holds most of our selection criteria is another one.
    How to "convince" the parallel framework to select our leading table for the specification
    of the PACKNO in addition (this would be times 20 faster due to better select options).
    We even tried to assign "alternate characteristics" to the packnos we do not liek to use,
    but it seems that note 999101 just fixes this for non-system-fields.
    But for the random number a diffrent form routine is used in /BA1/LF3_OBJ_INDEX_READF01
    fill_range_random instead of fill_range.
    Has anyone managed to assign the PACKNO of his choice to the infoset selection?
    How?
    Thanks in advance
    Volker

    Well, it is a bit more complicated
    ODS one, that the parallel framework selects for being the one to deliver the PACKNO
    is about equal in size (~120GB each) to ODS two which has two significant field which cuts down the
    amount of data to be retreived.
    Currently we execute the generated SQL in the best possible manner (by faking some stats )
    The problem is, that I'd like to have a Statement that has the PACKNO in the very same table.
    PACKNO is a generated random number esp. to be used for parallel processing.
    The job starts about 100 slaves
    Each slave gets a packet to be processed from the framework, which is internaly represented
    by a BETWEEN clause on this PACKNO. This is joined against ODS2 and then the selective fields
    can be compared resultin in 90% of the already fetched rowes can be discarded.
    Basicly it goes like
    select ...
    from
      ods1 T_00,
      ods2 T_01,
      ods3 T_02,
      ods4 T_03
    where
    ... some key equivalence join-conditions ...
    AND  T_00.PACKNO BETWEEN '000000' and '000050' -- very selective on T_00
    AND  T_01.TYPE = '202'  -- selective Value 10% on second table
    I'd trying to change this to
    AND  T_01.PACKNO BETWEEN '000000' and '000050'
    AND  T_01.TYPE = '202'  -- selective Value 10%
    so I can use a combined Index on T_01 (TYPE;PACKNO)
    This would be times 10 more selective on the driving table and due to the fact,
    that T_00 would be joined for just the rows I need, about a calculated time 20-30 faster.
    It really boosts when I do this in sqlplus
    Hope this clearyfies a bit.
    Problem is, that I can not change the code either for doing the
    build of the packets or the one that executes the application.
    I need to change the Inofset, so that the framework decides to build
    proper SQL with T_01.PACKNO instead of T_00.PACKNO.
    Thanks a lot
    Volker

  • How to map expdp parallele process to output file

    How to map expdp parallel process to its output file while running...
    say i use expdp dumpfile=test_%U.dmp parallel=5 ..
    Each parallele process writing to its related output file.. i want to know the mapping in run time...

    I'm not sure if this information is reported in the status command but it's worth a shot. You can get to the status command 2 ways:
    if you are running a datapump job from a terminal window, then while it is running, type ctl-c and you will get the datapump prompt. Either IMPORT> or EXPORT>
    IMPORT> status
    If you type status, it will tell you a bunch of information about the job and then each process. It may have dumpfile information in there.
    If you run it interactively, then you need to attach to the job. To do this, you need to know the job name. If you don't know, you can look at sys.dba_datapump_jobs if prived, or sys.user_datapump_jobs if not prived. You will see a job name and a schema name. Once you have that, you can:
    expdp user/password attach=schema.job_name
    This will bring you to the EXPORT>/iMPORT> prompt. Type status there.
    Like I said, I'm not sure if file name information is specified, but it might be. If it is not there, then I don't know of any other way to get it.
    Dean

  • How to handle xml message in proxy inbound processing?

    Hi Experts,
    I have a scenario that is SOAP Client====>XI===>ECC.
    But i don't need to use the XI mapping,i skip mapping
    in XI and use the generated proxy inbound processing.
    Here is a message structure as below.
    <commodityList>
    &#9632;<commodity>
    &#9632;&#9632;<detailNo>303303</detailNo>
    &#9632;&#9632;<makerName>sony</makerName>
    &#9632;&#9632;<ChargeInfoList>
    &#9632;&#9632;&#9632;<productId>aaaa</productId>
    &#9632;&#9632;&#9632;<name>bbb</name>
    &#9632;&#9632;</ChargeInfoList>
    &#9632;</commodity>
    </commodityList>
    When i sent the message without field entry of <productId>
    and <name> i got the response in soap client as below.
    <commodityList>
    &#9632;<commodity>
    &#9632;&#9632;<detailNo>303303</detailNo>
    &#9632;&#9632;<makerName>sony</makerName>
    &#9632;</commodity>
    </commodityList>
    The field tag <ChargeInfoList> doesn't display.
    But i want it to display as below.
    <commodityList>
    &#9632;<commodity>
    &#9632;&#9632;<detailNo>303303</detailNo>
    &#9632;&#9632;<makerName>sony</makerName>
    &#9632;&#9632;<ChargeInfoList>
    &#9632;</commodity>
    </commodityList>
    In case of field entry is empty,how to  let the response
    contains tag?
    As i know ,there is a CONTROLLER in proxy .
    But i don't know if it is relevant to this
    case and i don't know how to handle it.
    Brand

    Hi Mrudula,
    As far as i know there are no content conversion methodology for HTTPS as the recevier adapter.
    Also you can read through these links to confirm the same:
    http://help.sap.com/saphelp_nw04/helpdata/en/0d/5ab43b274a960de10000000a114084/content.htm
    http://publib.boulder.ibm.com/infocenter/wbihelp/v6rxmx/index.jsp?topic=/com.ibm.wbia_adapters.doc/doc/sap_xi/sapximst30.htm
    SAP NetWeaver - XML Communication Interface (CA-XML) [original link is broken]
    Regards,
    abhy
    note: reward the helpful.

Maybe you are looking for