TIME LIMIT EXCEEDED ERROR WHILE EXECUTING DTP

Hi gurus,
I Have got an error while executing
The errors are as follows.
1.Time limit exceeded. No return of the split processes
2.Background process BCTL_DK9MC0C2QM5GWRM68I1I99HZL terminated due to missing confirmation
3.Resource error. No batch process available. Process terminated
Note: Iam not  executing the DTP as a back ground job.
As it is of  higher priority the answers Asap Is appreciated.
Regards
Amar.

Hi,
how is it possible to execute a DTP in dialog process. In my mind it is only possible for debugging...
In "Display Data Transfer Process" -> "Goto" -> "Settings for Batch Manger" you can edit settings like Number of Processes or Job Class.
Additional take a look at table RSBATCHPARALLEL and
http://help.sap.com/saphelp_nw04s/helpdata/en/42/f29aa933321a61e10000000a422035/frameset.htm
Regards
Andreas

Similar Messages

  • Time Limit exceeded Error while updating huge number of records in MARC

    Hi experts,
    I have a interface requirement in which third party system will send a big file say.. 3 to 4MB file into SAP. in proxy we
    used BAPI BAPI_MATERIAL_SAVEDATA to save the material/plant data. Now, because of huge amount of data the SAP Queues are
    getting blocked and causing the time limit exceeded issues. As the BAPI can update single material at time, it will be called as many materials
    as we want to update.
    Below is the part of code in my proxy
    Call the BAPI update the safety stock Value.
        CALL FUNCTION 'BAPI_MATERIAL_SAVEDATA'
          EXPORTING
            headdata                    = gs_headdata
            CLIENTDATA                  =
            CLIENTDATAX                 =
           plantdata                   = gs_plantdata
           plantdatax                  = gs_plantdatax
           IMPORTING
              return              = ls_return.
        IF ls_return-type <> 'S'.
          CALL FUNCTION 'BAPI_TRANSACTION_ROLLBACK'.
          MOVE ls_return-message TO lv_message.
    Populate the error table and process next record.
          CALL METHOD me->populate_error
            EXPORTING
              message = lv_message.
          CONTINUE.
        ENDIF.
    Can any one please let me know what could be the best possible approach for this issue.
    Thanks in Advance,
    Jitender
    Hi experts,
    I have a interface requirement in which third party system will send a big file say.. 3 to 4MB file into SAP. in proxy we
    used BAPI BAPI_MATERIAL_SAVEDATA to save the material/plant data. Now, because of huge amount of data the SAP Queues are
    getting blocked and causing the time limit exceeded issues. As the BAPI can update single material at time, it will be called as many materials
    as we want to update.
    Below is the part of code in my proxy
    Call the BAPI update the safety stock Value.
        CALL FUNCTION 'BAPI_MATERIAL_SAVEDATA'
          EXPORTING
            headdata                    = gs_headdata
            CLIENTDATA                  =
            CLIENTDATAX                 =
           plantdata                   = gs_plantdata
           plantdatax                  = gs_plantdatax
           IMPORTING
              return              = ls_return.
        IF ls_return-type <> 'S'.
          CALL FUNCTION 'BAPI_TRANSACTION_ROLLBACK'.
          MOVE ls_return-message TO lv_message.
    Populate the error table and process next record.
          CALL METHOD me->populate_error
            EXPORTING
              message = lv_message.
          CONTINUE.
        ENDIF.
    Can any one please let me know what could be the best possible approach for this issue.
    Thanks in Advance,
    Jitender

    Hi Raju,
    Use the following routine to get fiscal year/period using calday.
    *Data definition:
    DATA: l_Arg1 TYPE RSFISCPER ,
          l_Arg2 TYPE RSFO_DATE ,
          l_Arg3 TYPE T009B-PERIV .
    *Calculation:
    l_Arg2  = TRAN_STRUCTURE-POST_DATE. (<b> This is the date that u have to give</b>)
    l_Arg3  = 'V3'.
    CALL METHOD CL_RSAR_FUNCTION=>DATE_FISCPER(
      EXPORTING I_DATE = l_Arg2
                I_PER = l_Arg3
      IMPORTING E_FISCPER = l_Arg1  ).
    RESULT = l_Arg1 .
    Hope it will sove ur problem....!
    Please Assign points.......
    Best Regards,
    SG

  • Time limit exceeded error

    Hello All,
    I am trying to execute a custom program with a variant, but I receive the Time limit exceeded error [TIME_OUT].
    I am now trying to analyse why this error has occurred as I am a beginner. Any help shall be greatly appreciated.
    Regards,
    Arpita.
    Moderator message: Welcome to SCN!
    Moderator message: Please Read before Posting in the Performance and Tuning Forum
    Edited by: Thomas Zloch on Oct 20, 2011 2:01 PM

    Hi Ramya,
       Your prog running in the back ground, so the time limit of the prog is exceded.  Go to sm37 see the prog running time if exceded correct the time.
    Regards
    Srinu

  • Time Limit exceeded error in ALV report

    I am gettting error "Time Limit Exceeded" when i execute ALV report. Can i run the program in background and how to do that?. I had already optimized my query in the program but even then i am facing the same issue.

    You can process the alv in background by pressing F9...I guess that the output would be available as a spool in SP01.
    You may need to re-check your query...And also, review the alv catalog and any events you are using....
    Greetings,
    Blag.

  • Error while executing DTP

    Hi gurus,
    Iam trying to extract the data from r/3 using generic extractor the data is loaded into PSA sucessfully, but iam getting following errors while executing DTP.
    1.An error occurred while executing a transformation rule:
    The exact error message is:
    The argument 'EA' cannot be interpreted as a number
    The error was triggered at the following point in the program
    GP4D35STLXQI3SHIVNQC2FSJ7MB 791
    2.The data record was filtered out because data records with the same key
    have already been filtered out in the current step for other reasons and
    the current update is non-commutative (for example, MOVE). This means
    that data records cannot be exchanged on the basis of the semantic key.
    Please guide me accordingly.
    Regards
    Amar.

    Hi
    While mapping the Qty Fields it is must to add UOM to the Qty Fields and map it with relevant Info Objects.
    The Semantic Keys defined at DTP are also has some issues, try to give a dummy key figure if you are using DSO in the data flow as the DSO has the Overwrite mode.
    (Choose  Semantic Groups to specify how you want to build the data packages that are read from the source (DataSource or InfoProvider). To do this, define key fields. Data records that have the same key are combined in a single data package. This setting is only relevant for DataStore objects with data fields that are overwritten. This setting also defines the key fields for the error stack. By defining the key for the error stack, you ensure that the data can be updated in the target in the correct order once the incorrect data records have been corrected.)
    Hope it helps and clear

  • Time Limit exceeded error in R & R queue

    Hi,
    We are getting Time limit exceeded error in the R & R queue when we try to extract the data for a site.
    The error is happening with the message SALESDOCGEN_O_W. It is observed that whenever, the timelimit error is encountered, the possible solution is to run the job in the background. But in this case, is there any possibility to run the particular subscription for sales document in the background.
    Any pointers on this would be of great help.
    Thanks in advance,
    Regards,
    Rasmi.

    Hi Rasmi
    I suppose that the usual answer would be to increase the timeout for the R&R queue.
    We have increased the timeout on ours to 60 mins and that takes care of just about everything.
    The other thing to check would be the volume of data that is going to each site for SALESDOCGEN_O_W. These are pretty big BDOCs and the sales force will not thank you for huge contranns time 
    If you have a subscription for sales documents by business patrner, then it is worth seeing if the business partner subscription could be made more intelligent to fit your needs
    Regards
    James

  • IDoc on outbound side from XI - time limit exceeded error

    Hi,
    I have a FIle to IDoc scenario and I'm creating there a lot of idocs (20000) in a single push. I'm getting an error "time limit exceeded" on outbound side with red flag in SXMB_MONI. How to increase this limit parameter? Where to do it? - PI? or R/3?
    Thank you,
    Olian

    Hi,
    Check thios below thread..
    Re: XI timeout error
    /people/michal.krawczyk2/blog/2006/06/08/xi-timeouts-timeouts-timeouts
    Regards,
    Srini

  • Error while Executing DTP for 0ic_c03

    Hi Friends,
    While executing the 2lis_03_bx DTP for the cube 0ic_c03, i am facing this peculiar problem.
    Executing the DTP takes a lot of time and after some time i am getting the error "Status 'Processed with Errors'
    Message no. RSBK257"
    I have tried with all the three Extraction mode in DTP, but am gettig the same result. i also deleted the DTP and created it again, but am facing the same problem.
    Please suggest me with some solutions.
    Thanks in advance

    Hi Friends
    Thanks for replying
    My colleague told me that in BI 7, its not necessary that we have to load BX datasource. wanted to confirm whether that is true.
    And another thing is i  Tried to load the datasource 2lis_03_bf, to the PSA but i am facing getting the error message.  Transfer structure field not contained in DataSource. Message no. R3037. and "Errors in source system, Message no. RSM340".
    Are both the errors related.
    what can be the solution for this issue.
    Thanks

  • Error while execute DTP

    Hi Experts,
    I create a new key figure on cube, then after this I had a problem when I execute the DTP.
    Error while updating to target PS_C08T (type INFOCUBE)
    Message number: RSBK241
    I need help for this.
    Thansk a lot.

    Hi,
    I think you have added the keyfigure in Infocube......... there after you need to activate the transformation  and properly map the new target keyfigure with the source keyfigure .... then activate the transformation also activate the DTP as well....... then execute the DTP ......... hopefully it should solve the error....
    If not then please provide some more details of the error............
    Thanks
    Mayank

  • Getting error while executing DTP

    Hello, 
    I am learning BI 7.0, am trying load master data from a flatfile to InfoObject. I am getting error when I execute DTP.
    I have 15 records in the file. Below are the message that I see in the DTP monitor
    -  Extraction DataSource  ( shows green light)
    - Filter out New Records with Same Key ( shows red light) Below are the messages for 15 records
    ' Record filtered in advance as error records with the same key exist'
    Please help me how to resolve this error.

    I checked my flatfile it looks as following structure
    LANG    Material Number      Mat Desc
    E          MAT001                   text1
    E          MAT002                   text2
    E          MAT003                   text3
    I checked the transformations, it shows 'Key' sign against  both lang and material number field. I checked the datasource field list and I 'Key' column is not checked. I don't know where key field settings need to be done. According to my understanding combination of Lang and Material Number forms the key then it cannot be duplicate as the material number is different though the lang field is same.
    Please advise. Thanks

  • Time_out - Time limit exceed - Error

    Hi,
    The following code gives the Time_out run time error.
    LOOP AT i_outtab.
        IF p_cust = 'X'.
         Select single kunnr from vbak into sales_cust
                     where kunnr =  i_outtab-kunnr
                     and  ERDAT GE date.
        IF sy-subrc NE 0.
        DELETE i_outtab INDEX sy-tabix.
        CONTINUE.
        ENDIF.
        ENDIF.
        i_outtab1-kunnr = i_outtab-kunnr.
        i_outtab1-name1 = i_outtab-name1.
        i_outtab1-land1 = i_outtab-land1.
        i_outtab1-ernam  = i_outtab-ernam.
        i_outtab1-vkorg  = i_outtab-vkorg.
        i_outtab1-vtweg  = i_outtab-vtweg.
        i_outtab1-spart   =  i_outtab-spart.
        APPEND i_outtab1.
        CLEAR i_outtab1.
      ENDLOOP.
    However, if the select query is omitted when the condition fails, the performance of the program is fast. Only because of this select query, it goes to run time error.
    What is the issue?
    Thanks,
    Ezhil

    Hi Ezhilhrh,
    Please see the code below.  I have removed the select query from the loop. 
    IF p_cust = 'X'.
    DATA: TEMP_ITAB LIKE TABLE OF I_OUTTAB OCCURS 0
          WITH HEADERLINE.
    Select single kunnr from vbak into table temp_itab
    for all entries in i_outtab
    where kunnr = i_outtab-kunnr
      and ERDAT GE date.
    IF sy-subrc EQ 0.
    i_outtab[] = temp_itab[]
    SORT I_OUTTAB BY KUNNR.
    LOOP AT i_outtab.
    READ TABLE TEMP_ITAB WITH KEY KUNNR = I_OUTTAB-KUNNR BINARY SEARCH.
    IF SY-SUBRC EQ 0.
    i_outtab1-kunnr = TEMP_ITAB -kunnr.
    i_outtab1-name1 = i_outtab-name1.
    i_outtab1-land1 = i_outtab-land1.
    i_outtab1-ernam = i_outtab-ernam.
    i_outtab1-vkorg = i_outtab-vkorg.
    i_outtab1-vtweg = i_outtab-vtweg.
    i_outtab1-spart = i_outtab-spart.
    APPEND i_outtab1.
    CLEAR i_outtab1.
    ELSE.
    DELETE i_outtab INDEX sy-tabix.
    CONTINUE.
    ENDIF.
    ENDLOOP.
    ENDIF.
    This code will improve your performance as well.
    Regards,
    Md Ziauddin.

  • Time limit exceeded.

    Hi XI Gurus,
    We are facing the Time limit exceeded error while processing the inbound Queue..
    Kindly provide some solution for the same.
    Regards,
    Anguraj.

    Hi,
    check out the message that caused that and the reason for the timeout will be shown inside - it's probably not the queue timeout (which can be changed in SMQR) but some other timeout - you will see it inside your XI message
    Regards,
    Michal Krawczyk

  • IDoc Tracking: TIME LIMIT EXCEEDED

    Hello!
    We have about 15 different R/3 platforms connected to XI using the IDoc adapter. If I use the IDoc Tracking functionality in IDX5 it usually is working fine and the IDoc number in the receiving system and IDoc status are returned within seconds.
    However, for one R/3 platform it was always taking about 30 minutes until the IDoc number and IDoc status were returned. The status bar displays that the IDOC_DATE_TIME_GET function module is executed.
    Now it runs for over one hour until it stops with a TIME LIMIT EXCEEDED error message and no IDoc numbers and status is returned any more. What needs to be corrected to speed up IDoc tracking for this particular system?
    Regards, Tanja

    Not sure if this would help, but this link speaks of how to bypass time limit exceeded
    http://www.erpgenie.com/abaptips/content/view/490/62/

  • Time limit exceeded when doing MIRO

    Hi Experts,
    time limit exceeded error occured when doing MIRO transaction.
    plz reply with suitable answer.
    Thanks,
    Jyosna

    >
    jyotsna shinde wrote:
    > plz reply with suitable answer.
    Moderator message - Please see Please Read before Posting in the Performance and Tuning Forum before posting - post locked
    Rob

  • CIF - Time Limit Exceeded

    Dear All,
    Working on SCM 4.1
    To CIF products from R/3 to APO.
    Earlier the CIF for Products , was working fine , but today we  it stuck by the Inbound Error in APO "Time Limit Exceeded".
    There are thousonds of products , so how can i check which product is having the issue?
    Plz help me throughout.
    Thanks in advance,
    Regards,
    Rajesh Patil

    Hi,
    Please check if OSS note 1254364 is applicable.
    Normally the reasons for this "Time Limit Exceeded" error can be categorized as below:
    1) Sysfails (Technical / Basis team should be checking  nature of sysfail and take corrective action)
    2) Livecache performance ((Technical / Basis team should be checking   livecache whether performance is low and then take corrective action)
    3) Master data errors if any. From the log we can know the details.
       A quick checking rule can be thought : 1. Are products from a particular plant having more errors 2. Can we run the products transfer job more frequently.
    Regards
    Datta

Maybe you are looking for

  • ITunes crashed while downloading album

    iTunes crashed while I was in the middle of downloading an album. I was charged for all 16 songs although I only got 9 so far (before program crashed). How can I get the remaining 7 songs that I already paid for? Thanks.

  • Editing avchd video

    I am using Final Cut Express and are editing avchd video. When I export this video at Quicktime Compressor, it comes out "not running Smooth". It is running choppyly. It is also choppyly when I burn it on a DVD.

  • JTextPane, how to set only a part of text editable?

    How to set only a part of text in JTextPane editable? For example I want to forbid changing text after 'enter'. JTextPane has method setEditable but this works only for whole JTextPane. Plz help!!!

  • Adding Chapter Marks makes DVD audio not work

    I sent a 65-minute video to Compressor from Final Cut Pro X. In Compressor I used the 'Dolby Digital Professional' and 'MPEG-2 for DVD' settings from the Disc Burning folder as targets, and added a job action to create the DVD. I left all default set

  • PC Companion update fails

    Tried to start my PC companion for Experia ARC S, it stated update required which I accepted but after download finish it states "installation error" and comapanion closes/will not run. HELP?