Time_out - Time limit exceed - Error

Hi,
The following code gives the Time_out run time error.
LOOP AT i_outtab.
    IF p_cust = 'X'.
     Select single kunnr from vbak into sales_cust
                 where kunnr =  i_outtab-kunnr
                 and  ERDAT GE date.
    IF sy-subrc NE 0.
    DELETE i_outtab INDEX sy-tabix.
    CONTINUE.
    ENDIF.
    ENDIF.
    i_outtab1-kunnr = i_outtab-kunnr.
    i_outtab1-name1 = i_outtab-name1.
    i_outtab1-land1 = i_outtab-land1.
    i_outtab1-ernam  = i_outtab-ernam.
    i_outtab1-vkorg  = i_outtab-vkorg.
    i_outtab1-vtweg  = i_outtab-vtweg.
    i_outtab1-spart   =  i_outtab-spart.
    APPEND i_outtab1.
    CLEAR i_outtab1.
  ENDLOOP.
However, if the select query is omitted when the condition fails, the performance of the program is fast. Only because of this select query, it goes to run time error.
What is the issue?
Thanks,
Ezhil

Hi Ezhilhrh,
Please see the code below.  I have removed the select query from the loop. 
IF p_cust = 'X'.
DATA: TEMP_ITAB LIKE TABLE OF I_OUTTAB OCCURS 0
      WITH HEADERLINE.
Select single kunnr from vbak into table temp_itab
for all entries in i_outtab
where kunnr = i_outtab-kunnr
  and ERDAT GE date.
IF sy-subrc EQ 0.
i_outtab[] = temp_itab[]
SORT I_OUTTAB BY KUNNR.
LOOP AT i_outtab.
READ TABLE TEMP_ITAB WITH KEY KUNNR = I_OUTTAB-KUNNR BINARY SEARCH.
IF SY-SUBRC EQ 0.
i_outtab1-kunnr = TEMP_ITAB -kunnr.
i_outtab1-name1 = i_outtab-name1.
i_outtab1-land1 = i_outtab-land1.
i_outtab1-ernam = i_outtab-ernam.
i_outtab1-vkorg = i_outtab-vkorg.
i_outtab1-vtweg = i_outtab-vtweg.
i_outtab1-spart = i_outtab-spart.
APPEND i_outtab1.
CLEAR i_outtab1.
ELSE.
DELETE i_outtab INDEX sy-tabix.
CONTINUE.
ENDIF.
ENDLOOP.
ENDIF.
This code will improve your performance as well.
Regards,
Md Ziauddin.

Similar Messages

  • Time limit exceeded error

    Hello All,
    I am trying to execute a custom program with a variant, but I receive the Time limit exceeded error [TIME_OUT].
    I am now trying to analyse why this error has occurred as I am a beginner. Any help shall be greatly appreciated.
    Regards,
    Arpita.
    Moderator message: Welcome to SCN!
    Moderator message: Please Read before Posting in the Performance and Tuning Forum
    Edited by: Thomas Zloch on Oct 20, 2011 2:01 PM

    Hi Ramya,
       Your prog running in the back ground, so the time limit of the prog is exceded.  Go to sm37 see the prog running time if exceded correct the time.
    Regards
    Srinu

  • Time Limit exceeded error in R & R queue

    Hi,
    We are getting Time limit exceeded error in the R & R queue when we try to extract the data for a site.
    The error is happening with the message SALESDOCGEN_O_W. It is observed that whenever, the timelimit error is encountered, the possible solution is to run the job in the background. But in this case, is there any possibility to run the particular subscription for sales document in the background.
    Any pointers on this would be of great help.
    Thanks in advance,
    Regards,
    Rasmi.

    Hi Rasmi
    I suppose that the usual answer would be to increase the timeout for the R&R queue.
    We have increased the timeout on ours to 60 mins and that takes care of just about everything.
    The other thing to check would be the volume of data that is going to each site for SALESDOCGEN_O_W. These are pretty big BDOCs and the sales force will not thank you for huge contranns time 
    If you have a subscription for sales documents by business patrner, then it is worth seeing if the business partner subscription could be made more intelligent to fit your needs
    Regards
    James

  • TIME LIMIT EXCEEDED ERROR WHILE EXECUTING DTP

    Hi gurus,
    I Have got an error while executing
    The errors are as follows.
    1.Time limit exceeded. No return of the split processes
    2.Background process BCTL_DK9MC0C2QM5GWRM68I1I99HZL terminated due to missing confirmation
    3.Resource error. No batch process available. Process terminated
    Note: Iam not  executing the DTP as a back ground job.
    As it is of  higher priority the answers Asap Is appreciated.
    Regards
    Amar.

    Hi,
    how is it possible to execute a DTP in dialog process. In my mind it is only possible for debugging...
    In "Display Data Transfer Process" -> "Goto" -> "Settings for Batch Manger" you can edit settings like Number of Processes or Job Class.
    Additional take a look at table RSBATCHPARALLEL and
    http://help.sap.com/saphelp_nw04s/helpdata/en/42/f29aa933321a61e10000000a422035/frameset.htm
    Regards
    Andreas

  • Time Limit exceeded error in ALV report

    I am gettting error "Time Limit Exceeded" when i execute ALV report. Can i run the program in background and how to do that?. I had already optimized my query in the program but even then i am facing the same issue.

    You can process the alv in background by pressing F9...I guess that the output would be available as a spool in SP01.
    You may need to re-check your query...And also, review the alv catalog and any events you are using....
    Greetings,
    Blag.

  • Time Limit exceeded Error while updating huge number of records in MARC

    Hi experts,
    I have a interface requirement in which third party system will send a big file say.. 3 to 4MB file into SAP. in proxy we
    used BAPI BAPI_MATERIAL_SAVEDATA to save the material/plant data. Now, because of huge amount of data the SAP Queues are
    getting blocked and causing the time limit exceeded issues. As the BAPI can update single material at time, it will be called as many materials
    as we want to update.
    Below is the part of code in my proxy
    Call the BAPI update the safety stock Value.
        CALL FUNCTION 'BAPI_MATERIAL_SAVEDATA'
          EXPORTING
            headdata                    = gs_headdata
            CLIENTDATA                  =
            CLIENTDATAX                 =
           plantdata                   = gs_plantdata
           plantdatax                  = gs_plantdatax
           IMPORTING
              return              = ls_return.
        IF ls_return-type <> 'S'.
          CALL FUNCTION 'BAPI_TRANSACTION_ROLLBACK'.
          MOVE ls_return-message TO lv_message.
    Populate the error table and process next record.
          CALL METHOD me->populate_error
            EXPORTING
              message = lv_message.
          CONTINUE.
        ENDIF.
    Can any one please let me know what could be the best possible approach for this issue.
    Thanks in Advance,
    Jitender
    Hi experts,
    I have a interface requirement in which third party system will send a big file say.. 3 to 4MB file into SAP. in proxy we
    used BAPI BAPI_MATERIAL_SAVEDATA to save the material/plant data. Now, because of huge amount of data the SAP Queues are
    getting blocked and causing the time limit exceeded issues. As the BAPI can update single material at time, it will be called as many materials
    as we want to update.
    Below is the part of code in my proxy
    Call the BAPI update the safety stock Value.
        CALL FUNCTION 'BAPI_MATERIAL_SAVEDATA'
          EXPORTING
            headdata                    = gs_headdata
            CLIENTDATA                  =
            CLIENTDATAX                 =
           plantdata                   = gs_plantdata
           plantdatax                  = gs_plantdatax
           IMPORTING
              return              = ls_return.
        IF ls_return-type <> 'S'.
          CALL FUNCTION 'BAPI_TRANSACTION_ROLLBACK'.
          MOVE ls_return-message TO lv_message.
    Populate the error table and process next record.
          CALL METHOD me->populate_error
            EXPORTING
              message = lv_message.
          CONTINUE.
        ENDIF.
    Can any one please let me know what could be the best possible approach for this issue.
    Thanks in Advance,
    Jitender

    Hi Raju,
    Use the following routine to get fiscal year/period using calday.
    *Data definition:
    DATA: l_Arg1 TYPE RSFISCPER ,
          l_Arg2 TYPE RSFO_DATE ,
          l_Arg3 TYPE T009B-PERIV .
    *Calculation:
    l_Arg2  = TRAN_STRUCTURE-POST_DATE. (<b> This is the date that u have to give</b>)
    l_Arg3  = 'V3'.
    CALL METHOD CL_RSAR_FUNCTION=>DATE_FISCPER(
      EXPORTING I_DATE = l_Arg2
                I_PER = l_Arg3
      IMPORTING E_FISCPER = l_Arg1  ).
    RESULT = l_Arg1 .
    Hope it will sove ur problem....!
    Please Assign points.......
    Best Regards,
    SG

  • IDoc on outbound side from XI - time limit exceeded error

    Hi,
    I have a FIle to IDoc scenario and I'm creating there a lot of idocs (20000) in a single push. I'm getting an error "time limit exceeded" on outbound side with red flag in SXMB_MONI. How to increase this limit parameter? Where to do it? - PI? or R/3?
    Thank you,
    Olian

    Hi,
    Check thios below thread..
    Re: XI timeout error
    /people/michal.krawczyk2/blog/2006/06/08/xi-timeouts-timeouts-timeouts
    Regards,
    Srini

  • IDoc Tracking: TIME LIMIT EXCEEDED

    Hello!
    We have about 15 different R/3 platforms connected to XI using the IDoc adapter. If I use the IDoc Tracking functionality in IDX5 it usually is working fine and the IDoc number in the receiving system and IDoc status are returned within seconds.
    However, for one R/3 platform it was always taking about 30 minutes until the IDoc number and IDoc status were returned. The status bar displays that the IDOC_DATE_TIME_GET function module is executed.
    Now it runs for over one hour until it stops with a TIME LIMIT EXCEEDED error message and no IDoc numbers and status is returned any more. What needs to be corrected to speed up IDoc tracking for this particular system?
    Regards, Tanja

    Not sure if this would help, but this link speaks of how to bypass time limit exceeded
    http://www.erpgenie.com/abaptips/content/view/490/62/

  • Time limit exceeded.

    Hi XI Gurus,
    We are facing the Time limit exceeded error while processing the inbound Queue..
    Kindly provide some solution for the same.
    Regards,
    Anguraj.

    Hi,
    check out the message that caused that and the reason for the timeout will be shown inside - it's probably not the queue timeout (which can be changed in SMQR) but some other timeout - you will see it inside your XI message
    Regards,
    Michal Krawczyk

  • Time limit exceeded when doing MIRO

    Hi Experts,
    time limit exceeded error occured when doing MIRO transaction.
    plz reply with suitable answer.
    Thanks,
    Jyosna

    >
    jyotsna shinde wrote:
    > plz reply with suitable answer.
    Moderator message - Please see Please Read before Posting in the Performance and Tuning Forum before posting - post locked
    Rob

  • CIF - Time Limit Exceeded

    Dear All,
    Working on SCM 4.1
    To CIF products from R/3 to APO.
    Earlier the CIF for Products , was working fine , but today we  it stuck by the Inbound Error in APO "Time Limit Exceeded".
    There are thousonds of products , so how can i check which product is having the issue?
    Plz help me throughout.
    Thanks in advance,
    Regards,
    Rajesh Patil

    Hi,
    Please check if OSS note 1254364 is applicable.
    Normally the reasons for this "Time Limit Exceeded" error can be categorized as below:
    1) Sysfails (Technical / Basis team should be checking  nature of sysfail and take corrective action)
    2) Livecache performance ((Technical / Basis team should be checking   livecache whether performance is low and then take corrective action)
    3) Master data errors if any. From the log we can know the details.
       A quick checking rule can be thought : 1. Are products from a particular plant having more errors 2. Can we run the products transfer job more frequently.
    Regards
    Datta

  • Error while running query "time limit exceeding"

    while running a query getting error "time limit exceeding".plz help.

    hi devi,
    use the following links
    queries taking long time to run
    Query taking too long
    with hopes
    Raja Singh

  • PI 7.0: IDOCs struck in IDX5 with error "Time Limit Exceeded".

    Hi All,
    We have a File to IDOC scenario in PI 7.0. After mapping the IDOCs are posted from PI to ECC System.
    On a normal day this interface works good, yesterday we received a huge file which resulted in the creation of about 25000 IDOCs from one single file. The mapping went fine, however the IDOCs created were not posted into ECC System. When we monitor the IDOCs using transaction code IDX5 in PI system, we found the error message as "Time limit exceeded", the user shown was "PIAFUSER". To overcome this error, we increased the time limit of PIAFUSER from default to about 1500 seconds.
    Now, I want to push these IDOCs from PI into ECC System. Could you please let us know, how to push these IDOCs sitting in PI system to ECC?
    We do not want to reprocess the file from the beginning. Please let us know if it is possible to push the IDOCs without processing the file? If yes, how to reprocess?
    Thanks in advance.
    Regards,
    Manohar Dubbaka.

    Hi,
    the help documentation is as follows:
    Check the tRFC Status  
    Use
    tRFC calls which transfer IDocs use the function module IDOC_INBOUND_ASYNCHRONOUS at reception (before release 4.0: INBOUND_IDOC_PROCESS).
    If an IDoc in the sending system has been passed to tRFC (IDoc status "03"), but has not yet been input in the receiving system, this means that the tRFC call has not yet been executed.
    Activities
    To check the status of the tRFC calls, choose Tools ® IDoc Interface/ALE ® Administration ® Monitoring ® Troubleshooting ® RFC Queue (SM58) and specify any additional selection criteria.
    The program RSARFCEX restarts unsuccessful tRFC calls.
    You cannot choose the option is being executed in background processing.
    Best Regards,
    Erik Hubers

  • TRFC error "time limit exceeded"

    Hi Prashant,
    No reply to my below thread...
    Hi Prashant,
    We are facing this issue quite often as i stated in my previous threads.
    As you mentioned some steps i have already followed all the steps so that i can furnish the jog log and tRFC details for reference long back.
    This issue i have posted one month back with full details and what we temporarily follow to execute this element successfully.
    Number of times i have stated that i need to know the root cause and permanent solution to resolve this issue as the log clearly states that it is due to struck LUWs(Source system).
    Even after executing the LUWs manually the status is same (Request still running and the status is in yellow color).
    I have no idea why this is happening to this element particularly as we have sufficient background jobs.
    we need change some settings like increasing or decreasing data package size or something else to resolve the issue permanently?
    For u i am giving the details once again
    Data flow:Standard DS-->PSA--->Data Target(DSO)
    In process monitor screen the request is in yellow color. NO clear error message s defined here.under update 0 record updated and missing message with yellow color except this the status against each log is green.
    Job log:Job is finished TRFCSSTATE=SYSFAIL message
    Trfcs:time limit exceeded
    What i follow to resolve the issue:Make the request green and manually update from PSA to Data target and the job gets completed successfully.
    Can you please tell me how to follow in this scenario to resolve the issue as i waiting for the same for long time now.
    And till now i didn't get any clue and what ever i have investigated i am getting replies till that point and no further update beyond this
    with regards,
    musai

    Hi,
    You have mentioned that already you have checked for LUWs, so the problem is not there now.
    In source system, go to we02 and check for idoc of type RSRQST & RSINFO. If any of them are in yellow status, take them to BD87 and process them. If the idoc processed is of RSRQST type, it would now create the job in source system for carrying out dataload. If it was of RSINFO type, it would finish the dataload in SAP BI side as well.
    If any in red, then check the reason.

  • Runtime error(Time limit exceeds)after executing select query

    Dear experts, whenever i executing the select query in this zprogram i am getting runtime error that time limit exceeds.i am using inner join and into table.after that also i am geetting error. how can i resolve it??
    SELECT LIKP~VBELN LIKP~WADAT_IST LIKP~VEHICLE_NO LIKP~TRNAME
              LIKP~VEHI_TYPE LIKP~LR_NO LIKP~ANZPK LIKP~W_BILL_NO
              LIKP~SEALNO1                                       " Seal NO1
              LIKP~SEALNO2                                       " Seal NO2
              LIPS~LFIMG
              VBRP~VBELN VBRP~VGBEL VBRP~MATNR VBRP~AUBEL VBRP~FKIMG
              VBAK~AUART
              VBRK~FKART VBRK~KNUMV VBRK~FKSTO
              FROM LIKP INNER JOIN LIPS ON LIKP~VBELN EQ LIPS~VBELN
                        INNER JOIN VBRP ON LIKP~VBELN EQ VBRP~VGBEL
                        INNER JOIN VBAK ON VBRP~AUBEL EQ VBAK~VBELN
                        INNER JOIN VBRK ON VBRP~VBELN EQ VBRK~VBELN
              INTO TABLE  I_FINAL_TEMP
              WHERE LIKP~VSTEL = '5100' AND
                 LIKP~WADAT_IST IN S_WADAT  AND
                    VBRP~AUBEL IN S_AUBEL AND
                    VBAK~AUART IN ('ZJOB','ZOR') AND
                    VBRK~FKART IN S_FKART AND
    *               VBRK~FKART IN ('ZF8','ZF2','ZS1') AND
                    VBRK~FKSTO NE 'X'.
    When I am debugging the select query.the cursor will not go to next step.after 15-20 minutes i am getting runtime error(time limit exceeds).
    how can i resolve it for that scenario??

    Looks like whole SD flow you trying to fetch in single query
    First you check the database statistic of these table are upto date in system ( Check with basis team )
    if this query was working fine earlier.
    Most of table involved are huge volume tables which queried with any primary key
    Any secondary index on created for LIKP on VSTEL WADET ?
    My suggestion would be split the selection queries and make use of primary or existing secondary index to fetch the desired result if possible. For testing purpose split the queries and find which is taking more time and which needs index by taking squel trace in ST05.
    Also take ST05 trace of this query in debugger ( New debugger -> special tool -> trace > ST05/SE30)

Maybe you are looking for