Time limit exceeded during activation of ODS.

Hi all ,
Iam loading data to 0FIA_DS12 from 80FIA-DS11 , while activating the ODS it is giving me the errors
Time limit exceeded. No return of the split processes
Background process BCTL_46TLB77BCWTDNQIKN0772P16K terminated due to missing confirmation
Error during confirmation of process 000002
Iam not able to get this error , tried to activated it again but not working.
Could anybody help me in this regard as its a bit urgent as we r working in production..
Thanks in advance

Hi,
I am getting the short dump.
TYPELOAD_NEW_VERSION
Short text
    A newer version of data type "/BI0/AFIA_DS1240" was found than one required
What happened?
    Runtime error
    The current ABAP program "GP46APBHXXAZADBZKVPTC7TA81O" had to be terminated
     because one
    of the statements could not be executed at runtime.
What can you do?
    Restart the program.
    If the error persists, contact your SAP administrator.
    You can use the ABAP dump analysis transaction ST22 to view and manage
    termination messages, in particular for long term reference.
Error analysis
    The data type "/BI0/AFIA_DS1240" was loaded from the database during the
     program run.
    However, a type of a newer version than the one requried was found here.

Similar Messages

  • Program terminated: Time limit exceeded, ABAP performance, max_wprun_time

    Hi,
    I am running an ABAP program, and I get the following short dump:
    Time limit exceeded. The program has exceeded the maximum permitted runtime and has therefore been terminated. After a certain time, the program terminates to free the work processfor other users who are waiting. This is to stop work processes being blocked for too long by
    - Endless loops (DO, WHILE, ...),
    - Database acceses with large result sets,
    - Database accesses without an apporpriate index (full table scan)
    - database accesses producing an excessively large result set,
    The maximum runtime of a program is set by the profile parameter "rdisp/max_wprun_time". The current setting is 10000 seconds. After this, the system gives the program a second chance. During the first half (>= 10000 seconds), a call that is blocking the work process (such as a long-running SQLstatement) can occur. While the statement is being processed, the database layer will not allow it to be interrupted. However, to stop the program terminating immediately after the statement has been successfully processed, the system gives it another 10000 seconds. Hence the maximum runtime of a program is at least twice the value of the system profile parameter "rdisp/max_wprun_time".
    Last error logged in SAP kernel
    Component............ "NI (network interface)"
    Place................ "SAP-Dispatcher ok1a11cs_P06_00 on host ok1a11e0"
    Version.............. 34
    Error code........... "-6"
    Error text........... "connection to partner broken"
    Description.......... "NiPRead"
    System call.......... "recv"
    Module............... "niuxi.c"
    Line................. 1186
    Long-running programs should be started as background jobs. If this is not possible, you can increase the value of the system profile parameter "rdisp/max_wprun_time".
    Program cannot be started as a background job. We have now identified two options to solve the problem:
    - Increase the value of the system profile parameter "rdisp/max_wprun_time"
    - Improve the performance of the following SELECT statement in the program:
    SELECT ps_psp_pnr ebeln ebelp zekkn sakto FROM ekkn
    INTO CORRESPONDING FIELDS OF TABLE i_ekkn
    FOR ALL ENTRIES IN p_lt_proj
    WHERE ps_psp_pnr = p_lt_proj-pspnr
    AND ps_psp_pnr > 0.
    In EKKN we have 200 000 entries.
    Is there any other options we could try?
    Regards,
    Jarmo

    Thanks for your help, this problem seems to be quite challenging...
    In EKKN we have 200 000 entries. 199 999 entries have value of 00000000 in column ps_psp_pnr, and only one has a value which identifies a WBS element.
    I believe the problem is that there isn't any WBS element in PRPS which has the value of 00000000. I guess that is the reason why EKKN is read sequantially.
    I also tried this one, but it doesn't help at all. Before the SELECT statement is executed, there are 594 entries in internal table p_lt_proj_sel:
      DATA p_lt_proj_sel LIKE p_lt_proj OCCURS 0 WITH HEADER LINE.
      p_lt_proj_sel[] = p_lt_proj[].
      DELETE p_lt_proj_sel WHERE pspnr = 0.
      SORT p_lt_proj_sel by pspnr.
      SELECT ps_psp_pnr ebeln ebelp zekkn sakto FROM ekkn
      INTO CORRESPONDING FIELDS OF TABLE i_ekkn
      FOR ALL ENTRIES IN p_lt_proj_sel
      WHERE ps_psp_pnr = p_lt_proj_sel-pspnr.
    I also checked that the index P in EKKN is active.
    Can I somehow force the optimizer to use the index?
    Regards,
    Jarmo

  • Error "time limit Exceed"?"

    hi Experts,
    What to do when a load is failing with "time limit Exceed"?

    Hi,
    Time outs can be due to many reasons.  You will need to find out for your specific build.  Some of the common ones are:
    1. You could have set a large packet size.  See in the monitor whether the number of records in one packet seem inordinately large, say 100,000.  Reduce the number in steps and see which ones work for you.
    2.  The target may have a large number of fields, even then you will receive a time out as the size of the packet may become large.  Same solution as point 1.
    3.  You may have built an index in the target ODS which may impact your write speed.  Remove any indexes and run with the same package size, if it works then you know the index is the problem.
    4.  There is a basis setting for time out.  Check that is set as per SAP recommendation for your system.
    5.  Check the transactional RFCs in the source system.  It may have choked due to a large number of errors or hung queues.
    Cheers...

  • Mails to 1 particular user is getting bounced - Command time limit exceeded

    I've about 100 mail clients. Since yesterday morning, I've had one user whose incoming emails have been getting bounced. I've run a "mailbfr -m user" w/o success. The server is current on updates and has been restarted. All other users are functioning correctly. Notes:
    -IMAP account
    -This user has a 4GB Sent folder. All other folders are under 2GB.
    -This user has recently starting syncing Notes from a Blackberry.
    THE SMTP LOG:
    Oct 7 08:26:52 myserver postfix/pipe[488]: D1BCE11CE230: to=<[email protected]>, relay=cyrus, delay=1000, delays=0.01/0/0/1000, dsn=5.3.0, status=bounced (Command time limit exceeded: "/usr/bin/cyrus/bin/deliver")
    A PORTION OF THE BOUNCE EMAIL SENT TO SENDER:
    This is the mail system at host myserver.com.
    I'm sorry to have to inform you that your message could not
    be delivered to one or more recipients. It's attached below.
    For further assistance, please send mail to postmaster.
    If you do so, please include this problem report. You can
    delete your own text from the attached returned message.
    The mail system
    <[email protected]>: Command time limit exceeded:
    "/usr/bin/cyrus/bin/deliver"
    Reporting-MTA: dns; myserver.com
    X-Postfix-Queue-ID: AD79711CE1DD
    X-Postfix-Sender: rfc822; [email protected]
    Arrival-Date: Thu, 7 Oct 2010 08:07:57 -0700 (PDT)
    Final-Recipient: rfc822; [email protected]
    Original-Recipient: rfc822;[email protected]
    Action: failed
    Status: 5.3.0
    Diagnostic-Code: x-unix; internal software error
    Any help?

    Solved. This person was in-process of moving Blackberry notes to the imap account. There was an interruption during the transfer phase and the server's Notes folder became bad somehow. When Mailbfr hit that particular folder, it stopped the rebuild process.
    I removed the Notes folder and ran Mailbfr again. The Notes folder was recreated and the rebuild function completed.
    Mail delivery has been restored to this user.

  • TRFC Time limit exceeded

    Hi guys,
    Data loading takes place from ODS to cube is stuck up every day because of Time limit exceeded in TRFC queue.
    it is full load and it contains about  370000 records everyday.
    I set data packet size 10,000 (48000 records per packet) , however it was throughing an error.
    usually it is tacking 3hrs to load from ODS to cube,
    1. why such long time its taking to load from ODS to cube.
    2. is there any memory issue
    3.any other suggestion to avoid data packet stuckup
    4.how to  reduce the loading time.
    or
    whether I can reduce data packet size less than 10,000 to avoid this stuckup or to reduce dtataload time.
    Many thanks
    Ram

    Hi Ram,
    We can increase the Background Process Jobs, it is the BASIS work. Please check it with the basis people they will do it for you..
    Processes are depends on the calculation of the RAM u have minimum is 2 and the maximum is 18 work process per an instance. For the same RAM should have calculated for update process with minimum of 1 and the maximum is 3 background processes with 1-6... Calculations comes like.....RAM/256 for work process and differ for other processes. If you have UNIX environment then you can use as many Processes as you want. Actually, 1 WP allocates approx. 15-20MB RAM when system is  idle.
    Hence forth see your RAM and analyze the Predecessors, then workout this with your BASIS people.
    Pls assign points if useful.
    Cheers!
    Ragahvendra Rao.Kolli

  • RFC_ERROR_SYSTEM_FAILURE: Time limit exceeded. Connection Pool - JCO api

    Hi Everyone
    My Connection  Pool parameters JCO api.
    client=300
    user=SISGERAL_RFC
    passwd=******
    ashost=14.29.3.120
    sysnr=00
    size=10
    I have these parameters on my Connection Pool and sometimes appear these wrongs in my application:
    1.
    2006-01-07 13:20:37,414 ERROR com.tel.webapp.framework.SAPDataSource - ##### Time limit exceeded. LOCALIZED MESSAGE = Time limit exceeded. KEY = RFC_ERROR_SYSTEM_FAILURE GROUP = 104 TOSTRING = com.sap.mw.jco.JCO$Exception: (104) RFC_ERROR_SYSTEM_FAILURE: Time limit exceeded.
    2.
    2006-01-07 14:01:31,007 ERROR com.tel.webapp.framework.SapPoolConnectionManager - Timeout
    I’d like to know if is happening.
    Are there something wrong with my connection pool?
    What can be happening?
    Thanks

    Raghu,
    Thanks for your response.
    Yes, the pool connections are in place according to the sAP note mentioned above.
    Regards,
    Faisal

  • Short dump "Time limit exceeded" when searching for Business Transactions

    Hello Experts,
    We migrated from SAP CRM 5.2 to SAP CRM 7.0. After migration, our business transaction search (quotation, sales order, service order, contract etc) ends with the short dump "Time limit exceeded" in class CL_CRM_REPORT_ACC_DYNAMIC, method DATABASE_ACCESS. The select query is triggered from line 5 of this method.
    Number of Records:
    CRMD_ORDERADM_H: 5,115,675
    CRMD_ORDER_INDEX: 74,615,914
    We have done these so far, but the performance is still either poor or times out.
    1. DB team checked the ORACLE parameters and confirmed they are fine. They also checked the health of indices in table CRMD_ORDER_INDEX and indices are healthy
    2. Created additional indices on CRMD_ORDERADM_H and CRMD_ORDER_INDEX. After the creation of indices, some of the searches(without any criteria) work. But it takes more than a minute to fetch 1 or 2 records
    3. An ST05 trace confirmed that the selection on CRMD_ORDER_INDEX takes the most time. It takes about 103 seconds to fetch 2 records (max hits + 1)
    4. If we specify search parameters, say for example a date or status, then again we get a short dump with the message "Time limit exceeded".
    5. Observed that only if a matching index is available for the WHERE clause, the results are returned (albeit slowly). In the absence of an index, we get the dump.
    6. Searched for notes and there are no notes that could help us.
    Any idea what is causing this issue and what we can do to resolve this?
    Regards,
    Bala

    Hi Michael,
    Thanks. Yes we considered the note 1527039. None of the three scenarios mentioned in the note helped us. But we ran CRM_INDEX_REBUILD to check if the table CRMD_ORDER_INDEX had a problem. That did not help us either.
    The business users told us that they mostly search using the date fields or Object ID. We did not have any problem with search by Object ID. So we created additional indices to support search using the date fields.
    Regards,
    Bala

  • Error while running query "time limit exceeding"

    while running a query getting error "time limit exceeding".plz help.

    hi devi,
    use the following links
    queries taking long time to run
    Query taking too long
    with hopes
    Raja Singh

  • TIME LIMIT EXCEEDED ERROR WHILE EXECUTING DTP

    Hi gurus,
    I Have got an error while executing
    The errors are as follows.
    1.Time limit exceeded. No return of the split processes
    2.Background process BCTL_DK9MC0C2QM5GWRM68I1I99HZL terminated due to missing confirmation
    3.Resource error. No batch process available. Process terminated
    Note: Iam not  executing the DTP as a back ground job.
    As it is of  higher priority the answers Asap Is appreciated.
    Regards
    Amar.

    Hi,
    how is it possible to execute a DTP in dialog process. In my mind it is only possible for debugging...
    In "Display Data Transfer Process" -> "Goto" -> "Settings for Batch Manger" you can edit settings like Number of Processes or Job Class.
    Additional take a look at table RSBATCHPARALLEL and
    http://help.sap.com/saphelp_nw04s/helpdata/en/42/f29aa933321a61e10000000a422035/frameset.htm
    Regards
    Andreas

  • Time Limit exceeded while running in RSA3

    Hi BW Experts,
      I am trying to pull 1 lakh data from CRM to BI System.
    Before scheduling, i am trying to execute in RSA3.I am getting the error message as "Time Limit Exceeded".
    Pls suggest, why it is happening like this.
    Thanks in advance.
    Thanks,
    Ram.

    Hi,
                because huge data with in the stipulated time it is not executing and showing the all records ,so it is better to go any selection option by each document type or some else then u can add all the documents ,anyway in bw side we r running this job in background so no problem.if u want see all records at a  time then u can discuss with ur basis people to extend the time for that.
    Thanks & Regards
    sathish

  • Time limit exceeded error

    Hello All,
    I am trying to execute a custom program with a variant, but I receive the Time limit exceeded error [TIME_OUT].
    I am now trying to analyse why this error has occurred as I am a beginner. Any help shall be greatly appreciated.
    Regards,
    Arpita.
    Moderator message: Welcome to SCN!
    Moderator message: Please Read before Posting in the Performance and Tuning Forum
    Edited by: Thomas Zloch on Oct 20, 2011 2:01 PM

    Hi Ramya,
       Your prog running in the back ground, so the time limit of the prog is exceded.  Go to sm37 see the prog running time if exceded correct the time.
    Regards
    Srinu

  • Time Limit exceeded error in R & R queue

    Hi,
    We are getting Time limit exceeded error in the R & R queue when we try to extract the data for a site.
    The error is happening with the message SALESDOCGEN_O_W. It is observed that whenever, the timelimit error is encountered, the possible solution is to run the job in the background. But in this case, is there any possibility to run the particular subscription for sales document in the background.
    Any pointers on this would be of great help.
    Thanks in advance,
    Regards,
    Rasmi.

    Hi Rasmi
    I suppose that the usual answer would be to increase the timeout for the R&R queue.
    We have increased the timeout on ours to 60 mins and that takes care of just about everything.
    The other thing to check would be the volume of data that is going to each site for SALESDOCGEN_O_W. These are pretty big BDOCs and the sales force will not thank you for huge contranns time 
    If you have a subscription for sales documents by business patrner, then it is worth seeing if the business partner subscription could be made more intelligent to fit your needs
    Regards
    James

  • Time Limit exceeded error in ALV report

    I am gettting error "Time Limit Exceeded" when i execute ALV report. Can i run the program in background and how to do that?. I had already optimized my query in the program but even then i am facing the same issue.

    You can process the alv in background by pressing F9...I guess that the output would be available as a spool in SP01.
    You may need to re-check your query...And also, review the alv catalog and any events you are using....
    Greetings,
    Blag.

  • Time Limit Exceeded while executing Proxy Program

    Hi all,
    we are  frequently facing Time Limit Exceeded problem in R/3 system while exceuting proxy program for large payloads (appx 5-7 MB). Sometimes we are able to successfully restart the message and sometimes we have to delete these messages. How can we resolve this issue.
    Thanks,
    Mayank

    hi Joerg,
    we are getting this error in inbound queue in R/3 system, also this is a async call, so no chance of any communication interruption b/w SAP systems. From PI system, message is succeccfully passed to R/3 system & Time Limit Exceeded is coming in R/3 system inboud queue (SMQ2). Is it poosible that timeout will happen within R/3 system.
    Thanks,
    Mayank

  • Time Limit exceeded Error while updating huge number of records in MARC

    Hi experts,
    I have a interface requirement in which third party system will send a big file say.. 3 to 4MB file into SAP. in proxy we
    used BAPI BAPI_MATERIAL_SAVEDATA to save the material/plant data. Now, because of huge amount of data the SAP Queues are
    getting blocked and causing the time limit exceeded issues. As the BAPI can update single material at time, it will be called as many materials
    as we want to update.
    Below is the part of code in my proxy
    Call the BAPI update the safety stock Value.
        CALL FUNCTION 'BAPI_MATERIAL_SAVEDATA'
          EXPORTING
            headdata                    = gs_headdata
            CLIENTDATA                  =
            CLIENTDATAX                 =
           plantdata                   = gs_plantdata
           plantdatax                  = gs_plantdatax
           IMPORTING
              return              = ls_return.
        IF ls_return-type <> 'S'.
          CALL FUNCTION 'BAPI_TRANSACTION_ROLLBACK'.
          MOVE ls_return-message TO lv_message.
    Populate the error table and process next record.
          CALL METHOD me->populate_error
            EXPORTING
              message = lv_message.
          CONTINUE.
        ENDIF.
    Can any one please let me know what could be the best possible approach for this issue.
    Thanks in Advance,
    Jitender
    Hi experts,
    I have a interface requirement in which third party system will send a big file say.. 3 to 4MB file into SAP. in proxy we
    used BAPI BAPI_MATERIAL_SAVEDATA to save the material/plant data. Now, because of huge amount of data the SAP Queues are
    getting blocked and causing the time limit exceeded issues. As the BAPI can update single material at time, it will be called as many materials
    as we want to update.
    Below is the part of code in my proxy
    Call the BAPI update the safety stock Value.
        CALL FUNCTION 'BAPI_MATERIAL_SAVEDATA'
          EXPORTING
            headdata                    = gs_headdata
            CLIENTDATA                  =
            CLIENTDATAX                 =
           plantdata                   = gs_plantdata
           plantdatax                  = gs_plantdatax
           IMPORTING
              return              = ls_return.
        IF ls_return-type <> 'S'.
          CALL FUNCTION 'BAPI_TRANSACTION_ROLLBACK'.
          MOVE ls_return-message TO lv_message.
    Populate the error table and process next record.
          CALL METHOD me->populate_error
            EXPORTING
              message = lv_message.
          CONTINUE.
        ENDIF.
    Can any one please let me know what could be the best possible approach for this issue.
    Thanks in Advance,
    Jitender

    Hi Raju,
    Use the following routine to get fiscal year/period using calday.
    *Data definition:
    DATA: l_Arg1 TYPE RSFISCPER ,
          l_Arg2 TYPE RSFO_DATE ,
          l_Arg3 TYPE T009B-PERIV .
    *Calculation:
    l_Arg2  = TRAN_STRUCTURE-POST_DATE. (<b> This is the date that u have to give</b>)
    l_Arg3  = 'V3'.
    CALL METHOD CL_RSAR_FUNCTION=>DATE_FISCPER(
      EXPORTING I_DATE = l_Arg2
                I_PER = l_Arg3
      IMPORTING E_FISCPER = l_Arg1  ).
    RESULT = l_Arg1 .
    Hope it will sove ur problem....!
    Please Assign points.......
    Best Regards,
    SG

Maybe you are looking for

  • I had to exchange my iphone how do I get my ringtones back?

    I just bought an iphone and i downloaded one ringtone from itunes on the phone. I had to exchange the phone because of some problems with it. I can't figure out how to get my ringtone back. It says that I purchased it, but its not in downloads on the

  • To add more fileds on XD01 on sales

    Hi, I want to add one more fileds Customer group1 in the tr code XD01, if u press buttton sales area data we get tab sales.There I want to add one filed Customer group1. Can anybody suggest me how to do ? Thanks

  • How to use on/off switch with the cover on?

    Ive only just got my IPAD so it may well be me but why is the on/off switch 'under' the cover.  Surely, I dont have to take the cover off everytime I want to switch it on and off?  Or should I not switch it on and off and just use the cover to put in

  • Output determination during delete order

    Hi, As per requirement, When a unprocessed order(which doesn't have any further documents) is deleted in SAP, it has to update certain Z tables. How can this be achieved using using Output determination/ output types? How does output determination wo

  • SAP PCK communication over the internet

    Dear All, Could you please give detailed information on SAP XI PCK and SAP XI communication over the internet? 1. I know there are few certificate issuing authority like DUNB..etc. 2.How SAP will identify the SAP PCK on WWW? will unique ip will be gi