Huge entries in Worklist

Hi,
I have high volume processes. In 3 weeks the processes generate over 250.000 entries in the worklist. I want to schedule a job to delete the entries of this table.
Can anyone tell how I can schedule a job to delete the worklist items.
Thanks.

Hi Chris,
Refer the below links, this may help you...
http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/402fae48-0601-0010-3088-85c46a236f50?quicklink=index&overridelayout=true
http://help.sap.com/saphelp_nw70/helpdata/EN/0e/80553b4d53273de10000000a114084/frameset.htm
Thanks,

Similar Messages

  • CAT2 multiple person data entry and worklist

    Hello everybody,
    For our users we have created a multiple person data entry profile and everything is working fine. hours for multiple people can now be entered in CAT2. However, our users are now stating a new request: They would like to see the worklist aswell (just like with single person hour entry).
    When trying to customize the data entry profile and making the setting for showing the worklist, SAP shows the message that multiple person and worklist are mutually exclusive, so not posible.
    However, the request from our users doesn't seem that strange. Would anyone of you know a way to combine the worklist with multiple person hour entry?
    Thanks in advance for your advice.
    Peter Hageman

    hi
    As per the message frm SAP , standard way would not be possible , you may think of a enhancement for this.
    Regards
    sameer

  • Conversion Completion Phase : REPOSRC entry in worklist

    Hi all,
    I am at the Conversion Completion Phase in a MDMP UC.
    In SUMG, I have an enty REPOSRC among others which has a few thousand entries.
    I scheduled the worker job, and it completed in split seconds without any logs in the "main log"
    I found :
    SUMG error
    indicating it can be ignored.
    However, I am not able to view the table REPOSC in se16, it gives me a shortdump.
    Syntax error in program "/1BCDWB/DBREPOSRC ".
    The following syntax error occurred in the program /1BCDWB/DBREPOSRC :
    "The databae view "REPOSRC" is write-protected, so it cannot be changed"
    Error in ABAP application program.
    exactly in this thread.
    TABLE NAME..............
    I am concern if this is an actual problem or normal behavior for this version.
    My system is SAP R/3 Ent 4.7 x110 SR 1 with SAP Basis 68, Kernel 6.40 / patch 369. System is unicode now.
    I have checked in an ECC system, and I am able to view the REPOSRC table entries. and I am able to do a syntax check on the program "/1BCDWB/DBREPOSRC ".
    I have checked a SAP R/3 4.6 system, this table and program does not exist. Thus, I am thinking if this is normal behavior in 4.7 as it is in between versions...or if this is a genuine error from the UC.
    Please advise.
    Nikki

    Hi Nikki,
    please check these threads:
    SUMG error
    Error in SUMG
    REPOSRC cannot be repaired by SUMG; It appears in SUMG, because the reports listed in the table may contain some language-dependent comments or the subcomponents (e.g function modules or includes) have no language key assigned. However, this does not have any consequences for functionality of the report sources. You can therefore ignore REPOSRC in SUMG.
    Best regards,
    Nils Buerckel

  • How to change  the Timestamp In  a program

    Hi,
    We faced a problem today, Some of the critical jobs from a source system are going to short Dump. Unfortunately we couldn't find any lockwaits, deadlocks and any delay in SM37. After analyzing, we came to know that, SAP trace indicates failure occured during program /GLB/RGTCSR_RC_INV execution :
    B  *** ERROR => DISTRIBUTED_TA_FAILED:
    The timestamp for the execution of the program has been set as "20.070.801.000.000", that does mean it will try to retrieve all values after the date 01.08.2007 and the time 00:00:00. Now checking on the code we can see there is a logic written which retrieves the values from the table /GLB/RGTT_CS_RC, checking on the Goods Movement status (wbsta) , timestamp and document number (VBELN_SO) if any. For this case VBELN_SO is blank and vbsta is equal to space (Not Relevant). This retrieval helped to logic to retrieve 706706 document and items from the table. This amount of data is huge and we would like to request BW team to run the program with a smaller timestamp to avoid any inconsistencies and also to avoid the dumps which can occur when we try to play with this huge entries.
    Checking on the code, two similar selects have been written on the same table /GLB/RGTT_CS_RC, firstly to select all the document numbers relevant to the timestamp and wbsta value and secondly to check "Get records that may have already been used", depending on the document number and item numbers. But as we are doing a select * it will automatically select VBELN_SO as well as POSNR_SO.  Hence, the second select from the same table is irrelevant and it will only effect the performance of the system, in the worst case it will terminate the program with this huge amount of entries. Hence, this logic has to be combined to one single select for which the first select statement is enough. Playing with a huge amount of data 706706, and retrieving from the same table twice can surely result in a short dump. Also there are couple of loops has been done on the same table which reduces the performance too.
    Possible solution looks like we need to reduce the time stamp and run it,
    can some one tell me how to change the time stamp in a program???

    First check the connection with the source system. If its ok then replicate the datasource and activate. Then try with running job.
    Time stamp error usually happens when source system in BI and r/3 side differs.

  • Best method to update database table for 3 to 4 million rows

    Hi All,
    I have 3 to 4 million rows are there in my excel file and we have to load to Z-Table.
    The intent is to load and keep 18 months of history in this table. 
    so what should be best way for huge volume of data to Z-Table from excel file.
    If is from the program, is that the best way use the FM 'GUI_DOWNLOAD' and down load those entries into the internal table and directly do as below
    INSERT Z_TABLE from IT_DOWNLOAD.
    I think for the huge amount of data it goes to dump.
    please suggest me the best possible way or any psudo code  to insert those huge entries into that Z_TABLE.
    Thanks in advance..

    Hi,
    You get the dump because of uploading that much records into itnernal table from excel file...
    in this case, do the follwowing.
    data : w_int type i,
             w_int1 type i value 1.
    data itab type standard table of ALSMEX_TABLINE with header line.
    do.
       refresh itab.
       w_int = w_int1..
       w_int1 = w_int + 25000.
       CALL FUNCTION 'ALSM_EXCEL_TO_INTERNAL_TABLE'
      EXPORTING
        FILENAME                      = <filename>
        I_BEGIN_COL                   = 1
        I_BEGIN_ROW                   = w_int
        I_END_COL                     = 10
        I_END_ROW                     = w_int1
      TABLES
        INTERN                        = itab
    * EXCEPTIONS
    *   INCONSISTENT_PARAMETERS       = 1
    *   UPLOAD_OLE                    = 2
    *   OTHERS                        = 3
    IF SY-SUBRC <> 0.
    * MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
    *         WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    if itab is not initial.
    write logic to segregate the data from itab to the main internal table and then
    insert records from the main internal table to database table.
    else.
    exit.
    endif.
    enddo.
    Regards,
    Siddarth

  • How to assign characteristic to Dimension ( Advices)

    Hello Experts,
    I need small clarification from your side.
    When assigning the characterstic to perticular dimension (What are the precautions we have to take care).
    how to deside this characterstic has to be  assign to this dimension ,
    i confused many times , am not clear on this topic  please give the advices to me.
    if any one have the related documents please update it.
    Edited by: vadalaprakash on Mar 26, 2010 2:23 PM

    Hello!
    the related characteristics should be kept together in one dimension so that combination of these characteristic values should remain mininmum possible in dimension table
    this increses the performance
    you should never keep irrelevant characterstics in one dimension which will create huge entries in dimension table
    eg: you can keep 0MATERIAL, 0MAT_PLANT, 0MATL_TYPE, ZPRD_HIER in one dimension
    regards,
    khyati

  • Open SQL statment for Update flag based on Date

    Dear all,
    I am trying to write an Open SQL statement to update a flag in a table. Table Ztable1 with fields Sr.No, Flag, Datefrom, DateTo. i would like to update Flag entry in the table only if today falls in between Datefrom & Dateto. I can satisfy the above requirement using the following ABAP code.
    DATA: lv_timestamp TYPE timestamp,
          lv_today LIKE adr2-valid_from,
          tz TYPE timezone.
    CONVERT DATE sy-datlo TIME sy-timlo INTO TIME STAMP lv_timestamp
      TIME ZONE tz.
    lv_today = lv_timestamp.
    update ztable1 set flag = 'X' where lv_today BETWEEN datefrom and dateto.
    But the issue is that, DateFrom & DateTo contains space aswell Dates. Datefrom can be space if it is start of Time (01010001) and also DateTo can be space if it is End of time (31129999). Which means that if DateFrom is space, then it should treated as 01010001, simlarly DateTo is space, then it should be 31129999. How can i write the if else cases within where clauses.
    I know Decode statement in Native sql programming, but that won't fit for Opensql in ABAP. Also, because of huge entries in database, i cannot read entries, manupulate & then update.
    How can i enhance the same above Update statement to cater this need.
    Please advise.
    Thanks a lot in advance.
    Greetings, Satish

    Hi,
    first fetch records in to internal table.
    ranges: r_range for sy-datum.
    loop at itab into wa.
    if wa-validfrom is initial.
    wa-validfrom =  (here u pass valid from date).
    elseif wa-validto is initial
    wa-validto = 99991231.
    endif.
    r_range-low = wa-validfrom
    r_range-high = wa-validto
    *check here current date is falling in interval. if its fall between ranges update flas in work area and *modify you internal table
    if sy-datum in r_range.
    wa-flag = 'x'.
    modify itab from wa.
    endif.
    refresh r_range.
    clear wa-flag.
    endloop.
    *--Finally update your ztable
    modify ztable from table itab.
    Regards,
    Peranandam

  • Job  LIS-BW-VB_APPLICATION_03_360  Failed

    The job  LIS-BW-VB_APPLICATION_03_360   is being cancelled.
    The job was  terminated with a short dump.
    Please find below the error analysis
    Short text of error message:
    Structures have changed (sy-subrc=2)
    Long text of error message:
    Technical information about the message:
    Message class....... "MCEX"
    Number.............. 194
    Variable 1.......... 2
    Variable 2.......... " "
    Variable 3.......... " "
    Variable 4.......... " "
    This is regarding  inventory datasources . The delta records are not getting loaded into the BI due to the above short dump.
    Termination occurred in the ABAP program "SAPLMCEX" - in "MCEX_UPDATE_03".
    The main program was "RMCEXUP1 ".
    Error in below code
    IF I_TMSP_HASH = lf_tmsp_hash.
       gf_tmsp_hash_ok = true.
    ENDIF.
    case sy-subrc.
       when 0. " Compare OK - do nothing
       when 1. " Compare OK - do nothing
       when 2. " Compare not ok - abort
         message x194(mcex) with sy-subrc.   -
      Error in this line
       when others.
         message x193(mcex) with sy-subrc.
    endcase.
    endif.
    This is production issue. Please respond as soon as possible
    Regards,
    Kiran

    Hi Kiran,
    Yes, most probably you have installed SP6 without clearing the delta and extraction Q and this has caused the issue.
    I can understand that the extraction must have got piled up with huge entries by now. But understand that the the extraction Q is common to all datasources in the same application component. And there is no way you can selectively identify and process them. Hence to get the whole process going, you would have to compromise your delta and do a re-init. And that means you are losing records for all the datasource in that appln comp.
    Now considering your question if you do a re-init for only 2 DS and leave the 3rd. Since I am not aware of the kind of changes the new package has brought and I will not be able confirm on that. But my take on this is - you anyways need to refill your setup for the other two Datasources. Try doing an init for the 2 DS's and see if you are able to run ur V3 job then. But if it fails then you'll have to do re-int for the 3rd as well. In either of the case you'll will have a full(repair/init) load running for each of the DS(since u've lost the deltas). Also Time and effort essentially will be the same in both the cases.
    I suggest you test this in your development/Quality system to avoid complicating your production issue.
    This link will help youi:
    http://sapbibw2010.blogspot.com/2010/10/how-to-retain-deltas-when-you-change-lo.html
    Regards,
    Swati

  • In DOE while running "EXTRACT JOB"  system was shut down.

    Dear Experts,
    while running "EXTRACT JOB" . system was shut down.then extract queues are in ready state. it's not going to running state.long time it appears like in ready state. if i run manually then it's running for some time then again it's going ready state.how will handle this because of it consist of huge entries we couldn't do manually.if any one knows please provide solutions.
    Regards,
    Ashok Reddy.

    Hi Ashok,
    Please check whether the extract queues are in registered state. If not, please register the extract queues using QIN Scheduler (Goto --> QIN Scheduler) & then check.
    Regards,
    Ananth.

  • Performance issue fetching huge number of record with "FOR ALL ENTRIES"

    Hello,
    We need to extract an huge amount of data (about 1.000.000 records) from VBEP table, which overall dimension is about 120 milions records.
    We actually use this statements:
    CHECK NOT ( it_massive_vbep[] IS INITIAL ) .
    SELECT (list of fields) FROM vbep JOIN vbap
                 ON vbepvbeln = vbapvbeln AND
                  vbepposnr = vbapposnr
                 INTO CORRESPONDING FIELDS OF  w_sched
                 FOR ALL ENTRIES IN it_massive_vbep
                 WHERE    vbep~vbeln   = it_massive_vbep-tabkey-vbeln
                    AND    vbep~posnr   = it_massive_vbep-tabkey-posnr
                    AND    vbep~etenr   = it_massive_vbep-tabkey-etenr.
    notice that internal table it_massive_vbep contains always records with fully specified key.
    Do you think this query could be further optimized?
    many thanks,
    -Enrico

    the are 2 option to improve performance:
    + you should work in blocks of 10.000 to 50.000
    + you should check archiving options, does this really make sense
    > VBEP table, which overall dimension is about 120 milions records.
    it_massive_vbep  into it_vbep_notsomassive (it_vbep_2)
    CHECK NOT ( it_vbep_2[] IS INITIAL ) .
      get runtime field start.
    SELECT (+list of fields+)
                  INTO CORRESPONDING FIELDS OF TABLE w_sched
                  FROM vbep JOIN vbap
                  ON vbep~vbeln = vbap~vbeln AND
                       vbep~posnr = vbap~posnr
                  FOR ALL ENTRIES IN it_vbep_2
                  WHERE vbep~vbeln = it_vbep_2-vbeln
                  AND      vbep~posnr = it_vbep_2-posnr
                  AND      vbep~etenr  = it_vbep_2-etenr.
      get runtime field stop.
    t = stop - start.
    write: / t.
    Be aware that even 10.000 will take some time.
    Other question, how did you get the 1.000.000 records in it_massive_vbep. They are not typed in, but somehow select.
    Change the FAE into a JOIN and it will be much faster.
    Siegfried

  • Clearing Document - Huge amount Exch/Reval entries

    I have an issue with respect to clear a customer item.  The customer item is cleared in foreign currency, USD and the following document is posted:
                                                                                    USD
    001 40  6677 551455     EXCHAGE VARIANT                                   34.40
    002 17  6677 255021     Safewell CO.                        22,490.00-
    003 17  8000 255021     Safewell CO.                              6,458.40-
    004 07  1300 255021     Safewell CO.                               28,914.00
    005 40  1300 610705     FC REVAL GAINS-CUST                                0.00
    006 50  6677 610705     FC REVAL GAINS-CUST                                0.00
    007 50  8000 610705     FC REVAL GAINS-CUST                                0.00
    My doubt is why the first item "exchange variant" line item is coming with 34.40 when the document is cleared in same USD ?
    I see also additional FC Reval Gains-Cust items.
    When the above document is switched to Company code currency i.e. INR, I get the following document entries:
    001 40  6677 551455     EXCHAGE VARIANT                                  46,577.40
    002 17  6677 255021     Safewell CO.                         1,085,030.05-
    003 17  8000 255021     Safewell CO.                              314,071.99-
    004 07  1300 255021     Safewell CO.                                1,352,524.64 
    005 40  1300 610705     FC REVAL GAINS-CUST                               37,796,863.25   
    006 50  6677 610705     FC REVAL GAINS-CUST                              29,366,299.19-
    007 50  8000 610705     FC REVAL GAINS-CUST                              8,430,564.06-
    We can note that in the USD, the FC-reval gains-Custitems are all zero where as when it is displayed in INR, those huge amounts are appearing.
    Why the system is behaving like this ?

    Hi Nikitha,
    This is about the huge amounts you see in INR - >
    Is  this a residual clearing posting ?
    Are you clearing the customer and creating residual items, each time a payment is received ?
    Also check the document items in GL view, you should find huge amounts in customer items as well
    If this is the case you need to revise the way you are posting residual clearing document (always post with reference to an existing debit item) otherwise soon you would see forex values overflowing -resulting in dump for any further postings to these customers
    Regards
    Sachin
    Edited by: Sachin Bhutani on Feb 3, 2010 9:48 PM

  • Changed version of Data Entry Profile imported results in empty worklist

    Hi,
    We've just moved, among other things, changed versions of all of the data entry profiles to our production server. Everything went well however all users get an empty worklist to start with.
    Do you maybe know why this is? Is this standard behaviourd? And is/would there be any way to overcome this and do not have users start with a 'fresh' worklist?
    Your help and input is very much appreciated.
    Kind regards,
    Tim

    Hi,
    The worklist is the one available in HR transaction CAT2. Source would be HR?
    Kind regards,
    Tim

  • 8.0.2 - problem 1 - unable to scroll screen when the keyboard is present. Huge problem for emails and other cloud based providers that allow for data entry. We need a fix ASAP!

    8.0.2 - problem 1 - unable to scroll screen down or up when the keyboard is present. When typing text, I am unable to see the text I am writing due to this problem. Huge problem for emails and other cloud based providers that allow for data text entry. While the keyboard is present, I am unable to scroll down to see the text, the screen automatically scrolls back up to the top of the screen. We need a fix ASAP!

    Have you tried resetting your iPad? You will not lose any data. Reset the iPad by holding down on the sleep and home buttons at the same time for about 10-15 seconds until the Apple Logo appears - ignore the red slider if it appears on the screen - let go of the buttons. Let the iPad start up.

  • Flag "Copy from Worklist Without Hours" in data entry profile

    Hi,
    please, could anyone explain me the flag Copy from Worklist Without Hours in data entry profile?
    I created an enhancement for fill the field of worklist. When I run CAT2, I expect that, with this flag, the record of worklist will be copied in data entry section.
    Is it right?
    Thanks
    Regards

    not answered

  • Worklist entries in MSS

    Hi Experts
    In one of our scenarios workitems Reside in MSS worklist
    even if they are not in the business workplace
    Is any one aware of in which table the worklist entriees are stored
    Thanks in advance

    Hi Arghadip thank you for the cooperation
    but the table you have specified doesnt have any entries
    Actually the history of the problem is
    Initiator has started 2 workflow instances instead of one .
    So even if the manger has approved the request it still existed in the Worklist
    But later we manually completed the duplicate workitem but the workitem still exists in the Worklist
    Please advice

Maybe you are looking for

  • Using A second Display For FR

    Hi all, wounder if anyone can help me. I have connected my iMac to a Sony Bravia 28" LCD TV, I mirror my mac screen to the TV, my mac screen apares on the TV in widescreen format ok. FR all works fine apart from one thing. Any video's, music video's

  • Different languages

    Hi, I'm a s/w tested and I need to test different languages. The only languages I have on my phone are 'Automatic', 'English', and '简体中文' (Chinese). I need to test our application for 'German, Spanish, Italian, and French'. How can I make phone use t

  • How to start ETL ?

    Hi, For ETL configuration i have installed and configured, 1. Oracle 10G DB 2. obiee and obia 7.9.6 3. informatica server (here i created repository and integration services) 4. DAC server 10g (setup for DAC is also configured like create warehouse t

  • Removing 1 effect when right click, not all?

    If I select 4 or 5 clips and right click in Adobe Premiere. I can remove effects. but sometimes I only want to remove, lets say 1 effect and keep the rest. But It seems I always have to remove them all? adobe you should make a feature that lets you c

  • ITunes cannot connect to iPhone

    when i plug my iphone 4 into my laptop and try to get on itunes it says "iTunes could not connect to iPhone "Brandon's iPhone' because an error occured while reading from this device."   ~~what should i do?