Issue in Update routine due to Data Package

We have this peculiar situation.
The scenario is ..
We have to load data from ODS1 to ODS2.
The data package size is 9980 while transferring data from ODS1 to ODS2.
In the update rule we have some calculations and we rank the records based on these calculations.
The ODS key for both ODS1 and ODS2 is same ie Delivery Number , Delivery Item & Source System.
For example a Delivery Number has 12 Delivery Items.
These Delivery Items are in different Data Packages namely Data Package 1 and Data Package 4.
So instead of having the ranks as 1 to 10 its calculating it as 1 to 5 and second item as 1 to 5.
But what we require is Rank as 1 to 10.
This is due to the fact that the items are in different Data packages.
In this case the ABAP routine is working fine but the Data Package is the problem.
Can anybody any alternative solution to this issue.?
Thanks in advance for assistance.............

CODE FOR INTER DATA PACKAGE TREATMENT
PROGRAM UPDATE_ROUTINE.
$$ begin of global - insert your declaration only below this line  -
TABLES: ...
DATA:   ...
DATA: v_packet_nbr TYPE i VALUE 1.
DATA:
  g_requnr  TYPE rsrequnr.
DATA:
  l_is        TYPE string VALUE 'G_S_IS-RECNO',
  l_requnr    TYPE string VALUE 'G_S_MINFO-REQUNR'.
FIELD-SYMBOLS: <g_f1> TYPE ANY,
               <g_requnr> TYPE ANY.
TYPES:
  BEGIN OF global_data_package.
        INCLUDE STRUCTURE /bic/cs8ydbim001.
TYPES: recno   LIKE sy-tabix,
  END OF global_data_package.
DATA lt_data_package_collect TYPE STANDARD TABLE OF global_data_package.
DATA ls_datapack TYPE global_data_package.
datapackage enhancement Declaration
TYPES: BEGIN OF datapak.
        INCLUDE STRUCTURE /bic/cs8ydbim001.
TYPES: END OF datapak.
DATA: datapak1 TYPE STANDARD TABLE OF datapak,
      wa_datapak1 LIKE LINE OF datapak1.
Declaration for Business Rules implementation
TYPES : BEGIN OF ty_ydbsdppx.
        INCLUDE STRUCTURE /bic/aydbsdppx00.
TYPES: END OF ty_ydbsdppx.
DATA : it_ydbsdppx TYPE STANDARD TABLE OF ty_ydbsdppx WITH HEADER LINE,
       wa_ydbsdppx TYPE ty_ydbsdppx,
       temp TYPE /bic/aydbim00100-price,
       lv_tabix TYPE sy-tabix.
$$ end of global - insert your declaration only before this line   -
The follow definition is new in the BW3.x
TYPES:
  BEGIN OF DATA_PACKAGE_STRUCTURE.
     INCLUDE STRUCTURE /BIC/CS8YDBIM001.
TYPES:
     RECNO   LIKE sy-tabix,
  END OF DATA_PACKAGE_STRUCTURE.
DATA:
  DATA_PACKAGE TYPE STANDARD TABLE OF DATA_PACKAGE_STRUCTURE
       WITH HEADER LINE
       WITH NON-UNIQUE DEFAULT KEY INITIAL SIZE 0.
FORM startup
  TABLES   MONITOR STRUCTURE RSMONITOR "user defined monitoring
           MONITOR_RECNO STRUCTURE RSMONITORS " monitoring with record n
           DATA_PACKAGE STRUCTURE DATA_PACKAGE
  USING    RECORD_ALL LIKE SY-TABIX
           SOURCE_SYSTEM LIKE RSUPDSIMULH-LOGSYS
  CHANGING ABORT LIKE SY-SUBRC. "set ABORT <> 0 to cancel update
$$ begin of routine - insert your code only below this line        -
fill the internal tables "MONITOR" and/or "MONITOR_RECNO",
to make monitor entries
TABLES: rsmonfact.
  TYPES:
    BEGIN OF ls_rsmonfact,
      dp_nr TYPE rsmonfact-dp_nr,
    END OF ls_rsmonfact.
  DATA: k TYPE i,
        v_lines_1 TYPE i,
        v_lines_2 TYPE i,
        v_packet_max TYPE i.
declaration of internal tables
  DATA: it_rsmonfact TYPE STANDARD TABLE OF ls_rsmonfact.
INTER-PACKAGE COLLECTION TREATMENT *******************
  ASSIGN (l_requnr) TO <g_requnr>.
  SELECT dp_nr FROM rsmonfact
    INTO TABLE it_rsmonfact
    WHERE rnr = <g_requnr>.
  DESCRIBE TABLE it_rsmonfact LINES v_packet_max.
  IF v_packet_nbr < v_packet_max.
  APPEND LINES OF DATA_PACKAGE[] TO lt_data_package_collect[].
    CLEAR: DATA_PACKAGE.
    REFRESH DATA_PACKAGE.
    v_packet_nbr = v_packet_nbr + 1.
    CLEAR: MONITOR[], MONITOR.
    MONITOR-msgid = '00'.
    MONITOR-msgty = 'I'.
    MONITOR-msgno = '398'.
    MONITOR-msgv1 = 'All data_packages have been gathered in one. '.
    MONITOR-msgv2 = 'The last DATA_PACKAGE contains all records.'.
    APPEND MONITOR.
  ELSE.
last data_package => perform Business Rules.
    IF v_packet_max > 1.
      APPEND LINES OF DATA_PACKAGE[] TO lt_data_package_collect[].
      CLEAR: DATA_PACKAGE[], DATA_PACKAGE.
      k = 1.
We put back all package collected into data_package, handling recno.
      LOOP AT lt_data_package_collect INTO ls_datapack.
        ls_datapack-recno = k.
        APPEND ls_datapack TO DATA_PACKAGE.
        k = k + 1.
      ENDLOOP.
      CLEAR : lt_data_package_collect.
      REFRESH : lt_data_package_collect.
    ENDIF.
sorting global data package and only keep the first occurence of the
*record
  SORT DATA_PACKAGE BY material plant calmonth.
  DELETE ADJACENT DUPLICATES FROM DATA_PACKAGE
        COMPARING material plant calyear.
  SELECT * FROM /bic/aydbsdppx00
      INTO TABLE it_ydbsdppx
      FOR ALL ENTRIES IN DATA_PACKAGE
        WHERE material = DATA_PACKAGE-material
          AND plant    = DATA_PACKAGE-plant
          AND calyear  = DATA_PACKAGE-calyear.
Enhance Data_package with Target additionnal fields.
  LOOP AT DATA_PACKAGE.
    CLEAR : wa_datapak1, wa_ydbsdppx.
    MOVE-CORRESPONDING DATA_PACKAGE TO wa_datapak1.
    READ TABLE it_ydbsdppx INTO wa_ydbsdppx
      WITH KEY material = DATA_PACKAGE-material
                  plant = DATA_PACKAGE-plant
                calyear = DATA_PACKAGE-calyear.
    IF sy-subrc NE 0.       "new product price
      APPEND wa_datapak1 TO datapak1.
    ELSE.                   " a product price already exists
      IF wa_ydbsdppx-calmonth GE DATA_PACKAGE-calmonth.
keep the eldest one  (for each year), or overwrite price if same month
        APPEND wa_datapak1 TO datapak1.
      ENDIF.
    ENDIF.
  ENDLOOP.
ENDIF.
if abort is not equal zero, the update process will be canceled
  ABORT = 0.
$$ end of routine - insert your code only before this line         -
ENDFORM.
Edited by: mansi dandavate on Jun 17, 2010 12:32 PM

Similar Messages

  • Update routine bringing wrong data in Report

    Hello Experts,
    I have a strange situation on Inventory data. We have update rules from data source 2LIS_03_BF to our inventory cube.There is a characteristic in the cube which was not doing any thing in the past(I mean not mapped),which I am filling it currenlty using a update routine. The cube data looks absolutley fine for that field, I mean its bringing what it supposed to bring from my routine.The cube data looks fine for that field, but when I run the report on this cube,the field data is getting some numbers from some where,which is totally wrong.
    I checked whether there is BADI written on this field (Virtual Characteristic), but seems there is none for this characteristic except for some key figures based on some other characteristics.
    I am confused here as why the report is bringing some wrong numbers when it supposed to get the correct values from cube.
    Please advise,
    Regards,
    JB

    Hi,
    i hope you have checked it for non cumulative objects as there will be a different way of aggregation in that case.
    also, if the inventory is modelled based on cumulative key figures and if you are sure there is no code written for virtual key figures then it could be exception aggregation in the reporting settings or in the key figure general settings.
    Also during checks you are considering the historical data as well from the cube and its effect on the report output.
    just do a check and let us know.
    regards
    Ajeet

  • In a update rule, run two data packages ; package by package not secuencial

    HI, my data source is an ods i want to fill other ods.
    If i have two data packages, with 7000 and 5000 registers
    i want to run the two datapackages in a secuencial way not parallel.
    But the init delta doesn´t give me the option of run package by package.
    The options show me is initialization with transfer data (parallel) i want package by package.
    How i can do that??
    Reward points

    a

  • Unable Update profile due request data for "state" field

    Anyone else having a problem updating their profile? Once I log-in the site send me page to update my profile. Once I try to click update, it comes back with an error message requesting the "state" field be complete. But there is no "state" field provided or it is not visible.
    Who ever create the profile page needs to check their work.

    Thank you all for responding.
    @ Sushant - We definitly have data in that PSA table linked to the data source (8ZECPCO22 , where ZECPCO22 is the direct update DSO) and this data is what we want to delete through process chains. As mentioned earlier we are able to manually delete these PSA requests via manage of the data source.
    Few more detail information below:
    ZECPCO22 is the direct update DSO and as per the data flow the data is further loaded from this DSO (data source -8ZECPCO22) into a master data infoobject or other targets via a infopackage and with processing type "PSA and then into Data target in series".
    So looking at the infopackage one can definitly confirm that the PSA table is used and since the DSO is a direct update we cant have change log table.
    However we are unable to delete these PSA requests through process chains using a "Delete PSA request" nor "Delete Change log Request" process types.
    Any suggestion is appreciated.
    Warm Regards
    Vikram S Reddy

  • Issue in update of PA & PD data using same form scenario

    Hi All,
         We're in the process of implementing HCM PF.  As per the requirement the form should be able to change PA, PSA, EG, ESG and Job of a position on PD & PA(of corresponding pernr) side.  The process is started as a PA form (selection by pernr). 
         I'm able to change the PA, PSA, EG, ESG of the position attribute but whenever I change the Job it gets updated on the PD side but not the PA side (it still holds the previous job prior to effective date).
         In my form scenario I have one PA & one PD service.
         Any help will be appreciated and rewarded in resolving this issue.
    Thanks,
    Mani.
    Edited by: mramamoorthy on Mar 14, 2011 7:01 PM

    Hi,
    Please check ur ISR Scenario,:
    Goto-> Process Form Scenario for Personnel Admin. Infotypes (Service SAP_PA):
    Define Operations:
    0001->M change
    Define Assignment of Fields:
    Field Name     Number     Infotype     Subtype     Screen Structure                       Field Name     Data Index
    Job     1     0001     1     HCMT_BSP_PA_XX_R0001     Job     1
    and do Synchronizing Form Fields.
    Then Do->Check Consistency of Form Scenarios in IMG activity.
    If you get green light then all configuration are correctly update.
    Otherwise problem in configuration side.
    Thanks

  • Issue with Update Routine

    Hi Experts,
    I tried the below code to calculate the no of days between Document Created date and System date. Also it is a condition that if a record is created yesterday and if its created time is greater than 17hrs, then the no of days shld be zero. But this code isn't working that way.
    For any records created yesterday after 17hrs, the code still produces days as 1.
    FYI - The Fuctional Module always adds one more to the differnce day and hence the line
      RESULT = OUT - 1.
    Any Suggestions is highly appreciated.
    Thanks,
    DV
    The code is
    PROGRAM UPDATE_ROUTINE.
    $$ begin of global - insert your declaration only below this line  -
    TABLES: ...
    DATA:   ...
    TABLES: /BIC/PZCRM_ITM.
    *DATA: IT_ZCRMITM LIKE /BIC/PZCRM_ITM OCCURS 0 WITH HEADER LINE.
    DATA: BEGIN OF IT_ZCRMITM OCCURS 0,
              /BIC/ZCRM_ITM LIKE /BIC/PZCRM_ITM-/BIC/ZCRM_ITM,
              CREA_TIME LIKE /BIC/PZCRM_ITM-CREA_TIME,
          END    OF IT_ZCRMITM.
    SELECT /BIC/ZCRM_ITM CREA_TIME FROM
    /BIC/PZCRM_ITM INTO TABLE IT_ZCRMITM.
    DATA: CUR_DATE LIKE SY-DATUM,
          PRE_DATE LIKE SY-DATUM,
          DOC_DATE LIKE SY-DATUM,
          TIME_CREA LIKE SY-UZEIT,
          OUT LIKE SY-TABIX.
    $$ end of global - insert your declaration only before this line   -
    FORM compute_key_field
      TABLES   MONITOR STRUCTURE RSMONITOR "user defined monitoring
      USING    COMM_STRUCTURE LIKE /BIC/CS80CRM_PRI
               RECORD_NO LIKE SY-TABIX
               RECORD_ALL LIKE SY-TABIX
               SOURCE_SYSTEM LIKE RSUPDSIMULH-LOGSYS
      CHANGING RESULT LIKE /BIC/VZCRM_PRI1T-/BIC/ZOPENDAYS
               RETURNCODE LIKE SY-SUBRC
               ABORT LIKE SY-SUBRC. "set ABORT <> 0 to cancel update
    $$ begin of routine - insert your code only below this line        -
    fill the internal table "MONITOR", to make monitor entries
      CLEAR: CUR_DATE.
      CLEAR: OUT.
      CLEAR: DOC_DATE.
      CLEAR: TIME_CREA.
    result value of the routine
    ***Get Time Created***
    READ TABLE IT_ZCRMITM WITH KEY
    /BIC/ZCRM_ITM = COMM_STRUCTURE-/BIC/ZCRM_ITM.
    IF SY-SUBRC = 0.
    TIME_CREA  = IT_ZCRMITM-CREA_TIME.
    ENDIF.
    ***Get Time Created***
    ***Get No of Days***
      CUR_DATE = SY-DATUM.
      DOC_DATE = COMM_STRUCTURE-calday.
      CALL FUNCTION 'Z_SELECT_FACTDAYS_FOR_PERIOD'
        EXPORTING
          I_DATAB  = DOC_DATE
          I_DATBI  = CUR_DATE
          I_FACTID = 'AU'
        IMPORTING
          OUT_DAYS = OUT.
    ***Get No of Days***
    ***Calculate appropriate No of Days for Yesterday's Records***
    if time created is after 17.00 hrs then no of days for the records cre
    otherwise the result will be OUT minus 1
      PRE_DATE = SY-DATUM - 1.
      IF DOC_DATE = PRE_DATE AND
        IT_ZCRMITM-CREA_TIME GT '170000'.
        RESULT = OUT - 2.
      ELSE.
        RESULT = OUT - 1.
      ENDIF.
    ***Calculate appropriate No of Days for Yesterday's Records***
    if the returncode is not equal zero, the result will not be updated
      RETURNCODE = 0.
    if abort is not equal zero, the update process will be canceled
      ABORT = 0.
    $$ end of routine - insert your code only before this line         -

    Hi Olivier,
    Thanks for your response.
    1. I will add the clause as u have suggested. But may I know teh scenario where we may find inactive master data- We load and activaet the master data daily.
    2. The function module code is as below. This FM is similar to one in R3 I think u can find by searching with SELECT_FACTDAYS
    FUNCTION Z_SELECT_FACTDAYS_FOR_PERIOD.
    ""Local Interface:
    *"  IMPORTING
    *"     REFERENCE(I_DATAB) LIKE  SY-DATUM
    *"     REFERENCE(I_DATBI) LIKE  SY-DATUM
    *"     REFERENCE(I_FACTID) LIKE  TFACT-IDENT
    *"  EXPORTING
    *"     REFERENCE(OUT_DAYS) LIKE  SY-TABIX
      DATA: L_V_AKTDAT LIKE SCAL-DATE.
      DATA: L_V_INDICATOR LIKE SCAL-INDICATOR.
    CLEAR  : ETH_DATS.
    REFRESH: ETH_DATS.
      L_V_AKTDAT = I_DATAB.
    do it for all days in space of time
      WHILE L_V_AKTDAT <= I_DATBI.
        CALL FUNCTION 'DATE_CONVERT_TO_FACTORYDATE'
             EXPORTING
                  DATE                         = L_V_AKTDAT
                  FACTORY_CALENDAR_ID          = I_FACTID
             IMPORTING
                  WORKINGDAY_INDICATOR         = L_V_INDICATOR
             EXCEPTIONS
                  CALENDAR_BUFFER_NOT_LOADABLE = 1
                  CORRECT_OPTION_INVALID       = 2
                  DATE_AFTER_RANGE             = 3
                  DATE_BEFORE_RANGE            = 4
                  DATE_INVALID                 = 5
                  FACTORY_CALENDAR_NOT_FOUND   = 6
                  OTHERS                       = 7.
       IF SY-SUBRC NE 0.
         MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
                 WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
                RAISING DATE_CONVERSION_ERROR.
       ENDIF.
    indicator is space if actual day is a working day
    if indicator is not space the actual day isn't a working day
        IF L_V_INDICATOR EQ SPACE.
         CLEAR ETH_DATS.
         ETH_DATS-PERIODAT = L_V_AKTDAT.
         APPEND ETH_DATS.
          out_days = out_days + 1.
        ENDIF.
        L_V_AKTDAT = L_V_AKTDAT + 1.
      ENDWHILE.
    ENDFUNCTION.
    3. There will be always a masterr data record, coz we load them daily. and the load is a full update so no records gets deleted.
    4. The load will be a Daily load. Business wants to capture the Snapshot for that day.  So apart from deciding the no of days based on created time, the other logic works fine.
    So my suspecian is that the part which decides based on Time created is not actually working in the way we expeceted.
    DV
    Message was edited by:
            DVMC

  • Reason for Error in Manual Update of Data Package

    Hey,
         Sometimes while loading huge amount of data, one of the data package doesnt get uploaded. For those data packages we do a manual update of the data package and it works well.
       My question is why do we get this error,
    Is it to do with the system contention or any IDOC issue?
    Please help me out to find the root cause of this, so that we can avoid the manual update of the data packges.

    We usually get this issue , when we are doing a lot of processing in the Update Rules. The Data package processing gets timed out.
    If it is happening very often then you would need to sit with your Basis person and modify the Timeout settings for your system.

  • Update routine infinite loop

    Hello Experts,
    For loading ODS2 we are making a lookup on ODS1 for 0material based on
    purchaing document number, item line item.
    Is there any mistake in the start routine or update routine.
    Because the load goes in infinite loop. I think update routine should be changed.
    Any suggestions are appreciated
    Start routine:
    data: begin of itab occurs 0,
            pur_doc like /BIC/AZODS100-OI_EBELN,
            item like /BIC/AZODS100-OI_EBELP,
            material like /BIC/AZODS100-material,
          end of itab.
    clear itab.
    select OI_EBELN OI_EBELP MAT_PLANT from /BIC/AZODS100
             into table itab.
    Update routine for 0material
    loop at itab where pur_doc = COMM_STRUCTURE-OI_EBELN
                       and item = COMM_STRUCTURE-OI_EBELP.
           RESULT = itab-matplant.
    endloop.

    Hi,
    this takes a long time, because with each record of your data packaged it is doing the loop and scanning each row of the internal table. Use the following instead.
    Start routine:
    types: begin of t_itab,
    pur_doc like /BIC/AZODS100-OI_EBELN,
    item like /BIC/AZODS100-OI_EBELP,
    material like /BIC/AZODS100-material,
    end of t_itab.
    data: itab type hashed table of t_itab with unique key pur_doc item.
    select OI_EBELN OI_EBELP MAT_PLANT from /BIC/AZODS100
    into table itab order by oi_ebeln oi_ebelp mat_plant.
    I hope these fields are the key of the ods object.
    Update routine for 0material
    data: wa_itab type t_itab.
    read table itab into wa_itab with table key pur_doc = COMM_STRUCTURE-OI_EBELN
    item = COMM_STRUCTURE-OI_EBELP.
    if sy-subrc = 0.
    RESULT = wa_itab-matplant.
    else.
    clear result.
    endif.
    Hope this helps
    regards
    Siggi

  • Can routine replace "master data attribute of" update rule for performance?

    Hi all,
    We are working on CRM-BW data modeling, We have to look up agent master data for agent level and position for each transaction data. So now we are using "Master data attribute of" update rule. Can we use routine instead of "Master data Attribute of" ? Will it improve the loading performance? Since we have to load 1 lack transaction records , where as we have 20,000 agent details in agent master data.My understanding is, for each record in data package the system has to go to master data table and bring the agent details & store in cubes. Say one agent created 10 transactions, then this option "master data attribute of" will read the agent master data 10 times even though we are going to pull same details for all 10 transactions from master data. if we use routine, we can pull the agent details& storing in internal table removing all duplicates and in update routine we can read the internal table.
    Will this way improve performance?
    let me know if you need further info?
    Thanks in advance.
    Arun Thangaraj

    Hi,
    your thinking is absolutely right!
    I don't recommend to use the standard attribute derivation since it will perform a SELECT to the database for EACH record.
    Better implement a sorted table in your start routine; fill it with SELECT <fields> FROM <master_data_table> FOR ALL ENTRIES OF datapak WHERE OBJVERS = 'A' etc...
    In your routine perform a READ itab ... BINARY SEARCH.... I believe that you won't be able to go faster...
    hope this helps...
    Olivier.

  • Creation of data packages due to large amount of datasets leads to problems

    Hi Experts,
    We have build our own generic extractor.
    When data packages (due to large amount of datasets) are created, different problems occur.
    For example:
    Datasets are now doubled and appear twice, one time in package one and a second time in package two. Since those datsets are not identical, information are lost while uploading those datasets to an ODS or Cube.
    What can I do? SAP will not help due to generic datasource.
    Any suggestion?
    BR,
    Thorsten

    Hi All,
    Thanks a million for your help.
    My conclusion from your answers are the following.
    a) Since the ODS is Standard - within transformation no datasets are deleted but aggregated.
    b) Uploading a huge amount of datasets is possible in two ways:
       b1) with selction criteria in InfoPackage and several uploads
       b2) without selction criteria in InfoPackage and therefore an automatic split of datasets in data packages
    c) both ways should have the same result within the ODS
    Ok. Thanks for that.
    So far I have only checked the data within PSA. In PSA number of datasets are not equal for variant b1 and b2.
    Guess this is normal technical behaviour of BI.
    I am fine when results in ODS are the same for b1 and b2.
    Have a nice day.
    BR,
    Thorsten

  • Identify the last data package in start routine

    Hi Everyone
    We have a start routine in transformations. We require to do some special processing in the start routine only when the last data package is executing. How can we determine in the start routine that current package is last one or not ? Any pointers in this direction are appreciated.

    Hi,
    You can get packet Id from datapackid in start routine and end routine. But I'm not so sure how to identify the last packet ID, alternatively you can store this packet id in some where else and read the same value in End routine if your logic(processing) permits to do this in End routine instead of Start routine.
    METHODS
          start_routine
            IMPORTING
              request                  type rsrequest
              datapackid               type rsdatapid
            EXPORTING
              monitor                  type rstr_ty_t_monitors
            CHANGING
              SOURCE_PACKAGE              type tyt_SC_1
            RAISING
              cx_rsrout_abort.
    hope it helps...
    regards.
    Raju

  • Update Routine to populate 0VENDOR from either of the 2 data source fields

    Hi,
    I have a requirement to write an update routine for 0VENDOR based on the below logic :
    Create routines to populated BW Info Object u201CVendoru201D (0VENDOR) based on the following logic:
    IF field u201CVendoru201D (ITM_VENDOR_ID) is populated from data source 0BBP_SC_TD_1, THEN populate 0VENDOR with that value
    ELSE IF u201CPreferred Vendoru201D (ITM_PROPVEN_ID) is populated from data source 0BBP_SC_TD_1, THEN populate 0VENDOR with that value
    ELSE IF neither u201CVendoru201D (ITM_VENDOR_ID) or u201CPreferred Vendoru201D (ITM_PROPVEN_ID) are populated from data source 0BBP_SC_TD_1, then 0VENDOR = NULL
    Can anyone help me in converting this logic into ABAP routine.
    Thanks,
    Suchitra

    Hi Suchitra,
    In the Transfer Rules ... You will be mapping each field then the mapping field click on the button with Triangle then you can see the which type you want.
    Then select the routine and select the datasource fields (don't forget to select the both fields VENDOR and PROPITM)...
    Then give a name to routine ...
    and in the code just change the COMM_STRUCTURE to TRANSFER_STRUCTURE.
    Then you can get this .... done..
    Regards,
    Ravi Kanth

  • Data package's data coolection in one internal table of Update rule in bw

    Hello Gurus,
    I have a requirement in bw which demands to create update routine, in which I need to find the vendor for the material, which doesn't have vendor by some business rule but, here in this case data is divided into different datapackages so here in this case material data is splited over different datapackages.
    One material can have n no of records with differnt plants.
    My only question is like can we collect all datapackage data in one internal table.
    Ex: if there are total 5 datapckages, so in update rule can we collect all these datapackages records?
    Please suggest your opinion.
    Thanks in advance,
    Snehal.
    Moderator message: please have a look in the BW forums.
    Edited by: Thomas Zloch on Jan 11, 2011 1:49 PM

    hi,
    i think your problem is need of a work area...
    Using "FOR ALL ENTRIES IN lt_zhrp"  You must use " check lt_zhrp is not initial "
    DATA: lt_zhrp TYPE STANDARD TABLE OF zhrp,
          lt_zhrt TYPE STANDARD TABLE OF zhrt.
    *new
    data: wa_zhrp type lt_zhrp.
    SELECT tabnr
    FROM zhrp
    INTO CORRESPONDING FIELDS OF TABLE lt_zhrp
    WHERE objid = lv_posid.
    LOOP AT lt_zhrp INTO wa_zhrp.
      SELECT low
      FROM zhrt
      INTO CORRESPONDING FIELDS OF TABLE lt_zhrt
      WHERE tabnr =  wa_zhrp-tabnr  " lt_zhrp-tabnr
      AND attrib = 'NET_VALUE'.
    ENDLOOP.
    Edited by: Miguel Antunes on Apr 28, 2010 4:22 PM

  • How do i get at the data in the data package of the start routine

    I have some very simple code in my start routine but it says data-package is not a table but it is SAP code that defines it this way.  here is my code
    SELECT DOC_NUM S_ORD_ITM CH_ON CREATEDON INTO TABLE T_ORD_BKG_DATA
    FROM DATA_PACKAGE.

    Hi,
    The data package is not a DB table, but an internal table.
    You will have to use a LOOP statement to get at all data :
        LOOP AT SOURCE_PACKAGE INTO w_SOURCE_PACKAGE.
          IF w_SOURCE_PACKAGE-myField = '98'.
          something to do
          ENDIF.
        ENDLOOP. " SOURCE_PACKAGE
    Hope that helps.
    Dieu

  • Data Package Issue in DTP

    Hi gurus,
    My dataflow is like datsource->infosource->wrute optimised DSO with semantic key..
    In source , i have 10 records in that 7 records are duplicate records.
    I reduced the DTP datapackage size from 50000 to 5.
    When i excuted the DTP , i got 2 data package. in the first data package i got all the 7 records for the same set of keys and in the second data package i got the remaining records.
    My doubt is i have defined the data package size as "5" then how come the first data package can hold 7 records instead of 5 records.
    Thanks in advance !

    Hi ,
    It is because of the Semantic Key seeting that you have maintained .Data records that have the same key are combined in a single data package. This setting is only relevant for DataStore objects with data fields that are overwritten .
    Semantic Groups ensures how you want to build the data packages that are read from the source (DataSource or InfoProvider).
    This setting also defines the key fields for the error stack. By defining the key for the error stack, you ensure that the data can be updated in the target in the correct order once the incorrect data records have been corrected.
    Hope it helps .
    Thanks
    Kamal Mehta

Maybe you are looking for

  • How do i get the app store on a 1g touch

    How do i get the app store icon on my touch 1g 1.1.5 (with software upgrade)

  • Printer installation

    hi i have a new system with window 7. i am trying to add a printer on my system which is on network. but when in control panel i try to install it its not detecting there. 

  • Help, the pc suite music program shuts dowm after ...

    Hej, I have a very big probelm, I can't transfer music to my N73, because just when I open the the 'music transferring program', it shuts down! And no matter if I try to play a song or choose another option, it's like it just ignores it and closes do

  • An Illustrator error occurred: 1346458189 ('PARM')

    Hello everyone, I am trying to open some documents in Illustrator CS4 by javascript, walk through all layers including sublayers, doing something (for now just reading the layernames and showing them at an alert), closing the document and continue wi

  • JMS Channel error reson code : 2019

    JMS Channel is throwing error with an reason code : 2019. Decription for reason code 2019 is : The object handle Hobj is not valid, for one of the following reasons: The parameter pointer is not valid, or (for the  MQOPEN  call) points to read-only s