End routine data extraction

Hi Gurus,
I am trying to extract one custom keyfigure from 0EMPLOYEE masterdata to 0PA_C01 cube,
Global  declarations include
data: begin of i_s_employee,
      l_zfte1 type /bic/mzemployee-/bic/zfte,
      end of i_s_employee.
data: i_t_employee like table of i_s_employee.
Start of end routine includes
data: e_s_result type tys_TG_1.
data: e_t_result type tyt_TG_1.
read table i_t_employee index 1 transporting no fields.
if sy-subrc = 4.
select /bic/zfte from /bic/mzemployee into corresponding
fields of table
i_t_employee
where objvers = 'A'
and dateto <= sy-datum
and /bic/zemployee <> ''.
if sy-subrc = 4.
raise exception type CX_RSROUT_ABORT.
endif.
endif.
loop at RESULT_PACKAGE into e_s_result.
read table i_t_employee into i_s_employee with key l_zfte1 =
e_s_result-/bic/zfte.
endloop.
loop at i_t_employee into i_s_employee where l_zfte1 =
e_s_result-/bic/zfte.
append e_s_result to e_t_result.
endloop.
*refresh RESULT_PACKAGE.
move e_t_result[] to RESULT_PACKAGE[].
Nothing gets updated to cube and there are also no syntax errors. Please can anyone tell me where i am making mistakes/overlooking.
any help greatly appreciated.
Best Regards,
Reddy.

hi reddy,
did you solve this problem?
i have same problem now and trying to solve.

Similar Messages

  • End Routine Data updation Issue

    Hi all,
    We are trying to do some look ups in the end routine to populate some fields.
    We can see the data selected in the debugging but  in the actual DSO the fields are not populating.
    we suspect that the modify statement
    MODIFY RESULT_PACKAGE from e_s_result.
    might be having some issues.
    If anybody have any suggestions please help us out.
    Appreciate it.
    Thanks,
    HM

    No it is not the issue with modify statement.
    Whatever fields you are updating in modify statement , you need to have rule type assigned as constant with no value..
    As in BI 7.0 the individual rule needs to be defined to enable it to get populated otherwise it doesnt think that the field needs to be updated.

  • End Routine Issue - It does not move data from E_T_RESULT to RESULT_PACKAGE

    Hi,
    I am facing an issue with end routine. I have gone through previous posts on, how to write end routine and all.I wrote the end routine accordingly.
    Here is my scenario,
    I have 0CUST_SALES master data , which has all the Sales Org, Distribution Channel and Division, Sold to Party, Sales Grp and Sales Dist.
    I am getting , Sold to party and Distribution channel at the field routine.
    I am using, Sold to Party and Dist Channel and Division = '01'- whatever i populated using a field routine  and trying to get the Sales Org, Sales Grp and Sales Dist at the end routine.
    It looks like, all the code that i wrote seems correct but it does not populate any values into RESULT_PACKAGE.
    Here is the code I wote at the end routine. I am not sure, whats wrong in it. I used, this link to write this routine :
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/203eb778-461d-2c10-60b3-8a94ee91cbfc&overridelayout=true
    Global Declaration----
      DATA : BEGIN OF IT_CUST_SALES,
        DIV TYPE /bi0/pcust_sales-DIVISION,
        DIST_CH TYPE /bi0/pcust_sales-DISTR_CHAN,
        SALES_ORG TYPE /bi0/pcust_sales-SALESORG,
        CUST_SAL TYPE /bi0/pcust_sales-CUST_SALES,
        SALESDIST TYPE /bi0/pcust_sales-SALES_DIST,
        SALESGRP TYPE /bi0/pcust_sales-SALES_GRP,
        END OF IT_CUST_SALES.
    DATA: T_CUST_SALES LIKE TABLE OF IT_CUST_SALES.
    Start of End Routine
       SELECT DIVISION DISTR_CHAN SALESORG CUST_SALES SALES_DIST SALES_GRP
        from
        /bi0/pcust_sales INTO TABLE T_CUST_SALES for all entries in
        RESULT_PACKAGE
        where CUST_SALES = RESULT_PACKAGE-SOLD_TO
         AND DISTR_CHAN = RESULT_PACKAGE-DISTR_CHAN
         AND DIVISION = '01'.
        LOOP AT RESULT_PACKAGE INTO e_s_result.
          READ TABLE T_CUST_SALES INTO IT_CUST_SALES
              WITH KEY CUST_SAL = e_s_result-SOLD_TO
                 DIST_CH = e_s_result-DISTR_CHAN
                 DIV = '01'.
          IF SY-SUBRC EQ 0 .
            MOVE IT_CUST_SALES-SALES_ORG TO E_S_RESULT-SALESORG.
            MOVE IT_CUST_SALES-SALESDIST TO E_S_RESULT-SALES_DIST.
            MOVE IT_CUST_SALES-SALESGRP TO E_S_RESULT-SALES_GRP.
            APPEND E_S_RESULT  TO  E_T_RESULT .
          ENDIF.
        ENDLOOP.
        REFRESH RESULT_PACKAGE.
        MOVE E_T_RESULT[] TO RESULT_PACKAGE[] .
    End End Routine
    Data comes into E_T_RESULT but it does not move to RESULT_PACKAGE. Any inputs will be helpful.
    Regards,
    Kumar

    Hi Hegde,
    Declaration is same , its like this.
       datA: e_s_result type tys_TG_1.
        data: e_t_result type tyt_TG_1.
    I don't know, when i inserted this code in this post, initially it was OK but once i post i also saw , its not that read friendly.
    FYI, i am trying to put the code again, lets see if it works.
      SELECT DIVISION DISTR_CHAN SALESORG CUST_SALES SALES_DIST SALES_GRP
        from    /bi0/pcust_sales INTO TABLE T_CUST_SALES for all entries in
        RESULT_PACKAGE   where CUST_SALES = RESULT_PACKAGE-SOLD_TO
         AND DISTR_CHAN = RESULT_PACKAGE-DISTR_CHAN
         AND DIVISION = '01'.
        LOOP AT RESULT_PACKAGE INTO e_s_result.
          READ TABLE T_CUST_SALES INTO IT_CUST_SALES
              WITH KEY CUST_SAL = e_s_result-SOLD_TO
                 DIST_CH = e_s_result-DISTR_CHAN
                 DIV = '01'.
          IF SY-SUBRC EQ 0 .
            MOVE IT_CUST_SALES-SALES_ORG TO E_S_RESULT-SALESORG.
            MOVE IT_CUST_SALES-SALESDIST TO E_S_RESULT-SALES_DIST.
            MOVE IT_CUST_SALES-SALESGRP TO E_S_RESULT-SALES_GRP.
            APPEND E_S_RESULT  TO  E_T_RESULT .
          ENDIF.
        ENDLOOP.
        REFRESH RESULT_PACKAGE.
        MOVE E_T_RESULT[] TO RESULT_PACKAGE[] .
    Regards,
    Kumar

  • Performance: reading huge amount of master data in end routine

    In our 7.0 system, each day a full load runs from DSO X to DSO Y in which from six characteristics from DSO X master data is read to about 15 fields in DSO Y contains about 2mln. records, which are all transferred each day. The master data tables all contain between 2mln. and 4mln. records. Before this load starts, DSO Y is emptied. DSO Y is write optimized.
    At first, we designed this with the standard "master data reads", but this resulted in load times of 4 hours, because all master data is read with single lookups. We redesigned and fill all master data attributes in the end routine, after fillilng internal tables with the master data values corresponding to the data package:
    *   Read 0UCPREMISE into temp table
        SELECT ucpremise ucpremisty ucdele_ind
          FROM /BI0/PUCPREMISE
          INTO CORRESPONDING FIELDS OF TABLE lt_0ucpremise
          FOR ALL ENTRIES IN RESULT_PACKAGE
          WHERE ucpremise EQ RESULT_PACKAGE-ucpremise.
    And when we loop over the data package, we write someting like:
        LOOP AT RESULT_PACKAGE ASSIGNING <fs_rp>.
          READ TABLE lt_0ucpremise INTO ls_0ucpremise
            WITH KEY ucpremise = <fs_rp>-ucpremise
            BINARY SEARCH.
          IF sy-subrc EQ 0.
            <fs_rp>-ucpremisty = ls_0ucpremise-ucpremisty.
            <fs_rp>-ucdele_ind = ls_0ucpremise-ucdele_ind.
          ENDIF.
    *all other MD reads
    ENDLOOP.
    So the above statement is repeated for all master data we need to read from. Now this method is quite faster (1,5 hr). But we want to make it faster. We noticed that reading in the master data in the internal tables still takes a long time, and this has to be repeated for each data package. We want to change this. We have now tried a similar method, but now load all master data in internal tables, without filtering on the data package, and we do this only once.
    *   Read 0UCPREMISE into temp table
        SELECT ucpremise ucpremisty ucdele_ind
          FROM /BI0/PUCPREMISE
          INTO CORRESPONDING FIELDS OF TABLE lt_0ucpremise.
    So when the first data package starts, it fills all master data values, which 95% of them we would need anyway. To accomplish that the following data packages can use the same table and don't need to fill them again, we placed the definition of the internal tables in the global part of the end routine. In the global we also write:
    DATA: lv_data_loaded TYPE C LENGTH 1.
    And in the method we write:
    IF lv_data_loaded IS INITIAL.
      lv_0bpartner_loaded = 'X'.
    * load all internal tables
    lv_data_loaded = 'Y'.
    WHILE lv_0bpartner_loaded NE 'Y'.
      Call FUNCTION 'ENQUEUE_SLEEP'
      EXPORTING
         seconds = 1.
    ENDWHILE.
    LOOP AT RESULT_PACKAGE
    * assign all data
    ENDLOOP.
    This makes sure that another data package that already started, "sleeps" until the first data package is done with filling the internal tables.
    Well this all seems to work: it takes now 10 minutes to load everything to DSO Y. But I'm wondering if I'm missing anything. The system seems to work fine loading all these records in internal tables. But any improvements or critic remarks are very welcome.

    This is a great question, and you've clearly done a good job of investigating this, but there are some additional things you should look at and perhaps a few things you have missed.
    Zephania Wilder wrote:
    At first, we designed this with the standard "master data reads", but this resulted in load times of 4 hours, because all master data is read with single lookups.
    This is not accurate. After SP14, BW does a prefetch and buffers the master data values used in the lookup. Note [1092539|https://service.sap.com/sap/support/notes/1092539] discusses this in detail. The important thing, and most likely the reason you are probably seeing individual master data lookups on the DB, is that you must manually maintain the MD_LOOKUP_MAX_BUFFER_SIZE parameter to be larger than the number of lines of master data (from all characteristics used in lookups) that will be read. If you are seeing one select statement per line, then something is going wrong.
    You might want to go back and test with master data lookups using this setting and see how fast it goes. If memory serves, the BW master data lookup uses an approach very similar to your second example (1,5 hrs), though I think that it first loops through the source package and extracts the lists of required master data keys, which is probably faster than your statement "FOR ALL ENTRIES IN RESULT_PACKAGE" if RESULT_PACKAGE contains very many duplicate keys.
    I'm guessing you'll get down to at least the 1,5 hrs that you saw in your second example, but it is possible that it will get down quite a bit further.
    Zephania Wilder wrote:
    This makes sure that another data package that already started, "sleeps" until the first data package is done with filling the internal tables.
    This sleeping approach is not necessary as only one data package will be running at a time in any given process. I believe that the "global" internal table is not be shared between parallel processes, so if your DTP is running with three parallel processes, then this table will just get filled three times. Within a process, all data packages are processed serially, so all you need to do is check whether or not it has already been filled. Or are you are doing something additional to export the filled lookup table into a shared memory location?
    Actually, you have your global data defined with the statement "DATA: lv_data_loaded TYPE C LENGTH 1.". I'm not completely sure, but I don't think that this data will persist from one data package to the next. Data defined in the global section using "DATA" is global to the package start, end, and field routines, but I believe it is discarded between packages. I think you need to use "CLASS-DATA: lv_data_loaded TYPE C LENGTH 1." to get the variables to persist between packages. Have you checked in the debugger that you are really only filling the table once per request and not once per package in your current setup? << This is incorrect - see next posting for correction.
    Otherwise the third approach is fine as long as you are comfortable managing your process memory allocations and you know the maximum size that your master data tables can have. On the other hand, if your master data tables grow regularly, then you are eventually going to run out of memory and start seeing dumps.
    Hopefully that helps out a little bit. This was a great question. If I'm off-base with my assumptions above and you can provide more information, I would be really interested in looking at it further.
    Edited by: Ethan Jewett on Feb 13, 2011 1:47 PM

  • Filling Data fields of a DSO in End Routine

    Hi Everyone,
    The data fields of a DSO contains 2 key figures and a characteristic.
    In the End routine of the transformation, i have assigned constant values for the infoobjects in the data field.
    After executing the DTP, if I check in the New Table of the DSO, these constant values are present.  But when I activate the DSO, the values for key figures gets initialised and the values for the characterisitic becomes empty (NULL).
    Is it not possible to assign values for the infoobjects in the data field? If so, why is this limitation?
    Thanks in advance,
    Uma

    Uma,
    To populate any field in the end routine, you have to assign some constant in the transformation first and then re-populate them using the end routine.
    Sometimes if you dont assign any constant in transformation, the values remain initial and even after you write a code fo that field, it is not populated in the end routine.
    All you have to do is assign constant 0 to the key figures you are populating in the end routine and run the DTP again.
    Thanks
    Sachin

  • No data in Active table of DSO for fields populated by End Routine

    Hi,
    I have a Standard DSO where we are populating few fields by using End Routine.
    Last week we added 5 more fields to DSO and wrote a logic in End ROutine to populate the DSO. These new fields dont have any mapping and these are just populated by end routine only.
    When I loaded the data from Data Source TO DSO, Data is loaded correctly into NEW DATA Table of DSO for all the fields. I could see correct data as per the logic in NEW Table including old and new fields.
    However, when I activate the DSO, I could not find the data for new fields which I added last week. Remaining fields are getting data as per the logic. Only these five fields are not having any data.
    Can you please let me know if any one had similar issue. I was under impression that all the data in the new table will go to Active table when we activate the DSO.
    Your inputs are highly appreciated.
    Thanks
    Krishna

    What version of BW are you using?  When editing your end-routine, a pop-up should display saying which fields you want populated/transferred from the end routine.  This pop-up will not display if you are using a lower version of BW 7.x.  To get around this, make sure that your newly added fields have a transformation rule type set to constant.  This will make sure that the fields get populated when transferring from new to active tables.

  • End Routine - populating Target Field based on Master Data

    Hi,
    I have an issue with my End Routine in BI 7.0. The scenario is as follows....
    The Target-Fields ZSALES_OFFICE , 0SALES_CHANNEL etc. are being mapped 1:1 from their respective source fields. In addition to these target fields I have a target field 0SALESORG which I need to populate based on the values from 0COMP_CODE which is an attribute of ZSALES_OFFICE. The values of 0COMP_CODE are 9000, 9001, 9002 and 9003 respectively. The end routine condition needs to be implemented as follows...
    For every 0COMP_CODE which has value 9000, 0SALESORG should be populated with the value "EAST". Similarly for every 0COMP_CODE which has value 9001, 0SALESORG should be populated with the value "WEST",  for every 0COMP_CODE which has value 9002, 0SALESORG should be populated with the value "NORTH" and  for every 0COMP_CODE which has value 9003, 0SALESORG should be populated with the value "SOUTH". I tried the following code but it doesnt seem to work. Could you pls help!!
    Thanks,
    SD
    DATA: it_tab4 TYPE TABLE OF /BIC/PZF31SALOFF,
              wa_tab4 TYPE /BIC/PZF31SALOFF.
        SELECT *
          FROM /BIC/PZF31SALOFF
        INTO CORRESPONDING FIELDS OF TABLE it_tab4.
        sort it_tab4 by /BIC/ZF31SALOFF.
        LOOP AT RESULT_PACKAGE
          INTO <result_fields>.
          read table it_tab4
          with key /BIC/ZF31SALOFF = <result_fields>-/BIC/ZF31SALOFF
          into wa_tab4
          binary search.
          if sy-subrc eq 0.
            CASE wa_tab4-comp_code.
              WHEN '9000'.
                <result_fields>-salesorg = 'EAST'.
              WHEN '9100'.
                <result_fields>-salesorg = 'WEST'.
              WHEN '9200'.
                <result_fields>-salesorg = 'NORTH'.
              WHEN '9300'.
                <result_fields>-salesorg = 'SOUTH'.
              MODIFY it_tab4 FROM wa_tab4.
            ENDCASE.
          endif.
        ENDLOOP.

    Replace your select statement ,
    SELECT *
    FROM /BIC/PZF31SALOFF
    INTO CORRESPONDING FIELDS OF TABLE it_tab4.
    instead of selecting all the fields , pick only the fields which are required.(one good performance improvement)
    SELECT    /BIC/PZF31SALOFF  comp_code
    FROM /BIC/PZF31SALOFF
    INTO CORRESPONDING FIELDS OF TABLE it_tab4.
    Remove the line below , this is not required
    MODIFY it_tab4 FROM wa_tab4.

  • End Routine is NOT modifying the DSO with new data after load into that DSO

    Hi all,
      I am creating an End Routine for DSO to populate a field ZFCMP_FLG (to store 'Y' ) with lookup from another DSO ZMDS_D01. This new field shows blank instead of 'Y', after activating the DSO. The RESULT_PACKAGE record is populated with 'Y' for ZFCMP_FLG  while debugging that End Routine and why it is NOT writing the modified records into DSO, please ? It is a Characteristic InfoObject with length 1 to store 'Y'. The following is some part of the code:
    DATA: wa_fcmp_flag   TYPE c VALUE 'Y'.
    LOOP AT RESULT_PACKAGE ASSIGNING <RESULT_FIELDS>.
        READ TABLE it_zmds_d01 INTO wa_zmds_d01 WITH KEY
                    /BIC/ZAUFNR    = <RESULT_FIELDS>-CS_ORDER
                    NOTIFICATN     = <RESULT_FIELDS>-NOTIFICATN  BINARY SEARCH.
         IF sy-subrc = 0.
           <RESULT_FIELDS>-/BIC/ZFCMP_FLG = wa_fcmp_flg.
        ENDIF.
    ENDLOOP.
    Thanks,
    Venkat.

    hi...
    Since you are using Field symbol to loop the internal Table there is no need to use the MODIFY Statement in the loop.
    So your code is correct only.
    But here you have to check the status of READ TABLE command in the debug mode.
    it may be failing that's why the RESULT_PACKAGE is not getting modified.
    Plz check it.
    Note: You may need to SORT the Int Table since you are using BINARY SEARCH. check below.
    DATA: wa_fcmp_flag   TYPE c VALUE 'Y'.
    Sort it_zmds_d01 by  /BIC/ZAUFNR    NOTIFICATN  .
    LOOP AT RESULT_PACKAGE ASSIGNING <RESULT_FIELDS>.
        READ TABLE it_zmds_d01 INTO wa_zmds_d01 WITH KEY
                    /BIC/ZAUFNR    = <RESULT_FIELDS>-CS_ORDER
                    NOTIFICATN     = <RESULT_FIELDS>-NOTIFICATN  BINARY SEARCH.
         IF sy-subrc = 0.
           <RESULT_FIELDS>-/BIC/ZFCMP_FLG = wa_fcmp_flg.
        ENDIF.
    ENDLOOP.

  • To populate data using end routine

    Hello,
    In the end routine I need to populate the org unit with data from 0hrposition object which gets populated from the source field assigment.
    Could anyone help me in writing the code in BW 7 END Routine
    InfoObject: 0ORGUNIT Organizational Unit.
            ORGUNIT           TYPE /BI0/OIORGUNIT,
    InfoObject: 0HRPOSITION Position.
            HRPOSITION           TYPE /BI0/OIHRPOSITION,
    Thank you
    Anima

    HI Anima,
                            Check here.....
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/e73bfc19-0e01-0010-23bc-ef0ad53f2fab
    Regards,
    Vijay.

  • END ROUTINE to calculate time differnce for two corresponding dates

    Hi All,
    I have to write an END ROUTINE in which I have 2 DATE FIELDS (e.g. DATE_1, DATE_2) and corresponding TIME fields (e.g. TIME_1, TIME_2).
    I have to calculate difference between the TIME fields for the two dates and populate a TIME_DIFF field.
    Can someone pls help me by posting sample code as i am new to BI.
    Thanks in ADVANCE!!!

    End Routine:
    http://help.sap.com/saphelp_nw70/helpdata/EN/e3/732c42be6fde2ce10000000a1550b0/frameset.htm
    End Routine and Expert Routine
    Thanks..
    Shambhu

  • Updating Cube in the End Routine based on the incoming Valid From Date

    Hello Experts
    Here is my scenario.
    We are using BW 7.x.  There is DSO and from DSO it goes to the Cube, the promotions information.  These data targets will have Material (Article), Plant (Site), Promotions Number, Valid from and Valid to dates.  The Valid To date will come in as 12/31/9999 all the time.
    If I receive a new record from ECC with same Material, Plat and Promotion combination again, I need to get the Valid From Date from the new record and Update the existing record' Valid To date in DSO/Cube as Valid From of the new record - 1.
    So Valid To date of the existing record = New Record's Valid From date minus - 1.
    I would like to do the update in "End Routine" of the Cube.
    Would you please suggest if this can be done and if so would you please provide the sample code.
    THANK YOU in Advance.
    Nag.

    yes, you can do this...
    just use a RESULT_PACKAGE in end routine.
    and using RESULT_PACKAGE fatch the existing records from DSO and append that into RESULT_PACKAGE after your desired changes,
    just try to create a code using this l;ogic. if you can't then i will provide you a code.
    Regards,
    Ashish

  • No data updated with END ROUTINE

    Dear Team,
    Written the following END ROUTINE to calculate the Value=qtyunit priceexchange rate in the transformations
    Qty,Unit Price and Exchange rates are comming from the source
    Value is getting updated till new data table,but when i activate it the value becomes zero
    Please guide in fixing the above same
    Thanks & Regards,
    Sarasu

    Dear All,
    Following is the code written
    data:wa_result_package type tys_TG_1.
        loop at RESULT_PACKAGE into wa_RESULT_PACKAGE.
          wa_result_package-/BIC/K_DELAMT =  wa_result_package-EXCHG_RATE *
          wa_result_package-/BIC/K_DELVAL *
          wa_result_package-DLV_QTY.
          MODIFY RESULT_PACKAGE FROM wa_RESULT_PACKAGE TRANSPORTING
          /BIC/K_DELAMT.
        ENDLOOP.
    Thanks & Regards,
    Sarasu

  • How to delete data from a DSO using start or end routine

    Is it possible to delete records from a DSO in the start or end routine.
    My example:  I have order 123, item 10, sched line 1.  This gets loaded into the DSO.
    Next day I have order 123, item 10, sched line 2 coming in.
    I want to delete the first one (with sched line 1) from the DSO and load the second one (with sched line 2).
    Can code be put in the start or end routine to do this?

    unless I can do it within the start routine I don't want to have to create new records.
    In the start routine I am comparing records coming in with what are already in the DSO.  If there is a match on order and item and the sched line is different, then I want to delete the record from the DSO and load the new one.  Is there code that I can put in a start routine to accomplish this?

  • End Routine ABAP to read from Internal table and do calculation.

    Hi All...
    I have completed some coding in a start routine to extract some fields from a DSO containing Master Data (Stock Age) into an internal table (the internal table has been defined in the global declarations area) which will then be read in the end routine.
    (the internal table will be read) at loadtime in the end routine and used in a calculation as described below.
    I.E
    GLOBAL DATA DECLARATION
    Data: ITAB1 TYPE TABLE OF /BIC/DSOTAB
    (DSOTAB has 3 fields PLANT, STYLE, 1STDATE (1STDATE IS A DATE FIELD)
    The start routine has the following code:
    IF ITAB1 IS INITIAL.
    SELECT /BIC/PLANT /BIC/STYLE /BIC/1STDATE
                    FROM /BIC/DSOTAB
                    INTO CORRESPONDING FIELDS OF TABLE ITAB1.
    This is working fine when run under simulation i.e ITAB1 is filled no problem.
    I then need to do a calculation in the end routine.
    1. First I have to find the record in the internal table using the key of PLANT AND STYLE from the RESULT_PACKAGE.
    The code i am using now is as follows....
        READ TABLE ITAB1 TRANSPORTING NO FIELDS WITH KEY
        /BIC/PLANT = <result_fields>-/BIC/PLANT /BIC/STYLE =
        <result_fields>-/BIC/STYLE.
    Once this record has been read I then have to perform the following calculation using the following additional fields
    <result_fields>-/BIC/DYS1ST is a NUMC field in the <result_fields> that will be be filled by the result of the calculation described below.
    <result_fields>-CALDAY is a date field which is already populated in the <result-fields> which is used in the calculation below.
    The Calculation required is a difference in days between two dates
    DYS1ST = CALDAY - 1STRED.
    The code i am using is
    If sy-subrc = 0.
         <result_fields>-/BIC/DYS1ST = <result_fields>-CALDAY -
         i_t_1stred_dso-/BIC/1STRED.
    So the whole section of code inside the LOOP at RESULT PACKAGE looks like this in the end routine
           READ TABLE ITAB1 TRANSPORTING NO FIELDS WITH KEY
        /BIC/PLANT = <result_fields>-/BIC/PLANT /BIC/STYLE =
        <result_fields>-/BIC/STYLE.
    IF sy-subrc = 0.
         <result_fields>-/BIC/DYS1ST = <result_fields>-CALDAY -
         i_t_1stred_dso-/BIC/1STRED.
    Im getting the error
    "ITAB1 " is a table without a header line and therefore has no component called "/BIC/1STRED
    Please can someone advise as to what I need to do to get this fixed please.
    Thanks in advance
    Stevo:)

    Hi,
    You will have to do few changes in your code as below,
    GLOBAL DATA DECLARATION
    Data: ITAB1 TYPE standard TABLE OF /BIC/DSOTAB.
    After that declare a workarea to read the values.
    DATA: i_wa_itab1 type /bic/dsotab.
    (DSOTAB has 3 fields PLANT, STYLE, 1STDATE (1STDATE IS A DATE FIELD)
    The start routine has the following code:
    IF ITAB1 IS INITIAL.
    SELECT /BIC/PLANT /BIC/STYLE /BIC/1STDATE
    FROM /BIC/DSOTAB
    INTO CORRESPONDING FIELDS OF TABLE ITAB1.
    This is working fine when run under simulation i.e ITAB1 is filled no problem.
    I then need to do a calculation in the end routine.
    1. First I have to find the record in the internal table using the key of PLANT AND STYLE from the RESULT_PACKAGE.
    The code i am using now is as follows....
    READ TABLE ITAB1 TRANSPORTING NO FIELDS WITH KEY
    /BIC/PLANT = <result_fields>-/BIC/PLANT /BIC/STYLE =
    <result_fields>-/BIC/STYLE.
    Once this record has been read I then have to perform the following calculation using the following additional fields
    <result_fields>-/BIC/DYS1ST is a NUMC field in the <result_fields> that will be be filled by the result of the calculation described below.
    <result_fields>-CALDAY is a date field which is already populated in the <result-fields> which is used in the calculation below.
    The Calculation required is a difference in days between two dates
    DYS1ST = CALDAY - 1STRED.
    The code i am using is
    If sy-subrc = 0.
    <result_fields>-/BIC/DYS1ST = <result_fields>-CALDAY -
    i_t_1stred_dso-/BIC/1STRED.
    So the whole section of code inside the LOOP at RESULT PACKAGE looks like this in the end routine
    READ TABLE ITAB1 into i_wa_itab1 WITH KEY
    /BIC/PLANT = <result_fields>-/BIC/PLANT /BIC/STYLE =
    <result_fields>-/BIC/STYLE.
    IF sy-subrc = 0.
    <result_fields>-/BIC/DYS1ST = <result_fields>-CALDAY -
    i_wa_itab1-/BIC/1STRED.
    Once you do this changes, your code will work fine.
    Regards,
    Durgesh.

  • End routine to populate Info-cube.

    Hi ,
    Is it possible to load fileds of a Info-cube using End routines in the following scenairos.
    1.Loading fields of info-cube by referencing/using a master data table in End routine.
    2.Loading fields of info-cube by referencing/using a DSO fields  in End routine.
    3.Loading fields of info-cube by referencing/using a fields of another info-cube in End routine.
    Please advise.

    Hi Stalin,
    Before answering your question you need to understand something about "End routine" and "Expert routine".
    End Routine:
    - Result_fields and Result_package are available
    - End routine contains only those fields available in Data target.
    Start Routine:
    - Source_fields and Source_package are available
    - Start routine contains only those fields coming from source.
    Expert Routine:
    -  Source_fields, Source_package, Result_fields and Result_package are available
    So Now if you want write code to look up into some other cube, in look up you may need to test condition using source fields, in that case " Expert Routine" is only the option.
    For Ex
    my data target contains : x,y and z fields (it becomes result_field)
    source contains : a field ( it becomes source_field)
    now if i want to write look up code like this " select x,y and z fields from other cube where my a field value = other cube a field value. here u r accessing both S_F as well as R_F. So only the option is "EXPERT ROUTINE"
    or else u want to write code only with R_F then "End routine " is enough.
    Thanks,
    Gowd

Maybe you are looking for