To populate data using end routine

Hello,
In the end routine I need to populate the org unit with data from 0hrposition object which gets populated from the source field assigment.
Could anyone help me in writing the code in BW 7 END Routine
InfoObject: 0ORGUNIT Organizational Unit.
        ORGUNIT           TYPE /BI0/OIORGUNIT,
InfoObject: 0HRPOSITION Position.
        HRPOSITION           TYPE /BI0/OIHRPOSITION,
Thank you
Anima

HI Anima,
                        Check here.....
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/e73bfc19-0e01-0010-23bc-ef0ad53f2fab
Regards,
Vijay.

Similar Messages

  • Performance: reading huge amount of master data in end routine

    In our 7.0 system, each day a full load runs from DSO X to DSO Y in which from six characteristics from DSO X master data is read to about 15 fields in DSO Y contains about 2mln. records, which are all transferred each day. The master data tables all contain between 2mln. and 4mln. records. Before this load starts, DSO Y is emptied. DSO Y is write optimized.
    At first, we designed this with the standard "master data reads", but this resulted in load times of 4 hours, because all master data is read with single lookups. We redesigned and fill all master data attributes in the end routine, after fillilng internal tables with the master data values corresponding to the data package:
    *   Read 0UCPREMISE into temp table
        SELECT ucpremise ucpremisty ucdele_ind
          FROM /BI0/PUCPREMISE
          INTO CORRESPONDING FIELDS OF TABLE lt_0ucpremise
          FOR ALL ENTRIES IN RESULT_PACKAGE
          WHERE ucpremise EQ RESULT_PACKAGE-ucpremise.
    And when we loop over the data package, we write someting like:
        LOOP AT RESULT_PACKAGE ASSIGNING <fs_rp>.
          READ TABLE lt_0ucpremise INTO ls_0ucpremise
            WITH KEY ucpremise = <fs_rp>-ucpremise
            BINARY SEARCH.
          IF sy-subrc EQ 0.
            <fs_rp>-ucpremisty = ls_0ucpremise-ucpremisty.
            <fs_rp>-ucdele_ind = ls_0ucpremise-ucdele_ind.
          ENDIF.
    *all other MD reads
    ENDLOOP.
    So the above statement is repeated for all master data we need to read from. Now this method is quite faster (1,5 hr). But we want to make it faster. We noticed that reading in the master data in the internal tables still takes a long time, and this has to be repeated for each data package. We want to change this. We have now tried a similar method, but now load all master data in internal tables, without filtering on the data package, and we do this only once.
    *   Read 0UCPREMISE into temp table
        SELECT ucpremise ucpremisty ucdele_ind
          FROM /BI0/PUCPREMISE
          INTO CORRESPONDING FIELDS OF TABLE lt_0ucpremise.
    So when the first data package starts, it fills all master data values, which 95% of them we would need anyway. To accomplish that the following data packages can use the same table and don't need to fill them again, we placed the definition of the internal tables in the global part of the end routine. In the global we also write:
    DATA: lv_data_loaded TYPE C LENGTH 1.
    And in the method we write:
    IF lv_data_loaded IS INITIAL.
      lv_0bpartner_loaded = 'X'.
    * load all internal tables
    lv_data_loaded = 'Y'.
    WHILE lv_0bpartner_loaded NE 'Y'.
      Call FUNCTION 'ENQUEUE_SLEEP'
      EXPORTING
         seconds = 1.
    ENDWHILE.
    LOOP AT RESULT_PACKAGE
    * assign all data
    ENDLOOP.
    This makes sure that another data package that already started, "sleeps" until the first data package is done with filling the internal tables.
    Well this all seems to work: it takes now 10 minutes to load everything to DSO Y. But I'm wondering if I'm missing anything. The system seems to work fine loading all these records in internal tables. But any improvements or critic remarks are very welcome.

    This is a great question, and you've clearly done a good job of investigating this, but there are some additional things you should look at and perhaps a few things you have missed.
    Zephania Wilder wrote:
    At first, we designed this with the standard "master data reads", but this resulted in load times of 4 hours, because all master data is read with single lookups.
    This is not accurate. After SP14, BW does a prefetch and buffers the master data values used in the lookup. Note [1092539|https://service.sap.com/sap/support/notes/1092539] discusses this in detail. The important thing, and most likely the reason you are probably seeing individual master data lookups on the DB, is that you must manually maintain the MD_LOOKUP_MAX_BUFFER_SIZE parameter to be larger than the number of lines of master data (from all characteristics used in lookups) that will be read. If you are seeing one select statement per line, then something is going wrong.
    You might want to go back and test with master data lookups using this setting and see how fast it goes. If memory serves, the BW master data lookup uses an approach very similar to your second example (1,5 hrs), though I think that it first loops through the source package and extracts the lists of required master data keys, which is probably faster than your statement "FOR ALL ENTRIES IN RESULT_PACKAGE" if RESULT_PACKAGE contains very many duplicate keys.
    I'm guessing you'll get down to at least the 1,5 hrs that you saw in your second example, but it is possible that it will get down quite a bit further.
    Zephania Wilder wrote:
    This makes sure that another data package that already started, "sleeps" until the first data package is done with filling the internal tables.
    This sleeping approach is not necessary as only one data package will be running at a time in any given process. I believe that the "global" internal table is not be shared between parallel processes, so if your DTP is running with three parallel processes, then this table will just get filled three times. Within a process, all data packages are processed serially, so all you need to do is check whether or not it has already been filled. Or are you are doing something additional to export the filled lookup table into a shared memory location?
    Actually, you have your global data defined with the statement "DATA: lv_data_loaded TYPE C LENGTH 1.". I'm not completely sure, but I don't think that this data will persist from one data package to the next. Data defined in the global section using "DATA" is global to the package start, end, and field routines, but I believe it is discarded between packages. I think you need to use "CLASS-DATA: lv_data_loaded TYPE C LENGTH 1." to get the variables to persist between packages. Have you checked in the debugger that you are really only filling the table once per request and not once per package in your current setup? << This is incorrect - see next posting for correction.
    Otherwise the third approach is fine as long as you are comfortable managing your process memory allocations and you know the maximum size that your master data tables can have. On the other hand, if your master data tables grow regularly, then you are eventually going to run out of memory and start seeing dumps.
    Hopefully that helps out a little bit. This was a great question. If I'm off-base with my assumptions above and you can provide more information, I would be really interested in looking at it further.
    Edited by: Ethan Jewett on Feb 13, 2011 1:47 PM

  • Hi loading last 2 months data using abap routine

    we are planning to write a routine in infopackage level,
    using the option 6-ABAP routine, to select the last 2 month data ,
    we have the time characteristics 0CALMONTH, At present we managing the load by manually changing the selection for last 2 month every time ,
    It would be a great help if anybody can show a sample code to select last 2 month data,
    Thanks,

    Hi,
    data: l_idx like sy-tabix.
    DATA: lv_calmonth LIKE /BI0/SCALMONTH-CALMONTH.
    DATA: lv_day TYPE DATS.
    "previous month
    lv_day = SY-DATUM.
    lv_day+6(2) = '01'.
    lv_day = lv_day - 1.
    lv_calmonth = lv_day(6).
    READ TABLE l_t_range with key fieldname = 'CALMONTH'.
    l_idx = sy-tabix.
    MOVE lv_calmonth TO l_t_range-low.
    MODIFY l_t_range INDEX l_idx.
    "previous month - 1
    lv_day+6(2) = '01'.
    lv_day = lv_day - 1.
    lv_calmonth = lv_day(6).
    MOVE lv_calmonth TO l_t_range-low.
    APPEND l_t_range.
    p_subrc = 0.
    let me know if this works.... not sure about the last append.
    Olivier.

  • No data in Active table of DSO for fields populated by End Routine

    Hi,
    I have a Standard DSO where we are populating few fields by using End Routine.
    Last week we added 5 more fields to DSO and wrote a logic in End ROutine to populate the DSO. These new fields dont have any mapping and these are just populated by end routine only.
    When I loaded the data from Data Source TO DSO, Data is loaded correctly into NEW DATA Table of DSO for all the fields. I could see correct data as per the logic in NEW Table including old and new fields.
    However, when I activate the DSO, I could not find the data for new fields which I added last week. Remaining fields are getting data as per the logic. Only these five fields are not having any data.
    Can you please let me know if any one had similar issue. I was under impression that all the data in the new table will go to Active table when we activate the DSO.
    Your inputs are highly appreciated.
    Thanks
    Krishna

    What version of BW are you using?  When editing your end-routine, a pop-up should display saying which fields you want populated/transferred from the end routine.  This pop-up will not display if you are using a lower version of BW 7.x.  To get around this, make sure that your newly added fields have a transformation rule type set to constant.  This will make sure that the fields get populated when transferring from new to active tables.

  • End routine to populate Info-cube.

    Hi ,
    Is it possible to load fileds of a Info-cube using End routines in the following scenairos.
    1.Loading fields of info-cube by referencing/using a master data table in End routine.
    2.Loading fields of info-cube by referencing/using a DSO fields  in End routine.
    3.Loading fields of info-cube by referencing/using a fields of another info-cube in End routine.
    Please advise.

    Hi Stalin,
    Before answering your question you need to understand something about "End routine" and "Expert routine".
    End Routine:
    - Result_fields and Result_package are available
    - End routine contains only those fields available in Data target.
    Start Routine:
    - Source_fields and Source_package are available
    - Start routine contains only those fields coming from source.
    Expert Routine:
    -  Source_fields, Source_package, Result_fields and Result_package are available
    So Now if you want write code to look up into some other cube, in look up you may need to test condition using source fields, in that case " Expert Routine" is only the option.
    For Ex
    my data target contains : x,y and z fields (it becomes result_field)
    source contains : a field ( it becomes source_field)
    now if i want to write look up code like this " select x,y and z fields from other cube where my a field value = other cube a field value. here u r accessing both S_F as well as R_F. So only the option is "EXPERT ROUTINE"
    or else u want to write code only with R_F then "End routine " is enough.
    Thanks,
    Gowd

  • BI end routine at transformation to populate info object by vlookup attribu

    Hi ,
    I am APO consultant working in Bi routines and I have the follwoing situation and need some guidance in ABAP code (routine) .
    We sales info from markets as flat filea snd lod them into cubes. One of the filed is file is External sales Group: ZEXSGRP. This is an attribute or Sales Group info object ZSLSGRP.
    We get external sales group populated in file when we upload the file into cube -  I  want to use end routine to vlook up the  info infoobject table ZSLSGRP - all the external sales groups and use the matching value to write Sales Group (ZSLSGRP).Example  if ZSLSGRP  is NAM and it attribute is  ZEXSGRP  and has value N0000032. The file gets value N0000032 so the end routine should look all attribute of all infoobject ZSLSGRP and match the value and populate in example above it populates NAM.
    Hope i am clear - can any help with this.
    thaks
    Varma

    Replace your select statement ,
    SELECT *
    FROM /BIC/PZF31SALOFF
    INTO CORRESPONDING FIELDS OF TABLE it_tab4.
    instead of selecting all the fields , pick only the fields which are required.(one good performance improvement)
    SELECT    /BIC/PZF31SALOFF  comp_code
    FROM /BIC/PZF31SALOFF
    INTO CORRESPONDING FIELDS OF TABLE it_tab4.
    Remove the line below , this is not required
    MODIFY it_tab4 FROM wa_tab4.

  • What is the use of end routine in bi 7.0

    hi friends,
    what is the use of end routine in bi 7.0. what scenerio we use end routine.
    Thanking u
    suneel.

    hi Suneel,
    check
    http://help.sap.com/saphelp_nw70/helpdata/en/e3/732c42be6fde2ce10000000a1550b0/frameset.htm
    End Routine
    An end routine is a routine with a table in the target structure format as input and output parameters. You can use an end routine to postprocess data after transformation on a package-by-package basis. For example, you can delete records that are not to be updated, or perform data checks.
    If the target of the transformation is a DataStore object, key figures are updated by default with the aggregation behavior Overwrite (MOVE). You have to use a dummy rule to override this.
    hope this helps.

  • Start & End Routines in BI 7  Transformations

    Hi,
    In Transformations from DSO1-->DSO2
    In Start Routine for all entries in Source Package i read some fields from DSO3 and filled an iternal table
    And  in end routine i read the iternal table and filled the result package/fields
    In the mapping i haven't mapped any thing to the fields to which i intended to fill using routines
    When i executed data load those fields are not populated with any value
    But if i debug the transformation...results are updating in all fields in the  result package.......
    Do i need to make any setting or mappings to the fields which i want to update using end routine
    Thanks

    HI,
    For support pack 16 and above you get one more button besides End Routine (once end routine is created).
    This button is to update behaviour of fields in End Routines. You get two options once you select this button. One needs to make selection of proper option as it is mandatory.
    The default setting for the pushbutton is that only the fields with active rules are updated in the transformation. With this selection, fields populated in End routine wont be updated in the data target if no active rule exists for them in Transformation.
    Alternatively, you can define that all the fields should always be updated by selecting 2nd radio button. As a result, fields filled in the end routine are not lost if there is no other active rule.
    So in your case if you are in SP 15 or lower, then you will have to map the fields.
    Go through this article it gives the above explanation along with screenshots.
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/30d35342-1fe3-2c10-70ba-ad0da83d23bd
    Hope this helps.
    Thanks,
    Rahul

  • Using the end routine to populate the Cubes

    Hi BI Gurus,
    I am having following requirement:
    DSO: ZODS1  -
    > This DSO gets all the raw data from Source system.
    DSO:ZODS4   -
    >  Updates ZCUBE4
           ,ZODS5    -
    > Updates ZCUBE5
    there is no data flow between ZODS1 and  ZODS4
                                                     ZODS1  and  ZODS5
    I have added same new fields to ZODS1,ZCUBE4,ZCUBE5.
    So i want to populate fields of ZCUBE4,ZCUBE5.What is best possible way to do that without chaning ZODS4,ZODS5 structure?
    I am thinking to write an end routine ?Any idea if its possible?
    Also if possible can somebody sample code .
    Please help.

    Hi,
      U can populate ur new fields easily thorugh end routine in transformation ds04 to cube 4 ....
    Provided u can sufficient key in ds04 and which is the same key in ds01 to get the information of newly added fields in cube....
    please give ur fields and some sample data then it will be more clear ....
    but it can be achieved if u have necessary key fields in both dso's.
    Regards
    vamsi

  • How to delete data from a DSO using start or end routine

    Is it possible to delete records from a DSO in the start or end routine.
    My example:  I have order 123, item 10, sched line 1.  This gets loaded into the DSO.
    Next day I have order 123, item 10, sched line 2 coming in.
    I want to delete the first one (with sched line 1) from the DSO and load the second one (with sched line 2).
    Can code be put in the start or end routine to do this?

    unless I can do it within the start routine I don't want to have to create new records.
    In the start routine I am comparing records coming in with what are already in the DSO.  If there is a match on order and item and the sched line is different, then I want to delete the record from the DSO and load the new one.  Is there code that I can put in a start routine to accomplish this?

  • Filling Data fields of a DSO in End Routine

    Hi Everyone,
    The data fields of a DSO contains 2 key figures and a characteristic.
    In the End routine of the transformation, i have assigned constant values for the infoobjects in the data field.
    After executing the DTP, if I check in the New Table of the DSO, these constant values are present.  But when I activate the DSO, the values for key figures gets initialised and the values for the characterisitic becomes empty (NULL).
    Is it not possible to assign values for the infoobjects in the data field? If so, why is this limitation?
    Thanks in advance,
    Uma

    Uma,
    To populate any field in the end routine, you have to assign some constant in the transformation first and then re-populate them using the end routine.
    Sometimes if you dont assign any constant in transformation, the values remain initial and even after you write a code fo that field, it is not populated in the end routine.
    All you have to do is assign constant 0 to the key figures you are populating in the end routine and run the DTP again.
    Thanks
    Sachin

  • End Routine is NOT modifying the DSO with new data after load into that DSO

    Hi all,
      I am creating an End Routine for DSO to populate a field ZFCMP_FLG (to store 'Y' ) with lookup from another DSO ZMDS_D01. This new field shows blank instead of 'Y', after activating the DSO. The RESULT_PACKAGE record is populated with 'Y' for ZFCMP_FLG  while debugging that End Routine and why it is NOT writing the modified records into DSO, please ? It is a Characteristic InfoObject with length 1 to store 'Y'. The following is some part of the code:
    DATA: wa_fcmp_flag   TYPE c VALUE 'Y'.
    LOOP AT RESULT_PACKAGE ASSIGNING <RESULT_FIELDS>.
        READ TABLE it_zmds_d01 INTO wa_zmds_d01 WITH KEY
                    /BIC/ZAUFNR    = <RESULT_FIELDS>-CS_ORDER
                    NOTIFICATN     = <RESULT_FIELDS>-NOTIFICATN  BINARY SEARCH.
         IF sy-subrc = 0.
           <RESULT_FIELDS>-/BIC/ZFCMP_FLG = wa_fcmp_flg.
        ENDIF.
    ENDLOOP.
    Thanks,
    Venkat.

    hi...
    Since you are using Field symbol to loop the internal Table there is no need to use the MODIFY Statement in the loop.
    So your code is correct only.
    But here you have to check the status of READ TABLE command in the debug mode.
    it may be failing that's why the RESULT_PACKAGE is not getting modified.
    Plz check it.
    Note: You may need to SORT the Int Table since you are using BINARY SEARCH. check below.
    DATA: wa_fcmp_flag   TYPE c VALUE 'Y'.
    Sort it_zmds_d01 by  /BIC/ZAUFNR    NOTIFICATN  .
    LOOP AT RESULT_PACKAGE ASSIGNING <RESULT_FIELDS>.
        READ TABLE it_zmds_d01 INTO wa_zmds_d01 WITH KEY
                    /BIC/ZAUFNR    = <RESULT_FIELDS>-CS_ORDER
                    NOTIFICATN     = <RESULT_FIELDS>-NOTIFICATN  BINARY SEARCH.
         IF sy-subrc = 0.
           <RESULT_FIELDS>-/BIC/ZFCMP_FLG = wa_fcmp_flg.
        ENDIF.
    ENDLOOP.

  • End Routine Issue - It does not move data from E_T_RESULT to RESULT_PACKAGE

    Hi,
    I am facing an issue with end routine. I have gone through previous posts on, how to write end routine and all.I wrote the end routine accordingly.
    Here is my scenario,
    I have 0CUST_SALES master data , which has all the Sales Org, Distribution Channel and Division, Sold to Party, Sales Grp and Sales Dist.
    I am getting , Sold to party and Distribution channel at the field routine.
    I am using, Sold to Party and Dist Channel and Division = '01'- whatever i populated using a field routine  and trying to get the Sales Org, Sales Grp and Sales Dist at the end routine.
    It looks like, all the code that i wrote seems correct but it does not populate any values into RESULT_PACKAGE.
    Here is the code I wote at the end routine. I am not sure, whats wrong in it. I used, this link to write this routine :
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/203eb778-461d-2c10-60b3-8a94ee91cbfc&overridelayout=true
    Global Declaration----
      DATA : BEGIN OF IT_CUST_SALES,
        DIV TYPE /bi0/pcust_sales-DIVISION,
        DIST_CH TYPE /bi0/pcust_sales-DISTR_CHAN,
        SALES_ORG TYPE /bi0/pcust_sales-SALESORG,
        CUST_SAL TYPE /bi0/pcust_sales-CUST_SALES,
        SALESDIST TYPE /bi0/pcust_sales-SALES_DIST,
        SALESGRP TYPE /bi0/pcust_sales-SALES_GRP,
        END OF IT_CUST_SALES.
    DATA: T_CUST_SALES LIKE TABLE OF IT_CUST_SALES.
    Start of End Routine
       SELECT DIVISION DISTR_CHAN SALESORG CUST_SALES SALES_DIST SALES_GRP
        from
        /bi0/pcust_sales INTO TABLE T_CUST_SALES for all entries in
        RESULT_PACKAGE
        where CUST_SALES = RESULT_PACKAGE-SOLD_TO
         AND DISTR_CHAN = RESULT_PACKAGE-DISTR_CHAN
         AND DIVISION = '01'.
        LOOP AT RESULT_PACKAGE INTO e_s_result.
          READ TABLE T_CUST_SALES INTO IT_CUST_SALES
              WITH KEY CUST_SAL = e_s_result-SOLD_TO
                 DIST_CH = e_s_result-DISTR_CHAN
                 DIV = '01'.
          IF SY-SUBRC EQ 0 .
            MOVE IT_CUST_SALES-SALES_ORG TO E_S_RESULT-SALESORG.
            MOVE IT_CUST_SALES-SALESDIST TO E_S_RESULT-SALES_DIST.
            MOVE IT_CUST_SALES-SALESGRP TO E_S_RESULT-SALES_GRP.
            APPEND E_S_RESULT  TO  E_T_RESULT .
          ENDIF.
        ENDLOOP.
        REFRESH RESULT_PACKAGE.
        MOVE E_T_RESULT[] TO RESULT_PACKAGE[] .
    End End Routine
    Data comes into E_T_RESULT but it does not move to RESULT_PACKAGE. Any inputs will be helpful.
    Regards,
    Kumar

    Hi Hegde,
    Declaration is same , its like this.
       datA: e_s_result type tys_TG_1.
        data: e_t_result type tyt_TG_1.
    I don't know, when i inserted this code in this post, initially it was OK but once i post i also saw , its not that read friendly.
    FYI, i am trying to put the code again, lets see if it works.
      SELECT DIVISION DISTR_CHAN SALESORG CUST_SALES SALES_DIST SALES_GRP
        from    /bi0/pcust_sales INTO TABLE T_CUST_SALES for all entries in
        RESULT_PACKAGE   where CUST_SALES = RESULT_PACKAGE-SOLD_TO
         AND DISTR_CHAN = RESULT_PACKAGE-DISTR_CHAN
         AND DIVISION = '01'.
        LOOP AT RESULT_PACKAGE INTO e_s_result.
          READ TABLE T_CUST_SALES INTO IT_CUST_SALES
              WITH KEY CUST_SAL = e_s_result-SOLD_TO
                 DIST_CH = e_s_result-DISTR_CHAN
                 DIV = '01'.
          IF SY-SUBRC EQ 0 .
            MOVE IT_CUST_SALES-SALES_ORG TO E_S_RESULT-SALESORG.
            MOVE IT_CUST_SALES-SALESDIST TO E_S_RESULT-SALES_DIST.
            MOVE IT_CUST_SALES-SALESGRP TO E_S_RESULT-SALES_GRP.
            APPEND E_S_RESULT  TO  E_T_RESULT .
          ENDIF.
        ENDLOOP.
        REFRESH RESULT_PACKAGE.
        MOVE E_T_RESULT[] TO RESULT_PACKAGE[] .
    Regards,
    Kumar

  • End Routine to populate custom AFS field for Billing Item extractor

    I have the following scenario for Billing 2LIS_13_VDITM datasource
    Doc_No   Item  Material    Item_Categ   AFS_field
    2000        11     XYZ          123
    2000        12     XYZ          123                  US1
    3000        11     PQR          456                 
    3000        12     PQR          456                  CA1
    I need to populate the AFS_field in the first row also for all Doc Nos for certain Item categories (eg 123 & 456). I need to do this on the first transformation from PSA to DSO. I was planning to write a SELECT statement in the Start Routine with the PSA table as the reference as seen below and then write an End Routine calling this reference table. But the PSA table name is different in Dev, QA and Prod so I cannot use this.
    TYPES: BEGIN OF S_AFS,
    VBELN TYPE C LENGTH 10,
    PSTYV TYPE C LENGTH 4,
    MATNR TYPE C LENGTH 18,
    J_4KRCAT TYPE C LENGTH 16,
    J_4KSCAT TYPE C LENGTH 16,
    END OF S_AFS.
    DATA: LT_AFS TYPE STANDARD TABLE OF S_AFS,
          LS_AFS TYPE S_AFS.
    DATA: LT_DATA TYPE tyt_SC_1.
    LT_DATA[] = SOURCE_PACKAGE[].
    SORT LT_DATA DESCENDING BY VBELN MATNR J_4KRCAT.
    DELETE ADJACENT DUPLICATES FROM LT_DATA COMPARING VBELN MATNR.
    SELECT VBELN MATNR J_4KRCAT J_4KSCAT
    INTO CORRESPONDING FIELDS OF TABLE LT_AFS
    FROM /BIC/B0000777001
    FOR ALL ENTRIES IN LT_DATA WHERE VBELN = LT_DATA-VBELN AND MATNR =
    LT_DATA-MATNR.
    SORT LT_AFS BY VBELN MATNR.
    I am not an expert in writing ABAP code. Any suggestions are greatly appreciated and points will be assigned. Thank you.

    Hi
    why do want to select the data from PSA table ?Already your trying to load data from PSA to DSO.
    you want to populate the AFS field value in DSO right?
    1. First you have to add AFS field in ur DSO
    2. Write the Start routine based on ur scenario(select the data based on ur scenario and put into internal table).
    3. wirte AFS field leval  transfer routine(Read the corresponding record based on docu num,item num) pass into RESULT.
    Regards,
    GR

  • End Routine - populating Target Field based on Master Data

    Hi,
    I have an issue with my End Routine in BI 7.0. The scenario is as follows....
    The Target-Fields ZSALES_OFFICE , 0SALES_CHANNEL etc. are being mapped 1:1 from their respective source fields. In addition to these target fields I have a target field 0SALESORG which I need to populate based on the values from 0COMP_CODE which is an attribute of ZSALES_OFFICE. The values of 0COMP_CODE are 9000, 9001, 9002 and 9003 respectively. The end routine condition needs to be implemented as follows...
    For every 0COMP_CODE which has value 9000, 0SALESORG should be populated with the value "EAST". Similarly for every 0COMP_CODE which has value 9001, 0SALESORG should be populated with the value "WEST",  for every 0COMP_CODE which has value 9002, 0SALESORG should be populated with the value "NORTH" and  for every 0COMP_CODE which has value 9003, 0SALESORG should be populated with the value "SOUTH". I tried the following code but it doesnt seem to work. Could you pls help!!
    Thanks,
    SD
    DATA: it_tab4 TYPE TABLE OF /BIC/PZF31SALOFF,
              wa_tab4 TYPE /BIC/PZF31SALOFF.
        SELECT *
          FROM /BIC/PZF31SALOFF
        INTO CORRESPONDING FIELDS OF TABLE it_tab4.
        sort it_tab4 by /BIC/ZF31SALOFF.
        LOOP AT RESULT_PACKAGE
          INTO <result_fields>.
          read table it_tab4
          with key /BIC/ZF31SALOFF = <result_fields>-/BIC/ZF31SALOFF
          into wa_tab4
          binary search.
          if sy-subrc eq 0.
            CASE wa_tab4-comp_code.
              WHEN '9000'.
                <result_fields>-salesorg = 'EAST'.
              WHEN '9100'.
                <result_fields>-salesorg = 'WEST'.
              WHEN '9200'.
                <result_fields>-salesorg = 'NORTH'.
              WHEN '9300'.
                <result_fields>-salesorg = 'SOUTH'.
              MODIFY it_tab4 FROM wa_tab4.
            ENDCASE.
          endif.
        ENDLOOP.

    Replace your select statement ,
    SELECT *
    FROM /BIC/PZF31SALOFF
    INTO CORRESPONDING FIELDS OF TABLE it_tab4.
    instead of selecting all the fields , pick only the fields which are required.(one good performance improvement)
    SELECT    /BIC/PZF31SALOFF  comp_code
    FROM /BIC/PZF31SALOFF
    INTO CORRESPONDING FIELDS OF TABLE it_tab4.
    Remove the line below , this is not required
    MODIFY it_tab4 FROM wa_tab4.

Maybe you are looking for