Look up at master data in  update routine

Hi all,
I am loading from flat file in to an infocube. My transaction data comes in the format Subcategory, Area,Date, Sales qty,UOM. But my infocube should be filled with Category,Subcategory,Region,Area,Date,SalesQty,UOM. My business requirements want me to model Category, Subcategory,Region area as characteristics of dimension. ie I cannot model Category as navigational attribute of Subcategory and Region as navigational attribute of Area.
But I can get the master data file for Subcategory in Sucategory, Category format and Area in Area, Region format. So when I load the transaction data I can make a table lookup on these master data table and fill the required fields.
But the problem is I donot have much experince in ABAP routines.
Can any one help me with a sample start/update routine on how to do this.
Any help will be appreciated.
Regards,
Amith

Amith,
Actually, you may everything do in a start routine of the URs (including lookup itself).
There is no need to declare internal table in a global part of routine and then read values in routines.
The code may look the following. (Don't have the system on hand. So, some syntax may require some modification.)
TABLES: <Your Master Data Table>.
DATA: wa_temp TYPE DATA_PACKAGE_STRUCTURE OCCURS 0 WITH HEADER LINE,
     wa_md TYPE <Your Master Data Table>  OCCURS 0 WITH HEADER LINE.     
SELECT * FROM <Your Master Data Table> INTO wa_md.
SORT wa_md ASCENDING BY <Basic Char Key>.
LOOP AT DATA_PACKAGE INTO wa_temp.
   READ TABLE wa_md WITH KEY <Basic Char Key> = wa_temp-<Basic Char Key>.
   wa_temp-<YourLookupIO> = wa_md-<YourLookupAttribute>.
   MODIFY DATA_PACKAGE FROM wa_temp.
ENDLOOP.
Best regards,
Eugene

Similar Messages

  • How to look up master data in update rule

    Hi Friends,
    I want to Fill a characteristic of an info cube using the attribute value in a master data in Update rule.
    Eg: I want to fill a field in cube 0PM_C01 that is got from Master data 0PM_ORDER. I have to do this in upate rule. Can anyone help me.
    Joe

    Eugene, Atlaj
    Your answers were very useful.
    Eugene as you said,
    I read the link you mentioned and I can use code
    like this
    DATA: mat LIKE /BI0/PMATERIAL-MATERIAL.
    SELECT SINGLE MATERIAL FROM /BI0/PMATERIAL INTO mat
    WHERE EANUPC = COMM_STRUCTURE-EANUPC AND OBJVERS = 'A'.
    IF SY-SUBRC = 0.
    RESULT = mat.
    RETURNCODE = 0.
    ELSE.
    RETURNCODE = 8.
    ENDIF.
    But for each record of comm structure the selection quer y will touch the Data base. As Atlaj said, can I fill the internal table in the start routine and look it up in the update rules. I think the performance willbe increased. Could you give me code sample or link?
    Thanks in ADV
    Joe

  • Access master data in update rules

    Hi,
    I try to calculate a weight in my update rules. Therefore I need the product weight and the number of peaces. I try to read the product weight from master data. The problem is, that it always calculates zero as the result. Is it possible to read master data in update rules?
    If I try to calculate the weight in the transfer rules, I get the message in the monitor, that the transfer rules don't finish. Do you have an idea what to do?
    Regards,
    Gabi

    Hi Robert,
    here is my code in the update rule.
    fill the internal table "MONITOR", to make monitor entries
      DATA: l_s_errorlog TYPE RSMONITOR.
      data: temp_prod_weight type /BIC/PZL2_P_ID.
    general values
      RESULT = COMM_STRUCTURE-/BIC/ZL2_POTMW.   "weight
      UNIT = COMM_STRUCTURE-UNIT_OF_WT.
      RETURNCODE = 0.
      ABORT = 0.
    calculate weight from master data, if it is zero
      IF COMM_STRUCTURE-/BIC/ZL2_POTMW = 0
      or COMM_STRUCTURE-/BIC/ZL2_POTMW = '0'
      or COMM_STRUCTURE-/BIC/ZL2_POTMW is initial.
        select single /BIC/ZL2_NETW UNIT_OF_WT
        from /BIC/PZL2_P_ID
        into corresponding fields of temp_prod_weight
        where /BIC/ZL2_P_ID = COMM_STRUCTURE-/BIC/ZL2_P_ID
        and objvers = 'A'.
        if sy-subrc <> 0.
          l_s_errorlog-MSGTY = 'I'.
          append l_s_errorlog to MONITOR.
          RESULT = 0.
    abort, if calculation is not possible
          ABORT = 1.
        else.
          RESULT = temp_prod_weight-/BIC/ZL2_NETW
    COMM_STRUCTURE-/BIC/ZL2_POTMQ.
          UNIT = temp_prod_weight-UNIT_OF_WT.
        endif.
    ENDIF.
    convert unit
      CALL FUNCTION 'UNIT_CONVERSION_SIMPLE'
        EXPORTING
          INPUT                      = RESULT
        NO_TYPE_CHECK              = 'X'
        ROUND_SIGN                 = ' '
          UNIT_IN                    = COMM_STRUCTURE-UNIT_OF_WT
          UNIT_OUT                   = 'KG'
        IMPORTING
        ADD_CONST                  =
        DECIMALS                   =
        DENOMINATOR                =
        NUMERATOR                  =
          OUTPUT                     = RESULT
        EXCEPTIONS
          CONVERSION_NOT_FOUND       = 1
          DIVISION_BY_ZERO           = 2
          INPUT_INVALID              = 3
          OUTPUT_INVALID             = 4
          OVERFLOW                   = 5
          TYPE_INVALID               = 6
          UNITS_MISSING              = 7
          UNIT_IN_NOT_FOUND          = 8
          UNIT_OUT_NOT_FOUND         = 9
          OTHERS                     = 10
      IF SY-SUBRC <> 0.
        l_s_errorlog-MSGTY = 'I'.
        append l_s_errorlog to MONITOR.
        UNIT = COMM_STRUCTURE-UNIT_OF_WT.
        RESULT = COMM_STRUCTURE-/BIC/ZL2_POTMW.
      ELSE.
        UNIT = 'KG'.
      ENDIF.
    I'm loading via PSA, but I have problems to debug the code. If I set a break point in my code (command break-point), I can't stop at it.
    Thanks,
    Gabi

  • FM for master data creation/update

    Hello all,
    I would like to ask you for a FM for master data creation/update.
    Thanks all for your help.

    The answer is API_SEMBPS_CHA_VALUES_UPDATE.

  • Performance: reading huge amount of master data in end routine

    In our 7.0 system, each day a full load runs from DSO X to DSO Y in which from six characteristics from DSO X master data is read to about 15 fields in DSO Y contains about 2mln. records, which are all transferred each day. The master data tables all contain between 2mln. and 4mln. records. Before this load starts, DSO Y is emptied. DSO Y is write optimized.
    At first, we designed this with the standard "master data reads", but this resulted in load times of 4 hours, because all master data is read with single lookups. We redesigned and fill all master data attributes in the end routine, after fillilng internal tables with the master data values corresponding to the data package:
    *   Read 0UCPREMISE into temp table
        SELECT ucpremise ucpremisty ucdele_ind
          FROM /BI0/PUCPREMISE
          INTO CORRESPONDING FIELDS OF TABLE lt_0ucpremise
          FOR ALL ENTRIES IN RESULT_PACKAGE
          WHERE ucpremise EQ RESULT_PACKAGE-ucpremise.
    And when we loop over the data package, we write someting like:
        LOOP AT RESULT_PACKAGE ASSIGNING <fs_rp>.
          READ TABLE lt_0ucpremise INTO ls_0ucpremise
            WITH KEY ucpremise = <fs_rp>-ucpremise
            BINARY SEARCH.
          IF sy-subrc EQ 0.
            <fs_rp>-ucpremisty = ls_0ucpremise-ucpremisty.
            <fs_rp>-ucdele_ind = ls_0ucpremise-ucdele_ind.
          ENDIF.
    *all other MD reads
    ENDLOOP.
    So the above statement is repeated for all master data we need to read from. Now this method is quite faster (1,5 hr). But we want to make it faster. We noticed that reading in the master data in the internal tables still takes a long time, and this has to be repeated for each data package. We want to change this. We have now tried a similar method, but now load all master data in internal tables, without filtering on the data package, and we do this only once.
    *   Read 0UCPREMISE into temp table
        SELECT ucpremise ucpremisty ucdele_ind
          FROM /BI0/PUCPREMISE
          INTO CORRESPONDING FIELDS OF TABLE lt_0ucpremise.
    So when the first data package starts, it fills all master data values, which 95% of them we would need anyway. To accomplish that the following data packages can use the same table and don't need to fill them again, we placed the definition of the internal tables in the global part of the end routine. In the global we also write:
    DATA: lv_data_loaded TYPE C LENGTH 1.
    And in the method we write:
    IF lv_data_loaded IS INITIAL.
      lv_0bpartner_loaded = 'X'.
    * load all internal tables
    lv_data_loaded = 'Y'.
    WHILE lv_0bpartner_loaded NE 'Y'.
      Call FUNCTION 'ENQUEUE_SLEEP'
      EXPORTING
         seconds = 1.
    ENDWHILE.
    LOOP AT RESULT_PACKAGE
    * assign all data
    ENDLOOP.
    This makes sure that another data package that already started, "sleeps" until the first data package is done with filling the internal tables.
    Well this all seems to work: it takes now 10 minutes to load everything to DSO Y. But I'm wondering if I'm missing anything. The system seems to work fine loading all these records in internal tables. But any improvements or critic remarks are very welcome.

    This is a great question, and you've clearly done a good job of investigating this, but there are some additional things you should look at and perhaps a few things you have missed.
    Zephania Wilder wrote:
    At first, we designed this with the standard "master data reads", but this resulted in load times of 4 hours, because all master data is read with single lookups.
    This is not accurate. After SP14, BW does a prefetch and buffers the master data values used in the lookup. Note [1092539|https://service.sap.com/sap/support/notes/1092539] discusses this in detail. The important thing, and most likely the reason you are probably seeing individual master data lookups on the DB, is that you must manually maintain the MD_LOOKUP_MAX_BUFFER_SIZE parameter to be larger than the number of lines of master data (from all characteristics used in lookups) that will be read. If you are seeing one select statement per line, then something is going wrong.
    You might want to go back and test with master data lookups using this setting and see how fast it goes. If memory serves, the BW master data lookup uses an approach very similar to your second example (1,5 hrs), though I think that it first loops through the source package and extracts the lists of required master data keys, which is probably faster than your statement "FOR ALL ENTRIES IN RESULT_PACKAGE" if RESULT_PACKAGE contains very many duplicate keys.
    I'm guessing you'll get down to at least the 1,5 hrs that you saw in your second example, but it is possible that it will get down quite a bit further.
    Zephania Wilder wrote:
    This makes sure that another data package that already started, "sleeps" until the first data package is done with filling the internal tables.
    This sleeping approach is not necessary as only one data package will be running at a time in any given process. I believe that the "global" internal table is not be shared between parallel processes, so if your DTP is running with three parallel processes, then this table will just get filled three times. Within a process, all data packages are processed serially, so all you need to do is check whether or not it has already been filled. Or are you are doing something additional to export the filled lookup table into a shared memory location?
    Actually, you have your global data defined with the statement "DATA: lv_data_loaded TYPE C LENGTH 1.". I'm not completely sure, but I don't think that this data will persist from one data package to the next. Data defined in the global section using "DATA" is global to the package start, end, and field routines, but I believe it is discarded between packages. I think you need to use "CLASS-DATA: lv_data_loaded TYPE C LENGTH 1." to get the variables to persist between packages. Have you checked in the debugger that you are really only filling the table once per request and not once per package in your current setup? << This is incorrect - see next posting for correction.
    Otherwise the third approach is fine as long as you are comfortable managing your process memory allocations and you know the maximum size that your master data tables can have. On the other hand, if your master data tables grow regularly, then you are eventually going to run out of memory and start seeing dumps.
    Hopefully that helps out a little bit. This was a great question. If I'm off-base with my assumptions above and you can provide more information, I would be really interested in looking at it further.
    Edited by: Ethan Jewett on Feb 13, 2011 1:47 PM

  • Master data delta update to data souce not possible

    Hi,
    0material_att data source delta update failed for a few days from the process chain and when I tried to manually run the delta infopackage, I got this message:
    Last delta update is not yet completed
    Therefore, no new delta update is possible.
    You can start the request again
    if the last delta request is red or green in the monitor (QM activity)
    I did "manage" for this data source but none of the failed delta run showed in there. I also looked for cancelled jobs in sm37 but nothing came up either. So I ran a full upload without deleting the data as my user is waiting for this fix to be shown in the report. What can I do now to enable the delta run again without deleting the data?
    sharon

    Hi Sharon,
    chk in RSMO for that master data 0material_attr, if there are any red/yellow req. If so change the QM status to RED again and goto IP and trigger the load again (it wil say that earlier delta has failed and if u want to Repeat ;ast delta,) in that give OK and exe the load again.
    In some cases, the datasource will not support repeat deltas, in that case do a re-init for the master data load. (no need to delete any datas.)
    Hope this helps. If you have any queries please ask.
    Thanks!
    Dharini

  • Master Data - Flexible update

    We have C1 master data, Different attributes under this.
          the source for this char is coming from different info pkgs.
          also Transactional data.
      We r loading master data from r/3 into char1 using flexible update mode.
      When looked at the master table using Se11, can view all the data correctly.
    But when view the contents of the C1 master data from RSA1 (Maintain master data ), observed that one of the attribute is not loaded.
    Any idea what could be the reason. 
    any way how to identify how this data is loaded, either master load or transaction data.
    Also how to correct this data.

    look into the transfer rules of the master data infoobject, check where the attribute is getting loaded, else check in the transfer rules of the transaction data infosource if it is marked to get loaded from there.
    proceed accordingly. if u dont find it in both the rules, then it needs to be configured again taking data from available data source.
    **revert on doubts. reward if helpful**

  • Master Data not updating

    We have found that are master data text isn't updating in APO from BW.  I can see that the data is in the APO PSA but doesn't reflect in the master data object.  We tried deleting the master data but the application advises that you cannot delete master data that's being referenced.
    Are these problems related?
    Why will the master data descriptions not update?
    Please suggest a solution.

    We have are pulling the description text from BW into APO via a master text load.  I am loading the data into the PSA and then into the maste data object.  The PSA shows the correct description.  When I view the master data, the description is empty.  We are updating 0LANGU, QSMATNR, and 0TXTLG.  When I look at the info object table /BIC/TQSMATNR, there are no table entries.

  • BP Master Data gets updated when (AR/AP) Invoice is raised

    Hi All,
    I am running SAP B1 8.8 and I discover that whenever AR or AP invoice is raised, it will update the business partner master data. On the master data of the particular BP where the invoice was raised, I went in and click on the change log, the date of latest update is the same as the date when the invoice is raised. I was wondering why SAP B1 8.8 is doing that? I am running the latest PL which is PL 14 hotfix. But I don't think it's PL 14, as I can see the history of te change log was created even before I updated my B1 to PL14 hotfix.
    I have done multiple invoices with the same BP, and it seemed the update only happened once, which was the first time an invoice is raised.  Just curious as to why the BP master data needs to be updated.
    Many thanks.
    Cheers.
    Kevin

    Hi Rahul,
    I have checked, no FMS. But do you get the same pattern though? When AR/AP invoice is raised a BP, then go that BP master data, click on its change log under tools. Do you see the latest date of the change is the same as your invoice date?
    Thank you.
    Regards,
    Kevin

  • Master Data Full Update

    Hi All,
    If I load Master Data everyday (Full Update), does the data overwrites the previous day data or adds the new data to the existing one?

    Hi S.D
      How are you? Yes, its master data always overwrites the data for which the keys match. So you need not worry about any duplicates. Infact, even when we do a delta load for our master data, we do a full load once a week to ensure that we did not lose any data with deltas.

  • Administration Tab for WBS Master data nt updating automatically.

    Hi PS Consultanats,
    Weare recieving WBS master data from the source system to our system. When I am recieving an IDOC related to WBS master data to my system the admisnistration data is not automatically generating.
    For some records it is generated when the Administration data fields were there in the IDOC structure from the source system. No since these fields are system generated why they are not updating automatically ?
    Regards,
    Raghavendra.M
    SAP-Practice.

    Hi Martina,
    Thanks a lot for your valuable inputs once again.
    About second part, please let me know whether it is possible or not.
    Actually, for data upload our Client has existing upload z programs.Now for e.g. while entering division data from upload file,client wants user to avoid mass human mistake. Hence this data should be some numerical no. like 001,002 equivalent for different divisions like Bus,truck etc...This division data when gets uploaded should appear in the form of text i.e Bus,Truck.
    Can this be possible? If yes , how can it be done?
    Regards
    Tushar

  • Revoke the material master data's updated in production

    Hi Experts,
    I have updated the product hierarchy field for the list of materials using BAPI_MATERIAL_SAVEDATA.
    I have forget to set the SALES_VIEW flag in BAPIMATHEAD table, instead it created all the views for the material. Is there any way to revoke the material master data.
    Thanks & Regards,
    Anand

    ALL_CAPITAL should resolve your problem. Check this also:
    RSDMD 194: 0MATERIAL Datarecord is invalid
    /thread/307925 [original link is broken]
    Absolutley Stuck...can't load data as characteristic
    Thanks..
    Shambhu

  • What tasks need to be done after master data is updated on R/3 side.

    Hi all,
    One of end user didn't enter the descriptions for few of the 0costcenters on the R/3 side. Now as he likes to see the descriptions , he's updating the information on R/3 side. What will be the tasks a bw backend person should perform in order to update the descriptions for costcenter.
    I think  they have to run the delta load for 0costcenter master data and do a ACR- Attribute change run.

    Hello
    If its just a descriptions. Then just full Text load for 0Cost_center should be good enough to show them in the BW Reports
    Thanks

  • Loading time-dependent master data using update rules/transformations

    Hi
    I am trying to load time-dependent master data to an infoobject. It seems that I get an error message on duplicate records if I use a transformation or update rule. Does this only work with direct update ?

    In the DTP you have the option to ignore duplicate records....
    Just select that and then load data...

  • Master Data "Simulate Update" and REAL update differences

    Hi there everyone.
    My issue is this. When I load my 0EMPLOYEE Master Data the new records are not written in the infoobject and I don't understand why?
    When I go to the Loading monitor, "Details" tab > Data package then right click and choose "Simulate Update" everything simulates perfectly, but afterwards, in the real infoobject the data is unchanged!!
    I activated the data, so, there is no problem there...
    Any ideas?
    thanx

    That's a little strange, if your data goes to PSA and in simulate is seems that it is sent write, there's something not allowing your data to be written.
    Nevertheless did you check your update rules (transformation) if there is something not allowing the data to be written?
    I'm having no more ideas...
    Try to create another infopackage and running different selections for 0employee. Check the time intervals (time dependent) while executing the infopacakge (if they are ok to bring data).
    Diogo.

Maybe you are looking for

  • MBP CD2 DVI to VGA not working with projector

    I am having issues getting the MBP to recognize a DELL 2300 projector through a DVI to VGA apple cable. A few months ago there were posts about this, but no solutions. The problem seems to be with the 'INTEL' side of things, as the same projector wit

  • Enhancement for ME22N , Triggers at PBO

    Dear all, My Requiremnet Is to Trigger an Enhancement( BADI or USER-EXIT) during PBO, from which I want to fetch the PO Number, so that i make the necessary validation to Enter into transaction 'ME22N' else simple throw an error message to the user.

  • WebCenter portlet IFRAME issue

    I have an ADF application that I have portletized into WebCenter, but noticed that it is encapsulated as an IFRAME even with renderPortletinIFrame="false". Here is the code in the JSPX file holding the portlet: <adfp:portlet value="#{bindings.Portlet

  • Reg.  JFormattedTectField & Date

    Hello Friends, I have a JFTF of Date set to simpleDateFormat(yyyy-mm-dd) format. when the user enters date in some other format - dd-mm-yyyy format, how can i trap that & force the user to enter date in specified format only. Currently, as said the d

  • The db_verify slow gradually after system restart

    dear consultants: After do the disk except error check , we found that the disk read is slow gradually after restart the system. but we have not the performance analyse before the disk except error check ,so we can't how to lead the disk read slowly.