Abap in Transfer Structure to lookup in Master Data

Hello Friends ,
I need help with one ABAP code...
I have an Infoobject 'Employee' which is mapped to source system 'A' in the transfer structure . But now I want it to look into my 'employee master data' and get populated from there.
Please suggest the ABAP code for this requirement .
Thanks a lot!!!

Hello SAP_newbee
Here is the code for master data lookup
DATA: L_TGTFIELD LIKE RESULT.
  L_TGTFIELD_CHVAL = COMM_STRUCTURE-XXXXX. "XXXXX is name of master data IO
CALL FUNCTION 'RSAU_READ_MASTER_DATA'
  EXPORTING
      I_IOBJNM                = 'XXXXX'  " Master data IO
      I_CHAVL                 = L_TGTFIELD_CHVAL   " Value of master data for which u want see attribute vale
      I_ATTRNM              = 'YYYYY'  " Name of attribute, infoObject tech name
  IMPORTING
      E_ATTRVAL               = L_TGTFIELD
      EXCEPTIONS
      READ_ERROR              = 1
      NO_SUCH_ATTRIBUTE       = 2
      WRONG_IMPORT_PARAMETERS = 3
      CHAVL_NOT_FOUND         = 4
      OTHERS                  = 5.
    CASE SY-SUBRC.
      WHEN 1 OR 2 OR 3 OR 5.
*   Error during read --> Skip whole package
        ABORT      = 4.
    Copy error message into errorlog
        MOVE-CORRESPONDING SY TO MONITOR.
        APPEND MONITOR.
      WHEN 0 OR 4.
    Attribute found successfull or not found
        RESULT = L_TGTFIELD.
        ABORT      = 0.
    ENDCASE.
Thanks
Tripple k

Similar Messages

  • LookUp for Master Data

    Hello All,
    I have an Infobject say 0MATERIAL which has two Navigational Attributes say MATERIAL1 and MATERIAL2.
    These are part of Master Data. My objective is to have MATERIAL1 and MATERIAL2 in my cube so that I can directly report on these without having to go drilling.
    I have 0MATERIAL, MATERIAL1 and MATERIAL2 in my cube, but the twist is 0MATERIAL is mapped to 0PLANT (say). Is it possible that I can escape from the LookUp or is it mandatory. I have tried the "Master Data Attribute of " option in the Update rules, this does'nt seem to work some how.
    If LookUp is mandatory can some one help me with the ABAP part of the scenario.
    I assure full points.

    Just to be sure...
    1.  0MATERIAL has no Compounded attributes in the infoobject definition
    2.  MATERIAL1 and MATERIAL2 have no Compounded attributes either
    3.  MATERIAL1 and MATERIAL2 are attributes of 0MATERIAL
    4.  0MATERIAL, MATERIAL1, and MATERIAL2 are all characteristics of your cube
    Yes?
    If this is so, then in your update rules for MATERIAL1 (and MATERIAL2), you should be able to map it as "Master data Attribute of" 0MATERIAL.
    0MAT_PLANT should not factor into the equation just because 0MATERIAL is an attribute.
    Does this help?

  • Abap code not working  - deleting based on master data table information

    Hi,
    I wrote a piece of code earlier which is working and during test we found out that it will be hard for the support guys to maintain because it was hard coded and there is possibility that users will include more code nums in the future
    sample code
    DELETE it_source WHERE /M/SOURCE EQ 'USA' AND
    /M/CODENUM NE '0999' AND
    /MCODENUM NE '0888' AND.
    Now I created a new InfoObject master data so that the support people can maintain the source and code number manually.
    master data table - the codenum is the key.
    XCODENUM    XSOURCE
    0999               IND01
    0888               IND01
    now I wrote this routine all the data gets deleted.
    tables /M/PGICTABLE.
    Data tab like /M/PGICTABLE occurs 0 with header line.
    Select * from /M/PGICTABLE into table tab where objvers = 'A'.
    if sy-subrc = 0.
    LOOP at tab.
    DELETE it_source WHERE /M/SOURCE EQ tab-XSOURCE AND /M/CODENUM NE tab-XCODENUM.
    ENDLOOP.
    Endif.
    But when I chage the sign to EQ, I get opposite values , Not what I require.
    DELETE it_source WHERE /M/SOURCE EQ tab-XSOURCE AND /M/CODENUM EQ tab-XCODENUM.
    Cube table that I want to extract from
    /M/SOURCE                             /M/CODENUM
    IND01                                       0999
    IND01                                       0888
    IND01                                       0555
    IND01                                       0444
    FRF01                                      0111
    I want to only the rows where the /M/CODENUM = 0999 and 0888 and i would also need FRF101
    and the rows in bold  should be deleted.
    thanks
    Edited by: Bhat Vaidya on Jun 17, 2010 12:38 PM

    It's obvious why it deletes all the records. Debug & get your answer i wont spoon feed
    Anyways on to achieve your requirement try this code:
    DATA:
          r_srce TYPE RANGE OF char5, "Range Table for Source
          s_srce LIKE LINE OF r_srce,
          r_code TYPE RANGE OF numc04,"Range table for Code
          s_code LIKE LINE OF r_code.
    s_srce-sign = s_code-sign = 'I'.
    s_srce-option = s_code-option = 'EQ'.
    * Populate the range tables using /M/PGICTABLE
    LOOP AT itab INTO wa.
      s_code-low = wa1-code.
      s_srce-low = wa1-srce.
      APPEND: s_code TO r_code,
              s_srce TO r_srce.
    ENDLOOP.
    DELETE ADJACENT DUPLICATES FROM:
    r_code COMPARING ALL FIELDS,
    r_srce COMPARING ALL FIELDS.
    * Delete from Cube
    DELETE it_source WHERE srce IN r_srce AND code IN r_code.

  • How to transfer changes in my Vendor Master Data ? (currency)

    Hello everybody,
    I replicated my vendors with bbpgetvd and everything was fine.    Now, my vendor master data  has changed, some vendors were modified in their currency field.   
    If I run again the transaction "bbpgetvd", the program doesnt identify these changes  =(
    What can I do in order to update this data ?
    Thanks and regards,
    Diego

    Hi guys,
    Thanks so much for your help !  I tried bbpupdvd and it worked fine !  
    thank you !!!
    Regards from Mexico,
    Diego

  • Abap logic not fetching multiple rows from master data table

    Hi
    I just noticed that my logic is fetching only 1 row from master data table.
    ProdHier table
    PRODHIERACHY            Level
    1000                                  1
    1000011000                      2
    10000110003333              3
    10000110004444              3
    '10000110005555              3*
    logic only fetches one row of level 3, I would like to fetch all level 3 rows.
    DATA: ITAB type table of /BI0/PPROD_HIER,
          wa like line of ITAB.
    Select * from /BI0/PPROD_HIER INTO wa where /BIC/ZPRODHTAS = 3.
    IF wa-PROD_HIER(10) = SOURCE_FIELDS-PRODH2.
         RESULT = wa-PROD_HIER.
         ELSEIF wa-PROD_HIER(5) = SOURCE_FIELDS-PRODH1.
         RESULT = wa-PROD_HIER.
    ENDIF.
    ENDSELECT.
    thanks

    Hi,,
    I have implemented the logic in end routine and it still reads only the first row.
    I am loading only PRODH1 and PROD2 but now I want to get all values of PRODH3 from the master data table.
    The first 5 values are PRODH1 and first 10 values belongs to PRODH2.
    Whenever PRODH2 = 1000011000 in source I should get the following values
    10000110001110
    10000110001120
    10000110001130
    I have multiple rows of 1000011000 so my result should be
      1000011000               10000110001110
      1000011000               10000110001120
      1000011000               10000110001130
    DATA: ITAB type table of /BI0/PPROD_HIER,
    wa like line of ITAB.
    data rp type _ty_s_TG_1.
    Select  * from /BI0/PPROD_HIER INTO table itab where /BIC/ZPRODHTAS = 3.
    LOOP AT RESULT_PACKAGE INTO rp.
    read table itab into wa with key PROD_HIER(5) = rp-PRODH1.
    IF sy-subrc EQ 0.
         rp-PRODH3 = wa-PROD_HIER.
         ELSE.
    read table itab into wa with key PROD_HIER(10) = rp-PRODH2.
    IF sy-subrc EQ 0.
         rp-PRODH3 = wa-PROD_HIER.
    ENDIF.
    ENDIF.
    MODIFY RESULT_PACKAGE FROM rp.
    ENDLOOP.
    Edited by: Bhat Vaidya on Sep 10, 2010 11:27 AM
    Edited by: Bhat Vaidya on Sep 10, 2010 11:37 AM

  • How to do lookup to master data in self transformation

    Hi Experts,
    Could any one please help me  on this
    i have created a self transformation for 0TCTUSERNM in the transformation i need to write an end routine to look up to charecterstic ABC in order to populate its attributes to 0TCTUSERNM
    Thanks In Advance,
    Nitya

    Hi Experts,
    Could anyone please help on this
    there is a charecterstic ABC which has all the master data its key field is only one that is user
    from ABC we are trying to load data to 0TCTUSERNM where it has two key fields SISID and TCTUSER
    we have mapped from ABC keyfield user to TCTUSER of 0TCTUSERNM
    requirement given is
    build a transformation from ABC to 0TCTUSERNM
    secondly build aself transformation for 0TCTUSERNM where it has to look up to ABC to get all its attributes to 0TCTUSERNM
    Note : ABC is having all the attributed filled  for all user
    where as 0TCTUSERNM couild have several records of same user from different source systems
    ex : in ABC we have an user called TOM , in 0TCTUSERNM we could have 5 records for TOM                                                                               
    like TOM Source System 1
                                                                                    TOM Source system 2
                                                                                    TOM source system 3   etc
    So initially when we do a transformation from ABC to 0TCTUSERNM
    we can only get one record updated
    but wht about the attributes of other 4 records of TOM
    for that we want to do self transformation for 0TCTUSERNM  where we want to look up to ABC to fill all records
    Please tell me if i am not clear
    Please help
    Thanks in Advance
    Nitya

  • Organizational structure determination on BP master data

    Hello,
    I need to assign the ERP sales organizational data to a CRM customer (BP) when created in CRM. this is the scenario:
    the user is assigned to a position. the position is also assigned to a org. element with sales organization, distribution channel, division and sales office.
    when the user creates a new customer in CRM, the sales area and sales office have to be the ones assigned to the user as explained above.
    Basically I need something similar as happens in transactions.
    can I make this by customizing ? do I need some development ?
    Regards
    Roberto

    Hi,
    You should write code that would pull the required data from T77OMATTR  into the fields of the sales office and sales group. The code should be written in a way that the database records are filtered on the basis of the current user.  If a user has been assigned to several org units, then  all the records pulled could appear in the input help from which one could be chosen.
    Edited by: Reba Verghese on Jun 13, 2010 9:44 PM

  • Master data error while loading data

    Dear Masters
    Im getting below error while loading delta update for master data(Attr) through infopackage to only PSA. Im in BI 7.0
    When I checked Error message, Im getting 2 errors. They are
    1) Update mode R is not supported by the extraction API
    2) Error in source system.
    I checked source system is fine. im getting data also in RSA3
    Error message from the source system
    Diagnosis
    An error occurred in the source system.
    System Response
    Caller 09 contains an error message.
    Further analysis:
    The error occurred in Service API .
    Refer to the error message.
    Procedure
    How you remove the error depends on the error message.
    Note
    If the source system is a Client Workstation, then it is possible that the file that you wanted to load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the
    Could you please help me to resolve this issue.

    Process chains might fail if the data source is not in sync between the source and the BW system. In such a case, go to the 'Source Systems' in transaction RSA1, find the datasource, right click and select 'Replicate Datasources'. Once this is done, the transfer structure needs to be activated as well. To do this in production, run the report RS_TRANSTRU_ACTIVATE_ALL with the source system name and the infosource name. This will activate the transfer structure.
    NOTE: For master data using direct upload the infosource name is the name of the info object itself.

  • Characteristics - Master data Tab- infosource with direct update

    Dear Experts,
    Please explain me in Laymans Language as i am new to BI (but am an ABAP consultant).
    In Info Object Characteristics,  Master data Tab there is infosource with direct update with an application Component.
    1. Now my doubt is,  when i specify the Application Component, this infosource would appear under the application component in Modeling->Infosources.
    Then on this infosource i would right click and create the info pakage and then data transfer process.
    is it correct?
    2. Suppose for this characteristic if i dont specify the infosource with direct update  then it wont appear in the infosource, so how can i load data in this case?
    Regards
    BI Learner

    Hi,
    Using an InfoSource with direct updating, master data (characteristics with attributes, texts, or hierarchies) of an InfoObject can be updated directly (without update rules, only using transfer rules) into the master data table. To do this you must assign it an application component. The system displays the characteristic in the InfoSource tree in the Data Warehousing Workbench. You can assign DataSources and source systems to the characteristic from there. You can then also load master data, texts, and hierarchies for the characteristic.
    You cannot use an InfoObject as an InfoSource with direct updating if:
    i. The characteristic you want to modify is characteristic 0SOURSYSTEM (source system ID).
    ii.  The characteristic has neither master data nor texts nor hierarchies. It is therefore impossible to load data for the characteristic.
    iii.  The characteristic that you want to modify turns out not to be a characteristic, but a unit or a key figure.
    To generate an export DataSource for a characteristic, the characteristic must also be an InfoSource with direct updating.
    1. *Now my doubt is, when i specify the Application Component, this infosource would appear under the application component in Modeling->Infosources.
    Then on this infosource i would right click and create the info pakage and then data transfer process.*
    The characteristic will be available in info source tree, you can assign a datasource->assignsource system->create package->load data.
    2. Suppose for this characteristic if i dont specify the infosource with direct update then it wont appear in the infosource, so how can i load data in this case?
    For a characteristic If you dont specify 'InfoSources with Direct Updating' then it wont appear in infosource tree.

  • What are the master Data in SAP SD

    Hi All,
    I have a question. Even though its a naive question please bear with me.
    what are the master data in SAP SD.
    These are the one's  i know.
    1)   Customer Master Data
    2)   Material  Master Data
    But what i want to know is does the following ones come under master data?
    1) Price master data
    2) Transporters master data(we are using transportation management)
    3) Route master data
    4) Customer material info records
    5) Credit master data(iam using credit management)
    What will be the master data for Transportation management(module), CIN(country India version), credit management & rebate processing
    Thanks for your valuable time & knowledge.
    Regards,
    Pavan

    Hi Pavan,
                  Just to add some more valuable information.
    <b>Organisational master data</b> : Every company is structured in a certain way. In order to work with the SAP System your company structure has to be represented in the system. This is done with the help of various organizational structures.
    <b>Condition master data</b> : In sales and distribution, products are sold or sent to business partners or services are performed for them. Data about the products and services as well as about the business partners is the basis for sales processing. Sales processing with the SAP R/3 System requires that the master data has been stored in the system.
    <b>Material master data</b> : In addition to sales and distribution, other departments of the company such as accounting or materials management access the master data. The material master data is stored in a specific structure in order to allow access from these different views.
    <b>Document master data</b> : The processing of business transactions in sales and distribution is based on the master data. In the SAP R/3 System, business transaction are stored in the form of documents. These sales and distribution documents are structured according to certain criteria so that all necessary information in the document is stored in a systematic way.
    <b>Customer Master Data</b> : or for that matter various business partners, we maintain data for them also called as the customer master data or vendor master data like that.
    Thanks & Regards
    Sadhu Kishore

  • Changes to Master Data

    Hi All
    Which are the things need to be considered while choosing between following two options to transfer the changes to the master data from R/3 to APO.
    1.Transfer of Data Changes Using ALE Change Pointers
    2. Online Transfer Using Business Transfer Events (BTEs)
    Rather what are the pros and cons of above two methods?
    Regards
    Pravin

    Hi Pravin,
    1) When you are using change pointers for master data transfer, some defined frequency to be set for integration model generation jobs.  You need to ensure that the change pointers are transferred and updated to the target system for all relevant planning elements either manually or through by setting up background jobs in system (c5 transaction).  If change pointers did not update properly, system performance will get affected.  For the elements, where change is not required, those entities has to be deleted by deleting the change pointer entries corresponding to that.
    2) In BTE, you create areas with the same naming conventions. These guarantee that the same names are used for master data, and also their synchronization in distributed system landscapes.
    Here we determine the assignment to a business system group of the APO System and the SAP R/3 System to be connected. 
    Regards
    R. Senthil Mareeswaran.

  • Transformation of transactional and master data (in BW/BI or PI/XI?)

    Dear all,
    I have the following requirement:
    I frequently transfer transactional as well as master data between various systems. From what I heard, SAP XI/PI is a plattform to handle the transformation and distribution of such transactional and master data. Unfortunately, the data quality is often not sufficient.
    As a consequence, some records need to be manually modified/completed. Let's assume order data needs to be exchanged and an order is characterized by a customer. When the order record contains a faulty CustomerID that does not exist, the CustomerID in the order has to be manually adjusted by a person. And this would require some user interface.
    Could I implement something like this in SAP XI/PI? Or would I require another SAP solution?
    Some of my relevant applications are in SAP BI/BW. There, I could implement something like this using an ODS for the "problematic" records and some maintenance views. Would you rather suggestion such a solution in BI/BW. But what would I do with the transformation between non-SAP systems?
    Best regards, Daniel

    Hi Daniel,
    I prefer JAVA mapping.
    There is lot you can do with message mapping (the graphical mapping editor):
    http://help.sap.com/saphelp_nw04/helpdata/en/49/1ebc6111ea2f45a9946c702b685299/frameset.htm
    with the predefined standard functions:
    http://help.sap.com/saphelp_nw04/helpdata/en/43/c4cdfc334824478090739c04c4a249/frameset.htm
    If this is not enough you can write your own so called user defined functions in JAVA:
    http://help.sap.com/saphelp_nw04/helpdata/en/22/e127f28b572243b4324879c6bf05a0/frameset.htm
    Also helpful:
    Message Mapping Simplified - Part I & II
    /people/sravya.talanki2/blog/2005/08/16/message-mapping-simplified--part-i
    /people/sravya.talanki2/blog/2005/12/08/message-mapping-simplified-150-part-ii
    Mapping Functionality in XI:
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/9202d890-0201-0010-1588-adb5e89a6638
    Regards
    Patrick

  • Master Data Attr

    Hi All,
    We need to have attribute of a master data in a DSO. For that we need to go for either Lookup from master data (End Routine) or we can have mapping of 'Read Master Data' in transformation.
    Please suggest which one will be good option in terms of performance.
    Thanks in advance,
    Tony

    Hi,
    The Read master data is field level mapping so, it has to update record by record.
    Where in the END Routine you can take all the records into an internal table at one shot and then you can update to DSO.
    So, i think the END Routine is better option on LOAD PERFORMANCE. but you have to make sure that no much logics or IF or Else or While conditions involved in the END Routine as these conditions may cause the performance.
    and also While loop can take much time.
    And also please make simple on the END Routine part.
    If the End routine is simple, then this will be better in performance aspects.
    But the Read master data is an User friendly option which can easily opt..
    hope you understood. .
    Regards,
    Ravi Kanth

  • How to configure CVC automation process using SNP master data and BI customer master data

    Hi Gurus,
    How to configure CVC automation in MPOS structure technically, using SNP master data in CVC creation such as Product and location details
    and in parallel extract date from ECC to BI customer master data in DP. Where-in a APO BI cube should have only new combinations which are to be validated with ECC and existing combinations in the MPOS structure before creation of CVC in different regions of MPOS.
    Second the automation should also check certain validations of product, location and are part of SNP-ECC master data and are these new combinations.
    Could someone guide us in this process.
    Thanks
    Kumar

    Praveen,
    The short answer is to move the data into infoproviders or flat files for all of the 'characteristic-type' source data; and then copy CVCs from the infoproviders into the Planning area, probably using one or more Process Chains (program /SAPAPO/TS_PLOB_MAINTAIN).
    The long answer is that this is not a trivial undertaking; this could end up being a pretty involved solution.  If you do not have enough BW/DP expertise available locally to create this solution for you, then I recommend you consider engaging external resources to assist you.  I personally wouldn't even consider starting to work on such a solution without first knowing a lot more about the detailed business requirements, and about any existing solutions that may already in place.  An SCN forum is not really suitable for such an answer.  In my opinion, the BBP doc alone would be 20+ pages, assuming no enhancements..
    Best Regards,
    DB49

  • Master Data Sequence load

    Hi guys i 've load data to an infocube and i have a lot blank records, i've read that this is because the master data not was loaded or loaded after transactional data, but my infocube have a lot of characteristics that have master data, is necesary to load every one before load tha trasactional data or there´s a way to load all the master data in one step instead of one by one?
    I hope that somebody could help with this
    Regards...

    Hi Oscar,
    It is recommended that you load the master data before the trsancational data so that the data cna be activated i.e. Sids are created for the char values and ready to use at the time of transactional data load. Are you doing a lot of master data look ups in your update or transfer rules, or laoding the master data attributes to the cube? If not, then you really should not have "blank" values.
    You can go ahead and load the master data after the transactional data too.
    You do not have to load all the chars each time. You can set up the delta for the master data where possible. For others, you can judge whether they have to be laoded everyday or may be once a week or once a month.
    Hope this helps...

Maybe you are looking for