Order of loading Master Data - Fact or Fiction

I understand that for loading Master Data for InfoCube 0FIAA_C01 (or any other) you should load starting from the lowest level.
That means for every characteristic in the cube you have to check and see if any of the InfoObjects have Master Data attributes, and if any of those attributes have attributes, and so on. This quickly becomes a multi-level structure.
Part of the tree structure for 0FIAA_C01 would look like:
0FIAA_C01                                     
..........0COMP_CODE                              
....................0CHRT_ACCTS                           
....................0C_CTR_AREA                             
..........0ASSET_AFAB                                
..........0ASSET                              
....................0ACTTYPE                        
....................0BUS_AREA 
<snip>
So does that mean that 0bus_area should be loaded first before 0asset?
Is this fact or fiction?
If its a fact I am wondering what tools SAP has for determining the order of loading Master Data.
Discussion points and tools for facts awarded!
Mike
Edited by: Michael Hill on Feb 12, 2008 4:52 PM

Hi,
My master data loads are largely in the area of HR.
The only order I follow while loading master data is for a particular infoobject with regard to text, attributes and hierarchy - The order being text>>attributes >>hirerachy.  Frankly, I have not checked doing it otherwise.
Across different master data infoobjects I see no need to have any order atleast in HR.  Generally speaking a master data object has data that has an independant existence as extracted from R/3 or other sources and not derived from any other master data object in BW.
Master data as its name implies should not have referential integrity checks with other master data. 
It would be good to know if someone has real experience to the contrary.
Mathew.

Similar Messages

  • Issue while loading Master Data through Process Chain in Production

    Hi All,
    We are getting an error in Process chain while loading Master Data
    Non-updated Idocs found in Source System
    Diagnosis
    IDocs were found in the ALE inbox for Source System that are not updated.
    Processing is overdue.
    Error correction:
    Attempt to process the IDocs manually. You can process the IDocs manually using the Wizard or by selecting the IDocs with incorrect status and processing them manually.
    I had checked the PSA also but I could not find any record and the strange thing is, Job itself is not getting scheduled. Can any one help me out in order to resolve this issue.
    Regards
    Bhanumathi

    Hi
    This problem is not related to Process chain..
    u can try this..
    In RSMO, select the particular load you want to monitor.
    In the menu bar, Environment >>> Transact. RFC >>> Select whichever is required, BW or Source System.
    In the next screen you can select the Execute button and the IDOCS will be displayed.
    Check Note 561880 - Requests hang because IDocs are not processed.
    OR
    Transact RFC - status running Yellow for long time (Transact RFC will be enabled in Status tab in RSMO).
    Step 1: Goto Details, Status get the IDoc number,and go to BD87 in R/3,place the cursor in the RED IDoc entroes in tRFC
    queue thats under outbound processing and click on display the IDOC which is on the menu bar.
    Step 2: In the next screen click on Display tRFC calls (will take you to SM58 particular TRFC call)
    place the cursor on the particular Transaction ID and go to EDIT in the menu bar --> press 'Execute LUW'
    (Display tRFC calls (will take you to SM58 particular TRFC call) ---> select the TrasnID ---> EDIT ---> Execute LUW)
    Rather than going to SM58 and executing LUW directly it is safer to go through BD87 giving the IDOC name as it will take you
    to the particular TRFC request for that Idoc.
    OR
    Go into the JOB Overview of the Load there you should be able to find the Data Package ID.
    (For this in RSMO Screen> Environment> there is a option for Job overview.)
    This Data Package TID is Transaction ID in SM58.
    OR
    SM58 > Give * / user name or background (Aleremote) user name and execute.It will show you all the pending TRFC with
    Transaction ID.
    In the Status Text column you can see two status
    Transation Recorded and Transaction Executing
    Don't disturb, if the status is second one Transaction Executing. If the status is first one (Transation Recorded) manually
    execute the "Execute LUWs"
    OR
    Directly go to SM58 > Give * / user name or background (Aleremote) user name and execute. It will show TRFCs to be executed
    for that user. Find the particular TRFC (SM37 > Req name > TID from data packet with sysfail).select the TrasnID (SM58) --->
    EDIT ---> Execute LUW
    (from JituK)
    Hope it helps
    Darshan

  • Loading master data compounded infoobject

    Hello experts
    I want to load master data into an InfoObject from a flat file, thus i want to define this infoobject as an infoprovider
    In order to do that i use the "Insert characteristic as InfoProvider" option from the InfoArea but i get a message :
    InfoObject PA_PAPRCT is not a basis characteristic; not used as data target
    This InfoObject has a superior InfoObject defined as superior InfoObject in the compounding tab, is that the problem ?
    Thanks for your help and happy new year
    Thibault

    Dear Thibault
    Yes, PA_PAPRCT is not the basic characteristic.
    Instead of this, use the superior object for this object.
    Than you can load the master data for all these objects.
    Hope now you will get the idea
    Regards
    Saravanan.ar

  • Loading Master data using DTP's

    Hi,
            I have got a problem with loading master data,we use DTP's for loading it,but the requests were failing due to prior incorrect requests for that master data infoobject,inspite of the fact that Master data always overwrites and also the DTP's were full update enabled.
             To load successfully we need to go and delete the previous requests all the time.
    I think there must be some way of solving this problem,help from you all will be really appreciated.
    Regards,
    G.Monica Roja Flora.

    yeah you would need to delete the incorrect requests before you can load again. Though the master data is overwrite, with the incorrect request the status can never be A.

  • Load Master Data in BW

    Hi all,
    I am having one doubt while loading Master Data In BW.
    Let's say , i am loading 0Customer in BW.
    It is having Customer Attribute and Customer Text.
    Now i am having question is. Which data we have to load first? Do we need to load Attribute first and then Text or vice versa.
    Please specify the reason.
    Please suggest.
    Regards,
    Macwan James.

    Hi,
    As per my understanding,u can do it in any order since u they are going to get loaded to different tables, usually attribute is given preference over text.But master data is loaded before transaction data since it improves performance as SID's will get generated during master data load.
    Hope it helps.
    Regards,
    Rathy

  • Why we load Master data first before loading Transaction data

    Hi Experts,
    why we load Master data first before loading Transaction data, specify any reasons for that ? Is it mandatory to load MD first ?
    I will allocate points to those who help me in detail. My advance thanks who respond to my query.
    Edited by: Nagireddy Pothireddy on Mar 10, 2008 8:17 AM

    Hi Nagireddy,
    I hope this helps....
    The bottom line for building cubes it to view facts against dimensions. When i say facts these are the key-figures i.e sales volume, Sales vat etc against some characteristics like sales Area,  Cost center , plant.
    Basically charateristics are those against which key-figures are measures like Costcenter, plant, material etc.
         Dimensions are a grouping of related characteristic. So basically a cube has a central fact table with dimesions associated to it in a relational schema. Imagine now you want to view a key figure Sales Volume against a dimension plant. when you consider plant , it has a distribution channel, purchasing organisation , company code, sales area, region etc associated with it. So which form the attributes of plant and also have some or the other description (texts) and aslo hierarchy. first we load the master data and then the transaction data follows.

  • What are the tables will update while loading Master data ?

    Hello Experts,
    What are the tables will update while loading Master data ? And requesting you to provide more information about Master data loading and its related settings in the beginning of creation infoobjects. 

    It depends upon the type of Master data u r loading....
    In all the master data loadings, for every new value of master data an SID will be created in the SID table /BI*/S<INFOOBJECT NAME> irrespective of the type of master data.
    But the exceptional tables that get updated depending on the type of master data are.....
    If it is a time Independent master data then the /BI*/P<INFOOBJECT NAME> table gets updated with the loaded data.
    If it is a time dependent master data then the /BI*/Q<INFOOBJECT NAME> table gets updated with the loaded data.
    If the master data is of time Independent Navigational attributes then for every data load the SID table will get updated first and then the /BI*/X<INFOOBJECT NAME> table gets updated with the SID's created in the SID table (NOT WITH THE MASTER DATA).
    If the master data is of time dependent navigational attributes then for every data load the SID table will get updated first and then the /BI*/Y<INFOOBJECT NAME> table gets updated with the SID's created in the SID table (NOT WITH THE MASTER DATA).
    NOTE: As said above, For all the data in P, Q, T, X, Y tables the SID's will be created in the S table /BI*/S<INFOOBJECT NAME>
    NOTE: Irrespective of the time dependency or Independency the VIEW /BI*/M<INFOOBJECT NAME> defined on the top of /BI*/P<INFOOBJECT NAME> & /BI*/Q<INFOOBJECT NAME> tables gives the view of entire master data.
    NOTE: it is just a View and it is not a Table. So it will not have any physical storage of data.
    All the above tables are for ATTRIBUTES
    But when it comes to TEXTS, irrespective of the Time dependency or Independency, the /BI*/T<INFOOBJECT NAME> table gets updated (and of course the S table also).
    Naming Convention: /BIC/*<InfoObject Name> or /BI0/*<InfoObject Name>
    C = Customer Defined Characteristic
    0 = Standard or SAP defined Characteristic
    * = P, Q, T, X,Y, S (depending on the above said conditions)
    Thanks & regards
    Sasidhar

  • Loading Master Data from a Flat File

    Hi Guru, I have flat files of different SAP tables (Transactional data tables and Master Data ), I want to upload this flat files to a DSO, my question is in the case of Master Data Files example SAP table USR03 ( User Address Data) in a Flat file format which is better to create a Master data text and attributes Datasource or create it as a transactional data Datasource to upload it to a DSO.

    Hi
    Master data datasource should be used to load master data like customer details. customer data is the master data which does not change frequently. customer infoobject may have some attributes like customer name, customer contact number, customer address, etc. we have to maintain these attributes and text data.
    Transaction data is something which keeps changing very frequently like price of the material, sales, etc. for loading such data we go for transaction datasources.
    As mentioned above too,the changes in the address are very rare, therefore youshould not use transactional datasource.
    Regards, Rahul
    Edited by: Rahul Pant on Feb 7, 2011 12:24 PM

  • Error while loading master data to attribute ZMAJORGU(Major organisational unit)

    Hi Experts,
    While loading master data to  ZMAJORGU(MOU)(info object texts) from  datasource (ZOS_HRP1000)datasource iam getting error in DTP.
    below is the error...
    Diagnosis
        Data record 349 & with the key 'Ship &' is invalid in value 'Ship &' of
        the attribute/characteristic ZMAJORGU &.
    System Response
        The system has recognized that the value mentioned above is invalid, and
        has processed this general error message. A subsequent message may give
        you more information on the error. This message refers to the same
        value, even though it does not state this explicitly.
    Procedure
        If this message appears during a data load, maintain the attribute in
        the PSA maintenance screens. If this message appears in the master data
        maintenance screens, leave the transaction and call it again. This
        allows you to maintain your master data.
    Please find the below screen shots
    I have done the following regarding the issue ....Looking at the error i have gone trough the transformation and i didnt find any routines or formuales
    I have checked the PSA and down loaded the data to spread sheet and put a filter on SHORT and STEXT assuming i will find the value
    'Ship &' as mentioned in the Diagnosis description. However i was not able to find the relavent data(i went to se16 gave teh request ID from PSA and displayed the entire data)tarting with Ship .The closest  value i could find in both the fileds was was AVP Shipment .
    I have the following questions
    1)In the screen short 2 it says record number 349 is it right to go to PSA table and check against the record number 349(i tried that but i was not able to find the error against that record)
    2)why am i not finding the value 'Ship' in PSA ...but this is the value wich is creating a poblem in updating to the target ....does it mean the record value is Ship or it starts with Ship or is ir part of some word?
    Please find the attached PSA data file (u have to convert it to Excel)
    I request people to help me analyse the issue and fix the issue.
    Regards
    Kiran

    Ram
    I checked other records as well they were a lot of smallcharecters upper case charectres ,numeric special charecters .but nothing stopped ...when the system is even accpeting all other charectres  like smallof caps or numeric... why on this record is creating an issue...
    please have a look at error stack
    and i have one more question when i look at PSA there are close to 300000 records
    but when i do an SE16 with request id i see only close to one lakh records (i came to conclusion b dragging down and the last record number was (below screen shot)
    Can you tell me what my mistake is numebr or records are not tallying in SE16 and PSA..

  • Lock NOT set for: Loading master data attributes error

    Hi experts,
    We were encountering this error before when trying to load master data.  When we checked the system we could not find any locks at the time, and activation or kicking off the attribute change run failed again.  We finally solved the problem running FM RSDDS_AGGR_MOD_CLOSE which sets the close flag to 'X' in table RSDDAGGRMODSTATE.  I have read that it is possible this lock error happens when two change runs happen at the same time.
    My question are:
    1. is it possible to find out what process exactly "caused" the lock? the table RSDDAGGRMODSTATE does not have a reference to any object or job. I am curious as we are trying to find ways to avoid this in the future...
    2. in our case, when we could not find any locks, is running this fm the only work around? is this a best practice?
    mark
    Message was edited by:
            Mark Siongco
    Message was edited by:
            Mark Siongco

    Hello Catherine
    I have had this problem in the past (3.0B) --> the reason is that our system was too slow and could not crunch the data fast enough, therefore packets where loacking each other.
    The fix: load the data into the PSA only, and then send it in background from the PSA to the info object. By doing this, only a background process will run, therefore locks cannot happen.
    Fix#2: by a faster server (by faster, I mean more CPU power)
    Now, maybe you have another issue with NW2004s, this was only my 2 cents quick thought
    Good luck!
    Ioan

  • Error while loading Master Data - BPC 10.0 NW

    Hello gurus,
    I am trying to load master data from a BW InfoObject to a dimension in BPC, but am getting the error as shown below:
    The transformation and conversion files are configured as shown below:
    I would greatly appreciate if you can help me troubleshoot the issue with the java script command.  Thanks in advance for your help.
    Best regards,
    Vijay

    Hello Vijay,
    Which Patch are you using for EPM add in.
    It seems that js:%external%.replace("#","_") does not work with some patches
    so in that case you have to use js:%external%.toString().replace("#","_") in your conversion file.
    Deepesh

  • Error while loading master data.

    I am facing error while loading master data in a infoobject.
    The error comes when I run DTP fo rloading data from PSA to infoobject.
    The error wordings are : "MCK_PRDT1 : Data record 1 ('Bleach '): Version 'Bleach ' is not valid"
    Please help me in understanding this Version errror and suggest a solution for this.

    SAP metodology integration MDM & BW you can look at:
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/b0f1f135-0015-2b10-5e83-acca118e17aa
    You can turn matching strategy in MDM Data manager for delete dublicate of records
    and then use MDM Syndicator for export data to file and then load that file to BW
    If you use OHD you cannot load from BW to MDM repository 10 records and unload only 5 because
    in BW every record has unique key.
    also you can use MDM ABAP or Java API for delete dublicate of records
    Edited by: Kanstantsin Chernichenka on May 8, 2009 1:02 PM

  • Error 194 while loading Master Data.

    Hi,
         I've a custom Info Object which has 0MATERIAL as an attribute. While trying to load master data to this custom info object using DBConnect , I'm getting the error:
    RSDMD 194
    0MATERIAL: Data record 1('0001003') : Version '1009841' is not valid.
    I think it's a problem with conversion exit. But I'm not able to solve this problem. It would be of great help if you could guide me on how to overcome this.
    Thanks in advance.
    Regards
    Hari

    Hari,
        I'm thinking for DB Connected datasources we have Propose Tab Page and Fileds Tab Page. In the Propose tab Page you can see data type and Format(internal or External or check). select external, put data type of your Material Infoobject Data type in the Propose Tab Page.
    all the best.
    Note: I'm thinking DB Connected Datasources got Propose Tab page. Please correct me if i'm wrong.
    Regards,
    Nagesh Ganisetti.
    *Assign Points if it helps

  • BI 7.0 - Duplicate Record Error while loading master data

    I am working on BI 7.0 and I am trying to load master data to an info object.
    I created an Infopackage and loaded into PSA.
    I created transformation and DTP and I get an error after I execute the DTP about duplicate records.
    I have read all previous threads about duplicate record error while loading master data and most of the them suggested to check 'Ignore duplicate records' option in the infopackage. But in 7.0, I can only load to PSA with infopackage and it doesn't have any option for me to ignore duplicate records.
    My data is getting loaded to PSA fine and I get this error while loading to info object using DTP.
    I would appreciate your help to resolve this issue.
    Regards,
    Ram.

    Hi,
    Refer:
    http://help.sap.com/saphelp_nw2004s/helpdata/en/45/2a8430131e03c3e10000000a1553f6/frameset.htm
    http://help.sap.com/saphelp_nw2004s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
    With rgds,
    Anil Kumar Sharma .P

  • Error while loading master data from R/3

    Hi Friends,
                I'm just trying to load master data for the infoobject of type "0BUS_AREA", but i have got data into PSA, but not got into char, i have got an error stating that
    Record 1 :No source system ID exists for source system (logical system) RD1CLNT100
    In tran structure there are two objects not being populated 0LOGSYS & OSOURCESYSTEM, which might cause the problem.
    Pls waiting for your replays
    Thanx in advance
    Balaji K.

    Hi Aswin,
               Thanx for your replay, in that one column is there, "source system id" but it is empty, how to generate that two digit source system id and also there are three kind of tabs like
    "Release IDs that are in use" "Suggest Source System IDs" & "Maintain Logical Systems"
             How to use and which one to use, and how to follow step by step to run loads once again by populating "OLOGSYS" & "OSOURCESYSTEM" ?
    Pls clarify this problem
    Regards.,
    Balaji K.

Maybe you are looking for

  • Crystal Report 2008 DataSource binding

    Hi, I have VS2008 webservice which is used to generate report. I am using CR2008 version of crystal report. While creating the report template I specified the development database and now going forward I am trying to set the database source through c

  • Error while running 10g web services tutorial

    I am trying to learn and evaluate JDev 10g. I followed the web services tutorial for 10g. In the end when I tried to run the example I get the following error. Any help will be appreciated. Thanks. PS: I am also seeing this error in the message pane

  • Windows 7 on iMac. Problem with wifi-connection.

    Hi! I recently installed windows 7 on my 27" iMac (latest model). Everything works great besides that the wifi is much slower when i run windows then when i run os x. When i connect directly with a cable in windows the speed is as it should be. Figur

  • Issues with texlive update from 2013 to 2014 version.

    I got an error about conflicting files: Packages (2): texlive-bin-2014.34260-1 texlive-core-2014.34872-1 Total Installed Size: 350.75 MiB Net Upgrade Size: 34.54 MiB :: Proceed with installation? [Y/n] (2/2) checking keys in keyring [##############]

  • Process chains resolution....

    my process chain got stuck ,using my proces chain i found technical name for my info source and using this i found data source and inside data source i found info package for that particular infosource.. In RSA1 >in that datasource rightclick>manage