BIA Dummy Cube to load master data

Hi everyone,
We've initiated a project to implement BIA and the racks will arrive in the next few weeks. Something I've heard discussed, but not found documented, is that some companies built a "dummy cube" consisting of all the master data involved in the cubes to be loaded to BIA. Apparently, this is to avoid the potential for locking the master data indexes as multiple cubes are indexed in parallel. Having the master data indexed in advance of indexing several cubes in parallel is apparently much faster, too.
See "Competing Processes During Indexing"
[Activating and Filling SAP NetWeaver BI Accelerator Indexes|http://help.sap.com/saphelp_nw2004s/helpdata/en/43/5391420f87a970e10000000a155106/content.htm]
My questions are: Is this master data "dummy cube" approach documented somewhere? Is this only for the initial build, or is this used for ongoing index rebuilds such that new master data objects are consistently added to the dummy cube? Is this the right approach to avoid master data index locking job delays/restarts, or is there a better/standard approach to index all master data prior to indexing the cubes?
Thanks for any insight!
Doug Maltby

Hi Doug - I'm not aware of this approach documented anywhere. Personally, I'm not sure a "dummy" cube buys you much. The reason I say that is because this "dummy" cube would only be used upon initial indexing. The amount of time to construct this cube, process chain(s), etc. would be close to the equivalent time to do the indexing. The amount of time it takes to do the initial build of the indexes depends on data volumes. From what I've seen in the field this could vary on average from 4-8 hours.
Locking is a possibility, however, I don't believe this is very prevalent. One of the most important pieces to scheduling the initial builds is timing. You don't want to be loading data to cubes or executing change runs when this takes place. In the event locking does occur, that index build can simply be restarted. Because a lock takes place, it does not mean all of your indexes will fail. The lock may cause a single index build to fail. Reviewing the logs in SM37 or the status of the infocube index in RSDDV will also show the current status. Simply restart any that have failed.
Hope this helps.
Josh

Similar Messages

  • What if i load transaction data without loading master data

    Hello experts,
    What are the consequences if i load transaction data without loading master data?Are there any other factors except the load performance because of SID generations etc and inconsistencies.
    <b>What kind of potential inconsistencies will occur?</b>
    Problem here is:
    when the transaction load starts a new master data such as employee(x) would be have been created in R/3 which does not exist in BW and hence transaction load fails.
    Thanks and Regards
    Uma Srinivasa rao

    Hi Rao,
    In case you load the master data after loading the transcation data and if there is any look up at the master data in teh update rules, then you can delete and reconstruct the requests in the ODS/Cube so that the latest master data is pulled in in the data target.
    Make sure you do the Apply hier/attr change before doing the delete and reconstruct.
    Bye
    Dinesh

  • Loading master data from SD to BW

    Hi Friends
    does anyone have steps for loading master data of SD to BW. I have loaded transaction data, need to load the customer and material master data.
    let me explain in more detail. i have extracted the data from a table in R/3 and it doesnt have the material and customer description, so the cube also wont have the same. the mat description is a table in R/3 called MAKT. how do i go about getting that material description into the cube. i hope this is clear enough and if you have any queries post so i can reply.
    I am working on LO extraction. As one of the contributors suggested, I need to do an enhancement, is it in the form of an ABAP routine or can it be done by adding a field to the cube. Pls suggest.
    Regards
    RP
    Message was edited by:
            rp

    hi,
       first thing which shouold be kept in mind is that master data should be loaded first than the transactioenal data should be loaded so as to reduvce time.
    the steps ara
    1.replicate the data source to bw by going to source system and chosing replicata datasoources.
    2.actvate the data sources.
    3.now create transformation between  data source and target and then create infopackage and schedule.
    pls assign points if helfuyl.

  • Order of loading Master Data - Fact or Fiction

    I understand that for loading Master Data for InfoCube 0FIAA_C01 (or any other) you should load starting from the lowest level.
    That means for every characteristic in the cube you have to check and see if any of the InfoObjects have Master Data attributes, and if any of those attributes have attributes, and so on. This quickly becomes a multi-level structure.
    Part of the tree structure for 0FIAA_C01 would look like:
    0FIAA_C01                                     
    ..........0COMP_CODE                              
    ....................0CHRT_ACCTS                           
    ....................0C_CTR_AREA                             
    ..........0ASSET_AFAB                                
    ..........0ASSET                              
    ....................0ACTTYPE                        
    ....................0BUS_AREA 
    <snip>
    So does that mean that 0bus_area should be loaded first before 0asset?
    Is this fact or fiction?
    If its a fact I am wondering what tools SAP has for determining the order of loading Master Data.
    Discussion points and tools for facts awarded!
    Mike
    Edited by: Michael Hill on Feb 12, 2008 4:52 PM

    Hi,
    My master data loads are largely in the area of HR.
    The only order I follow while loading master data is for a particular infoobject with regard to text, attributes and hierarchy - The order being text>>attributes >>hirerachy.  Frankly, I have not checked doing it otherwise.
    Across different master data infoobjects I see no need to have any order atleast in HR.  Generally speaking a master data object has data that has an independant existence as extracted from R/3 or other sources and not derived from any other master data object in BW.
    Master data as its name implies should not have referential integrity checks with other master data. 
    It would be good to know if someone has real experience to the contrary.
    Mathew.

  • Load master data from BI 7.0 to BPC NW 7.0

    From BPC 7.0 NW, SAP has standard package to load transaction data from BI cube to BPC cube, However, it dose not have the standard package to load master data from BI master data to BPC master data(dimension),
    Is this mean that we have to load the master data from BI to BPC through flat file? or we  can create some customer package(process chain) to load it?

    Hi JW,
    You can automate the master data load from BW if that is what you are trying to accomplish. Have you seen this guide: https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/00380440-010b-2c10-70a1-e0b431255827 ?
    Additionally you can also develop a custom process type and use it in a custom process chain alongwith a data manager package to load master data if you don't want to use any flat files at all.
    Regards
    Pravin

  • How to load master data in sap bpc 7.0

    Hi ,
      how to load master data in sap bpc 7.0?plz give me the steps also?
    Thank u

    Hello Devi,
    There are three ways you can load Master Data in BPC
    1) Copy and paste.
    Download master data in flat file .Copy the  master data and paste in members of dimension after that process the dimension.
    2) Using SISS Pakage - load Flat File into BPC-Cube
    3) Using SQL Command.
    Thanks.
    With regards,
    Anand Kumar

  • BPC75NW: Error Loading Master data for BPC Dimensions from BW InfoObject

    Hello Gurus,
    I'm trying to load master data for BPC Dimensions from BW Infoobjects.
    The ID thats used in BW is 32 char long. When I run the load, the ID is truncated after 20 chars and  the records are reported as duplicates.
    Due to this the load is failing.
    I cannot use any other ID as the texts will not be loaded in that case.
    Is there any work around to handle this?
    I cannot load the transaction data either.
    I looked at some posts and blogs in sdn, but nothing really helped.
    Cube - 0RPM_c05 (Financial Planning cube in SAP PPM)
    Version: BPC 7.5 NW SP5
    Thanks,
    Vasu

    Thanks Everyone.
    " you can write a transformation file and give a new name to those IDs who have values more than 20 characters."
    Poonam - Could you explain more?
    I tried using a additional Dimension property to hold the 32 char ID. But I cannot access this in the transformation.
    Is there a way to refer the dimension property in the Transformation file or in the UJD_ROUTINE BADI implementation?
    Thanks,
    Vasu

  • Loading Master data into BPS through layouts

    Hi SAP Gurus,
    Is it possible to load the master data into BI-BPS through layouts(in BI 7.0
    Thanks in advance.
    Regards
    Pradeep

    Hi Pradeep,
    if its combination of charactertics & keyfigures, then you need to load master data in Info objects(Characterstics) and for key figures you can load directly in the layout, once you save it , these values get saved back in Cube.
    Hope this helps
    Regards
    Imran

  • Loading Master Data to an InfoObject

    All,
    When loading a info provider like an ODS or Cube, there is a 'manage' option where you can see the requests that have loaded and that have failed. Where is the equivalent feature for loads to an InfoObject?  Is there one?
    Thanks in advance...

    hi nat,
    go to administration work bench (rsa1) go to info objects in context menu u can find maintain master data
    click on that u can see the master data of that infoobject.
    As master data is important we can't delete with requests
    as in ods or cube.usually developer will wont have authorization. we can edit the masterdata by reloading it
    again.
    Once u loaded master data using info package, in the monitor tab (rsmo ) u can see the status of the dataload
    green indicates successfully loaded or red indicates data not loaded properly
    hopes this clears ur doubt
    let me know
    <b>Assigning pts is the way of saying thansk in sdn</b>
    cheers
    ravi
    regards
    ravi

  • Why we load Master data first before loading Transaction data

    Hi Experts,
    why we load Master data first before loading Transaction data, specify any reasons for that ? Is it mandatory to load MD first ?
    I will allocate points to those who help me in detail. My advance thanks who respond to my query.
    Edited by: Nagireddy Pothireddy on Mar 10, 2008 8:17 AM

    Hi Nagireddy,
    I hope this helps....
    The bottom line for building cubes it to view facts against dimensions. When i say facts these are the key-figures i.e sales volume, Sales vat etc against some characteristics like sales Area,  Cost center , plant.
    Basically charateristics are those against which key-figures are measures like Costcenter, plant, material etc.
         Dimensions are a grouping of related characteristic. So basically a cube has a central fact table with dimesions associated to it in a relational schema. Imagine now you want to view a key figure Sales Volume against a dimension plant. when you consider plant , it has a distribution channel, purchasing organisation , company code, sales area, region etc associated with it. So which form the attributes of plant and also have some or the other description (texts) and aslo hierarchy. first we load the master data and then the transaction data follows.

  • What are the tables will update while loading Master data ?

    Hello Experts,
    What are the tables will update while loading Master data ? And requesting you to provide more information about Master data loading and its related settings in the beginning of creation infoobjects. 

    It depends upon the type of Master data u r loading....
    In all the master data loadings, for every new value of master data an SID will be created in the SID table /BI*/S<INFOOBJECT NAME> irrespective of the type of master data.
    But the exceptional tables that get updated depending on the type of master data are.....
    If it is a time Independent master data then the /BI*/P<INFOOBJECT NAME> table gets updated with the loaded data.
    If it is a time dependent master data then the /BI*/Q<INFOOBJECT NAME> table gets updated with the loaded data.
    If the master data is of time Independent Navigational attributes then for every data load the SID table will get updated first and then the /BI*/X<INFOOBJECT NAME> table gets updated with the SID's created in the SID table (NOT WITH THE MASTER DATA).
    If the master data is of time dependent navigational attributes then for every data load the SID table will get updated first and then the /BI*/Y<INFOOBJECT NAME> table gets updated with the SID's created in the SID table (NOT WITH THE MASTER DATA).
    NOTE: As said above, For all the data in P, Q, T, X, Y tables the SID's will be created in the S table /BI*/S<INFOOBJECT NAME>
    NOTE: Irrespective of the time dependency or Independency the VIEW /BI*/M<INFOOBJECT NAME> defined on the top of /BI*/P<INFOOBJECT NAME> & /BI*/Q<INFOOBJECT NAME> tables gives the view of entire master data.
    NOTE: it is just a View and it is not a Table. So it will not have any physical storage of data.
    All the above tables are for ATTRIBUTES
    But when it comes to TEXTS, irrespective of the Time dependency or Independency, the /BI*/T<INFOOBJECT NAME> table gets updated (and of course the S table also).
    Naming Convention: /BIC/*<InfoObject Name> or /BI0/*<InfoObject Name>
    C = Customer Defined Characteristic
    0 = Standard or SAP defined Characteristic
    * = P, Q, T, X,Y, S (depending on the above said conditions)
    Thanks & regards
    Sasidhar

  • Loading Master Data from a Flat File

    Hi Guru, I have flat files of different SAP tables (Transactional data tables and Master Data ), I want to upload this flat files to a DSO, my question is in the case of Master Data Files example SAP table USR03 ( User Address Data) in a Flat file format which is better to create a Master data text and attributes Datasource or create it as a transactional data Datasource to upload it to a DSO.

    Hi
    Master data datasource should be used to load master data like customer details. customer data is the master data which does not change frequently. customer infoobject may have some attributes like customer name, customer contact number, customer address, etc. we have to maintain these attributes and text data.
    Transaction data is something which keeps changing very frequently like price of the material, sales, etc. for loading such data we go for transaction datasources.
    As mentioned above too,the changes in the address are very rare, therefore youshould not use transactional datasource.
    Regards, Rahul
    Edited by: Rahul Pant on Feb 7, 2011 12:24 PM

  • Error while loading master data to attribute ZMAJORGU(Major organisational unit)

    Hi Experts,
    While loading master data to  ZMAJORGU(MOU)(info object texts) from  datasource (ZOS_HRP1000)datasource iam getting error in DTP.
    below is the error...
    Diagnosis
        Data record 349 & with the key 'Ship &' is invalid in value 'Ship &' of
        the attribute/characteristic ZMAJORGU &.
    System Response
        The system has recognized that the value mentioned above is invalid, and
        has processed this general error message. A subsequent message may give
        you more information on the error. This message refers to the same
        value, even though it does not state this explicitly.
    Procedure
        If this message appears during a data load, maintain the attribute in
        the PSA maintenance screens. If this message appears in the master data
        maintenance screens, leave the transaction and call it again. This
        allows you to maintain your master data.
    Please find the below screen shots
    I have done the following regarding the issue ....Looking at the error i have gone trough the transformation and i didnt find any routines or formuales
    I have checked the PSA and down loaded the data to spread sheet and put a filter on SHORT and STEXT assuming i will find the value
    'Ship &' as mentioned in the Diagnosis description. However i was not able to find the relavent data(i went to se16 gave teh request ID from PSA and displayed the entire data)tarting with Ship .The closest  value i could find in both the fileds was was AVP Shipment .
    I have the following questions
    1)In the screen short 2 it says record number 349 is it right to go to PSA table and check against the record number 349(i tried that but i was not able to find the error against that record)
    2)why am i not finding the value 'Ship' in PSA ...but this is the value wich is creating a poblem in updating to the target ....does it mean the record value is Ship or it starts with Ship or is ir part of some word?
    Please find the attached PSA data file (u have to convert it to Excel)
    I request people to help me analyse the issue and fix the issue.
    Regards
    Kiran

    Ram
    I checked other records as well they were a lot of smallcharecters upper case charectres ,numeric special charecters .but nothing stopped ...when the system is even accpeting all other charectres  like smallof caps or numeric... why on this record is creating an issue...
    please have a look at error stack
    and i have one more question when i look at PSA there are close to 300000 records
    but when i do an SE16 with request id i see only close to one lakh records (i came to conclusion b dragging down and the last record number was (below screen shot)
    Can you tell me what my mistake is numebr or records are not tallying in SE16 and PSA..

  • Lock NOT set for: Loading master data attributes error

    Hi experts,
    We were encountering this error before when trying to load master data.  When we checked the system we could not find any locks at the time, and activation or kicking off the attribute change run failed again.  We finally solved the problem running FM RSDDS_AGGR_MOD_CLOSE which sets the close flag to 'X' in table RSDDAGGRMODSTATE.  I have read that it is possible this lock error happens when two change runs happen at the same time.
    My question are:
    1. is it possible to find out what process exactly "caused" the lock? the table RSDDAGGRMODSTATE does not have a reference to any object or job. I am curious as we are trying to find ways to avoid this in the future...
    2. in our case, when we could not find any locks, is running this fm the only work around? is this a best practice?
    mark
    Message was edited by:
            Mark Siongco
    Message was edited by:
            Mark Siongco

    Hello Catherine
    I have had this problem in the past (3.0B) --> the reason is that our system was too slow and could not crunch the data fast enough, therefore packets where loacking each other.
    The fix: load the data into the PSA only, and then send it in background from the PSA to the info object. By doing this, only a background process will run, therefore locks cannot happen.
    Fix#2: by a faster server (by faster, I mean more CPU power)
    Now, maybe you have another issue with NW2004s, this was only my 2 cents quick thought
    Good luck!
    Ioan

  • Error while loading Master Data - BPC 10.0 NW

    Hello gurus,
    I am trying to load master data from a BW InfoObject to a dimension in BPC, but am getting the error as shown below:
    The transformation and conversion files are configured as shown below:
    I would greatly appreciate if you can help me troubleshoot the issue with the java script command.  Thanks in advance for your help.
    Best regards,
    Vijay

    Hello Vijay,
    Which Patch are you using for EPM add in.
    It seems that js:%external%.replace("#","_") does not work with some patches
    so in that case you have to use js:%external%.toString().replace("#","_") in your conversion file.
    Deepesh

Maybe you are looking for