Data from ODS and a Inventory cube

Hi Guys,
We have an Inventory Cube zic_c03, and ODS which has some detailed information like batch level information.
There is a MultiProvider on top of these two.
My question is can we use a cube on top of the ODS and feed that into the Multi Provider.
Will that improve performance on queries.
What are the drawbacks or repercussions I need to understand first.
Thanks a lot in advance for your help.

Hi Murali,
Yes, if you load into a cube from your ODS you can improve performance.  I typically load data that can be summarized into the cube and cut out document numbers, line items, etc...  If you need the detail, then just jump to a query off of the ODS objects' MP.
The drawbacks are that you no longer get all of your detailed information at the cube level.
Thanks,
Brian

Similar Messages

  • Reading data from ODS and update current layout?

    Hi,
    when i execute a planning folder, list of materials are displayed into the current layout.
    Now, for each material, I need to bring quantity into the current layout which is (quantity) in an ODS. Initially, Quantity is blank in the infocube. Now, i need to populate Quantity field from ODS into the current layout.
    Infocube Quantity field & ODS Quantity field are different but of the same type.
    I'm wondering if it can be done using a user exit. Can we read ODS data from a function module and send it back to the current layout?
    I appreciate sharing your ideas.
    Message was edited by: hari143

    Hi Hari,
    you could use a routine in the upload rule.
    declaration area
    Tables: /BIC/AODS00.
    data: T_MATERIAL Type hashed table of /BIC/AODS00 with unique
    key /BI0/0MATERIAL with header line.
    Loading of internal Table
      data: n type i.
      describe table T_MATERIAL lines n.
      if n = 0.
        SELECT * FROM /BIC/AODS00 INTO TABLE T_MATERIAL.
      endif.
    finding MATERIAL
      read table T_MATERIAL with table key
        /BI0/0MATERIAL = COMM_STRUCTURE-/BI0/0MATERIAL.
      if sy-subrc = 0.
    Additional necessary Values
    Values from Communication Structure
          RESULT_TABLE-Object1 = COMM_STRUCTURE-Object1.
    RESULT_TABLE-Object2 = COMM_STRUCTURE-Object2.
          RESULT_TABLE-VTYPE = COMM_STRUCTURE-VTYPE.
          RESULT_TABLE-VERSION = COMM_STRUCTURE-VERSION.
          RESULT_TABLE-FISCYEAR = COMM_STRUCTURE-FISCYEAR.
          RESULT_TABLE-FISCVARNT = COMM_STRUCTURE-FISCVARNT.
          RESULT_TABLE-G_CWWCON = COMM_STRUCTURE-G_CWWCON.
          RESULT_TABLE-/BIC/C7_ERGBER = COMM_STRUCTURE-/BIC/C7_ERGBER.
          RESULT_TABLE-CO_AREA = COMM_STRUCTURE-CO_AREA.
          RESULT_TABLE-SOURSYSTEM = COMM_STRUCTURE-SOURSYSTEM.
          RESULT_TABLE-CURTYPE = COMM_STRUCTURE-CURTYPE.
          RESULT_TABLE-CURRENCY = COMM_STRUCTURE-CURRENCY.
          RESULT_TABLE-COUNTRY = COMM_STRUCTURE-COUNTRY.
          RESULT_TABLE-G_CWWTEC = COMM_STRUCTURE-G_CWWTEC.
          RESULT_TABLE-SALES_DIST = COMM_STRUCTURE-SALES_DIST.
    Values from Hyperion Mapping ODS
          RESULT_TABLE-G_CWWPT2 = T_HYP_ACCT-G_CWWPT2.
          RESULT_TABLE-G_CWWPD3 = T_HYP_ACCT-G_CWWPD3.
          RESULT_TABLE-G_CWWPD6 = T_HYP_ACCT-G_CWWPD6.
          RESULT_TABLE-G_CWWNEW = T_HYP_ACCT-G_CWWNEW.
          RESULT_TABLE-CUSTOMER = T_HYP_ACCT-/BIC/C7_DUMMY3.
          RESULT_TABLE-SHIP_TO = T_HYP_ACCT-/BIC/C7_DUMMY3.
    Not assigned Values
          RESULT_TABLE-SALES_GRP = ''.
          RESULT_TABLE-PROFIT_CTR = ''.
          RESULT_TABLE-SALESORG = ''.
          RESULT_TABLE-G_CWWPD1 = ''.
          RESULT_TABLE-G_CWWPD4 = ''.
          RESULT_TABLE-G_CWWPD5 = ''.
          RESULT_TABLE-DISTR_CHAN = ''.
          RESULT_TABLE-DIVISION = ''.
          RESULT_TABLE-BUS_AREA = ''.
          RESULT_TABLE-CUST_GROUP = ''.
          RESULT_TABLE-/BIC/C7_HIE01 = ''.
          RESULT_TABLE-SALESEMPLY = ''.
          RESULT_TABLE-VALUATION = ''.
          RESULT_TABLE-REC_TYPE = ''.
          RESULT_TABLE-VAL_TYPE = ''.
          RESULT_TABLE-G_CWWMRK = ''.
    *Deriving values
          RESULT_TABLE-G_AVVDAA = COMM_STRUCTURE-/BIC/C7_VALUE1.
          CONCATENATE COMM_STRUCTURE-FISCYEAR '001' into
          RESULT_TABLE-FISCPER.
          APPEND RESULT_TABLE.
          RESULT_TABLE-G_AVVDAA = COMM_STRUCTURE-/BIC/C7_VALUE2.
          CONCATENATE COMM_STRUCTURE-FISCYEAR '002' into
          RESULT_TABLE-FISCPER.
          APPEND RESULT_TABLE.
          RESULT_TABLE-G_AVVDAA = COMM_STRUCTURE-/BIC/C7_VALUE3.
          CONCATENATE COMM_STRUCTURE-FISCYEAR '003' into
          RESULT_TABLE-FISCPER.
          APPEND RESULT_TABLE.
          RESULT_TABLE-G_AVVDAA = COMM_STRUCTURE-/BIC/C7_VALUE4.
          CONCATENATE COMM_STRUCTURE-FISCYEAR '004' into
          RESULT_TABLE-FISCPER.
          APPEND RESULT_TABLE.
          RESULT_TABLE-G_AVVDAA = COMM_STRUCTURE-/BIC/C7_VALUE5.
          CONCATENATE COMM_STRUCTURE-FISCYEAR '005' into
          RESULT_TABLE-FISCPER..
          APPEND RESULT_TABLE.
          RESULT_TABLE-G_AVVDAA = COMM_STRUCTURE-/BIC/C7_VALUE6.
          CONCATENATE COMM_STRUCTURE-FISCYEAR '006' into
          RESULT_TABLE-FISCPER.
          APPEND RESULT_TABLE.
          RESULT_TABLE-G_AVVDAA = COMM_STRUCTURE-/BIC/C7_VALUE7.
          CONCATENATE COMM_STRUCTURE-FISCYEAR '007' into
          RESULT_TABLE-FISCPER.
          APPEND RESULT_TABLE.
          RESULT_TABLE-G_AVVDAA = COMM_STRUCTURE-/BIC/C7_VALUE8.
          CONCATENATE COMM_STRUCTURE-FISCYEAR '008' into
          RESULT_TABLE-FISCPER.
          APPEND RESULT_TABLE.
          RESULT_TABLE-G_AVVDAA = COMM_STRUCTURE-/BIC/C7_VALUE9.
          CONCATENATE COMM_STRUCTURE-FISCYEAR '009' into
          RESULT_TABLE-FISCPER.
          APPEND RESULT_TABLE.
          RESULT_TABLE-G_AVVDAA = COMM_STRUCTURE-/BIC/C7_VALU10.
          CONCATENATE COMM_STRUCTURE-FISCYEAR '010' into
          RESULT_TABLE-FISCPER.
          APPEND RESULT_TABLE.
          RESULT_TABLE-G_AVVDAA = COMM_STRUCTURE-/BIC/C7_VALU11.
          CONCATENATE COMM_STRUCTURE-FISCYEAR '011' into
          RESULT_TABLE-FISCPER.
          APPEND RESULT_TABLE.
          RESULT_TABLE-G_AVVDAA = COMM_STRUCTURE-/BIC/C7_VALU12.
          CONCATENATE COMM_STRUCTURE-FISCYEAR '012' into
          RESULT_TABLE-FISCPER.
          APPEND RESULT_TABLE.
          RETURNCODE = 0.
          ABORT = 0.
        Else.
          RETURNCODE = 1.
          ABORT = 0.
        endif.
      ELSE.
        RETURNCODE = 1.
        ABORT = 0.
      ENDIF.

  • Error while uploading data from ODS to Cube

    Hi All,
    Will you please help out this issue.As this is a priority high issue,please reply if you know the answers.
    I am facing an error while loading the data from ODS to CUBE,the error
    is
    <b>1.Name is not in the namespace for generated BW Metaobjects
    2.Error 18 in the update</b>
    And this error is occuring only in quality server.
    Thanks,
    Ram.

    HI RAM SIVA,
    make sure that the Data source is replicated and tranfer rules r active.
    And also check whether the all transport requests r imported properly.
    hope it helps
    bhaskar

  • Error when trying to load data from ODS to CUBE

    hi,
      Iam getting a short dump  when trying to load data from ODS to CUBE. The Run time error is 'TYPELOAD_NEW_VERSION' and the short text is 'A newer version of data type "/BIC/AZODS_CA00" was found than one required.please help me out.

    Hi,
    Check this thread.........Ajeet Singh  has given a good solution here.........
    Re: Error With Data Load-Getting Canceled Right Away
    Also check SAP note: 382480..................for ur reference............
    Symptom
    A DART extraction job terminates with runtime error TYPELOAD_NEW_VERSION and error message:
    Data type "TXW_INDEX" was found in a newer version than required.
    The termination occurs in the ABAP/4 program "SAPLTXW2 " in "TXW_SEGMENT_RECORD_EXPORT".
    Additional key words
    RTXWCF01, LTXW2U01, TXW_INDEX
    Cause and prerequisites
    This problem seems to happen when several DART extraction jobs are running in parallel, and both jobs access table TXW_INDEX.
    Solution
    If possible, avoid running DART extractions in parallel.
    If you do plan to run such jobs in parallel, please consider the following points:
    In the DART Extract configuration, increase the value of the parameter "Maximum memory allocation for index (MB)" if possible. You can estimate reasonable values with the "File size worksheet" utility.
    Run parallel DART jobs on different application servers.
    As an alternative, please apply note 400195.
    It may help u.........
    Regards,
    Debjani.......

  • Load data from ODS 3.5 to Info Cube 7.0  ?

    Hi BI Experts
    I am new to BI 7.0 , my requirement is  i want to load the data from ODS  3.5 to Info Cube 7.0
    Is this Possible to load from old version to new version?
    If Possible please let me the steps
    Thanks in advance

    Hi there,
    ODS 3.5 is just like DSO 7.0 and InfoCube 7.0 is just like InfoCube 3.5
    So you can still use the old fashion 3.5 mode with InfoSource, update rules, transfer rules and InfoPackage to the InfoCube, or you can now use a transformation with DTP
    Diogo.

  • Record increased when sending data from ODS to cube

    Hello experts,
            when i am sending data from ODS - CUBE i am sending 20 records of full update but in my cube i
    am able to see 200 records, in those records i am getting the old records.Everytime i load the data i am getting the same problem its fetching all the old records.Please suggest me what to do inorder to reslove this problem.
    Thanks alot,
    Komal.

    Hi,
    TO delete the PSA Table record details follow these steps,
    Right click your ODS/CUBE and choose "Show dataflow diagram" on right side panel click "Techincal Settings" ON and you can get your PSA table name. Type this table name in SE11 and delete your records.
    OR
    IN RSA1 --> PSA choose your infosource and choose PSA and right click and choose delete PSA data.
    Hope it helps
    Regards,
    Arun.M.D

  • How to ensure data from ods to cube are correct or not

    hi all,
    i am just trying to get data from ods to cube. how to make ensure that the records that i fetched from ods to the respective cube are correct and number of records are the same.
    thanxs in advance
    regds
    hari

    Dear Hari,
    There are two ways of confirming correct loads:
    1. Level by Level Analysis
    2. Comparision with Virtual InfoProvider
    First method is by analysing data at each level. First at Database Table (if loading from R/3) or Excel (if loading from flatfile), then at extraction level (See data at RSA 3, not possible with flatfile). After this load data to BW through PSA and check the content of PSA. After that load it to Infoprovider and varify there.
    Second method is to create a Virtual Infocube on the datasource (extract exact data through FM). And create a multiprovider by combining this Virtual cube and original Cube. Then create a Query with calculated key figures as difference of each key figures from two cubes. This way you can say that which value is not matching.
    I suggest you the first method, but the standard way would be second one at highly critical loads where you cannot match huge ammount of data.
    Hope it helps...
    Regards,

  • Updation of data from ods to cube

    hi all,
              I have 8 activated request in ODS . now I want to load data into a cube
    from this ods .
      is it possible to update request one by one so that I have same no. of request
    in cube as in ods?
    Thanks in adv.

    Hi
    If you want to do like that, you have to create an infopack which updates data from ODS to cube.....Go to datamart ODS in your Infosouces...create an infopack....and you can giv request id in selection and do one by one
    Assign points if useful
    Thanks
    N Ganesh

  • Error while loading data from ODS to CUBE.

    Hi friends,
    When l am loading data from ODS to Cube with help of data mart, I am getting error in QA system , IN  DM system ,every thing went well.IF i see the detail tab in monitor under Processing .
    Its is showing like this .
    Transfer Rules : Missing Massage.
    Update PSA : missing massage.
    Processing end : missing message.
    I have checked the coding in update rules, everything is ok.
    Plz any inputs.
    hari
    Message was edited by:
            hari reddy

    Might means that IDocs flow is not defined properly in your QA system for myself-SourceSystem.
    Regards,
    Vitaliy

  • Error while laoding data from ODS to infocube

    Hi all,
    I am trying to load data from ODS to Cube but its giving error.
    I am getting following error
    An error occurred while executing a transformation rule:
    The exact error message is:
    The argument 'Result dependent booking' cannot be interpreted as a
    number
    The error was triggered at the following point in the program:
    GPD5CD2GMGXQYJ77HKMS0C0CQVE 2510
    System Response
         Processing the data record has been terminated.
    Procedure
          The following additional information is included in the higher-level
         node of the monitor:
         o   Transformation ID
         o   Data record number of the source record
         o   Number and name of the rule which produced the error
    Procedure for System Administration.
    How to resolve this error, it will be great help if anyone post the steps

    Thanks Ananda  for your suggestion,
    CAn you please tell me How to identify which rule has a problem ..is there any way which tell me which rule and which record is creating a problem

  • Index used or not for selecting data from ODS in a start routine

    Dear friends,
    In the start routine of the update rules to a cube, I am reading some data of an ODS in to an internal table .
    The ODS is indexed. But, I am not sure if the index is at all used in the Select statement (that gets the data from ODS to the internal table in the start routine) while loading data to the cube.
    Any help is highly appreciated.
    regards,
    atlaj

    Hi Atlaj
    You can findout this is display execution plan for SQL statement in DB02.
    Goto DB02. and under diagnostic, you find explain. Select that and enter your query. Make sure that everything here is in capital format. Below is a sample query which I have entered.
    SELECT "CRM_SALORG" "SALESORG" FROM "/BI0/QORGUNIT"
    WHERE "SALESORG" = ? AND "OBJVERS" = ? AND "DATETO" >= ?
    AND "DATEFROM" <= ?
    The select parameters should be inside qoutes and in caps and even the from table. Once you enter your query in this format, click on explain. It will show the index scan if the index is present. the output for my query will be something like
    0 SELECT STATEMENT ( Estimated Costs =  1,348E+01 [timerons] )
            1 (COOR) RETURN
                2 (    0) TQ
                    3 (    0) FETCH /BI0/QORGUNIT
                        4 (    0) IXSCAN /BI0/QORGUNIT~Z1
    Where my last statement (line 4) is showing index scan and the name of index read is Z1.
    Hope this helps.
    Please let me know if you have any problems entering the query in the specified format and u get any error.
    Regards
    Sriram

  • Shortdump problem for loadinf data from ODS to InfoCube

    hai
    im trying to load the data from ODS to InfoCube.But i got the following error like below
    Short dump in the Warehouse
    Diagnosis
    The data update was not completed. A short dump has probably been logged in BW providing information about the error.
    <b>System response
    "Caller 70" is missing.
    Further analysis:
    Search in the BW short dump overview for the short dump belonging to the request. Pay attention to the correct time and date on the selection screen.
    You get a short dump list using the Wizard or via the menu path "Environment -> Short dump -> In the Data Warehouse".
    Error correction:
    Follow the instructions in the short dump.</b>
    I looked at the shortdump.But it says that there is no shortdump for that particular date selection.
    pls tell me wht i have to do
    i ll assing the points
    bye
    rizwan

    Hi Rizwan,
    Why does the error occurs ?
    • This error normally occurs whenever BW encounters error and is not able to classify them. There could be multiple reasons for the same
    o Whenever we are loading the Master Data for the first time, it creates SID’s. If system is unable to create SID’s for the records in the Data packet, we can get this error message.
    o If the Indexes of the cube are not deleted, then it may happen that the system may give the caller 70 error.
    o Whenever we are trying to load the Transactional data which has master data as one of the Characteristics and the value does not exist in Master Data table we get this error. System can have difficultly in creating SIDs for the Master Data and also load the transactional data.
    o If ODS activation is taking place and at the same time there is another ODS activation running parallel then in that case it may happen that the system may classify the error as caller 70. As there were no processes free for that ODS Activation.
    o It also occurs whenever there is a Read/Write occurring in the Active Data Table of ODS. For example if activation is happening for an ODS and at the same time the data loading is also taking place to the same ODS, then system may classify the error as caller 70.
    o It is a system error which can be seen under the “Status” tab in the Job Over View.
    What happens when this error occurs ?
    • The exact error message is “System response "Caller 70" is missing”.
    • It may happen that it may also log a short dump in the system. It can be checked at "Environment -> Short dump -> In the Data Warehouse".
    What can be the possible actions to be carried out ?
    • If the Master Data is getting loaded for the first time then in that case we can reduce the Data Package size and load the Info Package. Processing sometimes is based on the size of Data Package. Hence we can reduce the data package size and then reload the data again. We can also try to split the data load into different data loads
    • If the error occurs in the cube load then we can try to delete the indexes of the cube and then reload the data again.
    • If we are trying to load the Transactional and Master Data together and this error occurs then we can reduce the size of the Data Package and try reloading, as system may be finding it difficult to create SID’s and load data at the same time. Or we can load the Master Data first and then load Tranactional Data
    • If the error is happening while ODS activation cause of no processes free, or available for processing the ODS activation, then we can define processes in the T Code RSCUSTA2.
    • If error is occurring due to Read/Write in ODS then we need to make changes in the schedule time of the data loading.
    • Once we are sure that the data has not been extracted completely, we can then go ahead and delete the red request from the manage tab in the InfoProvider. Re-trigger the InfoPackage again.
    • Monitor the load for successful completion, and complete the further loads if any in the Process Chain.
    (From Re: caller 70 missing).
    Also check links:
    Caller 70 is missing
    Re: Deadlock - error
    "Caller 70 Missing" Error
    Caller 70 missing.
    Bye
    Dinesh

  • How to extract data from ODS to non-SAP system

    Hi,
    Can anybody tell me, step by step, how to extract data from ODS to a non-SAP system?
    Is it possible to do it without programming effort? And is there volume limits for this kind of extraction?
    The non-SAP system is an unix system.
    Thanks in advance
    Ella

    Ella,
    You can look at it from the concept of a BADI / Infospoke
    Extract the data from the ODS to a Flat file / RDBMS using an infospoke. I am not sure as to how the infospoke loads data into the RDBMS ( did it very long ago ) but then you can push it into an RDBMS and I am sure it will be system neutral.
    Hope this helps...
    Arun
    Assign points if it helps

  • Error while loading the data from ODS to InfoCube

    hai
    Im trying to load the data from ODS to InfoCube for particular year .
    But it says that there is a source system problem .
    why it is like that .
    pls tell me
    i ll assing the points
    rizwan

    Hi Rizwan,
    you didn't mention the error message in details. there could be a few places to be checked:
    - check if BW itself source system is active and in tact and reactivate if necessary
    - check if update rule is active and reactivate if necessary
    - check if ODS is active and reactivate if necessary
    Regards,
    Lilly

  • Error While Loading data from ODS to Infocube

    I am trying to load data from ODS to Infocube  thru Update ODS data into data target. My requirement was to take a small subset of fields from ODS and design IC and load the data.
    My load fails at Extraction process as I get 0 records from total number of records sent in each package. Please let me know if you need more information.
    Please advise.
    Thanks,
    RR

    In Details tab of monitor, in extraction step,
    Extraction(messages): Errors occured
    Green Light for Data Request Received
    Green Light for Data Selection Scheduled
    Yellow for 25000 Records sent(0 records received)
    Yellow for 25000 Records sent(0 records received)
    Yellow for 15000 Records sent(0 records received)
    Green Light for Data Selection Ended
    Please let me know, If you need more information.
    Thanks,
    R R

Maybe you are looking for

  • Web Module: Works great, but is it possilble to publish an HTML web gallery that does not use thumbnails, but just displays all photos (one above another)?

    It would be nice if a published web gallery did not require clicking or sliding in order to view a new picture. Is there a template that just puts up every image in the gallery, full-size, one above another? That's how I like my webpages - see, for e

  • BPM 11.1.1.6 workspace help

    Hi , I am just trying to understand the tasks tabs in Oracle BPM workspace, and from Oracle's documentation "Administrative Taks" tab should only be visible to a user who belongs to a BPMProcessAdmin Or ProcessOwner role. But When I create a BPM doma

  • Mail always on top

    Every time I unlock my iBook, after using the screen lock, the Mail application appears on top of all other applications. I can't find any setting which would cause this to happen, and am curious if this is a bug or a "feature".

  • Upgrade 306 to 307 portal main errors

    I've upgrade both my db (solaris 816 to 817) and portal (306 to 307). Portal and RDBMS are in separate ORACLE_HOMEs on the same machine. Now, when I try to access portal, I get the following errors: "Client Logon Failed with errorcode:28150 Reset Ses

  • IPHOTO UNRECOGNIZABLE FORMAT?????

    I am using Iphoto 9.2.3 with a Nikons3100.  All my photos have always downloaded but this last time I took pictures when I try to import it says "These files are unreadable"  But they all are coming up as JPEGs which is what they always have.  HELPPP