Loading relational cube

Hi,
Several questions regarding relational cubes (build using OWB wisard).
1. I would like to have an sample mapping for loading relational cube.
I am able to load all dimension correctly, but for some reason when I load the cube nothing is loaded.
2. Is it possible to on relational cube to aggregate the information on part of the dimensions? I have created relational cube based on 5 dimensions including time. The cube has two cumulative measures: amount and storage.
For "amount" and "storage" measures, I would like to sum up for all dimension except for time. For time I want to take the end_date of a period value.
3. Is it possible to add a third and forth measures avg_amount and avg_storage and issue an "average" aggregation on those measure (while on "amount" and "storage" we shall use "sum").
Hope the above is clear.
Thanks
Galit

Please ignore, I put it under wrong discussion group.

Similar Messages

  • XRPM Business content: Loading of cube 0RPM_C02

    Hi,
      We are implementing the xRPM BW business content and find one scenario very interesting related to the cube 0RPM_C02( Staffing Resource Assignment)
      When we look into the BW content documentation in help.sap.com we find that this info cube can be updated from the following data source/info source, the first 3 being the transaction data.
    Data source     Info source
    0RPM_BUPA_AVL      0RPM_SRAR_03
    0RPM_ROLE_D     0RPM_SRAR_02
    0RPM_RELATE_D     0RPM_SRAR_01
    0RPM_TEAM_H     0RPM_TGUID
    0RPM_RELATE_H     0RPM_RELGUI
    0RPM_ROLE_H     0RPM_ROLGUI
    0RPM_PM_01     0RPM_PROJ/0RPM_PROJ_GUI
    0RPM_RESOURCE     0RPM_RESGUI
    The first three combinations have already been working and data loaded successfully in the cube. What I am not able to understand as
    1> why the cube should also get updated from the last 5 data source/info source combination which are basically for master data and loaded before we load the cube.
    2> There is no business content (update rule etc) that connect these info source to the cube 0RPM_C02. Do we need to manually create the update rules for these?
    3> Why the cube is getting loaded from the master data, which are also characteristics of its own dimension.
    Please answer point by point.
    Thanks
    Arun

    can u please post the answer here again. I'm like in the same scenario now.
    Regards

  • .eif Import and relational cube

    Hi,
    Once I import an .eif in an empty AW, I need to refresh the AW from relational cube in an ongoing state.
    How do I do that? Is there any way to ASSOCIATE already existing AW to a relational cube?
    Kindly help.
    Rishi

    Once you have migrated your EIF file to the 10g AW Standard Form, see OLAP documentation for more details, you should be able to view the dimensions and cubes within your AW using Analytic Workspace Manager (assuming you are using either 10.1.0.4 or 10.2 database version). The Mapping node on each dimension and cube tree within the display will allow you to map the attributes and measures to relational source tables. Analytic Workspace Manager can then be used to create a scheduled job to refresh the AW at a given time or create a load and solve PL/SQL package that can be incorporated into an existing process flow.
    Hope this helps,
    Keith
    Oracle Business Intelligence Product Management
    BI on Oracle: http://www.oracle.com/bi/
    BI on OTN: http://www.oracle.com/technology/products/bi/
    BI Beans http://www.oracle.com/technology/products/bib/index.html
    Discoverer: http://www.oracle.com/technology/products/discoverer/
    BI Software: http://www.oracle.com/technology/software/products/ias/devuse.html
    Documentation: http://www.oracle.com/technology/documentation/appserver1012.html
    BI Samples: http://www.oracle.com/technology/products/bi/samples/
    Blog: http://oraclebi.blogspot.com/

  • Data not loading to cube

    I am loading a cube from a cube.  There is data which meets my info package selection criteria in the source cube but I get 0 records transferred, added to the cube.  There is no delete in start routines.
    Just before loading I have selectively deleted all records from the target cube so as to not disturb existing delta mechanism
    The load I am making is full repair
    Can someone assist
    Thanks

    Hi Akreddy
    I am not sure "Repair full to Cube"...!
    Still the following is my opninon abt your issue...
    It seems there is some  miss map ..Just one quick check in the DTP monitor in which step your getting 0 records . Is it Data package level or transformation level or routine level..
    Identify the step where the transfered records is nullifying then dig through the step..or post the waning or message in this forum so that its pretty easier to get answered..
    Hope its clear a little..!
    Thanks
    K M R
    "Impossible Means I 'M Possible"
    Winners Don't Do Different things,They Do things Differently...!.
    >
    akreddy wrote:
    > I am loading a cube from a cube.  There is data which meets my info package selection criteria in the source cube but I get 0 records transferred, added to the cube.  There is no delete in start routines.
    >
    > Just before loading I have selectively deleted all records from the target cube so as to not disturb existing delta mechanism
    >
    > The load I am making is full repair
    >
    > Can someone assist
    >
    > Thanks
    Edited by: K M R on Aug 16, 2010 2:01 PM

  • Error Message When Loading a Cube

    Hello - I received the following error message when I attempted to load a cube:
    No SID found for value '0.000' of characteristic 0CURRENCY
    Is anyone familiar with this error message?  Also, I tested this load in DEV before I moved the cube into QA but I did not receive the error message in DEV.  Any help would be greatly appreciated, thanks.

    Hi,
    I had a similar error just a few days ago. I could solve it by replicating the datasources again and activating the transfer structures via rs_transtru_activate_all. After that the load went fine.
    regards
    Siggi

  • All records loaded to cube successfully but the request is in red status

    All the records are loaded from r3 to target, but the status of the request is still in RED, what is the reason and how to resolve it?

    Hi,
    If you find the records are loaded to cube and request is still red, Follow any one of the following options.
    1. Make the request to green manually
    2. Check the extraction monitor settings in SPRO -
    > SAP Customizing Implementation Guide  ---> Business Intelligence  --->  
        Automated Processes  -
    > Extraction Monitor Settings. Check how the settings are done if you are authorized.
    3. Go to Manage of the target ---> Environment ---> Automatic Request processing ---> Check the option Set Quality Status to OK
    Hope this helps,
    Regards,
    Rama Murthy.
    Edited by: Rama Murthy Pvss on May 11, 2010 7:39 AM

  • Function module to load transactional cube directly.

    Hi,
    Is there any function module to load transactional cube directly.
    Thanks.

    Hi.
    Transactional cube behaves as regular cube except additional possibility to write directly from planning applications.
    So you can load this cube with regular BW tools (data source, info source, info package, update rules).
    But as mentioned above before loading you should switch cube to load mode and after switch to planning mode using FM or manually.
    Regards.

  • Can data be loaded from Cube to DSO

    Can data be loaded from Cube to DSO

    Hello,
    Yes,it can be loaded...all you need to do is the follow the same procedure of source and target as you do with the normal flow,...there is no different with cube as data source...you can even schedule delta from the cube to the DSO.
    Thanks
    Ajeet

  • How to make data loaded into cube NOT ready for reporting

    Hi Gurus: Is there a way by which data loaded into cube, can be made NOT available for reporting.
    Please suggest. <removed>
    Thanks

    See, by default a request that has been loaded to a cube will be available for reporting. Bow if you have an aggregate, the system needs this new request to be rolled up to the aggregate as well, before it is available for reporting...reason? Becasue we just write queries for the cube, and not for the aggregate, so you only know if a query will hit a particular aggregate at its runtime. Which means that if a query gets data from the aggregate or the cube, it should ultimately get the same data in both the cases. Now if a request is added to the cube, but not to the aggregate, then there will be different data in both these objects. The system takes the safer route of not making the 'unrolled' up data visible at all, rather than having inconsistent data.
    Hope this helps...

  • Delta Load to cube

    Hi All,
                  I have an issue in the delta load to cube.From Nov 11 onwards the delta load is not picking any data.It shows 0 from 0 records in the monitor screen.Could anyone please help me on this.
    Thanks in advance,
    Drisya

    hi Drisya,
    actiavte the transfer rules & update rules once & then load the data.
    u said that 0 from 0 records it is showing .Is it showing status yellow OR green?
    2. ur loading data from ODS to cube OR directly to cube
    If it is yellow force that to red
    actiavte transfer structure & update rules and do loading again.
    If ur loading through ODS do export generic data sorce at ODS level by right click
    Thanks,
    kiran

  • DTP load to cube failed with no errors

    Hi,
    I executed a DTP load from cube to cube with huge data volume approx 25 million records. It is a full load. The DTP load failed without any errors in any packages. The load went for 1 day and 33 mins and the request just turned red without any error messages. I have check ST22 and SM37 Job logs. No clue.
    I restarted this load and again it failed approximately around the sametime without any error messages. Please throw some light on this.
    Thanks

    Hi,
    Thanks for the response. There are no abnormal messages in the SM37 Job logs. These process is running fine the production. We are making a small enhancement.
    08/22/2010 15:36:01 Overlapping check with archived data areas for InfoProvider Cube                             RSDA          230          S
    08/22/2010 15:36:42 Processing of data package 000569 started                                                        RSBK          244          S
    08/22/2010 15:36:47 All records forwarded                                                                            RSM2          730          S
    08/22/2010 15:48:58 Overlapping check with archived data areas for InfoProvider Cube                            RSDA          230          S
    08/22/2010 15:49:48 Processing of data package 000573 started                                                        RSBK          244          S
    08/22/2010 15:49:53 All records forwarded                                                                            RSM2          730          S
    08/22/2010 16:01:25 Overlapping check with archived data areas for InfoProvider Cube                            RSDA          230          S
    08/22/2010 16:02:13 Processing of data package 000577 started                                                        RSBK          244          S
    08/22/2010 16:02:18 All records forwarded                                                                            RSM2          730          S
    08/22/2010 16:10:46 Overlapping check with archived data areas for InfoProvider Cube                            RSDA          230          S
    08/22/2010 16:11:12 Job finished                                                                                00           517          S
    As shown here The process ended in 577 and 578 package did not start.  I can do be executing multiple DTPs but this process is running fine in production. There are no error messages or short dumps to start troubleshooting.
    Thanks,
    Mathi

  • BI Loading to Cube Manually with out creating Indexes.

    BW 3.5
    I have a  process chain schedules overnight which loads data from the InfoCubes from the ODS after loading to the staging and transformation layer
    The data loaded into the InfoCube is scheduled in the process chain as
    delete Index > delete contents of the cube> Load Data to the Cube --> Create Index.
    Tha above process chain load to cube normally takes 5 - 6 hrs.
    The only concern I have is at times if the process chain fails at the staging layer and transformation layer then I have to rectify the same manually.
    After rectifying the error, now I have to load the data to the Cube.
    I have only left with couple of hours say 2-3 hrs to complete the process chain of the load to the cube because of business hours.
    Kindly let me know in the above case where I have short of time to load data to the cube via process chain
    can I manually delete the contents of the cube and load the data to the cube. Here I will not be deleting the existing index(s) and create Index(s) after loading to the Cube because creation of Index normally takes a long time which I can avoid where I am short of time.
    Can I do the above at times and what are the impacts
    If the load to the InfoCube schedules via process chain the other normal working days. Is it going to fail or it will go through
    Also deleting contents of the cubes deletes the indexes.
    Thanks
    Note: As far I understand that Index are created to improve the performance at loading and query performance level.
    your input will be appreciated.

    Hi Pawan,
    Please find below my views in bold
    BW 3.5
    I have a process chain schedules overnight which loads data to the InfoCubes from the ODS after loading to the staging and transformation layer
    The data loaded into the InfoCube is scheduled in the process chain as
    delete Index > delete contents of the cube> Load Data to the Cube --> Create Index.
    I assume you are deleting the entire contents of the cube. If this is the normal pattern of loads to this cube and if there are no other loads to this cube you may consider configuring a setting in the infocube which " Delete InfoCube indexes before each data load and then refresh" .This setting you would find in Performance tab in create index batch option. Read F1 help of the checkbox. It will provide with more info.
    Tha above process chain load to cube normally takes 5 - 6 hrs.
    The only concern I have is at times if the process chain fails at the staging layer and transformation layer then I have to rectify the same manually.
    After rectifying the error, now I have to load the data to the Cube.
    I have only left with couple of hours say 2-3 hrs to complete the process chain of the load to the cube because of business hours.
    Kindly let me know in the above case where I have short of time to load data to the cube via process chain
    can I manually delete the contents of the cube and load the data to the cube. YES, you can Here I will not be deleting the existing index(s) and create Index(s) after loading to the Cube because creation of Index normally takes a long time which I can avoid where I am short of time.
    Can I do the above at times and what are the impacts Impacts :Lower query performance and loading performance as you mentioned
    If the load to the InfoCube schedules via process chain the other normal working days. Is it going to fail or it will go through
    I dont probably understand the question above, but i assume you mean, that if you did a manual load will there be a failure next day. - THERE WOULDNT
    Also deleting contents of the cubes deletes the indexes.
    YES it does
    Thanks
    Pavan - You can skip creating indices, but you will have slower query performance. However if you have no further loads to this cube, you could create your indices during business hours as well. I think, the building of indices demands a lock on the cube and since you are not loading anything else u should b able to furnish it. Lastly, is there no way you can remodel this cube and flow...do you really need to have full data loads?
    Note: As far I understand that Index are created to improve the performance at loading and query performance level. TRUE
    your input will be appreciated.
    Hope it helps,
    Regards,
    Sunmit.

  • Tools for Anaylsing DB request when loading a cube

    Hello everyone !
    I have seen in documentation a screen in BW where you can see, while a cube is loading, usefull informations about SQL statement (request code, elapsed time...).
    (It is may be accessible from SM50 while prosseces are running).
    Anyone has got idea about this management tools (transaction name) ?
    Thanks
    Best Regards
    Fred

    It's much like ST05  !
    When I load a cube, I have for instance an update routine for the infoobjet XXX.
    Is it possible to get, for the whole datapackage, the time elapsed for this particular sql statement ?
    Thanks anyway,
    Best Regards
    Fred

  • Can i load the cube through an customised abap program??

    Hi all,
    I have loaded the ODS using the customised abap pgrm. is it possible to load the CUBES through the abap program.
    If not what is the reasons for not loading it?
    Thanks
    Pooja

    Hi Pooja,
    For me ..
    I'm afraid to upload directly by program to info cube tables .. Because, i don't know what tables it will be taken into account for uploading ..
    But ..
    If the requirement is to upload it by ABAP Program ..
    I make like this ..
    1. I create the ABAP Program to create .csv files. So all uploaded data will be written into .csv files.
    2. I create info-package, that upload data from file. And i define the path to refer to that corresponding .csv files.
    3. Beside ABAP Program is used to write .csv files, i also use it to trigger the info-package.
    So by that techniques, i'm able to upload data by ABAP Program.
    Hopefully it can help you a lot.
    Regards,
    Niel.

  • Re: Issue with multiple records loading into cube

    Hello Gurus,
    A report is run on a monthly basis for Billing commissions. Info cube has all the data related to the Billing commission.
    It has a custom field for total commissions for which the data gets populated from the DSO based on the Billing
    document conditions. Look up has been done for the total commissions field. Most of them are showing up
    the right values, but few billing documents are showing up the wrong values.
    I made an attempt by reloading  the effected Billing documents into Info cube, now the code works and
    total commissions are showing up right values. I tried selective deletion and loaded the data for the whole month,
    the billing documents again show up the wrong values. These billing documents show up the right values when
    they are loaded individually.I tried checking if the code was wrong, but it was working when the effected ones
    are loaded individually. Please let me know your suggestions on this. Thanks in advance

    Nanda,
    Please find the start routine below
    Z9SD_O21 = Billing document header
          L_ZUARC like /BIC/AZ9SD_O2300-INV_QTY,
           L_ZUART like /BIC/AZ9SD_O2300-KPRICE,
           L_ZCGU like /BIC/AZ9SD_O2300-KNVAL,
           L_ZCTU like /BIC/AZ9SD_O2300-KNVAL,
           L_ZCNU like /BIC/AZ9SD_O2300-KNVAL,
           L_ZCQU like /BIC/AZ9SD_O2300-KNVAL,
    FORM SELECT_POST_ST
        USING COMM_BILL_NUM LIKE /BIC/AZ9SD_O2100-DOC_NUMBER.
    *         COMM_BILL_DATE LIKE /BIC/AZ9SD_O2100-BILL_DATE.
       IF COMM_BILL_NUM <> /BIC/AZ9SD_O2100-DOC_NUMBER.
    *  OR COMM_BILL_DATE <> /BIC/AZ9SD_O2100-BILL_DATE.
         CLEAR /BIC/AZ9SD_O2100.
         SELECT SINGLE DOC_NUMBER /BIC/ZPOST_ST
         INTO (/BIC/AZ9SD_O2100-DOC_NUMBER,
    *          /BIC/AZ9SD_O2100-BILL_DATE,
               /BIC/AZ9SD_O2100-/BIC/ZPOST_ST )
         FROM /BIC/AZ9SD_O2100 WHERE
          DOC_NUMBER = COMM_BILL_NUM.
       ENDIF.
    ENDFORM. "SELECT_POST_ST
    *$*$ end of global - insert your declaration only before this line   *-*
    * The follow definition is new in the BW3.x
    TYPES:
       BEGIN OF DATA_PACKAGE_STRUCTURE.
          INCLUDE STRUCTURE /BIC/CS8Z9SD_O24.
    TYPES:
          RECNO   LIKE sy-tabix,
       END OF DATA_PACKAGE_STRUCTURE.
    DATA:
       DATA_PACKAGE TYPE STANDARD TABLE OF DATA_PACKAGE_STRUCTURE
            WITH HEADER LINE
            WITH NON-UNIQUE DEFAULT KEY INITIAL SIZE 0.
    FORM startup
       TABLES   MONITOR STRUCTURE RSMONITOR "user defined monitoring
                MONITOR_RECNO STRUCTURE RSMONITORS " monitoring with record n
                DATA_PACKAGE STRUCTURE DATA_PACKAGE
       USING    RECORD_ALL LIKE SY-TABIX
                SOURCE_SYSTEM LIKE RSUPDSIMULH-LOGSYS
       CHANGING ABORT LIKE SY-SUBRC. "set ABORT <> 0 to cancel update
    *$*$ begin of routine - insert your code only below this line        *-*
    * fill the internal tables "MONITOR" and/or "MONITOR_RECNO",
    * to make monitor entries
    * if abort is not equal zero, the update process will be canceled
    * DELETE DATA_PACKAGE WHERE STORNO = 'X'.
    * DELETE DATA_PACKAGE WHERE /BIC/ZPSTAS = 'A'.
    * CG: 07/02/07 Assign Summarization group for each partner function
    * based on Master Data.
       DATA: LV_LINES TYPE I.
       DATA: LV_RETURN(1).
       DESCRIBE TABLE DATA_PACKAGE LINES LV_LINES.
       IF LV_LINES > 0.
         CLEAR: LV_RETURN, SY-SUBRC.
         CALL FUNCTION 'Z_FIND_SUM_GROUP_2'
           IMPORTING
             MY_RETURN                      = LV_RETURN
           TABLES
             IT_DATAPACKAGE                 = DATA_PACKAGE
           EXCEPTIONS
             GENERAL_ERROR                  = 1.
       ENDIF.
       ABORT = 0.
    *$*$ end of routine - insert your code only before this line         *-*
    ENDFORM.

Maybe you are looking for

  • Reg General Ledger Accounting(New) Business content

    Hi, I would like to know if we can migrate the standard queries based on 0FIGL_C01 and base the queries on 0FIGL_C10. Also, is it advisable to migrate the queries based on 0FIGL_VC1 and 0FIGL_VC2, to be based on 0FIGL_C10. any pointers to the same, w

  • Why do the blues in my jpegs look purple?

    I always shoot in the raw + jpeg format. But on my computer, the blues in the jpeg files always look purple while in the raw files they appear correctly.  This happens even though I have used different cameras and my monitor has been calibrated.  Can

  • Error #2148 when updated Flash Player to version 14

    Links to files shared in my network area stops working after update Flash Player to v.14 (in FP 12 and FP 13 it worked). When I click on link displaying in my application via TextFlow i got this error message: Error #2148. SWF file <http://my_www_ser

  • Cannot View Movies

    I am running the latest version of Quuicktime, 7.2.0 on a G3 running 10.4.9 I am unable to view movies but the audio component works, Any suggestions?

  • I can't get in my ipod5

    I bought this ipod 5 from a friend who had it given to him from another friend who I will refer to as Sam. Sam thought it would be funny to give the guy I bought it from (Joe) an unusable ipod by assigning it to his apple id and wiping all the info f