Aggregated data

Hi, good people
Can someone please break down for me the difference between Aggregated data and Document level data
and the best examples of aggregates data and document level
This is because I understand that when dealing with aggregated data, I should use Infocube
And whilst dealing with document level, I should Use DSO.
So I need to know how to know that this is Aggregated Information and this is Document Level
Regards
Tony

Hi Tonnieyeyo,
Let's consider you and I lived in a same apartment (Myapt say).
We went shopping in a mall close by (MALL1).
In MALL1 you purchased milk, sugar, salt and cornflakes.
When leaving the mall you payed for these items and received a reciept.
The receipt in the items mentions all the items you bought.
This itemized invoice reciept that you got is maintained in the MALL1 in its database.
The items in the database is the lowest level of detail of a transaction done by the mall with a particular customer.
Database tables in the MALL1 store each item brought by each customer. Any report which looks at the item level details would be thus a detailed report.
If I buy some articles say bread, cheese, butter, Coke and so on and I get a receipt of my transaction with the MALL.
This is again stored in the MALL1's database as the transaction done with me and is stored at a detailed item level.
This data in the MALL1 is then say pulled in BW for reporting purposes.
Then at the first layer it is stored in the DSO. The data with items information is stored in the DSO. So the data is as is in the database in the ERP system in MALL1.
Now if this data is pulled in the cube without items and orders information stored in its model and the data is stored at a customer /apartment level.(ie only customer apartment characteristics included in the cube),
So the MALL1 would be able to report on an agregated level that is at customer or apartment level the amount of sales done in our apartment.
Since the  data in the cube is at an aggregated level the report as to scan lesser records than in the case if the cube would have item level information, to find out actual volume of the sales for the apartment or the customer.
Hope it helps,
Best regards,
Sunmit.

Similar Messages

  • Getting aggregated data based on Date in Webi.

    Hi,
                  I have a requirement to display last 6 Months Receviables Month Wise in Webi. And i am getting data based on clearing Date in Bex query. And data from Bex query will be like
           Month             Measure
         01.01.2014       12
         06.01.2014       15
         15.01.2014       16
        02.02.2014        8
        05.02.2014       10
        12.03.2014        4
        14.03.2014        6
        18.03.2014       10
        21.03.2014        20
        30.03.2014       30
       03.04.2014        30
    And i want aggregated data as
    Month  Measure
    Jan      43
    Feb     18
    Mar     66
    Apr     30

    Hi Satish,
    Have you tried below:
    Create Month1 object as =Month([Month]) and [total]as =Sum([Measure])
    Then drag both Month1 and total objects in report block.
    Regards,
    Yuvraj

  • Reading aggregated data from a cube/multiprovider

    Hi BI people
    My project is currently looking for a functionmodule that reads aggregated data from a cube/multiprovider.
    I already have a functionmodule that reads data from a cube and returns it in a flat format. I have debugged this, but have not found any flags that can enable the OLAP functionality needed to perform the aggregation. The functionmodule is "RSDRI_INFOPROV_READ_RFC".
    The situation is that I need to apply the aggregation logic of a profit center hierrarchy to the data I read from RSDRI_INFOPROV_READ_RFC, this means manually replicating the the OLAP engine functionality (keyfigure aggregation exception, ect.) and this is not an option with the available time/budget.
    Please have a look at the example below:
    Say that I have a profit center hierarchy as displayed below (with postable nodes).
    PC1 - $10
         |---- PC2  - $30
         |---- PC3  - $20
    The data I'm getting back from the functionmodule RSDRI_INFOPROV_READ_RFC looks like this:
    PC1 $10
    PC2 $30
    PC3 $20
    But I need the data aggregated. An aggregation utilizing the hierarchy above will make the data look like this:
    PC1 $60
    PC2 $30
    PC3 $20
    Instead of building an aggregation program, it would be usefull if it was possible to extract aggregated data.
    Any comments appreciated.
    Regards
    Martin

    Thx Olivier,
    The problem is that I need a functionmodule that can apply the OLAP aggregation for a hierarchy to the data outpu from RSDRI_INFOPROV_READ_RFC.
    ... or the best alternative would be if there were a fm/class that could provide me with the hierarchy aggregation of the data.
    /Martin

  • Problem in POS DM Aggregated Data Posting

    Hello Everyone,
    We are working with SAP Triversity POS - SAP XI - POS DM+BIW - SAP ISR Scenario. Master data flows from ISR to POS via XI; Transactional data comes from POS to XI and gets posted to POS DM, from where the aggregated data goes to SAP ISR and non-aggregated data goes to BIW. We are experiencing a particular problem while posting totals data from Triversity. The sales totals data is posted correctly from Triversity to XI and it is transformed correctly; but when this data reaches POS DM, the sales total value is always showing zero. Whatever be the input from Triversity, the total record at POS DM always reads zero. Because of this, the BIW cubes are not getting populated correctly and reports are showing wrong figures. Has anyone faced such a problem before and has any clue why this is happening? We are using standard XI Content provided by SAP and also SAP directed extraction methodology for extracting data from Triversity, but this weird error still persists. Any help will be much appreciated.
    Thanks and Sincere Regards,
    Amitabha

    Hi Amitabha,
    I strictly recommend to open an customer message on POS DM queue.
    The question is, are the TLOG data correctly booked in POS DM ?
    Use the /POSDW/SEARCH_TLOG to check if the TLOG entry has correct data.
    If this is also not valid for the transaction number. Please set a breakpoint at
    inbound interface /POSDW/BAPI_POSTR_CREATE or /POSDW/IDOC_INPUT_POSTR
    Check if input data are correct or not.
    Regards
    Björn

  • Display of aggregated data in a planning book

    Hello,
    I am trying to display aggregated data for a characteristic value in the planning book even when i am on a detailed level. 
    Example :
    I have 3 products belonging to 1 material group.
    I have 2 customers.
    For customer A, i want to see the quantity of all detailed products.
    For customer B, I want to see the quantity of the materialgroup (on aggregate level) even if I only select 1 product.
    Help is more than welcome
    Frederik
    Edited by: Frederik Verhaeghe on Dec 26, 2007 4:12 PM

    Hello Srinivas,
    Here a more detailed explanation of what I want to achieve with the macro builder.
    Three materials are beloging to the Material group 001 in this example. (FV_2 and FV_3)
    My intention is to display the quantity of all the material group for the value POOL, and not only the qty for FV_1. Even when I only diplay FV_1, I want to see the total POOL qty for FV1, FV_2 and FV_3. 
    Best regards,
    Frederik
    Edited by: Frederik Verhaeghe on Dec 26, 2007 5:43 PM

  • Aggregated data from oracle to excel report

    Hi all,
    I have a requirement where i need to fill in an excel report using aggregated data in oracle tables.
    example: i need to find total number of employees in each department and the total salaries of each department. My excel report looks like below
    DEPT|total employees| total salaries
    dept 10|15|75000
    dept 20|0|0
    dept 30|20|100000
    dept 40|5|25000
    TOTAL|40|200000
    declare
    begin
    execute immediate 'insert into abc select count(empid),sum(sal), dept_id from emp group by dept_id';
    end;
    the above query gives me the aggreagted output and then i need to manually insert the data into the excel report. Instead i need an oracle procedure that will automatically insert the data from abc table into the excel report.
    i know that the utl_file is used to export the data, but never used.
    any help will be really appreciated.
    Thanks
    S

    Hi,
    Output Oracle report into HTML:
    sqlplus user/pass@inst
    SQL> set feed off markup html on spool on
    SQL> spool C:\MyReport.html
    <br>
    SQL&gt; select * from SomeTable;
    SQL&gt; spool off
    <br>
    SQL&gt; set markup html off spool off;
    <br>
    SQL>
    - Open obtained html report with browser; Output Oracle report into EXCEL:
    sqlplus user/pass@inst
    SQL> set feed off markup html on spool on
    SQL&gt; spool C:\MyReport.xls
    <br>
    SQL&gt; select * from SomeTable;
    SQL&gt; spool off
    <br>
    SQL&gt; set markup html off spool off;
    <br>
    SQL>
    - Open obtained xls with Excel;
    - In case of warning message "The file you're trying to open ... is in a different format ...
      Do you want to open the file now? - press "Yes"
    {code}
    Good luck!
    http://dba-star.blogspot.com/                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Aggregated data types - what package(s) do they reside in?

    I'm looking for the definition of the aggregated data type MatlERPRplctnReqMsgMatl, but I cannot find it in the PI ESB. Do these data types reside in a specific package that needs to be imported, or do I use the SWDL for the interface that specifies it in the ES Workplace?

    Hi Theo,
    we've looked at the data you sent me and found that we already ran into this issue even before the first productive release.
    The good news is that the problem should be solved already with the first release SAP NetWeaver CE 7.1 EhP1 SP1 Patch 0.
    The bad news is that if you initally created the process and modeled the mapping with the Beta release (which contained this bug), false model information was written into the project files. So the problem is not solved by re-compiling the project with a newer NWDS.
    Can you please check whether this model was created initially with the Beta version? If this is the case, I'm afraid you'll have to re-model the process with the productive release from scratch.
    If you are sure that you modeled this from scratch with the official (SP1) release, please open an OSS ticket for this.
    But I can put your mind at ease on one point: your use-case does not fall under a product limitation. This should work with the productive release, and if it doesn't, it's a bug that needs to be fixed.
    Best regards,
    Oliver

  • Aggregating Data from multple sources with ESB

    Hi
    I want to aggregate data from multiple data-sources with an esb service and after that call a bpel-process with a request build from this data.
    1. read data from datasource A (dbadapter-select-call)
    2. read data from data-source B (dbadapter-select-call)
    3. put the data together in xsl-transofrmation
    4. call bpel
    Is this possible? How can I get the data from the first call together with data from the second call for transformation? If I get data from the second call, the data from the first call seems to be lost.
    Any ideas?
    Gregor

    Gregor,
    It seems this aggregation of data is not possible in ESB.This can be done in BPEL only that too using assigns but not using transformations.I have tried using transformations by giving the third argument to the ora:processXSLT function.But couldnot achieve the desired result.
    For more info on passing a second variable(of another schema) as a parameter to xslt pls refer to the post
    http://blogs.oracle.com/rammenon/2007/05/
    and the sectiion "Passing BPEL Variable contents into XSLT as Parameters".
    Hope this helps you.
    Thanks,Venkat.

  • Aggregating data loaded into different hierarchy levels

    I have some problems when i try to aggregate a variable called PRUEBA2_IMPORTE dimensinated by time dimension (parent-child type).
    I read the help in DML Reference of the OLAP Worksheet and it said the follow:
    When data is loaded into dimension values that are at different levels of a hierarchy, then you need to be careful in how you set status in the PRECOMPUTE clause in a RELATION statement in your aggregation specification. Suppose that a time dimension has a hierarchy with three levels: months aggregate into quarters, and quarters aggregate into years. Some data is loaded into month dimension values, while other data is loaded into quarter dimension values. For example, Q1 is the parent of January, February, and March. Data for March is loaded into the March dimension value. But the sum of data for January and February is loaded directly into the Q1 dimension value. In fact, the January and February dimension values contain NA values instead of data. Your goal is to add the data in March to the data in Q1. When you attempt to aggregate January, February, and March into Q1, the data in March will simply replace the data in Q1. When this happens, Q1 will only contain the March data instead of the sum of January, February, and March. To aggregate data that is loaded into different levels of a hierarchy, create a valueset for only those dimension values that contain data. DEFINE all_but_q4 VALUESET time
    LIMIT all_but_q4 TO ALL
    LIMIT all_but_q4 REMOVE 'Q4'
    Within the aggregation specification, use that valueset to specify that the detail-level data should be added to the data that already exists in its parent, Q1, as shown in the following statement. RELATION time.r PRECOMPUTE (all_but_q4)
    How to do it this for more than one dimension?
    Above i wrote my case of study:
    DEFINE T_TIME DIMENSION TEXT
    T_TIME
    200401
    200402
    200403
    200404
    200405
    200406
    200407
    200408
    200409
    200410
    200411
    2004
    200412
    200501
    200502
    200503
    200504
    200505
    200506
    200507
    200508
    200509
    200510
    200511
    2005
    200512
    DEFINE T_TIME_PARENTREL RELATION T_TIME <T_TIME T_TIME_HIERLIST>
    -----------T_TIME_HIERLIST-------------
    T_TIME H_TIME
    200401 2004
    200402 2004
    200403 2004
    200404 2004
    200405 2004
    200406 2004
    200407 2004
    200408 2004
    200409 2004
    200410 2004
    200411 2004
    2004 NA
    200412 2004
    200501 2005
    200502 2005
    200503 2005
    200504 2005
    200505 2005
    200506 2005
    200507 2005
    200508 2005
    200509 2005
    200510 2005
    200511 2005
    2005     NA
    200512 2005
    DEFINE PRUEBA2_IMPORTE FORMULA DECIMAL <T_TIME>
    EQ -
    aggregate(this_aw!PRUEBA2_IMPORTE_STORED using this_aw!OBJ262568349 -
    COUNTVAR this_aw!PRUEBA2_IMPORTE_COUNTVAR)
    T_TIME PRUEBA2_IMPORTE
    200401 NA
    200402 NA
    200403 2,00
    200404 2,00
    200405 NA
    200406 NA
    200407 NA
    200408 NA
    200409 NA
    200410 NA
    200411 NA
    2004 4,00 ---> here its right!! but...
    200412 NA
    200501 5,00
    200502 15,00
    200503 NA
    200504 NA
    200505 NA
    200506 NA
    200507 NA
    200508 NA
    200509 NA
    200510 NA
    200511 NA
    2005 10,00 ---> here must be 30,00 not 10,00
    200512 NA
    DEFINE PRUEBA2_IMPORTE_STORED VARIABLE DECIMAL <T_TIME>
    T_TIME PRUEBA2_IMPORTE_STORED
    200401 NA
    200402 NA
    200403 NA
    200404 NA
    200405 NA
    200406 NA
    200407 NA
    200408 NA
    200409 NA
    200410 NA
    200411 NA
    2004 NA
    200412 NA
    200501 5,00
    200502 15,00
    200503 NA
    200504 NA
    200505 NA
    200506 NA
    200507 NA
    200508 NA
    200509 NA
    200510 NA
    200511 NA
    2005 10,00
    200512 NA
    DEFINE OBJ262568349 AGGMAP
    AGGMAP
    RELATION this_aw!T_TIME_PARENTREL(this_aw!T_TIME_AGGRHIER_VSET1) PRECOMPUTE(this_aw!T_TIME_AGGRDIM_VSET1) OPERATOR SUM -
    args DIVIDEBYZERO YES DECIMALOVERFLOW YES NASKIP YES
    AGGINDEX NO
    CACHE NONE
    END
    DEFINE T_TIME_AGGRHIER_VSET1 VALUESET T_TIME_HIERLIST
    T_TIME_AGGRHIER_VSET1 = (H_TIME)
    DEFINE T_TIME_AGGRDIM_VSET1 VALUESET T_TIME
    T_TIME_AGGRDIM_VSET1 = (2005)
    Regards,
    Mel.

    Mel,
    There are several different types of "data loaded into different hierarchy levels" and the aproach to solving the issue is different depending on the needs of the application.
    1. Data is loaded symmetrically at uniform mixed levels. Example would include loading data at "quarter" in historical years, but at "month" in the current year, it does /not/ include data loaded at both quarter and month within the same calendar period.
    = solved by the setting of status, or in 10.2 or later with the load_status clause of the aggmap.
    2. Data is loaded at both a detail level and it's ancestor, as in your example case.
    = the aggregate command overwrites aggregate values based on the values of the children, this is the only repeatable thing that it can do. The recomended way to solve this problem is to create 'self' nodes in the hierarchy representing the data loaded at the aggregate level, which is then added as one of the children of the aggregate node. This enables repeatable calculation as well as auditability of the resultant value.
    Also note the difference in behavior between the aggregate command and the aggregate function. In your example the aggregate function looks at '2005', finds a value and returns it for a result of 10, the aggregate command would recalculate based on january and february for a result of 20.
    To solve your usage case I would suggest a hierarchy that looks more like this:
    DEFINE T_TIME_PARENTREL RELATION T_TIME <T_TIME T_TIME_HIERLIST>
    -----------T_TIME_HIERLIST-------------
    T_TIME H_TIME
    200401 2004
    200402 2004
    200403 2004
    200404 2004
    200405 2004
    200406 2004
    200407 2004
    200408 2004
    200409 2004
    200410 2004
    200411 2004
    200412 2004
    2004_SELF 2004
    2004 NA
    200501 2005
    200502 2005
    200503 2005
    200504 2005
    200505 2005
    200506 2005
    200507 2005
    200508 2005
    200509 2005
    200510 2005
    200511 2005
    200512 2005
    2005_SELF 2005
    2005 NA
    Resulting in the following cube:
    T_TIME PRUEBA2_IMPORTE
    200401 NA
    200402 NA
    200403 2,00
    200404 2,00
    200405 NA
    200406 NA
    200407 NA
    200408 NA
    200409 NA
    200410 NA
    200411 NA
    200412 NA
    2004_SELF NA
    2004 4,00
    200501 5,00
    200502 15,00
    200503 NA
    200504 NA
    200505 NA
    200506 NA
    200507 NA
    200508 NA
    200509 NA
    200510 NA
    200511 NA
    200512 NA
    2005_SELF 10,00
    2005 30,00
    3. Data is loaded at a level based upon another dimension; for example product being loaded at 'UPC' in EMEA, but at 'BRAND' in APAC.
    = this can currently only be solved by issuing multiple aggregate commands to aggregate the different regions with different input status, which unfortunately means that it is not compatable with compressed composites. We will likely add better support for this case in future releases.
    4. Data is loaded at both an aggregate level and a detail level, but the calculation is more complicated than a simple SUM operator.
    = often requires the use of ALLOCATE in order to push the data to the leaves in order to correctly calculate the aggregate values during aggregation.

  • Unable to Post POS aggregated data to R/3 -

    Hi ,
    Am trying to aggregate the pos transaction data in POSDW and post them to Retail thru WPUUMS and WPUTAB idocs.  Have followed the  following steps
    1> one step processing i.e Task > One step processing> Task area --> Generate IDOC WPUUMS
    Code for aggregation method :002 --Agg of materials with conditions
      parameter group  : 0014  : generate Idoc wpuums
    2> Also tried the two step process
    Aggregation task : Assigned the 002 ( material /Stock with taxes and disc) to the profile.
    Outbound tasks : assigned the Sales data idoc wpuums to profile.
    3> task for  Generate the Wpuums idoc to collective processing.
    When i run  aggregate thru the /posdw/pdis / the particular task  , there is no error.
    when i check thru Pos workbench under pos aggregates --No aggregates found .
    in We05 in r/3 server the idoc shows fail status 51
    Prasad

    resolved . Pass the item as ARTN / EANN and for the task WPUUMS , use aggregation by Item identifier

  • Aggregated data for quatrter and year

    Hi,
         I have created a input schedule by using any by any template, I have account on row and time dimension on column, I input some test data for each month for some accounts, it work ok and I send the data successfully.
        I create another report using Evdre with the  row and column structure, I set both member set as "SELF AND DEP", however, when I run the report, I notice that the Quarter data and the year data is not correct, For example, Q1 data should the  summary of Jan,FEB and MAR, instead it show the same data as MAR, while the year data show the same data as Q4.
       Anybody know hot to fix this?
    Thanks

    I did another test as following:
    I copy a appshell from SAP standard AppShell, I generated some test data for each month using input schedule, then I build a evdre report, it work perfect,if i  choose PERIODIC, I get the quarter and year data summary the month or quarter, if I choose YTD, I get the year to date data,
    then I copy the my master data to  replace the orginal master data, I generated some test data, this time the report only show the YTD data no mater I choose PERIODC OR YTD for measure in CV.   This is verystrange.

  • Showing aggregated data at Manager's geography level in a parent child dimension

    I have DimEmployee dimension which has parent child relationship (between ManagerID & EmployeeID)
    EmployeeID
    ManagerID
    GeographyID
    Othe columns
    1
    1
    G1
    2
    1
    G2
    3
    2
    G3
    4
    4
    G4
    5
    4
    G5
    This dimension has been connected to FactSales using “EmployeeID”.
    EmployeeID
    Sales
    1
    100
    2
    150
    3
    80
    4
    50
    5
    60
    There is one more dimension called DimGeography which has reference relationship with “FactSales” through “GeographyID” column of DimEmployee
    GeographyID
    GeoName
    G1
    Abc
    G2
    Xyz
    Now I have to rollup all subordinates data and show it under Manger’s geography as shown below
    ABC (G1)                              330 (100+150+80)
    Xyz (G2)                               110 (50+60)
    I am able to show data at MangerID level data but unable to do so at Manager’s Geography level.
    How do I solve this problem using MDX?

    I have DimEmployee dimension which has parent child relationship (between ManagerID & EmployeeID)
    EmployeeID
    ManagerID
    GeographyID
    Othe columns
    1
    1
    G1
    2
    1
    G2
    3
    2
    G3
    4
    4
    G4
    5
    4
    G5
    This dimension has been connected to FactSales using “EmployeeID”.
    EmployeeID
    Sales
    1
    100
    2
    150
    3
    80
    4
    50
    5
    60
    There is one more dimension called DimGeography which has reference relationship with “FactSales” through “GeographyID” column of DimEmployee
    GeographyID
    GeoName
    G1
    Abc
    G2
    Xyz
    Now I have to rollup all subordinates data and show it under Manger’s geography as shown below
    ABC (G1)                              330 (100+150+80)
    Xyz (G2)                               110 (50+60)
    I am able to show data at MangerID level data but unable to do so at Manager’s Geography level.
    How do I solve this problem using MDX?
    Any Help??? Ideally we have to make other attributes (in this case Geography) recursive along with attributes involved in Parent Child hierarchy

  • Is there any documentation which throws light on how data aggregation happens in data warehouse grooming? what algorithm exactly it follows in different aggregation type (raw, hourly, daily)?

    Is there any documentation which throws light on how data aggregation happens in data warehouse grooming? what algorithm exactly it follows in different aggregation type (raw, hourly, daily)?
    How exactly it picks up a specific data value during Hourly aggregations and Daily aggregations?As in  How the value is chosen. Does it say averages out or simply picks  value at the start of the hour/day or end of the hour/day ??

    I'll try one more time. :)
    Views in the operations console are derived from data in the operational database. This is always raw data, and typically does not go back more than 7 days.
    Reports get data from the data warehouse. Unless you create a custom report that uses raw data, you will never see raw data in a report - Microsoft and probably all 3rd party vendors do not develop reports that fetch raw data.
    Reports use aggregated data - hourly and daily. The data is aggregated by min, max, and avg sample for that particular aggregation. If it's hourly data, then you will see the min, max, and avg for that entire hour. Same goes for daily - you will see the
    min, max, and avg data sample for that entire day.
    And to try clarifying even more, the values you see plotted on the report are avg samples. If you drill into the performance detail report, then you can see the min, max, and avg samples, as well as standard deviation (which is calculated based on these
    three values).
    Jonathan Almquist | SCOMskills, LLC (http://scomskills.com)

  • BPM performance- data aggregation persistance

    Hi,
    I have a situation of large volumes of records to be evaluated, aggregated and split into different scenarios.
    Q1.
    Best way to persist this aggregated data.
    Q2.
    Has anyone found, know or could suggest the best way for this to run optimal.
    Regards
    Ian

    Hi Ian,
    I have both implemented some services on XI 2.0 using ABAP-Proxys at a XI-Application-System / Cluster-Databases and the same services on XI 3.0 with BPM. The proxy solution was much more performant but of course the BPM-Solution has a better monitoring and the advantage of beeing in standard. For a proxy solution you have to copy a XI mandant and configure it as "Application System". It will serve as a message allocator. The ABAP-Code (or Java) is executed in the Inbound proxys, where you can call Outbound proxys or implement database operations.
    Regards Udo

  • Issue when uploading Sales data from DSO to Cube.

    Dear All,
    I have an issue when I am uploading Sales data from DSO to Cube.  I am using BI 7.0 and I have uploaded all sales document level data to my DSO.  Then I use a transformation rule to calculate the Sales Value when I am doing DTP to cube.  Cube has customer wise aggregation data.
    In DSO I have NetPrice(KF) and Delivered_QTY(KF).  I do a simple multiplication routine in the transformation from DSO to Cube.
    RESULT =   SOURCE_FIELDS-NET_PRICE * SOURCE_FIELDS-DLV_QTY .
    At the moment I use Active Table(With out Archive) on the DSO to get the data since this is my first load.
    Issue is that the figure (Sales Value) in the cube is incorrect. I am getting a very large values which is impossible.
    Can someone please help me.
    Shanka

    Hi,
    are you sure that cube has customer wise aggregated data..since it will always aggregate the values of the key figures for the same set of characters...
    did you checked the values for the key figures as well and if even they are inflated or is the problem with this key figure only.
    During the data load the records may aggregates first for the same values of characters and then multiplication will be happening....if that is the case then you may have to multiply the values before storing it in the data package and then let it aggregate....can be achived through start routine...
    but first verify if other key figures are also having the same issue.
    Thanks
    Ajeet

Maybe you are looking for

  • I can no longer see my wall photos posted on my facebook walls nor can i see facebook friends photos on their walls. ty

    '''Dear Firefox, As of yesterday morning, i can no longer SEE the my wall photos on my Facebook wall. Nor can i see the wall photos on others walls. I've been using Firefox for three + years, and appreciate the great service it provides. Thank you fo

  • Photoshop CS4 - Application Error

    I just installed the CS4 Creative Suite 4. Everytime i try to launch photoshop cs4 i recieve an "Photoshop.exe Application Error." This error refferences 0x6963561b and that memory cannot be read at 0x0000008c. Can anyone please help me with this?? I

  • Output from iPad to Samsung TV is magenta

    After upgrading my iPad to IOS7 the output through hdmi to my Samsung Tv is Magenta. If I display a picture it has a magenta border where it previously was black and has a magenta hue over the entire screen. This is consistent with all apps. Has anyo

  • Web service date format problem

    Hi, I am using a web service from another R/3 server. Its date format is like this.. - <xsd:simpleType name="date"> - <xsd:restriction base="xsd:string">   <xsd:maxLength value="10" />   <xsd:pattern value="\d\d\d\d-\d\d-\d\d" />   </xsd:restriction>

  • Oracle RAC server's IP and Subnetmask change

    Oracle DB Version:10.2.0.3.0 OS version : RedHat linux 4 Number of Nodes: 2 We need to move our RAC db servers from Atlanta to NJ. This move requires IP and subnet mask change. How we can do this chanse in for nodeapps / CRS components?