Conditional aggregation

The following represents a daily report reflecting % change in sales where percent diff in sales = ((curr_year_sales - last_year_sales) / last_year_sales) * 100. But district salels for D15-Payne
is only 4.76% although current year sales over last year are huge. store 5601 in that district has unreported
sales and should NOT be included in the district percentage calculation although the other columns should appear as shown. (special reporting rule).
District Sales should be 83.33 % meaning daily sales for last year would effectively be 12000. How do I prevent this. I think im making this hard than whatit is.
DISTRICT                      STORENBR                      DLY_SALES                     DLY_SALES_LY                  DLY_SALES_DIFF                DLY_SALES_PERC                REGION                        STORENAME                     CATEGORYTYPE                
D13-Bowser                    1275                          20000                         11000                         9000                          81.82                         R1-Michaels                   La Vista Ave                  clsStoreRow                 
D13-Bowser                    D13-Bowser                    20000                         11000                         9000                          81.82                         R1-Michaels                   D13-Bowser                    clsDisRow                   
D15-Payne                     2385                          10000                         8000                          2000                          25                            R1-Michaels                   WellGood                      clsStoreRow                 
D15-Payne                     4290                          6000                          3000                          3000                          100                           R1-Michaels                   West Poplar                   clsStoreRow                 
D15-Payne                     5094                          6000                          1000                          5000                          500                           R1-Michaels                   Gtown                         clsStoreRow                 
D15-Payne                     5601                          0                             9000                          -9000                         0                             R1-Michaels                   County Gate                   clsStoreRow                 
D15-Payne                     D15-Payne                     22000                         21000                         1000                          4.76                          R1-Michaels                   D15-Payne                     clsDisRow                   
D29-Bruce                     6705                          20000                         18000                         2000                          11.11                         R1-Michaels                   Levi St                       clsStoreRow                 
D29-Bruce                     D29-Bruce                     20000                         18000                         2000                          11.11                         R1-Michaels                   D29-Bruce                     clsDisRow                   
R1-Michaels                   R1-Michaels                   62000                         50000                         12000                         24                            R1-Michaels                   R1-Michaels                   clsRgnRow                   
Vanco                         Vanco                         62000                         50000                         12000                         24                            Vanco                         Vanco                         clsVancoRow SQL to create data:
with  sales_data_curr_year as
      select 22  storeid ,   153 day_nbr,   10000  dly_sales  from dual union all
      select 23  storeid ,   153 day_nbr,   20000  dly_sales  from dual union all
      select 24  storeid ,   153 day_nbr,    6000  dly_sales  from dual union all
      select 25  storeid ,   153 day_nbr,    6000  dly_sales  from dual union all
      select 27  storeid ,   153 day_nbr,       0  dly_sales  from dual union all
      select 524 storeid ,   153 day_nbr,   20000  dly_sales  from dual
,sales_data_last_year as
      select 22  storeid ,   153 day_nbr,    8000  dly_sales   from dual union all
      select 23  storeid ,   153 day_nbr,   11000  dly_sales   from dual union all
      select 24  storeid ,   153 day_nbr,    3000  dly_sales   from dual union all
      select 25  storeid ,   153 day_nbr,    1000  dly_sales   from dual union all
      select 27  storeid ,   153 day_nbr,    9000  dly_sales   from dual union all
      select 524 storeid ,   153 day_nbr,   18000  dly_sales   from dual
, store_data as
      select 27   storeid,  'County Gate' storename ,    '5601' storenbr ,     'R1-Michaels' rgn ,   'D15-Payne' dis , 'BHI' cmpy  from dual union all                                                                
      select 25   storeid,  'Gtown' storename ,          '5094' storenbr ,     'R1-Michaels' rgn ,   'D15-Payne' dis , 'BHI' cmpy  from dual union all                                                              
      select 22   storeid,  'WellGood' storename ,       '2385' storenbr ,     'R1-Michaels' rgn ,   'D15-Payne' dis , 'BHI' cmpy  from dual union all                                                                  
      select 23   storeid,  'La Vista Ave' storename ,   '1275' storenbr ,     'R1-Michaels' rgn ,   'D13-Bowser' dis , 'BHI' cmpy  from dual union all                                                                 
      select 24   storeid,  'West Poplar' storename ,    '4290' storenbr ,     'R1-Michaels' rgn ,   'D15-Payne' dis , 'BHI' cmpy  from dual union all                                                                   
      select 524  storeid,  'Levi St' storename ,        '6705' storenbr ,     'R1-Michaels' rgn ,   'D29-Bruce' dis , 'BHI' cmpy  from dual                                                             
--== Decode prevents a field from being null
, rolled_up as
     select    decode(grouping(rgn),       0, rgn,           'Vanco')     as  rgn  
,              decode(grouping(dis),       0, dis,            decode(grouping(rgn),  0,  rgn,        'Vanco'))       as  dis
,              decode(grouping(storenbr),  0, storenbr,       decode(grouping(dis),  0,  dis,         decode(grouping(rgn),   0,   rgn,      'Vanco')))       as storenbr 
,              decode(grouping(storenbr),  0, max(storename), decode(grouping(dis),  0, max(dis),    decode(grouping(rgn),   0,   max(rgn), 'Vanco')))       as storename
--== Use a category type for HTML formatting
,              decode(grouping(storenbr),  0, 'clsStoreRow',  decode(grouping(dis),  0, 'clsDisRow', decode(grouping(rgn),   0,  'clsRgnRow', 'clsVancoRow'))) as categoryType 
,              sum(curr.dly_sales)                     as dly_sales
,              sum(prev.dly_sales)                     as dly_sales_ly
,              sum(curr.dly_sales - prev.dly_sales)    as dly_sales_diff
      from     sales_data_curr_year            curr
      left  outer join sales_data_last_year    prev     on prev.storeid  = curr.storeid
                                                       and prev.day_nbr =  curr.day_nbr
      right outer join store_data              stores   on stores.storeid    = curr.storeid
      group by   rollup(rgn,dis,storenbr)
      order by  rgn, dis, storenbr
select   dis                                             as district
,        storenbr                                        as storenbr
,        dly_sales                                       as dly_sales
,        dly_sales_ly                                    as dly_sales_ly
,        dly_sales_diff                                  as dly_sales_diff
,        case
            when  dly_sales    = 0  then 0
            when  dly_sales_ly = 0  then 0
            else  round((dly_sales_diff / dly_sales_ly) * 100 , 2)
         end                                                  as dly_sales_perc
,        rgn                                             as region
,        storename                                       as storename
,        categoryType                                    as categoryType
from rolled_up;

Thanks. I missed the other fields in the output because they were too far right. Now it makes sense.
Is this the expected result set?
SQL> with  sales_data_curr_year as
  2  (
  3        select 22  storeid ,   153 day_nbr,   10000  dly_sales  from dual union all
  4        select 23  storeid ,   153 day_nbr,   20000  dly_sales  from dual union all
  5        select 24  storeid ,   153 day_nbr,    6000  dly_sales  from dual union all
  6        select 25  storeid ,   153 day_nbr,    6000  dly_sales  from dual union all
  7        select 27  storeid ,   153 day_nbr,    null  dly_sales  from dual union all
  8        select 524 storeid ,   153 day_nbr,   20000  dly_sales  from dual
  9  )
10  ,sales_data_last_year as
11  (
12        select 22  storeid ,   153 day_nbr,    8000  dly_sales   from dual union all
13        select 23  storeid ,   153 day_nbr,   11000  dly_sales   from dual union all
14        select 24  storeid ,   153 day_nbr,    3000  dly_sales   from dual union all
15        select 25  storeid ,   153 day_nbr,    1000  dly_sales   from dual union all
16        select 27  storeid ,   153 day_nbr,    9000  dly_sales   from dual union all
17        select 524 storeid ,   153 day_nbr,   18000  dly_sales   from dual
18  )
19  , store_data as
20  (
21        select 27   storeid,  'County Gate' storename ,    '5601' storenbr ,     'R1-Michaels' rgn ,   'D15-Payne' dis , 'BHI' cmpy from dual union all
22        select 25   storeid,  'Gtown' storename ,          '5094' storenbr ,     'R1-Michaels' rgn ,   'D15-Payne' dis , 'BHI' cmpy from dual union all
23        select 22   storeid,  'WellGood' storename ,       '2385' storenbr ,     'R1-Michaels' rgn ,   'D15-Payne' dis , 'BHI' cmpy from dual union all
24        select 23   storeid,  'La Vista Ave' storename ,   '1275' storenbr ,     'R1-Michaels' rgn ,   'D13-Bowser' dis , 'BHI' cmpy from dual union all
25        select 24   storeid,  'West Poplar' storename ,    '4290' storenbr ,     'R1-Michaels' rgn ,   'D15-Payne' dis , 'BHI' cmpy from dual union all
26        select 524  storeid,  'Levi St' storename ,        '6705' storenbr ,     'R1-Michaels' rgn ,   'D29-Bruce' dis , 'BHI' cmpy from dual
27  )
28  --== Decode prevents a field from being null
29  , rolled_up as
30  (
31       select    decode(grouping(rgn),       0, rgn,           'Vanco')     as  rgn
32  ,              decode(grouping(dis),       0, dis,            decode(grouping(rgn),  0,  rgn,        'Vanco'))       as  dis
33  ,              decode(grouping(storenbr),  0, storenbr,       decode(grouping(dis),  0,  dis,         decode(grouping(rgn),   0, rgn,      'Vanco')))       as storenbr
34  ,              decode(grouping(storenbr),  0, max(storename), decode(grouping(dis),  0, max(dis),    decode(grouping(rgn),   0, max(rgn), 'Vanco')))       as storename
35  --== Use a category type for HTML formatting
36  ,              decode(grouping(storenbr),  0, 'clsStoreRow',  decode(grouping(dis),  0, 'clsDisRow', decode(grouping(rgn),   0,  'clsRgnRow', 'clsVancoRow'))) as categoryType
37  ,              sum(curr.dly_sales)                           as dly_sales
38  ,              sum(nvl2(curr.dly_sales,prev.dly_sales,null)) as dly_sales_ly
39  ,              sum(curr.dly_sales - prev.dly_sales)          as dly_sales_diff
40        from     sales_data_curr_year            curr
41        left  outer join sales_data_last_year    prev     on prev.storeid  = curr.storeid
42                                                         and prev.day_nbr =  curr.day_nbr
43        right outer join store_data              stores   on stores.storeid    = curr.storeid
44        group by   rollup(rgn,dis,storenbr)
45        order by  rgn, dis, storenbr
46  )
47  select   dis                                             as district
48  ,        storenbr                                        as storenbr
49  ,        dly_sales                                       as dly_sales
50  ,        dly_sales_ly                                    as dly_sales_ly
51  ,        dly_sales_diff                                  as dly_sales_diff
52  ,        case
53              when  dly_sales    = 0  then 0
54              when  dly_sales_ly = 0  then 0
55              else  round((dly_sales_diff / dly_sales_ly) * 100 , 2)
56           end                                                  as dly_sales_perc
57  ,        rgn                                             as region
58  ,        storename                                       as storename
59  ,        categoryType                                    as categoryType
60  from rolled_up
61  /
DISTRICT    STORENBR     DLY_SALES DLY_SALES_LY DLY_SALES_DIFF DLY_SALES_PERC REGION      STORENAME    CATEGORYTYP
D13-Bowser  1275             20000        11000           9000          81.82 R1-Michaels La Vista Ave clsStoreRow
D13-Bowser  D13-Bowser       20000        11000           9000          81.82 R1-Michaels D13-Bowser   clsDisRow
D15-Payne   2385             10000         8000           2000             25 R1-Michaels WellGood     clsStoreRow
D15-Payne   4290              6000         3000           3000            100 R1-Michaels West Poplar  clsStoreRow
D15-Payne   5094              6000         1000           5000            500 R1-Michaels Gtown        clsStoreRow
D15-Payne   5601                                                              R1-Michaels County Gate  clsStoreRow
D15-Payne   D15-Payne        22000        12000          10000          83.33 R1-Michaels D15-Payne    clsDisRow
D29-Bruce   6705             20000        18000           2000          11.11 R1-Michaels Levi St      clsStoreRow
D29-Bruce   D29-Bruce        20000        18000           2000          11.11 R1-Michaels D29-Bruce    clsDisRow
R1-Michaels R1-Michaels      62000        41000          21000          51.22 R1-Michaels R1-Michaels  clsRgnRow
Vanco       Vanco            62000        41000          21000          51.22 Vanco       Vanco        clsVancoRow
11 rows selected.I made two adjustments. You said the sales were unreported, but you entered value 0. Unreported in Oracle terminology is NULL, so at line 7 I entered NULL. The second adjustment was to make last years sales evaluate to NULL if this years sales are NULL. You see that happening at line 38.
Regards,
Rob.

Similar Messages

  • Forecast release split Material based on Sales Organ and Dist Channel

    Hi All,
    Similar to transaction /SAPAPO/MC7A which splits forecasts of a material between two locations based on the proportion percentage supplied, Our business has identified that it would like to do a similar thing but based on sales organisation / Distribution Channel rather than location. So for Sales Organisation SO01 Dist Channel 65 we want the forecasts to be split between Distribution Center Locations SO0A and SO0B 40% 60% respectively.
    Does anyone know if this can be achieved or has anyone found a solution to a similar requirement? Does anyone have any suggestions on tackling this?
    Thanks
    Mark

    Hi Mark,
    I think this is possible.
    Check please BAdi /SAPAPO/SDP_PA_COPY method CHANGE_DATA_AFTER_READ  - Copy/Version Management
    In this method, the selection and grouping conditions (aggregated level) can be changed after the data are loaded from a planning area. If they are changed in the tables CT_SELECTION_FR and CT_GROUP_BY_FR, the combination values in the table CT_PLOBV must be updated by calling the function module '/SAPAPO/TS_DM_MASTERDATA_GET'. The planning object IDs and key figure values should be updated in the tables CT_LINES_FR and CT_TAB_FR.
    Best Regards,
    Ivo Stoyanov

  • Advance Aggregation based on multiple conditions

    Hi members,
    I have a situation where I need to aggregate data based on multiple conditions. Relevant details of the problem is as follows.
    There is a table (let's call X). It has following columns:
    Transaction_Time (date)
    Transaction_direction (Possible values IN or OUT)
    Column_1
    Column_2
    Based on the columns: Transaction_direction, Column_1, Column_2, the type of the transaction will be derived. For example, if transaction_direction='IN' then transaction type is IN, if 'OUT' then transaction types are Out. Similarly if Column_1=Column_2 then transaction type is Txn_3 otherwise 4.
    Based on date and transaction types the aggregation will happen.The sample output would be:
    Time, Transaction type (IN, OUT, Txn_3, Txn_4), Sum of transactions
    10-June-2013 00:00  IN Transactions  2500
    10-June-2013 00:00  Txn_3 Transactions 3590
    and so.
    IN and Out transactions are easy to be derived using decode() function. However avoiding multiple UNION ALL and write a single SQL for all four conditions is tricky one.
    Hope I clarified.
    Neeraj

    What version of Oracle are you using?
    If you're on 11.x you can use the UNPIVOT feature as follows:
    with t (Transaction_Time, Transaction_direction, Column_1, Column_2) as (
    select date '2013-06-10', 'IN', 1, 1 from dual union all
    select date '2013-06-10', 'IN', 2, 2 from dual union all
    select date '2013-06-10', 'IN', 1, 2 from dual union all
    select date '2013-06-10', 'IN', 3, 4 from dual union all
    select date '2013-06-10', 'OUT', 3, 3 from dual union all
    select date '2013-06-10', 'OUT', 3, 4 from dual
    select * from (
    select
      transaction_time
    , sum(case when transaction_direction = 'IN' then 1 end)  as IN_count
    , sum(case when transaction_direction = 'OUT' then 1 end) as OUT_count
    , sum(case when Column_1 = Column_2 then 1 end)           as Txn_3_count
    , sum(case when Column_1 != Column_2 then 1 end)          as Txn_4_count
    from t
    group by transaction_time
    unpivot (
      txn_count for transaction_type in (
         IN_count as 'IN'
      ,  OUT_count as 'OUT'
      ,  Txn_3_count as 'Txn_3'
      ,  Txn_4_count as 'Txn_4'
    order by transaction_time, transaction_type
    TRANSACTION_TIME TRANSACTION_TYPE TXN_COUNT
    2013-06-10       IN               4      
    2013-06-10       OUT              2     
    2013-06-10       Txn_3            3      
    2013-06-10       Txn_4            3      
    If you're okay with getting one row per date with the 4 counts you can just use the inner select above, i.e.
    select
      transaction_time
    , sum(case when transaction_direction = 'IN' then 1 end)  as IN_count
    , sum(case when transaction_direction = 'OUT' then 1 end) as OUT_count
    , sum(case when Column_1 = Column_2 then 1 end)           as Txn_3_count
    , sum(case when Column_1 != Column_2 then 1 end)          as Txn_4_count
    from t
    group by transaction_time
    order by transaction_time
    TRANSACTION_TIME IN_COUNT OUT_COUNT  TXN_3_COUNT  TXN_4_COUNT
    2013-06-10       4        2          3            3      
    If you want to sum transaction amounts then use the same logic, except in the case statements replace 1 with the column you want to sum.
    Regards,
    Bob

  • How to display top Position Level using conditions / Exception Aggregation

    Hello,
    I have the following problem where I need to list out the row with the highest POSITION LEVEL (with reference to BUSINESS PARTNER). I have tried out all possibilities including Conditions and Exception Aggregation (MAX) but nothing seems to work. Conditions work to an extent i.e. it shows the top position level but unfortunately for the entire query result i.e. only one row is displayed in the result (the first row to satisfy the condition) instead of all the rows for each Business Partner. POSITION LEVEL is a KEY FIGURE value. It also exists as a CHARACTERISTIC under a diferent name in the cube. How can I solve this? Pls help.
    This is how the report looks at present...
    First Name     Last Name     Business Partner Nr.     Date of Birth     Postalcode     City     Position Level
    Testfallini     Enzweiler     3000000020     21.10.1990     63674     New York     1
    Tamara     Dimmer     3000179274     10.09.1987     54689     Chicago     2
    Tamara     Dimmer     3000179274     10.09.1987     54689     Chicago     1
    Tamara     Dimmer     3000179274     10.09.1987     54689     Chicago     1
    Thu-Ha     Tran     3000069951     25.12.1988     93047     Atlanta     2
    Thu-Ha     Tran     3000069951     25.12.1988     93047     Atlanta     1
    This is how the report looks when I use conditions...
    First Name     Last Name     Business Partner Nr.     Date of Birth     Postalcode     City     Position Level
    Tamara     Dimmer     3000179274     10.09.1987     54689     Chicago     2
    This is how the report should look...
    First Name     Last Name     Business Partner Nr.     Date of Birth     Postalcode     City     Position Level
    Testfallini     Enzweiler     3000000020     21.10.1990     63674     New York     1
    Tamara     Dimmer     3000179274     10.09.1987     54689     Chicago     2
    Thu-Ha     Tran     3000069951     25.12.1988     93047     Atlanta     2
    Thanks in advance,
    SD

    Hi Sebastian,
    I thought you need to display all the records with the highest position level at top. But now I realize your requirement.
    It should only display only the maximum position level for each level.
    You can try using the Condition for the position level KF, with Top N operator and value = '1'
    And then go to char assignement tab in the exception and select Most Detail Chars along Rows/Individual Chars. and combinations for the required combination of Chars.
    [List Condition for All Characteristics in Drilldown Independent|http://help.sap.com/saphelp_nw04/helpdata/en/86/dfc405ab60524ea0d3e89db15fb2e7/content.htm]
    [Defining Conditions |http://help.sap.com/saphelp_nw04/helpdata/en/73/702e39074dc93de10000000a114084/frameset.htm]
    Hope this helps....
    Rgs,
    Ravikanth.

  • Help using Rules for aggregation conditions

    I am trying to utilize the application server rules engine to enforce validation on the XML payload. What I am trying to do is use it for a timesheet application, where after a user enters their hours, they submit and it gets some simple validation using the rules engine and gets returned if not valid according to the ruleset. The application supports multiple rows of hours entered, with each row specifying a unique task/activity worked that period. I have it working as I want using XML facts and RL Functions to determine the rules, but I want to take away the RL functions that need to iterate through each row and replace them with aggregate functions in the rule conditions themselves. Here are the rules I want to enforce:
    <ul>
    <li>An employee must not enter more than x hours of time worked per day, where x is a variable defined in the rules engine. As a constraint the number obviously cannot exceed 24. This rules must span all row entries submitted to sum the amount of hours entered for each day.</li>
    <li>An employee must enter at least y hours worked total each week, where y is a variable defined in the rules engine. Typically this will be defined as 40 hours. It must aggregate all hours entered in all rows for the week to determine how many have been submitted.</li>
    </ul>
    Currently I have 2 RL functions that each return true/false and check the above conditions, but each need to iterate through all rows entered and sum up the hours entered. This is probably satisfactory except my purpose is to show the power of using the rules engine to a client, and if I need to show them I had to develop custom RL code to support this I fear they may shy away from using it.
    Here is the general structure of my XML payload:
    <Timesheet>
      <EmployeeDetails>
        <FirstName>John</FirstName>
        <LastName>Doe</LastName>
        <EmployeeID>12345</EmployeeID>
      </EmployeeDetails>
      <WeeklyHoursEntries>
        <WeeklyHoursEntryRow>
          <TaskDetails>
            <Activity>TaskActivity1</Activity>
            <Description>Some description here</Description>
            <Category>REG</Category>
          </TaskDetails>
          <HoursEntries>
            <Day1>0</Day1>
             <Day2>8</Day1>
             <Day3>8</Day1>
             <Day4>9</Day1>
             <Day5>5</Day1>
             <Day6>10</Day1>
             <Day7>0</Day1>
           </HourEntries>
         </WeeklyHoursEntryRow>
         <WeeklyHoursEntryRow>
           <TaskDetails>
             <Activity>TaskActivity1</Activity>
             <Description>Some description here</Description>
             <Category>OVT</Category>
           </TaskDetails>
           <HoursEntries>
             <Day1>0</Day1>
             <Day2>2</Day1>
             <Day3>8</Day1>
             <Day4>0/Day1>
             <Day5>0</Day1>
             <Day6>10</Day1>
             <Day7>0</Day1>
           </HourEntries>
         </WeeklyHoursEntryRow>
       </WeeklyHoursEntries>
    </Timesheet>

    This is an administrators subforum. You might have more luck in some developers subforum here on OTN?

  • Invoice Aggregation for condition Items

    Dear Experts,
    The PO conatins 25 item .. I have put conditions in the header of the PO .. then I did a GR to this PO to all items ..
    The problem is when I execute a MIRO against this PO the system split the Condition on all the items .. so the MIRO document will contains 50 item ..
    I need the MIRO contains the 25 item then the conditions items cumulative not splitted for every item
    Please Advice
    Best Regards

    hi Vikrant,
    in my opinion there is no way to reduce the number of lines in the invoice.
    In the delivery note, you have single lines for the batches and you need this für goods movement.
    In Invoice, you can't add lots of delevery - lines in one single invoice-line.
    The foriegn trade data have to be correct in the invoice.
    You can alter the Printing-Program and here you can add the 100 Lines into one line at the printing.
    hans

  • Data in the Cube not getting aggregated

    Hi Friends
    We have Cube 1 and Cube 2.
    The data flow is represented below:
    R/3 DataSource>Cube1>Cube2
    In Cube1 data is Stored according to the Calender Day.
    Cube2 has Calweek.
    In Transformations of Cube 1 and Cube 2 Calday of Cube 1 is mapped to Calweek of Cube 2.
    In the CUBE2 when i upload data from Cube1.Keyfigure Values are not getting summed.
    EXAMPLE: Data in Cube 1
    MatNo CustNo qty calday
    10001  xyz     100  01.01.2010
    10001  xyz      100  02.01.2010
    10001  xyz      100   03.01.2010
    10001  xyz     100  04.01.2010
    10001  xyz      100  05.01.2010
    10001  xyz      100   06.01.2010
    10001  xyz      100   07.01.2010
    Data in Cube 2:
    MatNo CustNo qty calweek
    10001  xyz     100  01.2010
    10001  xyz      100  01.2010
    10001  xyz      100   01.2010
    10001  xyz     100   01.2010
    10001  xyz      100   01.2010
    10001  xyz      100   01.2010
    10001  xyz      100   01.2010
    But Expected Output Should be:
    MatNo CustNo qty calweek
    10001  xyz     700  01.2010
    How to acheive this?
    I checked in the transformations all keyfigures are maintained in aggregation summation
    regards
    Preetam

    Just now i performed consisyency check for the cube:
    I a getting following warnings:
    Time characteristic 0CALWEEK value 200915 does not fit with time char 0CALMONTH val 0
    Consistency of time dimension of InfoCube &1
    Description
    This test checks whether or not the time characteristics of the InfoCube used in the time dimension are consistent. The consistency of time characteristics is extremely important for non-cumulative Cubes and partitioned InfoCubes.
    Values that do not fit together in the time dimension of an InfoCube result in incorrect results for non-cumulative cubes and InfoCubes that are partitioned according to time characteristics.
    For InfoCubes that have been partitioned according to time characteristics, conditions for the partitioning characteristic are derived from restrictions for the time characteristic.
    Errors
    When an error arises the InfoCube is marked as a Cube with an non-consistent time dimension. This has the following consequences:
    The derivation of conditions for partitioning criteria is deactivated on account of the non-fitting time characteristics. This usually has a negative effect on performance.
    When the InfoCube contains non-cumulatives, the system generates a warning for each query indicating that the displayed data may be incorrect.
    Repair Options
    Caution
    No action is required if the InfoCube does not contain non-cumulatives or is not partitioned.
    If the Infocube is partitioned, an action is only required if the read performance has gotten worse.
    You cannot automatically repair the entries of the time dimension table. However, you are able to delete entries that are no longer in use from the time dimension table.
    The system displays whether the incorrect dimension entries are still being used in the fact table.
    If these entries are no longer being used, you can carry out an automatic repair. In this case, all time dimension entries not being used in the fact table are removed.
    After the repair, the system checks whether or not the dimension is correct. If the time dimension is correct again, the InfoCube is marked as an InfoCube with a correct time dimension once again.
    If the entries are still being used, use transaction Listcube to check which data packages are affected.  You may be able to delete the data packages and then use the repair to remove the time dimension entries no longer being used. You can then reload the deleted data packages. Otherwise the InfoCube has to be built again.

  • Can not input data when removed the value for seleciton condition

    Dear Experts,
    We met a very strange issue for the IP.
    We create a aggregation level and relatd query for user to key in data.
    We have a filter in the aggregation level.
    It will set value for A,B,C,D
    When user opent he report, system will require user to key in the value for A,B,C,D.
    Now we found that if we key in value for B, cell is input ready.
    If we removed the value in for the B in the selection condition (I mean the value of B is empty, this means tha all the value of B will display in the report), we can not key in data.
    Could you kindly let me kow the reason?
    Thanks and best regards
    Alex yang

    Dear Experts,
    Many thanks for your information.
    I know the principle for the IP.
    But I think you may misunderstanding this issue due to my incorrect explaination.
    First, we think the aggreagtion level is ok. This is due to that for the B in my example, we set its as column value in the query.
    This is means for each record in the IP query, it has only one B value to reflect it.
    But strange things is that if we set fixed value for B, IP input is ok.
    If we removed fixed value for B, IP function is error.
    Now, we will test if we key in multi value for B whether IP input function is ok or not.
    Any update, I will inform you.
    Thanks and best regards
    Alex yang

  • Issue in using presentation variable as filter condition in the reports

    Hi,
    I have an issue in using presentation variable as filter condition in my reports the details are as follows:
    Details :
    We want to implement the Max and Min variables through Presentation variables only.we do not want to implement it through session variables in this case.
    We have two variables MIN and MAX to be used as Presentation Variables,for a column of the report (which is a quantity),so that the user wants to see the data for this column within a particular range.i.e the Min and the Max.This part has been implemented well . The issue is when the user wants to see the full data.In that case we will not pass any values to these two Presentation Variable or in other words we are not restricting the report data so we are not passing any value to the variables,this is when the report is throwing the error. we want to leave this variables blank in that case.but this is giving error.
    Please suggest how can I overcome this issue.
    Thanks in Advance.
    Regards,
    Praveen

    i think you have to use guided navigation for this. create two reports first is the one you are having currently and second is the one in which remove the presentation variable from the column formula. i.e. the same report with no aggregation applied.
    Now create a dummy report and make it return value only when the presentation variable value is not equal to max or min. guide the report to navigate between the first and second report based on the result of the dummy report.

  • Query Designer not working with Aggregation Levels on BW 7.30

    Hi,
    Every time I try to create a query on top of an Aggregation Level that I created with the new ABAP based RSPLAN transaction.
    If somebody can help me out I'd appreciate it.
    By the way, we don't have the Java stack, but we are currently using BO4's JVM as we're on ramp up.
    This is the error that I get when trying to create the query:
    QD Revision 667
    ERROR ID: E-ONGUIUNHANDLEDEXCEPTION
    And this is the error log:
    1:42:13 p.m..319: Info: Query Designer Start. Revision: 667
    QDbCommandBase::Execute  - Standard View
    QDbCommandBase::Execute  - Table View
    QDbCommandBase::Execute  - Rows/Columns
    QDbCommandBase::Execute  - Cells
    QDbCommandBase::Execute  - Conditions
    QDbCommandBase::Execute  - Exceptions
    QDbCommandBase::Execute  - InfoProvider
    QDbCommandBase::Execute  - Filter
    QDbCommandBase::Execute  - Documents
    QDbCommandBase::Execute  - Where-Used List
    QDbCommandBase::Execute  - Properties
    QDbCommandBase::Execute  - Properties
    QDbCommandBase::Execute  - Messages
    QDbCommandManager::ItemClickedHandler - Bar clicked: NewQuery
    QDbCommandBase::Execute  - New...
    QDbCommandBase::Execute  - Table View
    QDbCommandBase::Execute  - Rows/Columns
    QDbCommandBase::Execute  - Cells
    QDbCommandBase::Execute  - Conditions
    QDbCommandBase::Execute  - Exceptions
    QDbCommandBase::Execute  - InfoProvider
    QDbCommandBase::Execute  - Filter
    QDbCommandBase::Execute  - Documents
    QDbCommandBase::Execute  - Where-Used List
    QDbCommandBase::Execute  - Messages
    QDbCommandBase::Execute  - Table View
    QDbCommandBase::Execute  - Cells
    QDbCommandBase::Execute  - Conditions
    QDbCommandBase::Execute  - Exceptions
    QDbCommandBase::Execute  - Rows/Columns
    QDbCommandBase::Execute  - Filter
    QDbCommandBase::Execute  - InfoProvider
    QDbCommandBase::Execute  - Properties
    -EXCEPTION-START- 1:42:52 p.m..416: TRACE EXCEPTION  ---
    Exception Name: TargetInvocationException
    Exception Message: Exception has been thrown by the target of an invocation.
    Exception    at System.RuntimeMethodHandle._InvokeMethodFast(Object target, Object[] arguments, SignatureStruct& sig, MethodAttributes methodAttributes, RuntimeTypeHandle typeOwner)
       at System.RuntimeMethodHandle.InvokeMethodFast(Object target, Object[] arguments, Signature sig, MethodAttributes methodAttributes, RuntimeTypeHandle typeOwner)
       at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture, Boolean skipVisibilityChecks)
       at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture)
       at System.RuntimeType.InvokeMember(String name, BindingFlags bindingFlags, Binder binder, Object target, Object[] providedArgs, ParameterModifier[] modifiers, CultureInfo culture, String[] namedParams)
       at System.Type.InvokeMember(String name, BindingFlags invokeAttr, Binder binder, Object target, Object[] args)
       at com.sap.bi.et.QueryDesigner.QDpPropertyBase.PropertyGet(Object iPropertySource, String iPropertyName)
       at com.sap.bi.et.QueryDesigner.QDpPropertyBase.ValueGetDefault(Object iPropertySource)
       at com.sap.bi.et.QueryDesigner.QDpPropertyBase.pValueDefault()
       at com.sap.bi.et.QueryDesigner.QDpPropertyBase.ValueDefault()
       at com.sap.bi.et.QueryDesigner.QDpPropertyBase.ValueResolved()
       at com.sap.bi.et.QueryDesigner.QDpPropertyMulti.SetFirstProperty(QDpPropertyBase iProperty)
       at com.sap.bi.et.QueryDesigner.QDpPropertyMulti.Add(QDpPropertyBase iProperty)
       at com.sap.bi.et.QueryDesigner.QDuPropPageQuery.ToDialog()
       at com.sap.bi.et.QueryDesigner.QDuPropPageBase.ToDialogCore()
       at com.sap.bi.et.QueryDesigner.QDuPropPageBase.set_CommandContext(QDbCommandContext Value)
       at com.sap.bi.et.QueryDesigner.QDuPropPages.EnablePage()
       at com.sap.bi.et.QueryDesigner.QDuPropPages.ContextChangedHandler(Object iSender, QDbCommandContext iCommandContext)
       at com.sap.bi.et.QueryDesigner.QDbCommandManager.ContextChangedEventHandler.Invoke(Object iSender, QDbCommandContext iCommandContext)
       at com.sap.bi.et.QueryDesigner.QDbCommandManager.OnContextChanged(QDbCommandContext iCommandContext)
       at com.sap.bi.et.QueryDesigner.QDbCommandManager.CalculateContext(QDcView iViews, QDeAreaType iAreaType)
       at com.sap.bi.et.QueryDesigner.QDbCommandManager.CalculateContext(QDbElement iElement)
       at com.sap.bi.et.QueryDesigner.QDbCommandPropertiesQuery.ExecuteCommand()
       at com.sap.bi.et.QueryDesigner.QDbCommandBase.Execute()
       at com.sap.bi.et.QueryDesigner.QDbCommandManager.CommandExecute(QDbCommandBase iCommand)
       at com.sap.bi.et.QueryDesigner.QDbCommandManager.InitialCommandExecute(QDbCommandBase iCommand)
       at com.sap.bi.et.QueryDesigner.QDbCommandManager.DoExecuteCommandInternal()
       at com.sap.bi.et.QueryDesigner.QDbCommandManager.DoExecuteCommand(QDbCommandBase iCommand)
       at com.sap.bi.et.QueryDesigner.QDbCommandManager.ItemClickedHandler(Object sender, BarItemClickedEventArgs args)
       at com.sap.bi.et.QueryDesigner.QDdEventDispatcher.MenuItemClickedHandler(Object iSender, BarItemClickedEventArgs iE)
       at Syncfusion.Windows.Forms.Tools.XPMenus.BarManager.OnItemClicked(BarItemClickedEventArgs args)
       at Syncfusion.Windows.Forms.Tools.XPMenus.BarItem.OnItemClicked(EventArgs args)
       at com.sap.bi.et.QueryDesigner.QDiBarItem.OnItemClicked(EventArgs args)
       at Syncfusion.Windows.Forms.Tools.XPMenus.BarItem.PerformClick()
       at Syncfusion.Windows.Forms.Tools.XPMenus.BarRenderer.OnMouseUp(MouseEventArgs e)
       at Syncfusion.Windows.Forms.Tools.XPMenus.BarControlInternal.OnMouseUp(MouseEventArgs e)
       at System.Windows.Forms.Control.WmMouseUp(Message& m, MouseButtons button, Int32 clicks)
       at System.Windows.Forms.Control.WndProc(Message& m)
       at Syncfusion.Windows.Forms.Tools.XPMenus.BarControlInternal.WndProc(Message& m)
       at System.Windows.Forms.Control.ControlNativeWindow.OnMessage(Message& m)
       at System.Windows.Forms.Control.ControlNativeWindow.WndProc(Message& m)
       at System.Windows.Forms.NativeWindow.Callback(IntPtr hWnd, Int32 msg, IntPtr wparam, IntPtr lparam)
    Full Stack:   at com.sap.bi.et.common.appl.Log.Debug.WriteTraceToFile(Level lLevel, String lString, Exception ex)
       at com.sap.bi.et.common.appl.Log.Trace.Exception(Exception ex, String iAdditionalInformation)
       at com.sap.bi.et.QueryDesigner.QDbApplicationData.OnGuiUnhandledException(Object iSender, ThreadExceptionEventArgs iEventArgs)
       at System.Windows.Forms.Application.ThreadContext.OnThreadException(Exception t)
       at System.Windows.Forms.Control.WndProcException(Exception e)
       at System.Windows.Forms.Control.ControlNativeWindow.OnThreadException(Exception e)
       at System.Windows.Forms.NativeWindow.Callback(IntPtr hWnd, Int32 msg, IntPtr wparam, IntPtr lparam)
       at System.Windows.Forms.UnsafeNativeMethods.DispatchMessageW(MSG& msg)
       at System.Windows.Forms.Application.ComponentManager.System.Windows.Forms.UnsafeNativeMethods.IMsoComponentManager.FPushMessageLoop(Int32 dwComponentID, Int32 reason, Int32 pvLoopData)
       at System.Windows.Forms.Application.ThreadContext.RunMessageLoopInner(Int32 reason, ApplicationContext context)
       at System.Windows.Forms.Application.ThreadContext.RunMessageLoop(Int32 reason, ApplicationContext context)
       at System.Windows.Forms.Application.Run(Form mainForm)
       at com.sap.bi.et.QueryDesigner.QDbApplicationData.Run(Boolean iAsApplication)
       at com.sap.bi.et.QueryDesigner.QDbQueryDesigner.Run(Boolean iAsApplication)
       at com.sap.bi.et.QueryDesigner.QDStarter.QDStartup.Main()
    -EXCEPTION-END----
    -CALLING-FROM- 1:42:52 p.m..416: TRACE EXCEPTION  ---
       at com.sap.bi.et.common.appl.Log.Debug.WriteTraceToFile(Level lLevel, String lString, Exception ex)
       at com.sap.bi.et.common.appl.Log.Trace.Exception(Exception ex, String iAdditionalInformation)
       at com.sap.bi.et.QueryDesigner.QDbApplicationData.OnGuiUnhandledException(Object iSender, ThreadExceptionEventArgs iEventArgs)
       at System.Windows.Forms.Application.ThreadContext.OnThreadException(Exception t)
       at System.Windows.Forms.Control.WndProcException(Exception e)
       at System.Windows.Forms.Control.ControlNativeWindow.OnThreadException(Exception e)
       at System.Windows.Forms.NativeWindow.Callback(IntPtr hWnd, Int32 msg, IntPtr wparam, IntPtr lparam)
       at System.Windows.Forms.UnsafeNativeMethods.DispatchMessageW(MSG& msg)
       at System.Windows.Forms.Application.ComponentManager.System.Windows.Forms.UnsafeNativeMethods.IMsoComponentManager.FPushMessageLoop(Int32 dwComponentID, Int32 reason, Int32 pvLoopData)
       at System.Windows.Forms.Application.ThreadContext.RunMessageLoopInner(Int32 reason, ApplicationContext context)
       at System.Windows.Forms.Application.ThreadContext.RunMessageLoop(Int32 reason, ApplicationContext context)
       at System.Windows.Forms.Application.Run(Form mainForm)
       at com.sap.bi.et.QueryDesigner.QDbApplicationData.Run(Boolean iAsApplication)
       at com.sap.bi.et.QueryDesigner.QDbQueryDesigner.Run(Boolean iAsApplication)
       at com.sap.bi.et.QueryDesigner.QDStarter.QDStartup.Main()

    Hi,
    it seems that you are already using the most recent query designer version. Is your SAPGUI also the most recent version? If yes, open a OSS ticket. If not reinstall the SAPGUI and the BEx tools. I am also working with this query desinger version and I don't have this kind of problem.
    Regards,
    Gregor

  • How can I insert an aggregated column name as a string in the target table?

    I have a large source table, with almost 70 million records.  I need to pull the sum of four of the columns into another target table, but instead of having the same four target columns I just want to have two.
    So, let's take SUM(col1), SUM(col2), SUM(col3), and SUM(col4) from the source DB & insert them into the target like this:
    SOURCE_COLUMN
    | AMOUNT
    1 col1
    | SUM_AMOUNT
    2 col2
    | SUM_AMOUNT
    3 col3
    | SUM_AMOUNT
    4 col4
    | SUM_AMOUNT
    I know how to do this in four separate Data Flows using the source, an Aggregation Transformation, a Derived Column (to hard code the SOURCE_COLUMN label), and destination... but with this many records, it takes over 3 hours to run because it has to loop
    through these records four separate times instead of once.  Isn't there a way to do this with one Data Flow?  I'm thinking maybe Conditional Split?
    Any help is appreciated, thanks!

    Hi ,
      This could be achieved using UNPIVOT transformation. The below sample uses the below source query
    SELECT 1 AS COL1,2 AS COL2,3 AS COL3,4 AS COL4
    setup the UNPIVOT transformation as below
    The output of unpivot transformation will be as below
    Hope this helps.
    Best Regards Sorna

  • If conditions in a query

    Hi All
    If any one has used if conditions (Boolean operators) in a query designer please assist me. Following is my requirement
    I have a char-A which is EQ 'AB', based on this i should calculate the formula as in
    IF CHAR-A = 'AB'.
      KEYFIGURE4 = (KEYFIGURE1 - KEYFIGURE2 + KEYFIGURE3) * -1.
    ENDIF.
    Hope i am clear
    Thanks in advance
    Regards

    solved by my self, used a formula with aggregation reference char
    Regards

  • Setting aggregation level in POSDM

    Hi experts
    I would like to consult from you regarding SAP standard function on aggregation level to be set in POSDM.
    The requirement I get is to aggregate sales in the level of day/store/article/promotion ID.
    Are there any solution to this in terms of configuration?
    Really appreciate your help.
    BR
    Dominic

    Hi,
    Aggregation Method plays a key role in terms of One-Step Processing Tasks.For eg.,if you want to Generate WPUTAB Idoc and there is an aggregation Method code which is specifically available to have the aggregation done based on the means of payment.
    SImilarly you can have aggregation method based on material,by sales item and qualifier and also with material&Conditions combination.
    You can see the difference when you choose 0000 - Code for Aggregation method for Task code 0013 - Generate Idoc WPUTAB and process.Later on change the Code for Aggregation Method to 0005 and you can see if in POS Workbench also.
    NOTE : You can also custom specific Aggregation Method - for eg, Aggregation should be done based on transaction type/sales item type etc.
    Thanks and Regards,
    Ramesh D

  • Is there a way to make closed captioning persist in an Aggregator project?

    Hello,
    I'm running Captivate 5.0.   My project consists of 12 movies, all which have audio and closed captioning.  (In these 12 movies, the default is to have closed captioning turned off.)  I then use Aggregator so the 12 movies can be viewed consecutively from the Aggregator SWF/HTM file.  When the Aggregator project starts playing, closed captioning defaults to off, which is what I want.  The viewer has the option to turn on closed captioning through the playback bar.  If turned on, the closed captioning displays fine until the next movie in the Aggregator project starts playing, at which time the closed captioning turns off.   Is there anyway to keep closed captioning turned on?  Otherwise, if the viewer needs closed captioning, they will need to turn it on 12x as they view the project...
    Thank you for your help!

    Think you'll need the Save and Load Data widget from Michael Lund (cpguru). Then you could use a user variable that will be set to 1 if CC is toggled on in a movie, to 0 if it is not activated. This variable has to be saved and loaded in each movie, hence the widget (not free). On entering each file, you could create a conditional advanced action that checks the value of that variable, and depending on its value will toggle the system variable cpCmndCC.
    Lilybiri

  • How to skip aggregated Datastrings before the Intersection Report launched.

    Hi there,
    We are facing a issue during our monthly upload process, that is causing us to switch all the time the "Enable Intersection Validation" trigger.
    Example:
    We have two different Import DataStrings that are imported (different Accounts). The aggregated value of both strings is equal to zero.
    Account     ICP     Amount
    ABC     A     -40
    XYZ     A     40
    Actually both account are mapped to the same Target Account (example Asset_Cash), and the ICP information is mapped as well to the same Target ICP (T_A).
    At the End FDM tries to upload agregated Value to
    Account          ICP     Amount
    Asset_Cash          T_A     0
    Now the Asset_Cash does not allow Intercompany, and even that the Amount is Zero, the Validation Report is not sucessful, and if i disable it, im getting the error during the Export, as the System is telling me that the Target intersection is not valid.
    Now, the solution to delete during the Event Script of the creation of the DAT the zero lines is not the one i would choose, as i need the intersection validation report.
    So that means i need to skip them after the mapping and aggregation, but before the intersection validation is done.
    How can i solve this?
    P.S.
    I cant neither do it during the import, as the finance user is not used to change any import scripts.
    thanks alot to all of you

    Ciao SH,
    Thank you for the quick support.
    Well, i wished it could be that simple. :)
    In my Example, the "A" ICP is an correct ICP Member, and needs to be mapped to T_A, as it can be used aswell for ICP Accounts.
    I cant neither use conditional mapping, as the combination of ICP Custom Dimension and so on would be to complicated.
    At the end, im looking to change in a manner the system, so that he would skip the aggregated zero datastring before the intersection validation report is run.
    cheers and thanks again

Maybe you are looking for