GROUP BY ROLLUP with daily stats

I am pretty new to the group by rollup clause and I want to rollup daily stats into monthly subtotals. When I try to rollup this data by month, it rolls it day by day and then gives me a subtotal of all months together and then a grandtotal at the bottom for all interface_id. Is it possible to take daily stats and roll them up by month. I'm confused.
set linesize 300
col company_info for a30
col site_name for a25
col Month for a20
select c.company_info,
s.site_name,
i.interface_name,
to_char(to_date(a.day_month_year,'MM/DD/YY'), 'Month DD YYYY') month,
nvl(sum(adm_total_hits),0) adm_hits,
nvl(sum(end_total_hits),0) end_hits,
nvl(sum(bandwidth_bytes),0) bandwidth
from siteinfo.companies c,
siteinfo.sites s,
siteinfo.interfaces i,
apache_daily_stats a
where c.company_id = s.company_id
and s.site_id = i.site_id
and i.interface_id = a.interface_id(+)
and i.interface_id = 4511
group by rollup (c.company_info, s.site_name, i.interface_name, a.day_month_year)
set linesize 80
Thanks in advance,
ReedK

Yes, I would like to sum by month. This is opposed to building in-line views to sum up all of the daily stats into a month subtotal. To get 12 months of data using daily stats, I would have to build 12 in-line views. YUK! I would much rather use this cool group by rollup command to sum daily stats by month.
Any clarity on this function would be appreciated.
Thanks
ReedK

Similar Messages

  • GROUP BY ROLLUP with Grouped Category Subtotals

    Hi,
    I need the categories to have group totals like the example below. Notice that California has it's own category subtotals then a grand total.
    I could do this with unions, a row type column, and sorting, but that would mean reading the entire table for each total snd subtotal.
    Any suggestions?
    Thank You in Advance for Your Help,
    Lou
    My Data
    STATE CITY CATEGORY NBR
    California     Los Angeles     AA     1
    California     Los Angeles     BB      2
    California     Los Angeles     CC     3
    California     San Diego      AA      4
    California     San Diego BB     5
    California     San Diego     CC     6
    California     San Francisco     AA     7
    California     San Francisco     BB     8
    California     San Francisco     CC     9
    Desired Result
    STATE CITY CATEGORY NBR
    California     Los Angeles     AA      1
    California     Los Angeles     BB     2
    California     Los Angeles     CC     3
    California     Los Angeles          6
    California     San Diego     AA     4
    California     San Diego     BB     5
    California     San Diego     CC     6
    California     San Diego          16
    California     San Francisco     AA     7
    California     San Francisco     BB     8
    California     San Francisco     CC     9
    California     San Francisco          24
    California          AA     12
    California          BB     15
    California          CC     18
    Grand Total               45

    My bad, that would be a cube then
    SQL> with t as (select 'California' state, 'Los Angeles' city, 'AA' categ, 1 nbr from dual
      2  union all select 'California' state, 'Los Angeles' city, 'BB' categ, 2 nbr from dual
      3  union all select 'California' state, 'Los Angeles' city, 'CC' categ, 3 nbr from dual
      4  union all select 'California' state, 'San Diego' city, 'AA' categ, 4 nbr from dual
      5  union all select 'California' state, 'San Diego' city, 'BB' categ, 5 nbr from dual
      6  union all select 'California' state, 'San Diego' city, 'CC' categ, 6 nbr from dual
      7  union all select 'California' state, 'San Francisco' city, 'AA' categ, 7 nbr from dual
      8  union all select 'California' state, 'San Francisco' city, 'BB' categ, 8 nbr from dual
      9  union all select 'California' state, 'San Francisco' city, 'CC' categ, 9 nbr from dual)
    10  select state,
    11         city,
    12         categ,
    13         sum(nbr) total
    14   from t
    15   group by cube(state, city, categ)
    16  having (grouping(city) = 1 and grouping(state) = 0)
    17  or (grouping(categ) = 0 and grouping(city) = 0 and grouping(state) = 0)
    18  order by state, city, categ
    19  /
    STATE      CITY          CA      TOTAL
    California Los Angeles   AA          1
    California Los Angeles   BB          2
    California Los Angeles   CC          3
    California San Diego     AA          4
    California San Diego     BB          5
    California San Diego     CC          6
    California San Francisco AA          7
    California San Francisco BB          8
    California San Francisco CC          9
    California               AA         12
    California               BB         15
    STATE      CITY          CA      TOTAL
    California               CC         18
    California                          45
    13 linhas selecionadas.
    SQL> You can play around with the having clause and groupings and select the rows as needed.
    Regards,
    Francisco.
    Edited by: fsitja on Mar 24, 2009 6:11 PM
    Edited by: fsitja on Mar 24, 2009 6:14 PM

  • GROUP BY ROLLUP Question

    I am curious if anyone has had this kind of problem. I'm posting a very generic example because the actual
    query is very complex and is about 120 lines long. I have the following: An outer query with 2 inline
    views as the source for the outer query.
    Select a.*, b.*
    From
    (Select statement from table_a where field3 = '1234' using GROUP BY ROLLUP) a,
    (Select statement from table_b where field3 = '1234' using GROUP BY ROLLUP) b,
    Where
    a.field1 = b.field1 (+) and
    a.field2 = b.field2 (+)
    The expected results are a one to none, one to one or one to many between a & b.
    The results are very inconsistent. For some criteria (eg: field3 = '1234'), I get NO matches
    from the second inline view. Others, I get some but not all matches.
    If I take each inline view and use them to create tables, then run the outer query by joining the two tables, I get all of the matches.
    I'm solving it right now by running each inline view inside of a union query and it performs acceptably, but, I don't understand why I couldn't get the join to work properly.
    Has anyone else experienced this ?????

    a quick one,
    SQL>
    SQL> With t As
      2  (
      3  Select 1990 c1,10c2,1000 c3 From dual Union All
      4  Select 1991   , 10  ,2000    From dual Union All
      5  Select 1990   , 20  ,1500    From dual Union All
      6  Select 1991   , 20  ,2500    From dual
      7  )Select Case when Grouping(c1) = 1 Then 'Dept Total' Else  to_char(c1)  End  c1,
      8          Case When grouping(c2) = 1 Then 'Grand Total' Else to_char(c2)  End c2,
      9          Sum(c3)
    10      From  t
    11     Group By Rollup(c2,c1)
    12  /
    C1                                       C2                                          SUM(C3)
    1990                                     10                                             1000
    1991                                     10                                             2000
    Dept Total                               10                                             3000
    1990                                     20                                             1500
    1991                                     20                                             2500
    Dept Total                               20                                             4000
    Dept Total                               Grand Total                                    7000
    7 rows selected
    SQ

  • Hierarchical Query with Rollup Sum (CONNECT BY with GROUP BY ROLLUP)

    Hi all,
    Imagine the following scenario: i have an ACCOUNT table which holds accounts and their hierarchy (currently 5 levels), and a BALANCE table which holds balance records for the accounts. Only CHILD accounts (level 5) have records in the BALANCE table. Simple example:
    CREATE TABLE accounts (account_code VARCHAR2(30), parent_account VARCHAR2(30), account_desc VARCHAR2(400));
    CREATE TABLE balances (account_code VARCHAR2(30), balance_amount NUMBER(18,2));
    INSERT INTO ACCOUNTS VALUES ('TOT',NULL,'Total');
    INSERT INTO ACCOUNTS VALUES ('ANA1','TOT','General Expenses');
    INSERT INTO ACCOUNTS VALUES ('4801001','ANA1','Small Expenses');
    INSERT INTO ACCOUNTS VALUES ('4801002','ANA1','Transportation');
    INSERT INTO ACCOUNTS VALUES ('ANA2','TOT','Health Expenses');
    INSERT INTO ACCOUNTS VALUES ('4802001','ANA2','Healthcare');
    INSERT INTO ACCOUNTS VALUES ('4802002','ANA2','Facilities');
    INSERT INTO BALANCES VALUES ('4801001', 2000);
    INSERT INTO BALANCES VALUES ('4801002', 1000);
    INSERT INTO BALANCES VALUES ('4802001', 3000);
    INSERT INTO BALANCES VALUES ('4802002', 4000);What i need in this scenario is to run a hierarchical query, where for each node i compute the sum of all its children (In LEAF nodes which are the child accounts, this sum is the value in BALANCES itself). Final Result would be:
    TOT -> 10000
      ANA1 -> 3000
        4801001 -> 2000
        4801001 -> 1000
      ANA2 -> 7000
        4802001 -> 3000
        4802002 -> 4000I have tried various ways, and found out a workaround which works for a fixed amount of levels, basically it builds the hierarchy and computes the SYS_CONNECT_BY_PATH, then splits this as a regular expression and uses GROUP BY ROLLUP to compute the higher levels. Then i assemble it again, now with the computed values. Below is the example query:
    select level
        , NVL (vfinal.child_account,'TOTAL') ||' - '||
                            ( SELECT account_desc
                                FROM accounts
                               WHERE account_code = vfinal.child_acct ) account_name
         , to_char(sum_bal, 'fm999g999g999g990') as rolled_up_balance
      from
    select coalesce( princ.lvl3, princ.lvl2, princ.lvl1 ) child_acct
         , DECODE ( princ.lvl2 , NULL
                                     , NULL
                                     , DECODE ( princ.conta_lvl3, NULL
                                     , princ.conta_lvl1,princ.conta_lvl2 ) ) parent_acct
         , sum(princ.balance_amount) sum_bal
    from (
    select hier.lvl1
         , hier.lvl2
         , hier.lvl3
         , hier.parent_account
         , hier.account_code child_acc
         , bal.balance_amount
      from ( select level 
                  , sys_connect_by_path( account_code, '/' ) hierarchy_acct
                  , REGEXP_SUBSTR(sys_connect_by_path( account_code, '/' ),'[^/]+',1,3) lvl3
                  , REGEXP_SUBSTR(sys_connect_by_path( account_code, '/' ),'[^/]+',1,2) lvl2
                  , REGEXP_SUBSTR(sys_connect_by_path( account_code, '/' ),'[^/]+',1,1) lvl1
                  , account_code
                  , parent_account 
               from accounts acc
               where level <= 3
               start with parent_account is null
               connect by nocycle prior account = parent_account
               order siblings by parent_account
               ) hier
          , balances  bal
      where bal.cod_conta  = hier.account_code
    ) princ
    where princ.lvl1 is not null
    group by rollup ( princ.lvl1
                    , princ.lvl2
                    , princ.lvl3 )
    order by princ.conta_lvl1
           , princ.conta_lvl2
           , princ.conta_lvl3
    ) vfinal
    where child_acct is not null
    start with parent_acct is null
    connect by nocycle prior child_acct = parent_acctAll said and done, what i need is to do the same thing for infinite levels, because this query has 3 fixed levels. Do you know how can i structure a new query where, independently of the number of levels, the parent sums are all rolled up like this?
    Thanks a lot in advance! Best Regards!
    Thiago
    Edited by: Thiago on Sep 6, 2011 11:31 AM
    Edited by: Thiago on Sep 6, 2011 1:01 PM

    Hi,
    Thiago wrote:
    Hi all,
    Imagine the following scenario: i have an ACCOUNT table which holds accounts and their hierarchy (currently 5 levels), and a BALANCE table which holds balance records for the accounts. Only CHILD accounts (level 5) have records in the BALANCE table. Simple example:
    CREATE TABLE accounts (account_code VARCHAR2(30), parent_account VARCHAR2(30), account_desc VARCHAR2(400));
    CREATE TABLE balances (account_code VARCHAR2(30), balance_amount NUMBER(18,2));
    INSERT INTO ACCOUNTS ('TOT',NULL,'Total');
    INSERT INTO ACCOUNTS ('ANA1','TOT','General Expenses');
    INSERT INTO ACCOUNTS ('4801001','ANA1','Small Expenses');
    INSERT INTO ACCOUNTS ('4801002','ANA1','Transportation');
    INSERT INTO ACCOUNTS ('ANA2','TOT','Health Expenses');
    INSERT INTO ACCOUNTS ('4802001','ANA2','Healthcare');
    INSERT INTO ACCOUNTS ('4802002','ANA2','Facilities');
    INSERT INTO BALANCES ('4801001', 2000);
    INSERT INTO BALANCES ('4801001', 1000);
    INSERT INTO BALANCES ('4802001', 3000);
    INSERT INTO BALANCES ('4802001', 4000);
    Thanks for posting the CREATE TABLE and INSERT statements. Remember why you do it: so that the people who want to help you can re-create the problem and test their ideas. If the statments don't work, then they are not so useful. None of the INSERT statements you posted work: they all need a VALUES keyword. Please test those statments before you post them.
    Also, make sure that the reuslts you post correspond to the sample data you post. In your sample data, there are no rows in balances for account_codes '4801002' or '4802002'.
    I think you want something like this:
    WITH  connect_by_results      AS
         SELECT     CONNECT_BY_ROOT account_code     AS root_account_code
         ,     account_code
         FROM     accounts
                             -- NOTE: No START WITH clause
         CONNECT BY     parent_account     = PRIOR account_code
    SELECT       c.root_account_code     || ' -> '
                          || TO_CHAR (SUM (b.balance_amount))     AS txt
    FROM           connect_by_results  c
    LEFT OUTER JOIN      balances          b  ON  c.account_code = b.account_code
    GROUP BY  c.root_account_code
    ;

  • Materialized views with GROUP by ROLLUP

    I am trying to created Materialized view (snapshot) using group by rollup functionality.
    The idea is to use snapshot with all subtotals already calculated in place.
    For example average group by rollup (columns of the time dimension).
    The idea is to use "query rewrite" Oracle functionality and snapshot with calculated subtotals.
    Does anybody have experiance with this method ?
    Thank you
    Michael

    The query rewrite is an internal function of Oracle. Normally, you have nothing to do than verify that the query that perform OBIEE is rewrited.
    In any other way, you can in OBIEE define that you have create an aggregate and OBIEE will (re-)write the query.
    http://www.rittmanmead.com/2007/10/26/using-the-obiee-aggregate-persistence-wizard/
    Regards
    Nico

  • Query with GROUP BY ROLLUP

    I have a query :
    SELECT e.deptno,d.dname,e.job,SUM(e.sal)
    FROM emp e,dept d
    WHERE e.deptno = d.deptno
    GROUP BY ROLLUP(e.deptno,d.dname,e.job)
    which gives the output with Deptno and Dept Name repeated for every row, I want them to appear only once for each Dept Num.
    Can you folks help me out on this.
    Thanks and Regards,
    Gaurav Srivastava

    scott@ORA92> BREAK  ON deptno ON dname
    scott@ORA92> SELECT e.deptno, d.dname, e.job, SUM(e.sal)
      2  FROM   emp e, dept d
      3  WHERE  e.deptno = d.deptno
      4  GROUP  BY ROLLUP (e.deptno, d.dname, e.job)
      5  ORDER  BY e.deptno, d.dname, e.job
      6  /
        DEPTNO DNAME          JOB       SUM(E.SAL)
            10 ACCOUNTING     CLERK           1300
                              MANAGER         2450
                              PRESIDENT       5000
                                              8750
                                              8750
            20 RESEARCH       ANALYST         6000
                              CLERK           1900
                              MANAGER         2975
                                             10875
                                             10875
            30 SALES          CLERK            950
                              MANAGER         2850
                              SALESMAN        5600
                                              9400
                                              9400
                                             29025
    16 rows selected.

  • Problem with group by rollup

    Hi
    I am using group by rollup ((....)) in order to list my data and then provide a summary at the end.
    Please take this example:
    with mydata as (
    select 'ABC' Part_no, sysdate-10 as Trans_Date, 100 Amount from dual union all
    select 'ABC' Part_no, sysdate-10 as Trans_Date, -100 Amount from dual union all
    select 'ABD' Part_no, sysdate-10 as Trans_Date, 100 Amount from dual union all
    select 'ABD' Part_no, sysdate-10 as Trans_Date, 100 Amount from dual union all
    select 'ABE' Part_no, sysdate-10 as Trans_Date, 100 Amount from dual union all
    select 'ABE' Part_no, sysdate-10 as Trans_Date, 50 Amount from dual
    select
    part_no,
    trans_date,
    sum(amount) total
    from
    mydata
    group by rollup ((part_no, trans_date))The trouble is, in that example, because some rows have no unique identifier, and they are grouped, the are summarised (by nature of group by).
    However, I want to retain the detail and also have a summary at the end, as a grand total.
    I get:
    ABC     5/13/2008 6:10:41 AM     0
    ABD     5/13/2008 6:10:41 AM     200
    ABE     5/13/2008 6:10:41 AM     150
                        350I want:
    ABC     5/13/2008 6:10:41 AM     100
    ABC     5/13/2008 6:10:41 AM     -100
    ABD     5/13/2008 6:10:41 AM     100
    ABD     5/13/2008 6:10:41 AM     100
    ABE     5/13/2008 6:10:41 AM     100
    ABE     5/13/2008 6:10:41 AM     50
                        350Can this be done?
    Thanks

    You can get total something like this:
    WITH mydata AS
         (SELECT 'ABC' part_no, SYSDATE - 10 AS trans_date, 100 amount
            FROM DUAL
          UNION ALL
          SELECT 'ABC' part_no, SYSDATE - 10 AS trans_date, -100 amount
            FROM DUAL
          UNION ALL
          SELECT 'ABD' part_no, SYSDATE - 10 AS trans_date, 100 amount
            FROM DUAL
          UNION ALL
          SELECT 'ABD' part_no, SYSDATE - 10 AS trans_date, 100 amount
            FROM DUAL
          UNION ALL
          SELECT 'ABE' part_no, SYSDATE - 10 AS trans_date, 100 amount
            FROM DUAL
          UNION ALL
          SELECT 'ABE' part_no, SYSDATE - 10 AS trans_date, 50 amount
            FROM DUAL)
    SELECT   part_no, trans_date, amount, sum (amount) over (partition by 1) total
        FROM mydata
    PART_NO TRANS_DATE           AMOUNT                                 TOTAL                                 
    ABC     5/13/2008 1:30:28 AM 100                                    350                                   
    ABC     5/13/2008 1:30:28 AM -100                                   350                                   
    ABE     5/13/2008 1:30:28 AM 50                                     350                                   
    ABD     5/13/2008 1:30:28 AM 100                                    350                                   
    ABE     5/13/2008 1:30:28 AM 100                                    350                                   
    ABD     5/13/2008 1:30:28 AM 100                                    350
    ---or
    SELECT   part_no, trans_date, amount, sum (amount) over (order by part_no) total
        FROM mydata                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

  • Group by rollup - inconsistent?

    Ran across this when doing some aggregates with real data, where the difference is more pronounced. But this test case shows it:
    with t as (
    select case
             when dbms_random.value(0,2) < 1 then 10
                                             else 20
           end dept,
           case
             when dbms_random.value(0,2) < 1 then 'abc'
                                             else 'def'
           end marker,
           sysdate - 20 + trunc(dbms_random.value(1,4)) new_date,
           sysdate - 20 + trunc(dbms_random.value(4,7)) old_date
    from dual
    connect by level<=20000)
    select dept,
           marker,
           diff,
           COUNT(*) / MAX(cnt) percent
    from (select t.*,
                 old_date - new_date diff,
                 COUNT(*) over (partition by dept, marker) cnt
          from t)
    group by rollup(dept,
                    marker,
                    diff)
    order by dept,
             marker,
             diffSo this generates 20,000 random records, groups them depending on how far apart two dates are, and then displays the % that each difference makes of that grouping. A ROLLUP isn't really necessary - I just put it in because I wanted to verify, quickly, that it was adding up to 100% for each group.
    Now, here's a sample output:
    DEPT                   MARKER DIFF                   PERCENT               
    10                     abc    1                      0.1125854443104141535987133092078809811017
    10                     abc    2                      0.2088862082830719742661841576196220345798
    10                     abc    3                      0.3331322878970647366304784881383192601528
    10                     abc    4                      0.2316043425814234016887816646562123039807
    10                     abc    5                      0.113791716928025733815842380377965420185
    10                     abc                           1                     
    10                     def    1                      0.1072504518979714802169110263105041172926
    10                     def    2                      0.2185177746535448885318337015464952801767
    10                     def    3                      0.3426390841534444667603936533440449889536
    10                     def    4                      0.2217312713396264310102430206868849166499
    10                     def    5                      0.1098614179554127334806185981120706969271
    10                     def                           1                     
    10                                                   1.9989957822855995179754970877686282386
    20                     abc    1                      0.11625148279952550415183867141162514828
    20                     abc    2                      0.2196520363780150257018584420719652036378
    20                     abc    3                      0.3301700276789244760775009885330170027679
    20                     abc    4                      0.2178726769474100434954527481217872676947
    20                     abc    5                      0.1160537761961249505733491498616053776196
    20                     abc                           1                     
    20                     def    1                      0.1060332732010422930446983363399478853478
    20                     def    2                      0.2136700741631589496893164962918420525155
    20                     def    3                      0.3553818400481058328322309079975947083584
    20                     def    4                      0.2156744838645019041892162758067749047905
    20                     def    5                      0.1092403287231910202445379835638404489878
    20                     def                           1                     
    20                                                   1.98635824436536180308422301304863582444
                                                         3.9541320680110715697904310003954132068
    27 rows selectedNow, with enough records I would expect some inaccuracy - this is a computer, and roundoff errors are a fact of life. The part that confuses me is that the first level of aggregation is always the exact "1", indicating a perfect 100% - so there is no roundoff error occuring there. But further rollups start to show some error.
    Any ideas on why some would show roundoff errors while the others do not? With my actual data, it's much more pronounced - only getting ~142% out of what should be 200% for one, 193% out of 200% for the second, and the overall total is only ~211% out of 400%.

    Figured out how to show the "very weird" results - scewed data:
    with t as (
    select case
             when dbms_random.value(0,2) < .7 then 10
                                             else 20
           end dept,
           case
             when dbms_random.value(0,2) < .5 then 'abc'
                                             else 'def'
           end marker,
           sysdate - 20 + trunc(dbms_random.value(1,4)*5) new_date,
           sysdate - 20 + trunc(dbms_random.value(4,7)*5) old_date
    from dual
    connect by level<=20000)
    select dept,
           marker,
           case
             when diff < 3 then '1-2'
             when diff < 8 then '3-7'
             when diff < 17 then '8-16'
                            else '17 or more'
           end diff_range,
           COUNT(*) / MAX(cnt) percent
    from (select t.*,
                 old_date - new_date diff,
                 COUNT(*) over (partition by dept, marker) cnt
          from t)
    group by rollup(dept,
                    marker,
                    case
             when diff < 3 then '1-2'
             when diff < 8 then '3-7'
             when diff < 17 then '8-16'
                            else '17 or more'
           end)
    order by dept,
             marker,
             length(diff_range),
             diff_range
    DEPT                   MARKER DIFF_RANGE PERCENT               
    10                     abc    1-2        0.0121472621939824331900579330966174546814
    10                     abc    3-7        0.1113810502709773874042235096243692767707
    10                     abc    8-16       0.4692580826013829190805456923939450569987
    10                     abc    17 or more 0.4072136049336572603251728648850682115492
    10                     abc               1                     
    10                     def    1-2        0.0155172413793103448275862068965517241379
    10                     def    3-7        0.1143678160919540229885057471264367816092
    10                     def    8-16       0.4614942528735632183908045977011494252874
    10                     def    17 or more 0.4086206896551724137931034482758620689655
    10                     def               1                     
    10                                       1.32517286488506821154924313212483647916
    20                     abc    1-2        0.0124575311438278595696489241223103057758
    20                     abc    3-7        0.109132091012045711932461649335941521672
    20                     abc    8-16       0.4739009574796664264387933697106970040152
    20                     abc    17 or more 0.404509420364460002059096056831051168537
    20                     abc               1                     
    20                     def    1-2        0.010325406758448060075093867334167709637
    20                     def    3-7        0.1129536921151439299123904881101376720901
    20                     def    8-16       0.4693366708385481852315394242803504380476
    20                     def    17 or more 0.4073842302878598247809762202753441802253
    20                     def               1                     
    20                                       1.32904354988160197673221455780912179553
                                             2.05909605683105116853701225162153814475
    23 rows selected

  • Group by rollup and order by question

    hi all,
    normally, using order by would sort data based on the column. however, i am using a group by roll up with an order by and my select case contains some case grouping... see my code below.
    SELECT
      CASE GROUPING(org) + GROUPING(sbu)
        WHEN 2 THEN 'Grand Total'
        WHEN 1 THEN NULL
        ELSE NULLIF(org, 'ZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZ')
        END org,
      CASE GROUPING(org) + GROUPING(sbu)
        WHEN 2 THEN NULL
        WHEN 1 THEN 'Total'
        ELSE ipl_rep_sylk_pkg.get_sbu_name(sbu)
        END sbu,
      SUM(ft) ft,
      SUM(pt) pt,
      NULLIF(NVL(SUM(ft),0) + NVL(SUM(pt),0),0) tot,
      NULLIF(NVL(SUM(ter),0) + NVL(SUM(ret),0),0) ter_ret,
      SUM(adds) adds,
      SUM(ins) ins,
      SUM(outs) outs,
      SUM(fft) fft,
      SUM(fpt) fpt,
      NULLIF(NVL(SUM(fft),0) + NVL(SUM(fpt),0),0) ftot,
      SUM(bft) bft,
      SUM(bpt) bpt,
      NULLIF(NVL(SUM(bft),0) + NVL(SUM(bpt),0),0) btot
    FROM xxipl_ceo_personnel_tbl
    GROUP BY ROLLUP (org, sbu)
    ORDER BY org, sbunot sure why but my data always start with the Grand Total Row and ends with all the Total Rows.
    is there something wrong with my query?
    thanks
    allen

    A.Sandiego wrote:
    not sure why but my data always start with the Grand Total Row and ends with all the Total Rows.Allen,
    You have aliassed your columns with the same name as the original columns org and sbu. Your order by now orders by your aliassed expressions instead of the original columns. And the "G" of "Grand Total" is before the "T" of "Total" in the alphabet ;-)
    Regards,
    Rob.

  • Oracle 10g Reports: Control Break using Group By Rollup

    Oracle 10g Contol-Break Reporting
    Hello. I am trying to create a report using Group By Rollup. The report should look like:
    MONTH......._WEEK_..... CODE.... TOTAL
    JULY..........WEEK 1..... K1...........2
    ............................. K1...........2
    .............................SUB:.........4
    ................WEEK 2..... K1...........2
    ............................. K1...........2
    .............................SUB:.........4
    ...............WEEK 3..... K1...........2
    ............................. K1...........2
    .............................SUB:.........4
    ...............WEEK 4..... K1...........2
    ............................. K1...........2
    .............................SUB:.........4
    ..........................MTH Tot:.....16
    AUG..........WEEK 1..... K1...........2
    ............................. K1...........2
    .............................SUB:.........4
    ................WEEK 2..... K1...........2
    ............................. K1...........2
    .............................SUB:.........4
    ...............WEEK 3..... K1...........2
    ............................. K1...........2
    .............................SUB:.........4
    ...............WEEK 4..... K1...........2
    ............................. K1...........2
    .............................SUB:.........4
    ..........................MTH Tot:.....16
    ..........................GRND TOT: 32
    Not sure how to group the codes into the correct month/week and the labels are a problem. Here is the table/data and a my poor attempt at using the Group by rollup. I'm still working on it but any help would be very nice.
    create table translog
    ttcd          VARCHAR2(5) not null,
    stime TIMESTAMP(6) not null,
    etime TIMESTAMP(6)
    insert into translog ( TTCD, STIME, ETIME)
    values ('T4', '01-JUL-12 12.00.01.131172 AM', '01-JUL-12 12.00.16.553256 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('T4', '01-JUL-12 12.00.17.023083 AM', '01-JUL-12 12.00.37.762118 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('K2', '01-JUL-12 12.00.38.262408 AM', '01-JUL-12 12.00.40.686331 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('U1', '01-JUL-12 12.00.40.769385 AM', '01-JUL-12 12.00.41.281300 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('SK4', '08-JUL-12 12.00.41.746175 AM', '08-JUL-12 12.00.51.775487 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('L1', '08-JUL-12 12.00.53.274039 AM', '08-JUL-12 12.00.53.802800 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('L1','08-JUL-12 12.00.54.340423 AM', '08-JUL-12 12.01.03.767422 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('T1', '08-JUL-12 12.01.04.699631 AM', '08-JUL-12 12.01.04.744194 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('S2', '15-JUL-12 12.01.04.796472 AM', '15-JUL-12 12.01.04.817773 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('T1', '15-JUL-12 12.01.04.865641 AM', '15-JUL-12 12.01.05.154274 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('T1', '15-JUL-12 12.01.05.200749 AM', '15-JUL-12 12.01.05.508953 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('L1', '15-JUL-12 12.01.06.876433 AM', '15-JUL-12 12.01.07.510032 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('T1', '15-JUL-12 12.01.07.653582 AM', '15-JUL-12 12.01.07.686764 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('S2', '15-JUL-12 12.01.07.736894 AM', '15-JUL-12 12.01.08.163321 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('L1', '22-JUL-12 12.01.08.297696 AM', '22-JUL-12 12.01.08.562933 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('T1', '22-JUL-12 12.01.08.583805 AM', '22-JUL-12 12.01.08.620702 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('L1', '22-JUL-12 12.01.08.744821 AM', '22-JUL-12 12.01.08.987524 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('L1', '22-JUL-12 12.01.09.096695 AM', '22-JUL-12 12.01.09.382138 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('L1', '22-JUL-12 12.01.09.530122 AM', '22-JUL-12 12.01.10.420257 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('T1', '01-AUG-12 12.01.10.550234 AM', '01-AUG-12 12.01.10.581535 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('S2', '01-AUG-12 12.01.10.628756 AM', '01-AUG-12 12.01.10.656373 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('T1', '01-AUG-12 12.01.10.740711 AM', '01-AUG-12 12.01.10.768745 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('T1', '01-AUG-12 12.01.10.819635 AM', '01-AUG-12 12.01.10.900849 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('L1', '01-AUG-12 12.01.09.530122 AM', '01-AUG-12 12.01.10.420257 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('L1', '08-AUG-12 12.01.11.231004 AM', '08-AUG-12 12.01.24.073071 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('T1', '08-AUG-12 12.01.24.202920 AM', '08-AUG-12 12.01.24.244538 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('S2', '08-AUG-12 12.01.24.292334 AM', '08-AUG-12 12.01.24.318852 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('T1', '08-AUG-12 12.01.24.362643 AM', '08-AUG-12 12.01.24.397662 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('L1','15-AUG-12 12.01.09.530122 AM', '15-AUG-12 12.01.10.420257 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('T1', '15-AUG-12 12.01.24.414572 AM', '15-AUG-12 12.01.24.444615 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('L2W', '15-AUG-12 12.01.24.478739 AM', '15-AUG-12 12.01.25.020265 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('K4', '15-AUG-12 12.01.25.206721 AM', '15-AUG-12 12.01.25.729493 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('L1', '15-AUG-12 12.01.25.784746 AM', '15-AUG-12 12.01.39.226921 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('L1','15-AUG-12 12.01.39.517953 AM', '15-AUG-12 12.01.50.775295 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('L1', '22-AUG-12 12.01.57.676446 AM', '22-AUG-12 12.01.58.252945 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('L1', '22-AUG-12 12.01.09.530122 AM', '22-AUG-12 12.01.10.420257 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('L1', '22-AUG-12 12.01.58.573242 AM', '22-AUG-12 12.02.10.651922 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('L1', '22-AUG-12 12.02.11.209305 AM', '22-AUG-12 12.02.24.140456 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('SK4','22-AUG-12 12.02.25.204035 AM', '22-AUG-12 12.02.25.580603 AM');
    insert into translog ( TTCD, STIME, ETIME)
    values ('T1','22-AUG-12 12.02.25.656474 AM', '22-AUG-12 12.02.25.689249 AM');
    select
    decode(grouping(trunc(stime)),1, 'Grand Total: ', trunc(stime)) AS "DATE"
    ,decode(grouping(ttcd),1, 'SUB TTL:', ttcd) CODE,count(*) TOTAL
    from translog
    group by rollup (trunc(stime),ttcd);}
    Thank you.

    830894 wrote:
    Oracle 10g Contol-Break Reporting
    Hello. I am trying to create a report using Group By Rollup. The report should look like:Couple of things:
    1) Your test data setup dows not match with your expected output &
    2) layout of data (like control break) should ideally be carried out using reporting tools
    Here is what you are probably looking for:
    SQL> select * from v$version ;
    BANNER
    Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Prod
    PL/SQL Release 10.2.0.1.0 - Production
    CORE     10.2.0.1.0     Production
    TNS for Linux: Version 10.2.0.1.0 - Production
    NLSRTL Version 10.2.0.1.0 - Production
    SQL> create table translog
      2  (
      3  ttcd VARCHAR2(5) not null,
      4  stime TIMESTAMP(6) not null,
      5  etime TIMESTAMP(6)
      6  );
    Table created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('T4', '01-JUL-12 12.00.01.131172 AM', '01-JUL-12 12.00.16.553256 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('T4', '01-JUL-12 12.00.17.023083 AM', '01-JUL-12 12.00.37.762118 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('K2', '01-JUL-12 12.00.38.262408 AM', '01-JUL-12 12.00.40.686331 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('U1', '01-JUL-12 12.00.40.769385 AM', '01-JUL-12 12.00.41.281300 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('SK4', '08-JUL-12 12.00.41.746175 AM', '08-JUL-12 12.00.51.775487 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('L1', '08-JUL-12 12.00.53.274039 AM', '08-JUL-12 12.00.53.802800 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('L1','08-JUL-12 12.00.54.340423 AM', '08-JUL-12 12.01.03.767422 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('T1', '08-JUL-12 12.01.04.699631 AM', '08-JUL-12 12.01.04.744194 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('S2', '15-JUL-12 12.01.04.796472 AM', '15-JUL-12 12.01.04.817773 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('T1', '15-JUL-12 12.01.04.865641 AM', '15-JUL-12 12.01.05.154274 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('T1', '15-JUL-12 12.01.05.200749 AM', '15-JUL-12 12.01.05.508953 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('L1', '15-JUL-12 12.01.06.876433 AM', '15-JUL-12 12.01.07.510032 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('T1', '15-JUL-12 12.01.07.653582 AM', '15-JUL-12 12.01.07.686764 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('S2', '15-JUL-12 12.01.07.736894 AM', '15-JUL-12 12.01.08.163321 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('L1', '22-JUL-12 12.01.08.297696 AM', '22-JUL-12 12.01.08.562933 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('T1', '22-JUL-12 12.01.08.583805 AM', '22-JUL-12 12.01.08.620702 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('L1', '22-JUL-12 12.01.08.744821 AM', '22-JUL-12 12.01.08.987524 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('L1', '22-JUL-12 12.01.09.096695 AM', '22-JUL-12 12.01.09.382138 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('L1', '22-JUL-12 12.01.09.530122 AM', '22-JUL-12 12.01.10.420257 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('T1', '01-AUG-12 12.01.10.550234 AM', '01-AUG-12 12.01.10.581535 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('S2', '01-AUG-12 12.01.10.628756 AM', '01-AUG-12 12.01.10.656373 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('T1', '01-AUG-12 12.01.10.740711 AM', '01-AUG-12 12.01.10.768745 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('T1', '01-AUG-12 12.01.10.819635 AM', '01-AUG-12 12.01.10.900849 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('L1', '01-AUG-12 12.01.09.530122 AM', '01-AUG-12 12.01.10.420257 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('L1', '08-AUG-12 12.01.11.231004 AM', '08-AUG-12 12.01.24.073071 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('T1', '08-AUG-12 12.01.24.202920 AM', '08-AUG-12 12.01.24.244538 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('S2', '08-AUG-12 12.01.24.292334 AM', '08-AUG-12 12.01.24.318852 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('T1', '08-AUG-12 12.01.24.362643 AM', '08-AUG-12 12.01.24.397662 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('L1','15-AUG-12 12.01.09.530122 AM', '15-AUG-12 12.01.10.420257 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('T1', '15-AUG-12 12.01.24.414572 AM', '15-AUG-12 12.01.24.444615 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('L2W', '15-AUG-12 12.01.24.478739 AM', '15-AUG-12 12.01.25.020265 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('K4', '15-AUG-12 12.01.25.206721 AM', '15-AUG-12 12.01.25.729493 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('L1', '15-AUG-12 12.01.25.784746 AM', '15-AUG-12 12.01.39.226921 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('L1','15-AUG-12 12.01.39.517953 AM', '15-AUG-12 12.01.50.775295 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('L1', '22-AUG-12 12.01.57.676446 AM', '22-AUG-12 12.01.58.252945 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('L1', '22-AUG-12 12.01.09.530122 AM', '22-AUG-12 12.01.10.420257 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('L1', '22-AUG-12 12.01.58.573242 AM', '22-AUG-12 12.02.10.651922 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('L1', '22-AUG-12 12.02.11.209305 AM', '22-AUG-12 12.02.24.140456 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('SK4','22-AUG-12 12.02.25.204035 AM', '22-AUG-12 12.02.25.580603 AM');
    1 row created.
    SQL> insert into translog ( TTCD, STIME, ETIME)
      2  values ('T1','22-AUG-12 12.02.25.656474 AM', '22-AUG-12 12.02.25.689249 AM');
    1 row created.
    SQL> commit ;
    Commit complete.
    SQL> select case when row_number() over (partition by mth order by mth, wk, ttcd) = 1 then mth end as "Month"
      2        ,case when row_number() over (partition by mth, wk order by mth, wk, ttcd) = 1 and wk is not null then 'WEEK '||wk end as "Week"
      3        ,case when gttcd = 1 and gwk = 0 and gmth = 0 then 'SUB:'
      4              when gttcd = 1 and gwk = 1 and gmth = 0 then 'Month Total:'
      5              when gttcd = 1 and gwk = 1 and gmth = 1 then 'Grand Total:'
      6              else ttcd
      7         end as "Code"
      8        ,cnt as "Total"
      9    from (
    10          select trunc(stime, 'MM') as mth, to_char(stime, 'W') as wk, ttcd, count(*) as cnt
    11                ,grouping(trunc(stime, 'MM')) as gmth, grouping(to_char(stime, 'W')) as gwk, grouping(ttcd) as gttcd
    12            from translog
    13           group by rollup(trunc(stime, 'MM'), to_char(stime, 'W'), ttcd)
    14           order by trunc(stime, 'MM'), to_char(stime, 'W'), ttcd
    15         ) ;
    Month     Week   Code              Total
    01-JUL-12 WEEK 1 K2                    1
                     T4                    2
                     U1                    1
                     SUB:                  4
              WEEK 2 L1                    2
                     SK4                   1
                     T1                    1
                     SUB:                  4
              WEEK 3 L1                    1
                     S2                    2
                     T1                    3
                     SUB:                  6
              WEEK 4 L1                    4
                     T1                    1
                     SUB:                  5
                     Month Total:         19
    01-AUG-12 WEEK 1 L1                    1
                     S2                    1
                     T1                    3
                     SUB:                  5
              WEEK 2 L1                    1
                     S2                    1
                     T1                    2
                     SUB:                  4
              WEEK 3 K4                    1
                     L1                    3
                     L2W                   1
                     T1                    1
                     SUB:                  6
              WEEK 4 L1                    4
                     SK4                   1
                     T1                    1
                     SUB:                  6
                     Month Total:         21
                     Grand Total:         40
    35 rows selected.

  • Smart Groups not working with keywords?

    I've gone through the various forum posts on Smart Groups (especially this one - http://discussions.apple.com/thread.jspa?messageID=8649078&#8649078) but still haven't found an answer to this problem.
    After importing my test files into my local iTunes install, I used the "Get Info" method to edit the metadata for those files. At this time, I put a series of keywords in the "grouping" field. I then uploaded these test files to our iTunes U site.
    When I search for those keywords on our site, I get the results that I expect to get. However, when I try to create a Smart Group based on any of those keywords, I get no results. I waited for a few days, hoping that the indexing would catch up and yield more search results - but that didn't seem to fix the problem. The documentation states that Smart Groups should work with keywords... so is there something I'm missing here?
    Any thoughts would be very helpful.

    Duncan, thanks for responding to me quickly.
    I'm confused where the keyword field is, though. I don't know of a keyword field using the "Get Info" method of editing the file's metadata on the local iTunes install, and the guide is unclear about where that field is or how to add information into it.
    On this page (http://deimos.apple.com/rsrc/doc/AppleEducation-iTunesUUsersGuide/UsingiTunesUSe arch/chapter10_section2.html), it mentions the following method for adding keywords for iTunes U:
    "To add additional keywords for iTunes U to search:
    1. Control-click the track where you want to add keywords, then choose Get Info from the shortcut menu that appears.
    2. Click the Info tab.
    3. Add keywords to the Grouping field. When searching your iTunes U site, after a track title, keywords in the Grouping field have the second-highest matching relevance.
    4. Click OK."
    This is the method that I've been using, since it seems to say that any words added to the "Grouping" field on the local install get read as keywords by iTunes U. Is there another way to add keywords to a file that I'm missing?

  • Customized heading in the Group by Rollup clause

    I have a table with the following data.
    SQL> select region, accname, secname, col1 from acc;
    REGION ACCNAME SECNAME COL1
    region1 acc1 sec1 40
    region1 acc1 sec2 60
    region1 acc1 sec3 80
    region1 acc2 sec2 50
    region1 acc2 sec5 70
    region2 acc3 sec6 120
    6 rows selected.
    I get the following output for the below query.
    SELECT region, accname, secname, SUM (col1)
    FROM acc
    GROUP BY ROLLUP (region, accname, secname);
    REGION ACCNAME SECNAME SUM(COL1)
    region1 acc1 sec1 40
    region1 acc1 sec2 60
    region1 acc1 sec3 80
    region1 acc1 180
    region1 acc2 sec2 50
    region1 acc2 sec5 70
    region1 acc2 120
    region1 300
    region2 acc3 sec6 120
    region2 acc3 120
    region2 120
    420
    12 rows selected.
    I need customized heading like 'Security Total'/'Account Total'/'Regionwise Total' against grouped amounts. Is there anyway to get this. I am using Oracle 9i. Please help me.
    Thanks.

    We can throw in the GROUPING_ID function:
    select case grouping_id(deptno, job)
    when 0
    then 'No Subtotalling'
    when 1
    then 'Job Subtotal'
    when 2
    then 'Department Total'
    when 3
    then 'Grand Total'
    end
    , case grouping(deptno)
    when 1
    then 'Department Total'
    else to_char(deptno)
    end Department
    , case grouping(job)
    when 1
    then 'Job SubTotal'
    else job
    end Job
    , sum(sal) salary_sum
    from emp
    group
    by rollup (deptno, job)
    I am not exactly sure what you are looking for. However, using GROUPING_ID you can determine what columns you are sub-totaling on (or rolling up by) in a given record.
    If you are only interested in certain subtotals, you can use this query in an inline view that you further restrict.
    Results:
    CASEGROUPING_ID( DEPARTMENT JOB SALARY_SUM
    No Subtotalling 10 CLERK 1300
    No Subtotalling 10 MANAGER 2450
    No Subtotalling 10 PRESIDENT 5000
    Job Subtotal 10 Job SubTotal 8750
    No Subtotalling 20 CLERK 1900
    No Subtotalling 20 ANALYST 6000
    No Subtotalling 20 MANAGER 2975
    Job Subtotal 20 Job SubTotal 10875
    No Subtotalling 30 CLERK 950
    No Subtotalling 30 MANAGER 2850
    No Subtotalling 30 SALESMAN 5600
    Job Subtotal 30 Job SubTotal 9400
    Grand Total Department Total Job SubTotal 29025
    Lucas

  • How use Group By Rollup in BIP ?

    Hy everybody ,
    I create a dataset with this query :
    SELECT  cntr.country_region "Region" , cntr.country_name  "Pay ", Round(AVG(s.quantity_sold * s.amount_sold ),2) "Moyenne Des Ventes"
    FROM customers c inner join sales s
    on c.cust_id = s.cust_id
    inner join countries cntr
    on c.country_id =  cntr.country_id
    group by rollup (cntr.country_region,cntr.country_name )
    It's ok in ad-hoc mode , it's ok when i generate sample data , but when i want to create my Report : I use "Data Table" and for the "Moyenne Des Ventes" column , i have no data.
    Someone can help me ?

    "ROLLUP",  "CUBE",  "GROUPING SETS" actually I used for ad-hoc reporting only on text based tools. Though in report development tools you can easily generate sub-totals I think you can use different queries.

  • Group by rollup

    I have below data:
    create table t_book_sales ( f_book_name VARCHAR2(200), f_sale_date DATE );
    INSERT INTO t_book_sales VALUES ( 'A', '05-JAN-11');
    INSERT INTO t_book_sales VALUES ( 'B', '06-JAN-11');
    INSERT INTO t_book_sales VALUES ( 'C', '07-JAN-11');
    INSERT INTO t_book_sales VALUES ( 'A', '01-APR-11');
    INSERT INTO t_book_sales VALUES ( 'A', '05-APR-11');
    INSERT INTO t_book_sales VALUES ( 'B', '06-APR-11');
    INSERT INTO t_book_sales VALUES ( 'A', '01-JUL-11');
    INSERT INTO t_book_sales VALUES ( 'A', '05-JUL-11');
    INSERT INTO t_book_sales VALUES ( 'A', '08-JUL-11');
    INSERT INTO t_book_sales VALUES ( 'A', '10-JUL-11');
    INSERT INTO t_book_sales VALUES ( 'B', '15-JUL-11');
    INSERT INTO t_book_sales VALUES ( 'B', '20-JUL-11');
    INSERT INTO t_book_sales VALUES ( 'C', '07-SEP-11');
    INSERT INTO t_book_sales VALUES ( 'C', '07-SEP-11');
    INSERT INTO t_book_sales VALUES ( 'C', '07-SEP-11');
    INSERT INTO t_book_sales VALUES ( 'C', '07-OCT-11');
    INSERT INTO t_book_sales VALUES ( 'C', '07-OCT-11');
    INSERT INTO t_book_sales VALUES ( 'C', '07-OCT-11');
    commit;
    I am using following query :
       SELECT   'Q' || to_char(trunc(f_sale_date, 'Q'), 'Q') QTR, f_book_name, count(1)
          FROM   t_book_sales
    --  group by  f_sale_date, f_book_name
        GROUP BY ROLLUP  ( trunc(f_sale_date, 'Q'), f_book_name); But here, I need to display counts as
    Q1 = Q1, Q2 = Q1+ Q2 , Q3 = Q1+ Q2+ Q3, Q4 = Q1+ Q2 + Q3 + Q4
    Any ideas appreciated.

    Hi,
    Variation on Frank's answer just for the fun learning :
    +[edit]+
    Well, I just realized it's no variation at all actually. It's just the default windowing clause of analytics.
    If you're looking for me I'll be at the coffee machine...
    +[edit]+
    Scott@my11g SQL>!cat afiedt.buf
    with t_book_sales(f_book_name, f_sale_date) as (
    select 'A', to_date('05-JAN-11','dd-MON-YY') from dual
    union select 'B', to_date('06-JAN-11','dd-MON-YY') from dual
    union select 'C', to_date('07-JAN-11','dd-MON-YY') from dual
    union select 'A', to_date('01-APR-11','dd-MON-YY') from dual
    union select 'A', to_date('05-APR-11','dd-MON-YY') from dual
    union select 'B', to_date('06-APR-11','dd-MON-YY') from dual
    union select 'A', to_date('01-JUL-11','dd-MON-YY') from dual
    union select 'A', to_date('05-JUL-11','dd-MON-YY') from dual
    union select 'A', to_date('08-JUL-11','dd-MON-YY') from dual
    union select 'A', to_date('10-JUL-11','dd-MON-YY') from dual
    union select 'B', to_date('15-JUL-11','dd-MON-YY') from dual
    union select 'B', to_date('20-JUL-11','dd-MON-YY') from dual
    union select 'C', to_date('07-SEP-11','dd-MON-YY') from dual
    union select 'C', to_date('07-SEP-11','dd-MON-YY') from dual
    union select 'C', to_date('07-SEP-11','dd-MON-YY') from dual
    union select 'C', to_date('07-OCT-11','dd-MON-YY') from dual
    union select 'C', to_date('07-OCT-11','dd-MON-YY') from dual
    union select 'C', to_date('07-OCT-11','dd-MON-YY') from dual
    ------ end of sample data ------
    SELECT
         qtr
         ,this_qtr
         ,SUM(this_qtr) OVER (PARTITION BY f_book_name ORDER BY qtr ROWS BETWEEN UNBOUNDED PRECEDING AND CURRENT ROW) so_far
    from (
         SELECT    TO_CHAR ( TRUNC (f_sale_date, 'Q')
                     , 'YYYY "Q"Q'
                     )               AS qtr
         ,       f_book_name
         ,       COUNT (*)               AS this_qtr
         FROM      t_book_sales
         GROUP BY  TRUNC (f_sale_date, 'Q')
         ,       f_book_name
    order by f_book_name, qtr
    Scott@my11g SQL>/
    QTR       THIS_QTR     SO_FAR
    2011 Q1          1          1
    2011 Q2          2          3
    2011 Q3          4          7
    2011 Q1          1          1
    2011 Q2          1          2
    2011 Q3          2          4
    2011 Q1          1          1
    2011 Q3          1          2
    2011 Q4          1          3
    9 rows selected.:-)
    Edited by: Nicosa on Jul 19, 2012 3:13 PM

  • Is it possible to subscribe to Server Group Wiki Calendar with the iPhone?

    I would like to subscribe to Group Wiki Calendar with an iPhone, but I am not having much luck.
    The Group Wiki is on our on Snow Leopard Mini Server. I have successfully subscribed to the calendar with iCal on client computers and the calendar updates correctly on the LAN. The calendar also updates correctly when the client computer is on a WAN (off site).
    I would now like to subscribe to the calendar so employees iPhones automatically update changes made to the Wiki Group Calendar.
    I have taken the URL from the Info window of the subscribed calendar in iCal and used it in the setup of a Subscription Account on the iPhone, but it consistently fails to verify the user account.
    I have tried many variations of the URL based on forum readings, but still no success.
    Does anyone know if it is possible to subscribe to a Group Wiki using the iPhone and, if so, what is the correct URL statement to get the iPhone to see the Wiki calendar?
    Thank you in advance.

    I tried several versions of this:
    http://hostname.domainname.com:8008/principals/wikis/wikiname
    https://hostname.domainname.com:8443/principals/wikis/wikiname
    and after the "Wiki Server Update 1.0" by Apple...
    http://hostname.domainname.com:8008/principals/uids/wiki-wikiname
    https://hostname.domainname.com:8443/principals/uids/wiki-wikiname
    None of them work.
    I am running SLS 10.6.4 with the Wiki Server Update 1.0. Wiki calendars from the web and in iCal.
    iPhone 4 with iOS 4.0 & iPod touch with iOS 3.1.3 trying to connect over WiFi.
    1 How is everyone else connecting their iPhones to SLS services?
    2 Anyone using the "Mobile Access" or another Mac proxy server?
    3 What about VPN from iPhone to SLS for these services?
    4 What Firewall/NAT settings between the world and your SLS do you use?

Maybe you are looking for

  • Can't create PDF

    I am using Acrobat Pro 8.1.5 on a Vista machine. It has been working fine until the last week or so. When I attempt to print to the Acrobat print driver, I get an "Invalid" error. I also seem to have grown five additional duplicate drivers, none of w

  • Vendor evaluation while doing SES

    Hi all, is there any way i can update vendor evaluation subcriteria while doing service entry sheet. is there any field which i can update for vendor evaluation for services Po. Regards, qsm sap

  • Ipad Bluetooth connectivity not working

    I have the ipad 2 and my Bluetooth will not find any devices. I've tried everything from restart to restore. Does anyone have a clue how to fix it? Thanks!!

  • Repaint JInternalFrame transparent over a picture

    Hi, I have a problem using a non-opaque JInternalFrame, with a content pane which has a background with ALPHA. When I move the frame over a picture, it does not repaint() automatically. But, when I use a ComponentListener to repaint() it manually (in

  • Front row not playing DTS soundtracks

    Dear forum, I'm having a strange problem that I'm hoping someone can help me with. I have my Mac mini connected to a 5.1 speaker system via optical out. When I play back DVDs using the DVD Player, starting it from the dock, I get nice 5.1 DTS sound.