Custom YTD/TOTALS

Hi all,
I would like to customize the YTD & TOTAL inner calculations depending on the account type.
For example, if I want the TOTAL or YTD sales I want that BPC calculates it as a sum of the different periods (this is what I understand is standard calculation).
In the other side, if I want the Customer end of period TOTALS or YTD, I don't want BPC to accumulate from Jan to Dec (it should be great for my company results, but it's not the reality), I'd like to use a custom calculation that takes previous period data (for example Dec-08 if I am asking for the 2008 totals or DEC YTD).
Can I customize it by using a business rule or is it better to create a mdx logic? I have allready created a mdx Logic, but it takes a long time to calculate the data.
I'd apreciate if someone can help me on this issue or if someone has another possible solution.
Thanks in advance for all your help.

I am not sure if I completely follow your question, but I have a suggestion.  If you are using BPC 7M SP3 or higher, you may have more than 1 TIME HIERARCHY.  So it would be possible to have a Calendar view and a seperate view based on a different set months to quarters, years etc.
If that doesn't solve your question, then I would look into building a custom MEASURE that would be executed on all sets of data for certain applications.  There is a How To Guide that has example of building a custom Measure for reporting purposes on SDN.
Hope this helps.

Similar Messages

  • Customer Ageing Total differ fron Balancesheet total.

    Hi All,
    Customer Ageing report doesn't matches with Debtors (Control A/c).
    Customer Ageing Total for the month of Jane.09 doesn't get match but Ageing total for the month of Feb09 does get match with control a/c .in Balancesheet.
    Kindly provide me the solutions .
    Thanks & regards,
    Harshad A.Surve.

    Hi,
    Can you check if you are running the Backdated Aging correctly.
    If you are working with 2005 version , please refer to Note No. 800294 and follow the settings mentioned in the Note.
    If you are working with 2007 version, run the report by checking the tickbox 'Display customers with zero balance'.
    Check if it helps.
    Regards,
    Jitin
    SAP Business One Forum Team

  • LOA Approval notification should show Leave Occurance and YTD totals

    LOA Approval notification should show Leave Occurance and YTD totals for manager and HR Specialist???
    Hi this is very Urgent
    Thanks

    Hi,
    Please post your question in [Human Resource Management (HRMS)|http://forums.oracle.com/forums/forum.jspa?forumID=113] forum, you would probably get a better/faster response.
    Regards,
    Hussein

  • Amend previous solution for YTD totals:  need PTD also

    Last week I received help on getting YTD totals on a 5 week period-to-date report. Data and solution are below.
    I've got a period to date report with following columns:
    week1 tots, week2 tots, week3 tots, week4 tots, week5 tot, period-to-date tots, year-to-date tots
    The records for one hot dog stands looks like in the following format:
    HOT_DOG_STAND_ID       WEEK_NBR               NET_SALES2             BUNS24434              PICKELS_AW38           MUSTARD_TB56           CHICKENHEADS33         PIECES_SOLD34          SCRAPS35               PIECES_UNACCOUNTED     HEAD_AVERAGE           EFFICIENCY            
    141                    0                      647064.59              691287.4               149142.91     
    141                    1                      697227.09              694887.4               139149.31             
    141                    2                      293067.04              344887.4               159159.91             
    141                    3                      693467.09              695687.5               139149.91             
    141                    4                      644067.09              595487.4               635149.94 
    141                    5                      644067.09              595487.4               635149.94               
    141                    7                     6897467.09            12694887.6             34139169.34                                Week nbr 0 is sum of weeks from beginning of year to the end of previous period
    week 1 - 5 are the totals for each week in the period
    week 7 represents the the year to date total
    (totals for #7 are only for display and are not correct)
    I was not concerned with PTD totals (sum of weeks 1 to 5); I simply UNION ALL'ed a second SELECT from first query summing on week 1 - 5
    The client program throws error:
    "unsupported case for inlining of query name in WITH clause". Googling suggest it is most likely most likely because of the UNION ALL
    So I do need to get the PTD totals in a record with Week number 6.
      CREATE TABLE PERIOD_DATA
       (     "HOT_DOG_STAND_ID" NUMBER NOT NULL ENABLE,
         "WEEK_DATE" DATE,
         "NET_SALES2" NUMBER,
         "BUNS24434" NUMBER,
         "PICKELS_AW38" NUMBER,
         "MUSTARD_TB56" NUMBER,
         "CHICKENHEADS33" NUMBER,
         "PIECES_SOLD34" NUMBER,
         "SCRAPS35" NUMBER,
         "PIECES_UNACCOUNTED" NUMBER
    REM INSERTING into PERIOD_DATA
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('29-DEC-08','DD-MON-RR HH.MI.SSXFF AM'),14301.39,13951.26,3431.13,0,3680,2484,378,818);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('05-JAN-09','DD-MON-RR HH.MI.SSXFF AM'),14651.37,14651.37,3249.55,0,3200,2419,505,276);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('12-JAN-09','DD-MON-RR HH.MI.SSXFF AM'),14169.89,14169.89,2463.53,0,3136,2080,474,582);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('19-JAN-09','DD-MON-RR HH.MI.SSXFF AM'),15864.46,15864.46,3245.49,0,3472,2764,475,233);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('26-JAN-09','DD-MON-RR HH.MI.SSXFF AM'),15961.2,15916.23,3395.51,0,3648,2838,392,418);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('02-FEB-09','DD-MON-RR HH.MI.SSXFF AM'),19066.4,19066.4,4165.07,0,4336,3682,333,321);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('09-FEB-09','DD-MON-RR HH.MI.SSXFF AM'),18415.74,18415.74,4024.74,0,4032,3365,482,185);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('16-FEB-09','DD-MON-RR HH.MI.SSXFF AM'),18014,17849,3486.33,0,3840,3238,374,228);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('23-FEB-09','DD-MON-RR HH.MI.SSXFF AM'),18671.09,18626.12,3729.42,0,3888,2970,353,565);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('02-MAR-09','DD-MON-RR HH.MI.SSXFF AM'),17636,17636,3815,0,3424,2840,490,94);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('09-MAR-09','DD-MON-RR HH.MI.SSXFF AM'),17235.52,17145.58,3897.42,0,3504,2928,421,155);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('16-MAR-09','DD-MON-RR HH.MI.SSXFF AM'),15989.27,15989.27,3372.95,0,3728,3051,369,308);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('23-MAR-09','DD-MON-RR HH.MI.SSXFF AM'),19067.69,18960.41,4152.6,0,4048,3293,442,313);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('30-MAR-09','DD-MON-RR HH.MI.SSXFF AM'),18717.99,18717.99,3923.69,0,4408,3219,593,596);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('06-APR-09','DD-MON-RR HH.MI.SSXFF AM'),17335.16,17335.16,3769.08,0,3928,2997,514,417);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('13-APR-09','DD-MON-RR HH.MI.SSXFF AM'),18967.39,18967.39,4157.76,0,4144,2991,527,626);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('20-APR-09','DD-MON-RR HH.MI.SSXFF AM'),23090.88,23090.88,4427.96,0,5544,4493,560,491);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('27-APR-09','DD-MON-RR HH.MI.SSXFF AM'),24197.98,24132.99,4248.66,0,6680,5190,606,884);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('04-MAY-09','DD-MON-RR HH.MI.SSXFF AM'),20202.21,20137.22,3714.68,0,7052,6170,422,460);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('11-MAY-09','DD-MON-RR HH.MI.SSXFF AM'),18514.48,18514.48,3266.06,0,5508,4178,571,759);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('18-MAY-09','DD-MON-RR HH.MI.SSXFF AM'),18678.68,18678.68,3814.07,0,5824,4345,633,846);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('25-MAY-09','DD-MON-RR HH.MI.SSXFF AM'),17937.18,17937.18,3051.52,0,4844,4986,529,-671);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('01-JUN-09','DD-MON-RR HH.MI.SSXFF AM'),17445.75,17445.75,3079.91,0,5028,4810,656,-438);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('08-JUN-09','DD-MON-RR HH.MI.SSXFF AM'),17327.88,17327.88,3263.29,0,6112,4674,672,766);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('15-JUN-09','DD-MON-RR HH.MI.SSXFF AM'),17241.72,16937.33,3328.27,0,5792,4490,567,735);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('22-JUN-09','DD-MON-RR HH.MI.SSXFF AM'),16625.83,16625.83,3485.18,0,5408,4319,761,328);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('29-JUN-09','DD-MON-RR HH.MI.SSXFF AM'),17002.84,17002.84,3091.09,0,5664,4369,544,751);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('06-JUL-09','DD-MON-RR HH.MI.SSXFF AM'),16339.19,16274.2,3075.3,0,4784,3440,697,647);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('13-JUL-09','DD-MON-RR HH.MI.SSXFF AM'),17165.12,16885.14,3458.03,0,4320,3296,640,384);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('20-JUL-09','DD-MON-RR HH.MI.SSXFF AM'),17029.77,16899.79,3198.91,0,4448,3449,645,354);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('27-JUL-09','DD-MON-RR HH.MI.SSXFF AM'),16596.89,16596.89,3015.54,0,4624,3288,665,671);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('03-AUG-09','DD-MON-RR HH.MI.SSXFF AM'),16468.58,16468.58,2981.35,0,2224,3495,564,-1835);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('10-AUG-09','DD-MON-RR HH.MI.SSXFF AM'),18625.48,18550.5,3524.44,0,4856,3482,578,796);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('17-AUG-09','DD-MON-RR HH.MI.SSXFF AM'),24538.54,24323.55,5580.71,0,5260,3771,608,881);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('24-AUG-09','DD-MON-RR HH.MI.SSXFF AM'),18081.37,18081.37,3533.45,0,5980,3080,553,2347);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('31-AUG-09','DD-MON-RR HH.MI.SSXFF AM'),17183.25,17183.25,3487.12,0,2544,3262,615,-1333);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('07-SEP-09','DD-MON-RR HH.MI.SSXFF AM'),17688.41,17575.29,3424.17,0,4800,3480,591,729);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('14-SEP-09','DD-MON-RR HH.MI.SSXFF AM'),18211.29,18211.29,3806.32,0,3968,3104,527,337);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('21-SEP-09','DD-MON-RR HH.MI.SSXFF AM'),16809.21,16744.22,3014.61,0,4128,3124,710,294);
    SELECT HOT_DOG_STAND_ID
    , DECODE(TRUNC(week_date , 'iw') ,
             to_date('24-AUG-09' , 'dd-mon-rr') , 1 ,
             to_date('24-AUG-09' , 'dd-mon-rr') + 7 , 2 ,
             to_date('24-AUG-09' , 'dd-mon-rr') + 14 , 3 ,
             to_date('24-AUG-09' , 'dd-mon-rr') + 21 , 4 ,
             to_date('24-AUG-09' , 'dd-mon-rr') + 28 , 5 , 0) AS week_nbr
    , SUM(NET_SALES2)                                                                                                                                                                                                                                                     AS net_sales2
    , SUM(BUNS24434 ) BUNS24434
    , SUM(PICKELS_AW38) PICKELS_AW38
    , SUM(MUSTARD_TB56) MUSTARD_TB56
    , SUM(CHICKENHEADS33) CHICKENHEADS33
    , SUM(PIECES_SOLD34) PIECES_SOLD34
    , SUM(SCRAPS35) SCRAPS35
    , SUM(PIECES_UNACCOUNTED) * - 1 PIECES_UNACCOUNTED
       /*--== Head average  net_sales / chickenusage*/
    , CASE
          WHEN NVL( SUM(ChickenHeads33) / 8 , 0) = 0 THEN 0
          ELSE ROUND(SUM(net_sales2) / ( SUM(ChickenHeads33) / 8 ) , 2)
       END AS Head_average
       /*--=== Efficiency =   (ChickenUsage  - scrappedDiv8 - unaccountedDiv8) / ChickenUsage)  * 100*/
    , CASE
          WHEN NVL(SUM(ChickenHeads33) / 8 , 0) = 0 THEN 0
          ELSE ROUND((((SUM(ChickenHeads33) / 8 ) - ( SUM(scraps35) / 8 ) - (SUM(pieces_unaccounted) / 8 )) / (SUM(ChickenHeads33) / 8 )) * 100 , 2)
       END AS efficiency
    FROM period_data per
    WHERE week_DATE BETWEEN TRUNC(TO_DATE( '24-AUG-09' , 'DD-MON-YY') , 'IY') AND TRUNC(TO_DATE( '24-AUG-09' , 'DD-MON-YY') , 'IW') + 6 + 7 * 4
    GROUP BY hot_dog_stand_id
    , DECODE(TRUNC(week_date , 'iw') ,
          to_date('24-AUG-09' , 'dd-mon-rr') , 1 ,
          to_date('24-AUG-09' , 'dd-mon-rr') + 7 , 2 ,
          to_date('24-AUG-09' , 'dd-mon-rr') + 14 , 3 ,
          to_date('24-AUG-09' , 'dd-mon-rr') + 21 , 4 ,
          to_date('24-AUG-09' , 'dd-mon-rr') + 28 , 5 ,
          0)
    ORDER BY DECODE(TRUNC(week_date , 'iw') , to_date('24-AUG-09' , 'dd-mon-rr') , 1 , to_date('24-AUG-09' , 'dd-mon-rr') + 7 , 2 , to_date('24-AUG-09' , 'dd-mon-rr') + 14 , 3 , to_date('24-AUG-09' , 'dd-mon-rr') + 21 , 4 , to_date('24-AUG-09' , 'dd-mon-rr') + 28 , 5 , 0);The following was the successful solution: see: Need help getting YTD total
    VARIABLE   start_date     VARCHAR2 (11);
    EXEC         :start_date  := '24-AUG-2009';
         SELECT  HOT_DOG_STAND_ID
         ,      NVL (CASE
                   WHEN  week_date >= TO_DATE( :start_date, 'DD-MON-YYYY')
                   AND   week_date <  TO_DATE( :start_date, 'DD-MON-YYYY') + 35
                   THEN  1 + FLOOR ( (week_date - TO_DATE( :start_date, 'DD-MON-YYYY'))
                                             / 7
                   ELSE  0
                      END
                 , 7
                 )          AS week_nbr
    , SUM(NET_SALES2)                                                                                                                                                                                                                                                     AS net_sales2
    , SUM(BUNS24434 ) BUNS24434
    , SUM(PICKELS_AW38) PICKELS_AW38
    , SUM(MUSTARD_TB56) MUSTARD_TB56
    , SUM(CHICKENHEADS33) CHICKENHEADS33
    , SUM(PIECES_SOLD34) PIECES_SOLD34
    , SUM(SCRAPS35) SCRAPS35
    , SUM(PIECES_UNACCOUNTED) * - 1 PIECES_UNACCOUNTED
       /*--== Head average  net_sales / chickenusage*/
    , CASE
          WHEN NVL( SUM(ChickenHeads33) / 8 , 0) = 0 THEN 0
          ELSE ROUND(SUM(net_sales2) / ( SUM(ChickenHeads33) / 8 ) , 2)
       END AS Head_average
       /*--=== Efficiency =   (ChickenUsage  - scrappedDiv8 - unaccountedDiv8) / ChickenUsage)  * 100*/
    , CASE
          WHEN NVL(SUM(ChickenHeads33) / 8 , 0) = 0 THEN 0
          ELSE ROUND((((SUM(ChickenHeads33) / 8 ) - ( SUM(scraps35) / 8 ) - (SUM(pieces_unaccounted) / 8 )) / (SUM(ChickenHeads33) / 8 )) * 100 , 2)
       END AS efficiency
    FROM period_data per
    WHERE week_DATE BETWEEN TRUNC(TO_DATE( '24-AUG-09' , 'DD-MON-YY') , 'IY') AND TRUNC(TO_DATE( '24-AUG-09' , 'DD-MON-YY') , 'IW') + 6 + 7 * 4
    GROUP BY  hot_dog_stand_id
    ,           ROLLUP (
                 CASE
                   WHEN  week_date >= TO_DATE( :start_date, 'DD-MON-YYYY')
                   AND   week_date <  TO_DATE( :start_date, 'DD-MON-YYYY') + 35
                   THEN  1 + FLOOR ( (week_date - TO_DATE( :start_date, 'DD-MON-YYYY'))
                                             / 7
                   ELSE  0
                 END          -- week_nbr
    ORDER BY  week_nbr
    ;Thanks in advance .

    Frank Kulash wrote:
    Hi,
    There's probably a way to get both the year-to-date and the period_to_date totals using ROLLUP, but I don't know what that is.I replaced your WITH statement with
    --== ABBREVS: beginning-of-week, period-week-nbr, beginning-of-period, beginning-of-period-last-year
    with report_dates as
    select '24-AUG-09' gamedate, '239' week_nbr, '24-AUG-09' bow, '25-AUG-08' bow_ly, 9 per_nbr, 1 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '25-AUG-09' gamedate, '240' week_nbr, '24-AUG-09' bow, '25-AUG-08' bow_ly, 9 per_nbr, 1 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '26-AUG-09' gamedate, '241' week_nbr, '24-AUG-09' bow, '25-AUG-08' bow_ly, 9 per_nbr, 1 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '27-AUG-09' gamedate, '242' week_nbr, '24-AUG-09' bow, '25-AUG-08' bow_ly, 9 per_nbr, 1 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '28-AUG-09' gamedate, '243' week_nbr, '24-AUG-09' bow, '25-AUG-08' bow_ly, 9 per_nbr, 1 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '29-AUG-09' gamedate, '244' week_nbr, '24-AUG-09' bow, '25-AUG-08' bow_ly, 9 per_nbr, 1 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '30-AUG-09' gamedate, '245' week_nbr, '24-AUG-09' bow, '25-AUG-08' bow_ly, 9 per_nbr, 1 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '31-AUG-09' gamedate, '246' week_nbr, '31-AUG-09' bow, '01-SEP-08' bow_ly, 9 per_nbr,  2 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '01-SEP-09' gamedate, '247' week_nbr, '31-AUG-09' bow, '01-SEP-08' bow_ly, 9 per_nbr, 2 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '02-SEP-09' gamedate, '248' week_nbr, '31-AUG-09' bow, '01-SEP-08' bow_ly, 9 per_nbr, 2 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '03-SEP-09' gamedate, '249' week_nbr, '31-AUG-09' bow, '01-SEP-08' bow_ly, 9 per_nbr, 2 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '04-SEP-09' gamedate, '250' week_nbr, '31-AUG-09' bow, '01-SEP-08' bow_ly, 9 per_nbr, 2 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '05-SEP-09' gamedate, '251' week_nbr, '31-AUG-09' bow, '01-SEP-08' bow_ly, 9 per_nbr, 2 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '06-SEP-09' gamedate, '252' week_nbr, '31-AUG-09' bow, '01-SEP-08' bow_ly, 9 per_nbr, 2 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '07-SEP-09' gamedate, '253' week_nbr, '07-SEP-09' bow, '08-SEP-08' bow_ly, 9 per_nbr, 3 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '08-SEP-09' gamedate, '254' week_nbr, '07-SEP-09' bow, '08-SEP-08' bow_ly, 9 per_nbr, 3 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '09-SEP-09' gamedate, '255' week_nbr, '07-SEP-09' bow, '08-SEP-08' bow_ly, 9 per_nbr, 3 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '10-SEP-09' gamedate, '256' week_nbr, '07-SEP-09' bow, '08-SEP-08' bow_ly, 9 per_nbr, 3 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '11-SEP-09' gamedate, '257' week_nbr, '07-SEP-09' bow, '08-SEP-08' bow_ly, 9 per_nbr, 3 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '12-SEP-09' gamedate, '258' week_nbr, '07-SEP-09' bow, '08-SEP-08' bow_ly, 9 per_nbr, 3 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '13-SEP-09' gamedate, '259' week_nbr, '07-SEP-09' bow, '08-SEP-08' bow_ly, 9 per_nbr, 3 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '14-SEP-09' gamedate, '260' week_nbr, '14-SEP-09' bow, '15-SEP-08' bow_ly, 9 per_nbr, 4 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '15-SEP-09' gamedate, '261' week_nbr, '14-SEP-09' bow, '15-SEP-08' bow_ly, 9 per_nbr, 4 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '16-SEP-09' gamedate, '262' week_nbr, '14-SEP-09' bow, '15-SEP-08' bow_ly, 9 per_nbr, 4 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '17-SEP-09' gamedate, '263' week_nbr, '14-SEP-09' bow, '15-SEP-08' bow_ly, 9 per_nbr, 4 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '18-SEP-09' gamedate, '264' week_nbr, '14-SEP-09' bow, '15-SEP-08' bow_ly, 9 per_nbr, 4 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '19-SEP-09' gamedate, '265' week_nbr, '14-SEP-09' bow, '15-SEP-08' bow_ly, 9 per_nbr, 4 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '20-SEP-09' gamedate, '266' week_nbr, '14-SEP-09' bow, '15-SEP-08' bow_ly, 9 per_nbr, 4 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '21-SEP-09' gamedate, '267' week_nbr, '21-SEP-09' bow, '22-SEP-08' bow_ly, 9 per_nbr, 5 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '22-SEP-09' gamedate, '268' week_nbr, '21-SEP-09' bow, '22-SEP-08' bow_ly, 9 per_nbr, 5 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '23-SEP-09' gamedate, '269' week_nbr, '21-SEP-09' bow, '22-SEP-08' bow_ly, 9 per_nbr, 5 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '24-SEP-09' gamedate, '270' week_nbr, '21-SEP-09' bow, '22-SEP-08' bow_ly, 9 per_nbr , 5 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '25-SEP-09' gamedate, '271' week_nbr, '21-SEP-09' bow, '22-SEP-08' bow_ly, 9 per_nbr, 5 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '26-SEP-09' gamedate, '272' week_nbr, '21-SEP-09' bow, '22-SEP-08' bow_ly, 9 per_nbr, 5 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '27-SEP-09' gamedate, '273' week_nbr, '21-SEP-09' bow, '22-SEP-08' bow_ly, 9 per_nbr, 5 per_week_nbr, '24-AUG-09' bop, '25-AUG-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                   
    select '28-SEP-09' gamedate, '274' week_nbr, '28-SEP-09' bow, '29-SEP-08' bow_ly, 10 per_nbr,  1 per_week_nbr, '28-SEP-09' bop, '29-SEP-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                  
    select '29-SEP-09' gamedate, '275' week_nbr, '28-SEP-09' bow, '29-SEP-08' bow_ly, 10 per_nbr,  1 per_week_nbr, '28-SEP-09' bop, '29-SEP-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                  
    select '30-SEP-09' gamedate, '276' week_nbr, '28-SEP-09' bow, '29-SEP-08' bow_ly, 10 per_nbr, 1 per_week_nbr, '28-SEP-09' bop, '29-SEP-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual union all                                                                                                                  
    select '01-OCT-09' gamedate, '277' week_nbr, '28-SEP-09' bow, '29-SEP-08' bow_ly, 10 per_nbr, 1 per_week_nbr, '28-SEP-09' bop, '29-SEP-08' bop_ly, '29-DEC-08' boy, '29-DEC-07' boy_ly from dual                                                                                                               
    )which allowed me to substitute (see the func DDL below):
    ,  nvl( case
                  when week_date < dates.bop then 0
                  else fnc_get_per_week_nbr(week_date)
                end
            , 7)  as week_nbrThis is where i got stuck but I'm sure I was on right track
    with report_dates as
    --=== INSERT WITH CLAUSE FROM ABOVE ===---                                                             
    SELECT  HOT_DOG_STAND_ID
             --== when week_date < beginning of period then week nbr = 0 to representing beginning of year to end of previous period
           ,  nvl( case
                  when week_date < dates.bop then 0
                  else fnc_get_per_week_nbr(week_date)
                end
            , 7)  as week_nbr
    , SUM(NET_SALES2)   NET_SALES2                                                                                                                                                                                                                                                  AS net_sales2
    , SUM(BUNS24434 ) BUNS24434
    , SUM(PICKELS_AW38) PICKELS_AW38
    , max(sum(net_sales2))  over partition by(  ??? where week_nbr in (1,2,3,4,5) net_sales2_ptd
    , max(sum(BUNS24434))  over partition by(  ??? where week_nbr in (1,2,3,4,5) BUNS24434_ptd
    , max(sum(PICKELS_AW38))  over partition by(  ??? where week_nbr in (1,2,3,4,5) PICKELS_AW38_ptd
    FROM period_data per
    inner join report_dates dates on dates.game_date = TO_DATE( '26-AUG-09' , 'DD-MON-YY')
    WHERE week_DATE BETWEEN dates.boy AND to_date(dates.bow, 'dd-mon-yy') + 6
    GROUP BY  hot_dog_stand_id
    ,    ROLLUP (
                    case
                  when week_date < dates.bop then 0
                  else fnc_get_per_week_nbr(week_date)
                end
    ORDER BY  week_nbr
    ;As a footnote, in my live database, reports_tables is a table, so I cleaned up main SELECT with the referenced function:
    create or replace FUNCTION fnc_get_per_week_nbr(dte IN date)
    RETURN number IS
    out_week_nbr  number;
    BEGIN
    --== ABBREVS: beginning-of-week, period-week-nbr, beginning-of-period, beginning-of-period-last-year
    with report_dates as
                ---== Replace with Select statements to have sample data for func ==--                                                                                                         
    select per_week_nbr into out_week_nbr from report_dates where game_date =  dte;
         return out_week_nbr;
    END fnc_get_per_week_nbr;
    Without the correct desired results, I can't be sure if this is right.
    Your results were correct, totals and all. Thank you.
    There was a lot of unnecessary dividing by 8 going on.
    x / 8 = 0 if and only if x = 0, so there's no need to divide by 8 when testing for division by 0.
    (a/8) / (b/8) = a / b, so there's no need to divide by 8 at all when computing efficiency.
    thanks I had missed that.

  • Need help getting YTD total

    I've got a period to date report with following columns:
    week1 tots, week2 tots, week3 tots, week4 tots, week5 tot, period-to-date tots, year-to-date tots
    I have a SELECT statement which totals data for the entire year and separates current period totals
    by grouping on the week_nbr . Any date between beginning of year and the end of the previous period will be week 0
    The Select statement retursn 6 rows: 1 for each week in period and one with week_nbr = 0 which represents the totals from the beginning of year
    to the end of the previous period.
    the select statement returns the data correctly . I need help getting the YTD total for (weeks 1 - 5) + (totals for week 0) for each column.
    This means that I will have a 7th record containing the YTD totals. ( I am not concerned with the PTD totals)
    I tried sum by partition but complex decode statement gave me problems.
      CREATE TABLE PERIOD_DATA
       (     "HOT_DOG_STAND_ID" NUMBER NOT NULL ENABLE,
         "WEEK_DATE" DATE,
         "NET_SALES2" NUMBER,
         "BUNS24434" NUMBER,
         "PICKELS_AW38" NUMBER,
         "MUSTARD_TB56" NUMBER,
         "CHICKENHEADS33" NUMBER,
         "PIECES_SOLD34" NUMBER,
         "SCRAPS35" NUMBER,
         "PIECES_UNACCOUNTED" NUMBER
    REM INSERTING into PERIOD_DATA
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('29-DEC-08','DD-MON-RR HH.MI.SSXFF AM'),14301.39,13951.26,3431.13,0,3680,2484,378,818);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('05-JAN-09','DD-MON-RR HH.MI.SSXFF AM'),14651.37,14651.37,3249.55,0,3200,2419,505,276);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('12-JAN-09','DD-MON-RR HH.MI.SSXFF AM'),14169.89,14169.89,2463.53,0,3136,2080,474,582);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('19-JAN-09','DD-MON-RR HH.MI.SSXFF AM'),15864.46,15864.46,3245.49,0,3472,2764,475,233);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('26-JAN-09','DD-MON-RR HH.MI.SSXFF AM'),15961.2,15916.23,3395.51,0,3648,2838,392,418);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('02-FEB-09','DD-MON-RR HH.MI.SSXFF AM'),19066.4,19066.4,4165.07,0,4336,3682,333,321);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('09-FEB-09','DD-MON-RR HH.MI.SSXFF AM'),18415.74,18415.74,4024.74,0,4032,3365,482,185);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('16-FEB-09','DD-MON-RR HH.MI.SSXFF AM'),18014,17849,3486.33,0,3840,3238,374,228);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('23-FEB-09','DD-MON-RR HH.MI.SSXFF AM'),18671.09,18626.12,3729.42,0,3888,2970,353,565);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('02-MAR-09','DD-MON-RR HH.MI.SSXFF AM'),17636,17636,3815,0,3424,2840,490,94);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('09-MAR-09','DD-MON-RR HH.MI.SSXFF AM'),17235.52,17145.58,3897.42,0,3504,2928,421,155);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('16-MAR-09','DD-MON-RR HH.MI.SSXFF AM'),15989.27,15989.27,3372.95,0,3728,3051,369,308);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('23-MAR-09','DD-MON-RR HH.MI.SSXFF AM'),19067.69,18960.41,4152.6,0,4048,3293,442,313);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('30-MAR-09','DD-MON-RR HH.MI.SSXFF AM'),18717.99,18717.99,3923.69,0,4408,3219,593,596);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('06-APR-09','DD-MON-RR HH.MI.SSXFF AM'),17335.16,17335.16,3769.08,0,3928,2997,514,417);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('13-APR-09','DD-MON-RR HH.MI.SSXFF AM'),18967.39,18967.39,4157.76,0,4144,2991,527,626);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('20-APR-09','DD-MON-RR HH.MI.SSXFF AM'),23090.88,23090.88,4427.96,0,5544,4493,560,491);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('27-APR-09','DD-MON-RR HH.MI.SSXFF AM'),24197.98,24132.99,4248.66,0,6680,5190,606,884);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('04-MAY-09','DD-MON-RR HH.MI.SSXFF AM'),20202.21,20137.22,3714.68,0,7052,6170,422,460);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('11-MAY-09','DD-MON-RR HH.MI.SSXFF AM'),18514.48,18514.48,3266.06,0,5508,4178,571,759);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('18-MAY-09','DD-MON-RR HH.MI.SSXFF AM'),18678.68,18678.68,3814.07,0,5824,4345,633,846);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('25-MAY-09','DD-MON-RR HH.MI.SSXFF AM'),17937.18,17937.18,3051.52,0,4844,4986,529,-671);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('01-JUN-09','DD-MON-RR HH.MI.SSXFF AM'),17445.75,17445.75,3079.91,0,5028,4810,656,-438);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('08-JUN-09','DD-MON-RR HH.MI.SSXFF AM'),17327.88,17327.88,3263.29,0,6112,4674,672,766);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('15-JUN-09','DD-MON-RR HH.MI.SSXFF AM'),17241.72,16937.33,3328.27,0,5792,4490,567,735);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('22-JUN-09','DD-MON-RR HH.MI.SSXFF AM'),16625.83,16625.83,3485.18,0,5408,4319,761,328);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('29-JUN-09','DD-MON-RR HH.MI.SSXFF AM'),17002.84,17002.84,3091.09,0,5664,4369,544,751);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('06-JUL-09','DD-MON-RR HH.MI.SSXFF AM'),16339.19,16274.2,3075.3,0,4784,3440,697,647);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('13-JUL-09','DD-MON-RR HH.MI.SSXFF AM'),17165.12,16885.14,3458.03,0,4320,3296,640,384);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('20-JUL-09','DD-MON-RR HH.MI.SSXFF AM'),17029.77,16899.79,3198.91,0,4448,3449,645,354);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('27-JUL-09','DD-MON-RR HH.MI.SSXFF AM'),16596.89,16596.89,3015.54,0,4624,3288,665,671);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('03-AUG-09','DD-MON-RR HH.MI.SSXFF AM'),16468.58,16468.58,2981.35,0,2224,3495,564,-1835);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('10-AUG-09','DD-MON-RR HH.MI.SSXFF AM'),18625.48,18550.5,3524.44,0,4856,3482,578,796);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('17-AUG-09','DD-MON-RR HH.MI.SSXFF AM'),24538.54,24323.55,5580.71,0,5260,3771,608,881);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('24-AUG-09','DD-MON-RR HH.MI.SSXFF AM'),18081.37,18081.37,3533.45,0,5980,3080,553,2347);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('31-AUG-09','DD-MON-RR HH.MI.SSXFF AM'),17183.25,17183.25,3487.12,0,2544,3262,615,-1333);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('07-SEP-09','DD-MON-RR HH.MI.SSXFF AM'),17688.41,17575.29,3424.17,0,4800,3480,591,729);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('14-SEP-09','DD-MON-RR HH.MI.SSXFF AM'),18211.29,18211.29,3806.32,0,3968,3104,527,337);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('21-SEP-09','DD-MON-RR HH.MI.SSXFF AM'),16809.21,16744.22,3014.61,0,4128,3124,710,294);
    SELECT HOT_DOG_STAND_ID
    , DECODE(TRUNC(week_date , 'iw') ,
             to_date('24-AUG-09' , 'dd-mon-rr') , 1 ,
             to_date('24-AUG-09' , 'dd-mon-rr') + 7 , 2 ,
             to_date('24-AUG-09' , 'dd-mon-rr') + 14 , 3 ,
             to_date('24-AUG-09' , 'dd-mon-rr') + 21 , 4 ,
             to_date('24-AUG-09' , 'dd-mon-rr') + 28 , 5 , 0) AS week_nbr
    , SUM(NET_SALES2)                                                                                                                                                                                                                                                     AS net_sales2
    , SUM(BUNS24434 ) BUNS24434
    , SUM(PICKELS_AW38) PICKELS_AW38
    , SUM(MUSTARD_TB56) MUSTARD_TB56
    , SUM(CHICKENHEADS33) CHICKENHEADS33
    , SUM(PIECES_SOLD34) PIECES_SOLD34
    , SUM(SCRAPS35) SCRAPS35
    , SUM(PIECES_UNACCOUNTED) * - 1 PIECES_UNACCOUNTED
       /*--== Head average  net_sales / chickenusage*/
    , CASE
          WHEN NVL( SUM(ChickenHeads33) / 8 , 0) = 0 THEN 0
          ELSE ROUND(SUM(net_sales2) / ( SUM(ChickenHeads33) / 8 ) , 2)
       END AS Head_average
       /*--=== Efficiency =   (ChickenUsage  - scrappedDiv8 - unaccountedDiv8) / ChickenUsage)  * 100*/
    , CASE
          WHEN NVL(SUM(ChickenHeads33) / 8 , 0) = 0 THEN 0
          ELSE ROUND((((SUM(ChickenHeads33) / 8 ) - ( SUM(scraps35) / 8 ) - (SUM(pieces_unaccounted) / 8 )) / (SUM(ChickenHeads33) / 8 )) * 100 , 2)
       END AS efficiency
    FROM period_data per
    WHERE week_DATE BETWEEN TRUNC(TO_DATE( '24-AUG-09' , 'DD-MON-YY') , 'IY') AND TRUNC(TO_DATE( '24-AUG-09' , 'DD-MON-YY') , 'IW') + 6 + 7 * 4
    GROUP BY hot_dog_stand_id
    , DECODE(TRUNC(week_date , 'iw') ,
          to_date('24-AUG-09' , 'dd-mon-rr') , 1 ,
          to_date('24-AUG-09' , 'dd-mon-rr') + 7 , 2 ,
          to_date('24-AUG-09' , 'dd-mon-rr') + 14 , 3 ,
          to_date('24-AUG-09' , 'dd-mon-rr') + 21 , 4 ,
          to_date('24-AUG-09' , 'dd-mon-rr') + 28 , 5 ,
          0)
    ORDER BY DECODE(TRUNC(week_date , 'iw') , to_date('24-AUG-09' , 'dd-mon-rr') , 1 , to_date('24-AUG-09' , 'dd-mon-rr') + 7 , 2 , to_date('24-AUG-09' , 'dd-mon-rr') + 14 , 3 , to_date('24-AUG-09' , 'dd-mon-rr') + 21 , 4 , to_date('24-AUG-09' , 'dd-mon-rr') + 28 , 5 , 0);The expected results will be:
    HOT_DOG_STAND_ID       WEEK_NBR               NET_SALES2             BUNS24434              PICKELS_AW38           MUSTARD_TB56           CHICKENHEADS33         PIECES_SOLD34          SCRAPS35               PIECES_UNACCOUNTED     HEAD_AVERAGE           EFFICIENCY            
    141                    7                      697067.09              694887.4               139149.91              0                      175808                 139454                 21036                  15318                  31.72                  79.32                  You can get these dame results by running endpot-to-endpoint query:
    SELECT  HOT_DOG_STAND_ID
         , max(7) as week_nbr
         ,sum(NET_SALES2)      net_sales2
          ,sum(BUNS24434 )        BUNS24434
          ,sum(PICKELS_AW38)      PICKELS_AW38
          ,sum(MUSTARD_TB56)     MUSTARD_TB56
          ,sum(CHICKENHEADS33)   CHICKENHEADS33
          ,sum(PIECES_SOLD34)    PIECES_SOLD34
          ,sum(SCRAPS35)         SCRAPS35
          ,sum(PIECES_UNACCOUNTED)   PIECES_UNACCOUNTED
        ---===== Copied code from outer query
              --==  net_sales / chickenusage 
                      ,   CASE
                             WHEN NVL( sum(ChickenHeads33) / 8    ,0)  = 0 then 0
                             ELSE ROUND(sum(net_sales2)/   ( sum(ChickenHeads33) / 8    ) , 2)
                          END as Head_average
                        --=== Efficiency =   (ChickenUsage  - scrappedDiv8 - unaccountedDiv8) / ChickenUsage)  * 100
                        ,   CASE
                                  WHEN NVL(sum(ChickenHeads33) / 8    ,0)  = 0 then 0
                                  ELSE   ROUND((((sum(ChickenHeads33) / 8 )  - ( sum(scraps35) / 8 ) - (sum(pieces_unaccounted) / 8 )) / (sum(ChickenHeads33) / 8 )) * 100, 2)
                          END as efficiency  
    from period_data
    WHERE week_DATE BETWEEN TRUNC(TO_DATE( '24-AUG-09' ,'DD-MON-YY'), 'IY') AND TO_DATE( '27-sep-09' ,'DD-MON-YY')
    group by hot_dog_stand_id;Thanks In Advance

    Hi,
    Welcome to the forum!
    Thanks for posting the CREATE TABLE and INSERT statements; that's very helpful. You could teach something to some people who have been using this forum for years (except that nobody can teach them).
    user12335325 wrote:
    The expected results will be:
    HOT_DOG_STAND_ID       WEEK_NBR               NET_SALES2             BUNS24434              PICKELS_AW38           MUSTARD_TB56           CHICKENHEADS33         PIECES_SOLD34          SCRAPS35               PIECES_UNACCOUNTED     HEAD_AVERAGE           EFFICIENCY            
    141                    7                      697067.09              694887.4               139149.91              0                      175808                 139454                 21036                  15318                  31.72                  79.32                 
    Do you mean the expected results will include the row above, and that the results will be this row along with the 6 rows you're already getting? (If you wanted just that one row, I suppose you would just run your second query.)
    That sound like a job for ROLLUP.
    VARIABLE   start_date     VARCHAR2 (11);
    EXEC         :start_date  := '24-AUG-2009';
         SELECT  HOT_DOG_STAND_ID
         ,      NVL (CASE
                   WHEN  week_date >= TO_DATE( :start_date, 'DD-MON-YYYY')
                   AND   week_date <  TO_DATE( :start_date, 'DD-MON-YYYY') + 35
                   THEN  1 + FLOOR ( (week_date - TO_DATE( :start_date, 'DD-MON-YYYY'))
                                             / 7
                   ELSE  0
                      END
                 , 7
                 )          AS week_nbr
    , SUM(NET_SALES2)                                                                                                                                                                                                                                                     AS net_sales2
    , SUM(BUNS24434 ) BUNS24434
    , SUM(PICKELS_AW38) PICKELS_AW38
    , SUM(MUSTARD_TB56) MUSTARD_TB56
    , SUM(CHICKENHEADS33) CHICKENHEADS33
    , SUM(PIECES_SOLD34) PIECES_SOLD34
    , SUM(SCRAPS35) SCRAPS35
    , SUM(PIECES_UNACCOUNTED) * - 1 PIECES_UNACCOUNTED
       /*--== Head average  net_sales / chickenusage*/
    , CASE
          WHEN NVL( SUM(ChickenHeads33) / 8 , 0) = 0 THEN 0
          ELSE ROUND(SUM(net_sales2) / ( SUM(ChickenHeads33) / 8 ) , 2)
       END AS Head_average
       /*--=== Efficiency =   (ChickenUsage  - scrappedDiv8 - unaccountedDiv8) / ChickenUsage)  * 100*/
    , CASE
          WHEN NVL(SUM(ChickenHeads33) / 8 , 0) = 0 THEN 0
          ELSE ROUND((((SUM(ChickenHeads33) / 8 ) - ( SUM(scraps35) / 8 ) - (SUM(pieces_unaccounted) / 8 )) / (SUM(ChickenHeads33) / 8 )) * 100 , 2)
       END AS efficiency
    FROM period_data per
    WHERE week_DATE BETWEEN TRUNC(TO_DATE( '24-AUG-09' , 'DD-MON-YY') , 'IY') AND TRUNC(TO_DATE( '24-AUG-09' , 'DD-MON-YY') , 'IW') + 6 + 7 * 4
    GROUP BY  hot_dog_stand_id
    ,           ROLLUP (
                 CASE
                   WHEN  week_date >= TO_DATE( :start_date, 'DD-MON-YYYY')
                   AND   week_date <  TO_DATE( :start_date, 'DD-MON-YYYY') + 35
                   THEN  1 + FLOOR ( (week_date - TO_DATE( :start_date, 'DD-MON-YYYY'))
                                             / 7
                   ELSE  0
                 END          -- week_nbr
    ORDER BY  week_nbr
    ;Notice I simplified the computation of week_nbr.
    Some other people have asked questions about hot dog stands recently.
    I'm curious; is this from a course? If so, where? What is the textbook (if any)?
    Thanks.

  • Question On YTD Total Sales Query

    Hello,
    We use this query below to see YTD sales for each of our BP Customers:
    SELECT T0.CardCode 'Acct #', T0.CardName Company, T0.Address
    ' Address', T0.City ' City', T0.State1 State, T0.ZipCode
    'Billing Zip', T0.Phone1 Phone, T0.Balance ' Balance',
    T1.SlpName 'Sales Rep',
    T2.PymntGroup Terms, T3.GroupName 'Group', ((SELECT ISNULL(SUM(INV1.LINETOTAL),0)
    FROM INV1 INNER JOIN OINV ON INV1.DocEntry = OINV.DocEntry
    WHERE OINV.CardCode = T0.CardCode AND Year(INV1.DocDate) = Year(GetDate()))-(SELECT ISNULL(SUM(RIN1.LINETOTAL),0)
    FROM RIN1 INNER JOIN ORIN ON RIN1.DocEntry = ORIN.DocEntry
    WHERE ORIN.CardCode = T0.CardCode AND Year(RIN1.DocDate) = Year(GetDate()))) [YTD Sales]
    FROM OCRD T0
    LEFT JOIN OSLP T1 ON T1.SlpCode = T0.SlpCode
    LEFT JOIN OCTG T2 ON T2.GroupNum = T0.GroupNum
    LEFT JOIN OCRG T3 ON T3.GroupCode = T0.GroupCode
    WHERE T0.CardType = 'C'
    We use this query below to see daily Invoice and Credit Memo postings for a selected period:
    SELECT 'INVOICE' as "Doc Type", T0.DOCNUM as "Doc Number", T0.CARDCODE as "Customer Code", T0.CARDNAME as "Customer Name", T0.DOCDATE as "Posting Date", T0.NUMATCARD as "Customer Ref #", T0.DocDueDate, T0.DocTotal
    FROM [dbo].[OINV] T0 WHERE T0.DOCDATE BETWEEN '[%0]' And '[%1]'
    UNION ALL
    SELECT 'CREDIT MEMO', T0.DOCNUM,T0.CARDCODE, T0.CARDNAME, T0.DOCDATE, T0.NUMATCARD, T0.DocDueDate, -1*T0.DocTotal
    FROM [dbo].[ORIN] T0 WHERE T0.DOCDATE BETWEEN '[%0]' And '[%1]'
    My question is -- shouldn't the sum of the YTD column in the 1st Query be the same as the sum of the Doc Total column in the 2nd Query (given that all dates are selected in the 2nd Query)? 
    This doesn't appear to be the case and I was wondering why?
    Thanks,
    Mike

    Mike,
    The DocTotal may contain Freight and Handling expenses, While the first query only taken the SUM of the line total of the Items.
    Thats why they may be different.
    Suda

  • Customer statement total

    Hello SAP Gurus
    I have an issue in A/R support:
    Client wants to have a report that shows the total of all customer statements(f.27), they run f.27 everymonth for a set of customers, and they need a report the shows the total of this activity.
    I thought of using FBL5N or S_ALR_87012173 but the problem is when they run customer statements they run it for specific period of time for eg: 10.03.2007 to 10.04.2007 whereas in FBL5N and S_ALR_87012173 we don't have that option of giving date range. Though 'Open at Key Date' option is there which will bring all the open items upto that date.
    Please give me any ideas how to resolve this issue, Is there any chance by using Dynamic selections in FBL5N, I am not sure of course......
    Thanks for your help

    Hi,
    You can use FBL5N. In the dynamic selection you can choose the posting date range you want and in the key date for clearing, you can give the date on which the report you are looking for. This will give you all between that posting dates, open on that key date.
    Hope you can do it now.
    regards
    Krishnan

  • Customizing grand total columns in pivot view

    Hi,
    I need to customize grand total column name in pivot view
    ex: If i have two measure order quantity and order amount when i am selecting aggregation after in column properties it is getting grand total i need to display as grand total for OQ and grand total for OA
    Please give suggestions
    Thanks,
    Kartheek.
    Edited by: 998231 on May 26, 2013 11:31 PM

    Hi Kartheek,
    In Pivot View, the default Grand Total can not be renamed to have measure specific grand total labe (The one which we specify in the pivot view either by row or column).
    But you can try having one seperate measure (in criteria) which will calculate the grand total of each measure by dimensions, which are available on the report.
    In sort you can achieve this using table and narrative view together (Not by pivot view).
    Expresion to be used for grand total in report criteria is Sum( Measure By Dim1, dim2... DimN)
    where Dim1 and dim2 are the dimesional column available on the report and measure is you OQ/OA.
    Create a table view as a default view (do not add any report level agg) and then add narrative view to display the Grand total OQ and Grand Total OA. You can add your custom lables with HTML tag, disply the 1st row only as the values will be repeated for grand total.
    If you find any other solution do let me know. As of now I can think of this as a best solution.
    Regards,
    Kashi

  • Customer wise Total order qnty,dispatched qnty & qnty for despatch

    Hi all,
    How to check customer wise,for individual material, Total sales order quatity,dispatched qnty & qnty available for dispatch in a single report,
    As tryed in MB52, it will show against S.O wise qnty availble for disptach, & VA05 will show order qnty with partial delivery(but not qnty).
    Can we get the above in single report?

    Hello Pawar,
    There is no standard reports available as far as i know to check customer wise,for individual material, Total sales order quatity,dispatched qnty & qnty available for dispatch in a single report.
    But in the T-code "MC18", you can do the confirugation for the field catalog.
    Steps for setting up a Field Catalog -
    1. Specify a name and a description for the field catalog you want to create in the field Field catalog.
    2. Assign the field catalog to an application.
    3. Define the catalog category.
    4. Press ENTER.
    You branch to the maintenance screen of the individual fields of the catalogs.
    5. Depending on the catalog category you have chosen, select Edit -> Characteristics or Key figures or Date.
    Two dialog boxes are displayed for selecting the fields.One dialog box lists the fields you have already selected. Choose Selection list to select further fields.If you have not yet selected any fields, the second dialog box is displayed at the same time.The second dialog box consists of two lists. The list on the right contains all source tables from which you can select fields for the field catalogs. In this case, only the source tables valid for the selected application are displayed.
    6. If you want to display the fields of a certain source table, you can select this by double-clicking on it.
    The list on the left shows the fields of the selected source table.
    7. If you want to copy a field from the list into your field catalog, position the cursor on the corresponding field name and choose Copy.
    The selected field is marked and copied into the list of the already selected fields.
    Note
    When you select the source tables and fields, the respective descriptions are displayed.
    You can use the Switch display function to display the technical names (table and field names from the Data Dictionary).
    8. Press Copy + close to branch to the list of the selected fields.
    You can now also edit this list, i.e. you can change the sequence in which the fields appear, delete fields from the list or add new fields from the source tables.
    9. The sequence of the selected characteristics can be changed as follows:
    a) Select the characteristic, or a block of characteristics that is to be moved. The selection can be made using the icon Select/Deselect or Select block, or with a double click. The icon move is displayed.
    b) Position your cursor on a different characteristic and choose the icon Move.
    The characteristic or block of characteristics you initially selected will now be inserted above the second selected characteristic.
    10. To copy the selected fields to your field catalog, choose Copy.
    11. Save the field catalog.
    12.Then Define the updated rules.
    Regards,
    SARTHAK.

  • Custom Running Total on Crosstab

    Because I need to do some special calculations on a crosstab, I need to use the 'Supress' formula technique to do a custom calculations on a crosstab.
    The crosstab exists in the report footer and everything works fine except when the crosstab is being expanded to the next page.  Basically, if the crosstab and it's grand total fits in one page, the calculation is fine.  If the crosstab requires another page, the grand total at the last page will only sum up the crosstab values on the page that the grand total is being displayed.
    Is there any way to make it so that the grand total does not reset if it happens to expand to multiple pages?
    Here are my formulas:
    1.  Formula on Row 1 (Supress formula) - This will constantly add value to a variable as long as Row1 is printed.
    whileprintingrecords;
    numbervar grandtotal;
    numbervar units;
    numbervar budget;
    grandtotal := grandtotal + (units * budget);
    false;
    2. Formula on Grand Total (Display String formula) - This basically displays the value of 'grandtotal' after all rows have been printed.
    whileprintingrecords;
    numbervar grandtotal;
    '$ ' & totext(grandtotal);

    You're using Crystal Syntax for your formulas, which I have to admit I'm not too familiar with...  But in Basic Syntax, there's a difference between Global and Shared variables, where Globals do not carry to the report footer but Shared variables do.
    Perhaps if you use Shared variables instead of Globals (if that is what the syntax indicates) your problem will go away...
    HTH,
    Carl

  • Weekly Group Total and associated YTD total

    Hello,
    I need to show current week group total and total year to date of the same group.  My columns are Model, Department, Issue.  Currently, the report is grouped on Model, Department and Issue with a filter for date.  Can this be done?   Thank you

    Create a formula field, say {@Current Week Amount} as (basic syntax):
    if datepart("ww", {datefield}) = datepart("ww", CurrentDate) then
      formula = {amount}
    else
      formula = 0
    end if
    You can then sum() this, or place it in a detail line, or whatever.  It will only have a (non-zero) value when the date field is in the current week.
    HTH,
    Carl

  • Table and Field for Open Order Quantity for a Customer and Material

    Hi
    I created two sales orders 3 and 5 quantities, and delivered 1 qty in the second order.
    when I checked the table VBBE or VA05, I am able to see 3 and 4 qantities open.
    but I want to see the total 7 as open order qty, i.e, 3+4=7
    because both the orders are placed by the same customer and same material
    can u please tell me in which table exactly the total open order quantity for a customer and total open order quantity for a material is stored
    Madhu
    Edited by: madhubabu rao on Jul 1, 2008 1:49 PM

    HI,
    U can get the information in VA05 as there are columns like confirmed quantity and order quantty and status. U can use  summation button (add upto values) to get required information.
    Thx,
    Pramod

  • Need help for creation of aging report of customer

    Hi Experts,
                     I am going to create aging report for custer so have a need of basic concept of creation of aging report.
    Thanks and Regards
    vijay
    Edited by: vijay dwivedi on Mar 9, 2009 1:19 AM

    Hi here is a guide of how to understand the components in an ageing report.Hope this helps
    Customer Receivables Aging Report
    Field
    Description/Activity
    Currency - Click This graphic is explained in the accompanying text and choose the currency in which you want to display the report results.
    The available options include:
    ●     Local currency
    ●     System currency
    ●     Business partner currency
    ●     Any additional currencies defined in SAP Business One
    Aging Date - Displays the aging date entered in the selection criteria window.
    Age By - Choose whether you calculate the age of the receivable according to the due date, posting date, or document date of the document/journal entry.
    Customer Code - Displays the code of the customer.
    Click This graphic is explained in the accompanying text to open the corresponding Business Partner Master Data.
    Customer Name - Displays the name of the customer.
    Total - Displays the total amount of open receivables.
    Figures in brackets mean that you have already received payments from a customer that exceed the amount of the receivables for the period.
    Future Remit. - Displays all open receivables owed by the customer according to the date selected in the Age By field for which the date specified in the Aging Date field has not yet been reached.
    Example:
    The customer has an open A/R Invoice for USD100, due on May 1st.
    Due Date is selected in the Age By field.
    The Aging Date is April 28th.
    Since the invoice is due after the aging date, the invoice amount is displayed in the Future Remit column..
    Letter - Indicates whether a dunning letter was created.
    This column is displayed only when generating the report by Sales Documents and by Due Date
    Time Intervals - These four columns are constructed according to the specifications you made in the Interval field in the selection criteria window.
    SAP Business One displays the relevant open receivables in four columns for the time interval. The amount of future remit is first subtracted from the total sum of receivables.
    ·         Days u2013 Each column represents the number of days entered in the selection criteria window starting from the aging date and backwards. The fourth column includes open receivables for the days prior to the first three columns.
    ·         Months u2013 Each column represents one month starting from the aging date month and backwards. The last column includes open receivables for months prior to the first three months.
    ·         Periods u2013 Each column represents one period (as defined in Administration ® System Initialization ® Posting Periods tab) starting from the aging date period and backwards. The last column includes open receivables for periods prior to the first three periods.
    If the number of periods in your company is less than the 4, the non-relevant columns will be titled as Not Valid.
    Doubtful Debt - Displays the doubtful debt amount if exists.
    Top Total Row - The top total row displays the sum of amounts listed in one column.
    Bottom Total Row - The bottom total row displays the percentage of open receivables for each time interval.
    Thanks
    Kevin

  • Customer Service Process

    Hi,
    Business Process:
    Customer has given the complaint. So Service notification will be created. It requires man power as well as components replacement.
    I have 2 different solutions. Please advice me which one is better than other & why.
    Solution 1:
    From notification, sales order will be created for configurable service material & the components to be replaced. I have created service product & configuration profile for service material. So in background, service order will be created with pre-defined task list. I will do the delivery & PGI based on sales order. I will bill based on sales order. In background, I will execute the service order & I will settle the labor cost to sales order. So I guess, while billing I can bill the customer on total value (material cost & labor cost)
    Solution 2:
    From notification, I will create the service order. I will execute the service order & I will do GI for the order. I will do the billing based on service order.
    Here drawback I guess, I can't do GI to the customer instead I will do for the order.
    I have gone through this link.
    http://help.sap.com/saphelp_46c/helpdata/en/e6/4a903f9e0311d189b70000e829fbbd/frameset.htm
    In this link, it has been mentioned that components should be assigned to service order.
    So what would be the need of business process Automatic creation of service order from Sales order using service product?
    Please clarify.

    Hi,
    It is totally depends on how you look for the scenario,
    If you want to show that Goods are issued to customer  then solution first is OK, but doing business process is complex one for user, that they have to create notification in CS then create sales order in SD then delivery then again create service order in CS and again create billing with reference to sales order, it is like jumping here and there.
    Where as second solution is strait forward
    Starting from CS and ending in SD billing one time only here and there so it can be very easy to user
    Kapil

  • Calculating Year to Date Totals

    Develper advises they have no solution in Crystal for the following.  Report requires display of monthly expenditure totals and a YTD total that represents the cummulative total of each month.  The report displays a YTD total for the end of each month.  Problem is that when a given month has a zero value for the monthly expenditure, the report is displaying a zero value for the YTD total and failing to pick up the YTD from the previous month end....
    Does anyone understand the developer's issue and/or have a solution?

    You should be able to group off year, month, put the field in detail section, then create a summary from
    that field.  Right click the field, select insert summary, and select the groups, or footers as needed.
    If that does not work, you may need to try running totals.

Maybe you are looking for

  • Problem using time machine from migrated MacBook Pro

    I upgraded from a 2.66Gig 15 inch dual core to a 2.2GHz 15 inch quad core (MacBook Pro 8,2) - I used setup assistant to transfer files and settings form the old mac to the new one over ethernet - Everything (apps, documents, preferences) transferred

  • When users open links in a new window

    We have a report that shows many rows of data together with a link to drill down on a particular row by branching to another page, passing the row PK as a parameter - standard stuff. However, some of our users like to use the browser facility to open

  • Extending Batch Input or Direct Input

    Hello experts, I am customizing Material Master by adding some custom fields to MARA.  I know I can extend the IDoc for loading the data including my custom fields.  I was wondering if, and how, to extend the Batch Input or Direct Input to include my

  • Anonymous Classes

    Can you implement an interface on an anonymous class ? (of course the anonymous class does not belongs or extends of an interface) I know this sound strange but it's in a code I have to maintain and in order to make less changes as posible I want to

  • Windows 7 only booting with 2GB of ram instead of 8.

    I used Bootcamp (newest version) to install Windows 7 Ultimate on my partitioned hard drive on my Mac Mini. The computer has 8GB of RAM, but when I open the Task Manager in Windows, I can only see that 2BG of RAM is being used. How can I get it so th