Group a week data in a few subgroup

Hi,
I have a special requirement that I would like to do it using SQL. Suppose I have the following table:
Day    Count
1        5
2        15
3        10
4        31
5        25
6        2
7        5
------------------------I would like to group the above table into 3 subgroup (as shown by the dotted lines). As you may have realised, the grouping condition is "Start from Day1 until Day7 (end of week). If a day count is more than 30, make a subgroup. If less than 30, go to next day until cumulative count is more than 30. When cumulative count is 30, make a new subgroup and reset cumulative count to zero for the next day. Algorithm continues until end of the week.
Thanks.

It is not for any frontend...
It is actually for statistical calculations.(some folks in my dept pointed out that for meaningful statistic, the sample size should be more than 30..its their rule)
So the final result based on the tables I have shown is:
Set  Count   
1      30
2      31
3      32

Similar Messages

  • Group by week between date range

    Hi,
    i'm having a table for documents. The documents are received from diffeent dates. I've to calculate number of documents received on weekly basis.
    I need an sql to count the number of documents between two given dates grouping by weekly.
    My week starts on Sunday and ends with Saturday.
    I've tried group by with to_char(documentdate,'IW') , but this will take Monday to Sunday as a week & the incomlete weeks i'm not able to calculate. ie : in the below output the date range 1/28/2007 - 1/31/2007 is not a complete week . ( ie saturday to wednesday)
    I need out put should be like this for Date Range: 12/31/2006 to 1/31/2007
    week               count of documents
    12/31/2006 - 1/6/2007 10
    1/7/2007 - 1/13/2007     40
    1/14/2007 - 1/20/2007     30
    1/21/2007 - 1/27/2007     20
    1/28/2007 - 1/31/2007 10
    Please help me to get this.....
    (columns in documents table is documentid and documentdate)

    your're very close with IW. you just need to shift the data to get it to fall within the correct range. btw, you don't need julian dates. it's overkill
    trunc(date+1,'IW') will push the date into the correct week range (IW will put sunday one week earlier than you want, so push it ahead a day before truncating).
    then subtract 1 day from the result to get it to display with sunday's date instead of monday
    trunc(mydate+1, 'IW')-1
    and the week end date is simply trunc(mydate+1, 'IW')_5
    select mydate, to_char(mydate,'Dy') dy,
    trunc(mydate+1,'IW')-1 wk_Start,
    trunc(mydate+1,'IW')+5 wk_end
    from (select sysdate-rownum mydate from dual connect by level < 20)
    order by 1

  • Grouping Via Week Ending Date

    Hi All,
    I have transaction line data for each day for our customer and i would like to group via Week Ending date.  Our week runs from Sunday to Saturday.
    If i Add the transaction date to the group expert it allows me to group via week(perfect) but does it via week commencing.  I see you can group via formula, does any one know how you can get the grouping via Week Ending.
    thanks for any help and advise.
    mike

    Try using this formula...
    DateAdd("ww", DateDiff("ww", #1/6/1900#, {TableName.DateField}), #1/6/1900#)
    This formula will calculate the "Following Saturday" for any given date. If the date supplied is a Saturday, it will return that same date.
    So...
    11/2/2010 &gt; 11/6/2010
    11/13/2010 &gt; 11/13/2010
    11/14/2010 &gt; 11/20/2010
    HTH,
    Jason

  • Dynamically group records by date

    I am attempting to create a report that will dynamically group records into a set number of date buckets.  This is similar to grouping records by a date field and setting the days, weeks, months, etc property but instead of grouping by a set time span I want to a specific number of date groups regardless of date span.  So say i have records where the first date is today at 1am and the last record is today at 9 pm.  I want the data grouped into 10 groups and the time calculated for that group based on total time span / 10.  The first group would be 1AM to 3AM, the second would group 3AM to 5AM, etc..  The reason I am doing this is for a chart that displays record counts over time but the overall timespan will never be known until runtime.  Setting the chart for hourly or weekly doesn't work because if the user runs the report over a year the dates will be illegible.
    Thanks in advance!

    Well this SHOULD be easy. But leave it to CR make not...
    You can start by finding the minimum & maximum dates within your range:
    Local DateTimeVar MinDate;
    MinDate := Minimum({Table.DateField})
    and
    Local DateTimeVar MaxDate;
    MaxDate := Maximum({Person.ModifiedDate})
    Then figure out what the the interval would be if the span is broken down into 10 equal parts"
    DateDiff("n", {@MinDate}, {@MaxDate}) / 10
    From there just use a formula to segregate each records into the appropriate groups:
    EvaluateAfter({@Interval});
    IF {Table.DateField} >= {@MinDate}
        AND {Table.DateField} <= DateAdd("n",{@Interval}, {@MinDate}) THEN 1 ELSE
    IF {Table.DateField} > DateAdd("n",{@Interval}, {@MinDate})
        AND {Table.DateField} <= DateAdd("n",{@Interval} * 2, {@MinDate}) THEN 2 ELSE
    IF {Table.DateField} > DateAdd("n",{@Interval} * 2, {@MinDate})
        AND {Table.DateField} <= DateAdd("n",{@Interval} * 3, {@MinDate}) THEN 3 ELSE
    IF{Table.DateField} > DateAdd("n",{@Interval} * 3, {@MinDate})
        AND{Table.DateField} <= DateAdd("n",{@Interval} * 4, {@MinDate}) THEN 4 ELSE
    IF {Table.DateField} > DateAdd("n",{@Interval} * 4, {@MinDate})
        AND {Table.DateField} <= DateAdd("n",{@Interval} * 5, {@MinDate}) THEN 5 ELSE
    IF {Table.DateField} > DateAdd("n",{@Interval} * 5, {@MinDate})
        AND {Table.DateField} <= DateAdd("n",{@Interval} * 6, {@MinDate}) THEN 6 ELSE
    IF {Table.DateField} > DateAdd("n",{@Interval} * 6, {@MinDate})
        AND {Table.DateField} <= DateAdd("n",{@Interval} * 7, {@MinDate}) THEN 7 ELSE
    IF {Table.DateField} > DateAdd("n",{@Interval} * 7, {@MinDate})
        AND{Table.DateField} <= DateAdd("n",{@Interval} * 8, {@MinDate}) THEN 8 ELSE
    IF {Table.DateField} > DateAdd("n",{@Interval} * 8, {@MinDate})
        AND {Table.DateField} <= DateAdd("n",{@Interval} * 9, {@MinDate}) THEN 9 ELSE
    IF {Table.DateField} > DateAdd("n",{@Interval} * 9, {@MinDate})
        AND {Table.DateField}  <= {@MaxDate} THEN 10
    This is where CR drops the ball... IMHO... it WON'T allow you to to group by a formula field that uses an aggregate in the formula (in this case Minimum & Maximum)... It will however allow to to graph on it, which I assume is what you are actually trying to do.  If anyone knows a way to work around the grouping issue, I'd love to know it myself.
    HTH,
    Jason

  • Sum of weekly data

    sum of weekly data.. from daily data.
    create table test3(tno number(10), tdate date);
    declare
    begin
    for i in 1..20 loop
    insert into test3 values(i,sysdate+i);
    end loop;
    end;
    SELECT * FROM test3;
    TNO                    TDATE                    
    1                      21-APR-11                
    2                      22-APR-11                
    3                      23-APR-11                
    4                      24-APR-11                
    5                      25-APR-11                
    6                      26-APR-11                
    7                      27-APR-11                
    8                      28-APR-11                
    9                      29-APR-11                
    10                     30-APR-11                
    11                     01-MAY-11                
    12                     02-MAY-11                
    13                     03-MAY-11                
    14                     04-MAY-11                
    15                     05-MAY-11                
    16                     06-MAY-11                
    17                     07-MAY-11                
    18                     08-MAY-11                
    19                     09-MAY-11                
    20                     10-MAY-11                
    20 rows selected
    now i want sum of weekly datasum(tTNO).

    Something like...
    SQL> ed
    Wrote file afiedt.buf
      1  with t as (select rownum as tno, sysdate+rownum as tdate from dual connect by rownum <= 20)
      2  --
      3  -- end of test data
      4  --
      5  select distinct trunc(tdate,'fmIW') as start_wk
      6        ,trunc(tdate,'fmIW')+6 as end_wk
      7        ,sum(tno) over (partition by trunc(tdate,'fmIW')) as sm
      8  from t
      9* order by 1
    SQL> /
    START_WK    END_WK              SM
    18-APR-2011 24-APR-2011         10
    25-APR-2011 01-MAY-2011         56
    02-MAY-2011 08-MAY-2011        105
    09-MAY-2011 15-MAY-2011         39or just a basic SUM rather than analytical...
    SQL> ed
    Wrote file afiedt.buf
      1  with t as (select rownum as tno, sysdate+rownum as tdate from dual connect by rownum <= 20)
      2  --
      3  -- end of test data
      4  --
      5  select trunc(tdate,'fmIW') as start_wk
      6        ,trunc(tdate,'fmIW')+6 as end_wk
      7        ,sum(tno) as sm
      8  from t
      9  group by trunc(tdate,'fmIW')
    10* order by 1
    SQL> /
    START_WK    END_WK              SM
    18-APR-2011 24-APR-2011         10
    25-APR-2011 01-MAY-2011         56
    02-MAY-2011 08-MAY-2011        105
    09-MAY-2011 15-MAY-2011         39Edited by: BluShadow on 20-Apr-2011 13:56

  • Capacity requirement group by week

    Inside CM01, given a work center,  we can get capacity requirements group by weeks.
    Is there any function module to get the same answer?
    I've found that table KBED seems to hold the information I need.
    But don't know how to group the records into different slot of week.
    Can anybody help?
    Regards,
    Norman.

    Hi,
    user588757 wrote:
    Hi Frank,
    Too good... it worked !!!!!!!!!!!!Thanks. If it's too good, I suppose you can always add errors.
    And can we give input for END DATE...it does have only start date..Yes, you could give the end date instead of the start date.
    You could allow either.
    Suppose you had a form that returned user_input_date (a DATE) and a string, user_date_type, returned by a radio group, 'START' if the user indicated that the date entered was a start date, or 'END' otherwise.
    WITH  d  AS
        SELECT  user_input_date -     CASE
                             WHEN  user_date_type = 'START'
                             THEN  0
                             ELSE  6
                        END     AS input_startdate
    ,     g  AS
        SELECT  (creation_date - input_startdate)  AS wk_num
        FROM    xxsirf_per_addresses_tl
        CROSS JOIN  d
    SELECT    MAX (input_startdate + (7 * wk_num))          AS wk_start
    ,         MAX (input_startdate + (7 * wk_num) + 6)     AS wk_end
    ,         COUNT (*)
    FROM      g
    GROUP BY  wk_num
    ORDER BY  wk_num;Though if you were running this from a form, it would be easy to do sub-query d separately, and pass the input_startdate that it computed to the query I posted earlier.
    I would discourage allowing users to input both start- and end dates, because they won't necessarily be six days apart.
    Of course, you could ignore one if both were entered, but that's exactly the kind of thing that confuses end users.

  • SQL for summing weekly data

    Hi
    I have two tables which has date column and holding more than month data. I want to sum the weekly data starting Monday by combing two table . Can you any help me on to get the SQL for this.
    SQL> select * from ORCL_DATA_GROWTH;
    REPORT_DA DATAGROWTH                                                                                                             
    03-SEP-12       9.78                                                                                                             
    04-SEP-12       5.36                                                                                                             
    05-SEP-12       5.42                                                                                                             
    06-SEP-12      33.36                                                                                                             
    07-SEP-12       5.47                                                                                                             
    08-SEP-12        5.5                                                                                                             
    09-SEP-12        5.5                                                                                                             
    10-SEP-12       9.47                                                                                                             
    11-SEP-12       8.16                                                                                                             
    12-SEP-12      23.97                                                                                                             
    13-SEP-12      51.28                                                                                                             
    14-SEP-12      24.05                                                                                                             
    15-SEP-12      24.03                                                                                                             
    16-SEP-12      24.17                                                                                                             
    17-SEP-12      28.16                                                                                                             
    18-SEP-12      24.17                                                                                                             
    19-SEP-12      24.19                                                                                                             
    20-SEP-12      50.96                                                                                                             
    21-SEP-12      24.19   
    SQL> select * from ORCL_PURGING;
    REPORT_DA PURGING_SPACE FILE_SYSTEM                                                                                              
    01-OCT-12            18 /dborafiles/orac/ora_Test/oradata01                                                                     
    01-OCT-12             0 /dborafiles/orac/ora_Test/oradata05                                                                     
    02-OCT-12            55 /dborafiles/orac/ora_Test/oradata01                                                                     
    02-OCT-12             0 /dborafiles/orac/ora_Test/oradata05                                                                     
    03-OCT-12            21 /dborafiles/orac/ora_Test/oradata01                                                                     
    03-OCT-12             0 /dborafiles/orac/ora_Test/oradata05                                                                     
    04-OCT-12             0 /dborafiles/orac/ora_Test/oradata01                                                                     
    04-OCT-12             0 /dborafiles/orac/ora_Test/oradata05                                                                     
    05-OCT-12             0 /dborafiles/orac/ora_Test/oradata01                                                                     
    05-OCT-12             0 /dborafiles/orac/ora_Test/oradata05                                                                     
    06-OCT-12            70 /dborafiles/orac/ora_Test/oradata01                                                                     
    06-OCT-12             0 /dborafiles/orac/ora_Test/oradata05                                                                     
    07-OCT-12            21 /dborafiles/orac/ora_Test/oradata01                                                                     
    07-OCT-12             0 /dborafiles/orac/ora_Test/oradata05                                                                     
    08-OCT-12            21 /dborafiles/orac/ora_Test/oradata01                                                                     
    08-OCT-12             0 /dborafiles/orac/ora_Test/oradata05 Combining two table and daily out put sample below. But I want to have output by summing weekly value
    SQL> select a.report_date,sum(a.PURGING_SPACE) PURGING_SPACE,b.DATAGROWTH from ORCL_PURGING a ,ORCL_DATA_GROWTH b where a.report_date=b.report_date and a.report
    _date<=(sysdate) and a.report_date >=(sysdate-30) group by a.report_date,b.datagrowth order by a.report_date;
    REPORT_DA PURGING_SPACE DATAGROWTH
    19-SEP-12            77      24.19
    20-SEP-12             2      50.96
    21-SEP-12            47      24.19
    22-SEP-12            19      24.16
    23-SEP-12            22      24.05
    24-SEP-12            25      28.11
    25-SEP-12            43      24.08
    26-SEP-12            21      24.06
    27-SEP-12            22      50.86
    28-SEP-12            22      23.05
    29-SEP-12            22      23.27
    30-SEP-12            22      23.61
    01-OCT-12            18      28.67
    02-OCT-12            55      25.92
    03-OCT-12            21      23.38
    04-OCT-12             0      50.46
    05-OCT-12             0      23.62
    06-OCT-12            70      24.39
    07-OCT-12            21      24.53
    08-OCT-12            21      28.66
    09-OCT-12            51      24.41
    10-OCT-12            22      24.69
    11-OCT-12            23      50.72
    12-OCT-12            22      25.08
    13-OCT-12            25      25.57
    14-OCT-12            21      23.38
    15-OCT-12            22      27.77
    27 rows selected.                               Thanks in advance.
    Edited by: BluShadow on 18-Oct-2012 09:28
    added {noformat}{noformat} tags for readability.  Please read {message:id=9360002} and learn to do this yourself in future.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

    Hi,
    user9256814 wrote:
    Hi
    This is my qery
    WITH purging_week AS
         SELECT TRUNC (report_date, 'IW')     AS report_week
         ,     SUM (purging_space)           AS total_purging_space -- same as in main query
         FROM      orcl_purging
         GROUP BY TRUNC (report_date, 'IW')
    ,     data_growth_week     AS
         SELECT TRUNC (report_date, 'IW')     AS report_week
         ,     SUM (datagrowth)           AS total_datagrowth
         FROM      orcl_data_growth
         GROUP BY TRUNC (report_date, 'IW')
    SELECT     COALESCE ( p.report_week
              , d.report_week
              )               AS report_week
    ,     p.total_purging_space
    ,     d.total_datagrowth
    FROM          purging_week     p
    FULL OUTER JOIN     data_growth_week d ON d.report_week = p.report_week
    ORDER BY report_week
    ;That's still hard to read.
    You may have noticed that this site normally compresses whitespace.
    Whenever you post formatted text (including, but not limited to, code) on this site, type these 6 characters:
    \(small letters only, inside curly brackets) before and after each section of formatted text, to preserve spacing.  Blushadow probably won't do it for you every time you post a message.
    This is just one of the many things mentioned in the forum FAQ {message:id=9360002} that can help you get better answers sooner.
    Are you still having a problem?  Have you tried using in-line views?  If so, wouldn't it make more sense to post the code you're using, rather than some code you're not using?                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Sum of LineCount Including Groups and Detail Data On Each Page Used To Generate New Page If TotalPageLineCount 28

    Post Author: tadj188#
    CA Forum: Formula
    Needed: Sum of LineCount Including Groups and Detail Data On Each Page Used To Generate New Page If TotalPageLineCount > 28
    Background:
    1) Report SQL is created with unions to have detail lines continue on a page, until it reaches page footer or report footer, rather than using  subreports.    A subreport report is now essentially a group1a, group1b, etc. (containing column headers and other data within the the report    with their respective detail lines).  I had multiple subreports and each subreport became one union.
    Created and tested, already:
    1) I have calculated @TotalLineForEachOfTheSameGroup, now I need to sum of the individual same group totals to get the total line count on a page.
    Issue:
    1) I need this to create break on a certain line before, it dribbles in to a pre-printed area.
    Other Ideas Appreciated:
    1) Groups/detail lines break inconveniently(dribble) into the pre-printed area, looking for alternatives for above situation.
    Thank you.
    Tadj

    export all image of each page try like this
    var myDoc = app.activeDocument;
    var myFolder = myDoc.filePath;
    var myImage = myDoc.allGraphics;
    for (var i=0; myImage.length>i; i++){
        app.select(myImage[i]);
        var MyImageNmae  = myImage[i].itemLink.name;
        app.jpegExportPreferences.jpegQuality = JPEGOptionsQuality.high;
        app.jpegExportPreferences.exportResolution = 300;
           app.selection[0].exportFile(ExportFormat.JPG, File(myFolder+"/"+MyImageNmae+".JPEG"), false);
        alert(myImage[i].itemLink.name)

  • From where to get "First day of the week" data for all the locales, is it present in CLDR spec 24?

    I am trying to get "First day of the week" data from CLDR spec24 but cannot find where to look for it in the spec. I need this data to calculate numeric value of "LOCAL day of the week".
    This data to implement "c" and "cc" day formats that equals numeric local day of the week.
    e.g if "First day of the week" data for a locale is 2 (Monday) , it means numeric value for local day of the week will be 1 if it is Monday that day, 2 if it is Tuesday that day and likewise.

    Hi
    If you want to week to be started with Sunday then use the following formula:
    TimestampAdd(SQL_TSI_DAY, 1-DAYOFWEEK(Date'@{var_Date}'), Date'@{var_Date}') if it's retail week(starts from Monday) then the follow below:
    TimestampAdd(SQL_TSI_DAY, 1-DAYOFWEEK(Date'@{var_Date}'), Date'@{var_Date}')
    I'm assuming var_Date is the presentation variable for prompt...
    Edited by: Kishore Guggilla on Jan 3, 2011 4:48 PM

  • Count(*) with group by max(date)

    SQL> select xdesc,xcust,xdate from coba1 order by xdesc,xcust,xdate;
    XDESC XCUST XDATE
    RUB-A 11026 01-JAN-06
    RUB-A 11026 05-JAN-06
    RUB-A 11026 08-JAN-06
    RUB-A 11027 10-JAN-06
    RUB-B 11026 02-JAN-06
    RUB-B 11026 08-JAN-06
    RUB-B 11026 09-JAN-06
    RUB-C 11027 08-JAN-06
    I want to make sql that result :
    XDESC     COUNT(*)
    RUB-A     2
    RUB-B 1
    RUB-C 1
    Criteria : GROUPING: XDESC XCUST AND MAX(DATE)
    bellow mark *** that was selected in count.
    XDESC XCUST XDATE
    RUB-A 11026 01-JAN-06
    RUB-A 11026 05-JAN-06
    RUB-A 11026 08-JAN-06 ***
    RUB-A 11027 10-JAN-06 ***
    ---------------------------------------------------------COUNT RUB-A = 2
    RUB-B 11026 02-JAN-06
    RUB-B 11026 08-JAN-06
    RUB-B 11026 09-JAN-06 ***
    ---------------------------------------------------------COUNT RUB-B = 1
    RUB-C 11027 08-JAN-06 ***
    --------------------------------------------------------COUNT RUB-C = 1
    Can Anybody help ?
    I tried :
    select xdesc,max(xdate),count(max(xdate)) from coba1 group by xdesc
    ERROR at line 1:
    ORA-00937: not a single-group group function
    Thank

    This one is duplicate. see the following link
    Count(*) with group by max(date)
    Thanks

  • How to display 16 weeks data in the output of the query

    Hi experts,
    I have to display 16 weeks data from current week(Thursday to wednesday).
               (19/07/07 - 12/07/07) (11/07/07 - 6/07/07) like these 16 weeks
                             sales                       sales
    product1              200                         300
    product2              400                         500
    I have to use text variable on createddate char but I do not know how to implement
    the above scenerio.
    Guru's please help me.
    Thanks & Regards,
    James.

    sure james ..
    chk these links..
    text var..
    http://help.sap.com/saphelp_nw04s/helpdata/en/85/e0c73cccbdd45be10000000a114084/frameset.htm
    http://help.sap.com/saphelp_nw70/helpdata/en/c1/759b3c4d4d8d15e10000000a114084/frameset.htm
    chk thisthread too..
    Re: Problem with the text variable
    and for replacement paths..
    http://www.sd-solutions.com/documents/SDS_BW_Replacement%20Path%20Variables.html
    hope it helps...

  • Previous week Dates

    Hi all
    Please help in writing a query to get the previous week data.
    If the input date is sysdate then it should return values for previous week Monday to Sunday.
    Whatever the input it should get the value for previous week from Monday to Sunday.
    Thanks
    Jo

    IW is the formatter for the "Iso-Week"
    ISO = international standard organization
    See also: http://en.wikipedia.org/wiki/ISO_week
    Main thing is that it will always start with monday. The weeknumber might be different then for ww format. This doesn't matter in your case. But you migh want to do some experiments with to_char(sysdate,'IW'), to_char(sysdate,'WW').
    Trunc(<datevale>,'IW') will cut the date back to the first day of this iso-week.
    select to_char(sysdate-365,'IW') IW , to_char(sysdate-365,'WW') WW from dual;
    IW     WW
    37     36Edited by: Sven W. on Sep 9, 2010 2:29 PM

  • Payment Medium Workbench -  Group by Posting Date

    I have made a medium format with DMEE Engine but it doesn't work as I would like.
    I want to have a file with all items group by posting date (subtotals group by posting date, no depends on vendor).
    I have 2 leveles:
       - Level 1: Repetition Factor 1          and Key Field FPAYP-FAEDT (Posting Date)
       - Level 2: Repetition Factor 999999 and Key Field FPAYP-DOC2R (Document unique key)
    However when I execute F110 transaction it generates different files, WHY?
    One file per vendor and posting date, but I WANT TO HAVE A UNIQUE FILE.
    Example
                      Vendor          Posting Date          Ammount
    ITEM 1:       xx                 15/10/2007            100
    ITEM 2:       xx                 15/10/2007            200
    ITEM 3:       yy                 15/10/2007            300
    ITEM 4:       xx                 21/10/2007            400
    ITEM 5:       zz                 21/10/2007            500
    Output (wrong)
    FILE1:
                     15/10/2007
                     xx     100
                     xx     200
                              300
    FILE2:
                     15/10/2007
                     yy     300
                              300
    FILE3:
                     21/10/2007
                     xx     400
                              400
    FILE4:
                     21/10/2007
                     zz     500
                              500
    Output (ok)
    FILE1:
                    15/10/2007
                     xx     100
                     xx     200
                     yy     300
                              600
                     21/10/2007
                     xx     400
                     zz     500
                              900
    Thanks in advanced!

    Ok, the problem has been fixed.
    The repetition factor for level 1 (Posting Date) should be 9999999 instead of 1, beacuse it refers to input data and it doesn't refer to output data.
    We can have differents posting dates in input data, but we want to have a unique line by posting date in output data.
    Regards!

  • Group by with date range.

    Hi,
    I am looking for effective usage of Group by against date range.
    I have a transaction table as below.
    Date            customer_no      amount_paid
    01-Dec-13     001                  500
    02-Dec-13     001                  360
    09-Dec-13     001                  200
    02-Nov-13     001                  360
    09-Nov-13     001                  200
    02-Nov-13     001                  360
    09-Oct-13     001                  200
    02-Oct-13     001                  360
    09-Oct-13     001                  200
    02-Sep-13     001                  360
    09-Sep-13     001                  200
    ............... etc.
    I would like to see sum(amount_paid) by past date ranges 1-30 days, 31-60 days, 61-90 days.
    Below are expected results.
    Customer          Duration       amount_paid
    001                    1-30             980
    001                    31-60           450
    001                    61-90          1200
    002                    1-30             300
    002                    31-60           490
    002                    61-90           320
    003                    1-30             450
    ......................etc.
    I have to group by customer no and date range (1-30, 31-60, 61-90..etc).
    Can someone help me getting query for this.
    Thanks...
    Sreeram.

    SQL> with t
      2  as
      3  (
      4  select to_date('01-Dec-13', 'dd-Mon-rr') dt, '001' customer_no, 500 amount_paid from dual
      5    union all
      6  select to_date('02-Dec-13', 'dd-Mon-rr') dt, '001' customer_no, 360 amount_paid from dual
      7    union all
      8  select to_date('09-Dec-13', 'dd-Mon-rr') dt, '001' customer_no, 200 amount_paid from dual
      9    union all
    10  select to_date('02-Nov-13', 'dd-Mon-rr') dt, '001' customer_no, 360 amount_paid from dual
    11    union all
    12  select to_date('09-Nov-13', 'dd-Mon-rr') dt, '001' customer_no, 200 amount_paid from dual
    13    union all
    14  select to_date('02-Nov-13', 'dd-Mon-rr') dt, '001' customer_no, 360 amount_paid from dual
    15    union all
    16  select to_date('09-Oct-13', 'dd-Mon-rr') dt, '001' customer_no, 200 amount_paid from dual
    17    union all
    18  select to_date('02-Oct-13', 'dd-Mon-rr') dt, '001' customer_no, 360 amount_paid from dual
    19    union all
    20  select to_date('09-Oct-13', 'dd-Mon-rr') dt, '001' customer_no, 200 amount_paid from dual
    21    union all
    22  select to_date('02-Sep-13', 'dd-Mon-rr') dt, '001' customer_no, 360 amount_paid from dual
    23    union all
    24  select to_date('09-Sep-13', 'dd-Mon-rr') dt, '001' customer_no, 200 amount_paid from dual
    25  )
    26  select customer_no
    27       , ((grp_val - 1) * 30) + 1 start_val
    28       , grp_val * 30 end_val
    29       , sum(amount_paid) amount_paid
    30    from (
    31            select dt
    32                 , customer_no
    33                 , amount_paid
    34                 , ceil(sum(dt_interval) over(partition by customer_no order by dt)/30) grp_val
    35              from (
    36                      select dt
    37                           , customer_no
    38                           , amount_paid
    39                           , nvl(dt - lag(dt) over(partition by customer_no order by dt), 1) dt_interval
    40                        from t
    41                   )
    42         )
    43   group
    44      by customer_no
    45       , grp_val
    46   order
    47      by grp_val;
    CUS  START_VAL    END_VAL AMOUNT_PAID
    001          1         30         560
    001         31         60         760
    001         61         90         920
    001         91        120        1060
    SQL>

  • Grouping with 2 date in sql server

    totle_count
             dt
             name   
    program_id
    1
      1/3/2015
      Tea
    14
    1
      1/5/2015
      Tea
    14
    1
      1/6/2015
      Lunch
    13
    17
    1/6/2015
      Tea
    14
    2
    1/9/2015
      Breakfast
    1008
    39
    1/10/2015
      Breakfast
    1008
    4
    1/19/2015
      Breakfast
    1008
    1
    1/19/2015
      Dinner
    1009
    3
    1/21/2015
       Tea
    14
    totle_count
              dt
    name
    program_id
    1
       1/6/2015 and 1/8/2015
      Lunch
    13
    2
       1/3/2015 and 1/5/2015
      Tea
    14
    17
       1/6/2015 and 1/9/2015
      Tea
    14
    3
       1/21/2015 and 1/23/2015
      Tea
    14
    41  
    1/9/2015 and 1/11/2015
     Breakfast
    1008
    4
       1/18/2015 and 1/20/2015
     Breakfast
    1008
    1
       1/18/2015 and1/20/2015
    Dinner
    1009
    Here is 2 table, 1st is my row table, and I want to group with 2 date.
    Here is 3 days difference. and sum of value is coming.
    2nt table is my expectation table.
    Plz help me for this problem.
    Thanks
    Samir

    I am doing 3 days grouping here
    "dt" is name of date column
    if "dt" column has '1/19/2015' then next in group it will come in "1/18/2015 and1/20/2015"
    any date of 18 or 19 0r 20 it will come in "1/18/2015 and1/20/2015"
    and counting will add between 3 date
    Sorry your posted output is not as per above rule
    How did row for  1/6/2015 and 1/9/2015 got included then. If you group by 3 days you wont get
    this row as 6-9 is 4 days not 3 days
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

Maybe you are looking for

  • Change UOM in PR but PO exist for PR

    Hi Experts, We created a PR with a customized account assignment cat. and material text (no material code) with UOM L , we created PO in UOM L but purchasers cant do GRN because they received material in UOM LG ( which they mentioned in PO/PR Texts a

  • How to get the UDDI Key to publish a service provider system in SR?

    Hi, I am following instruction in appendix 1 & 2 of the white paper at https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/1079b1af-dcb7-2b10-9ebb-eafcaa3cbbea?overridelayout=true trying to publish a service provider system into the s

  • DVD Drive Not Functioning Properly

    Hello, Earlier tonight i created a Burn Folder for some music, and it was set up to burned onto a Data DVD. I inserted the disc, and OSX acknowledged it, and asked what i wanted to do with the disc. When i selected "Open Finder" i get a pop-up box st

  • TreeTable selectedRowKeys not working in Jdev 11.1.1.4

    Hi, We have a treetable which we create using bindings (underlying objects are POJOs and not BC) We used to add a row to the treetable and then select the first node. This code used to work "as is in Jdev 11.1.1.3" Now when we test the same code usin

  • SPROXY related testing process

    After configuring the ABAP Proxy with SPROXY transaction what is the testing procedure for the connectivity with XI.