SQL for summing weekly data

Hi
I have two tables which has date column and holding more than month data. I want to sum the weekly data starting Monday by combing two table . Can you any help me on to get the SQL for this.
SQL> select * from ORCL_DATA_GROWTH;
REPORT_DA DATAGROWTH                                                                                                             
03-SEP-12       9.78                                                                                                             
04-SEP-12       5.36                                                                                                             
05-SEP-12       5.42                                                                                                             
06-SEP-12      33.36                                                                                                             
07-SEP-12       5.47                                                                                                             
08-SEP-12        5.5                                                                                                             
09-SEP-12        5.5                                                                                                             
10-SEP-12       9.47                                                                                                             
11-SEP-12       8.16                                                                                                             
12-SEP-12      23.97                                                                                                             
13-SEP-12      51.28                                                                                                             
14-SEP-12      24.05                                                                                                             
15-SEP-12      24.03                                                                                                             
16-SEP-12      24.17                                                                                                             
17-SEP-12      28.16                                                                                                             
18-SEP-12      24.17                                                                                                             
19-SEP-12      24.19                                                                                                             
20-SEP-12      50.96                                                                                                             
21-SEP-12      24.19   
SQL> select * from ORCL_PURGING;
REPORT_DA PURGING_SPACE FILE_SYSTEM                                                                                              
01-OCT-12            18 /dborafiles/orac/ora_Test/oradata01                                                                     
01-OCT-12             0 /dborafiles/orac/ora_Test/oradata05                                                                     
02-OCT-12            55 /dborafiles/orac/ora_Test/oradata01                                                                     
02-OCT-12             0 /dborafiles/orac/ora_Test/oradata05                                                                     
03-OCT-12            21 /dborafiles/orac/ora_Test/oradata01                                                                     
03-OCT-12             0 /dborafiles/orac/ora_Test/oradata05                                                                     
04-OCT-12             0 /dborafiles/orac/ora_Test/oradata01                                                                     
04-OCT-12             0 /dborafiles/orac/ora_Test/oradata05                                                                     
05-OCT-12             0 /dborafiles/orac/ora_Test/oradata01                                                                     
05-OCT-12             0 /dborafiles/orac/ora_Test/oradata05                                                                     
06-OCT-12            70 /dborafiles/orac/ora_Test/oradata01                                                                     
06-OCT-12             0 /dborafiles/orac/ora_Test/oradata05                                                                     
07-OCT-12            21 /dborafiles/orac/ora_Test/oradata01                                                                     
07-OCT-12             0 /dborafiles/orac/ora_Test/oradata05                                                                     
08-OCT-12            21 /dborafiles/orac/ora_Test/oradata01                                                                     
08-OCT-12             0 /dborafiles/orac/ora_Test/oradata05 Combining two table and daily out put sample below. But I want to have output by summing weekly value
SQL> select a.report_date,sum(a.PURGING_SPACE) PURGING_SPACE,b.DATAGROWTH from ORCL_PURGING a ,ORCL_DATA_GROWTH b where a.report_date=b.report_date and a.report
_date<=(sysdate) and a.report_date >=(sysdate-30) group by a.report_date,b.datagrowth order by a.report_date;
REPORT_DA PURGING_SPACE DATAGROWTH
19-SEP-12            77      24.19
20-SEP-12             2      50.96
21-SEP-12            47      24.19
22-SEP-12            19      24.16
23-SEP-12            22      24.05
24-SEP-12            25      28.11
25-SEP-12            43      24.08
26-SEP-12            21      24.06
27-SEP-12            22      50.86
28-SEP-12            22      23.05
29-SEP-12            22      23.27
30-SEP-12            22      23.61
01-OCT-12            18      28.67
02-OCT-12            55      25.92
03-OCT-12            21      23.38
04-OCT-12             0      50.46
05-OCT-12             0      23.62
06-OCT-12            70      24.39
07-OCT-12            21      24.53
08-OCT-12            21      28.66
09-OCT-12            51      24.41
10-OCT-12            22      24.69
11-OCT-12            23      50.72
12-OCT-12            22      25.08
13-OCT-12            25      25.57
14-OCT-12            21      23.38
15-OCT-12            22      27.77
27 rows selected.                               Thanks in advance.
Edited by: BluShadow on 18-Oct-2012 09:28
added {noformat}{noformat} tags for readability.  Please read {message:id=9360002} and learn to do this yourself in future.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

Hi,
user9256814 wrote:
Hi
This is my qery
WITH purging_week AS
     SELECT TRUNC (report_date, 'IW')     AS report_week
     ,     SUM (purging_space)           AS total_purging_space -- same as in main query
     FROM      orcl_purging
     GROUP BY TRUNC (report_date, 'IW')
,     data_growth_week     AS
     SELECT TRUNC (report_date, 'IW')     AS report_week
     ,     SUM (datagrowth)           AS total_datagrowth
     FROM      orcl_data_growth
     GROUP BY TRUNC (report_date, 'IW')
SELECT     COALESCE ( p.report_week
          , d.report_week
          )               AS report_week
,     p.total_purging_space
,     d.total_datagrowth
FROM          purging_week     p
FULL OUTER JOIN     data_growth_week d ON d.report_week = p.report_week
ORDER BY report_week
;That's still hard to read.
You may have noticed that this site normally compresses whitespace.
Whenever you post formatted text (including, but not limited to, code) on this site, type these 6 characters:
\(small letters only, inside curly brackets) before and after each section of formatted text, to preserve spacing.  Blushadow probably won't do it for you every time you post a message.
This is just one of the many things mentioned in the forum FAQ {message:id=9360002} that can help you get better answers sooner.
Are you still having a problem?  Have you tried using in-line views?  If so, wouldn't it make more sense to post the code you're using, rather than some code you're not using?                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

Similar Messages

  • PL/SQL for sum/divide and multiply

    Hi..
    I am tyring to do a small computation..I am not good at PL/SQL..Can u plz help..Also plz correct me if i am wrong - in the source used it shud say "Always, replacing....." and Source Type - PL?SQL Expression or Function,correct?
    the function is
    ((Item A + Item B + Item C) / (Item + Item Y = Item Z)) * Item M..
    The net result is displayed in an Item K
    Thank you
    Regards,
    R

    ATD is correct that you should always do in SQL what you can ... in other words PL/SQL should never be the first choice.
    But it appears from what you've written that you may need a function so here's some help.
    You formula is:
    ((Item A + Item B + Item C) / (Item + Item Y = Item Z)) * Item M.
    So you have 6 inputs A, B, C, M, X, and Y (Z making no sense as written) and they all seem be numeric. So:
    CREATE OR REPLACE FUNCTION myfunc (A NUMBER, B NUMBER, C NUMBER, M NUMBER, X NUMBER, Y NUMBER) RETURN NUMBER IS
    BEGIN
      RETURN (((A + B + C) / (X + Y)) * M);
    END myfunc;
    /Should work just fine although it is so simplistic I can think of many ways it could be vastly improved. If it isn't what you want it is close enough you can alter it for your purposes.
    What you have asked is about as simple as one can get. I would expect my students, after three hours of class, to be able to write it. Which means you really need to make taking a PL/SQL class important in your life.
    For self study try these:
    http://www.psoug.org/reference/anonymous_blocks.html
    http://www.psoug.org/reference/functions.html

  • ALV Grid Collect Statment for SUM the data.

    Hellow Experts,
    i am new to ABAP i have on programme which output is coming like ,
    Prod.ID   ... .... ... Qnty in Date1 .. Qnty in  Date2... Qnty in Date3........Qnty in Date31.
    0001                       12.1               0.00               10.1
    0001                       12.1               0.00               10.1
    I need the summestion of qnty against that date , and prodId should not repate my Output should be like this,
    Prod.ID   ... .... ... Qnty in Date1 .. Qnty in  Date2... Qnty in Date3........Qnty in Date31.
    0001                       24.2               0.00               20.2
    0002                       12.1               0.00               10.1
    I wrote collect statment but its not working , plz help me for this, and one more thing that i am passing the data to gt_data.
    My code is here,
    TABLES: AFRU,AUFK,VBAK,KNA1,VBAP,VBKD.
    DATA: BEGIN OF i_alv OCCURS 0,
          ARBID TYPE AFRU-ARBID,
          BUDAT TYPE AFRU-BUDAT,
          WERKS TYPE AFRU-WERKS,
          LMNGA TYPE AFRU-LMNGA,
          AUFNR TYPE AFRU-AUFNR,
          KDAUF TYPE AUFK-KDAUF,
          KDPOS TYPE AUFK-KDPOS,
          VBELN TYPE VBAK-VBELN,
          KUNNR TYPE VBAK-KUNNR,
          NAME1 TYPE KNA1-NAME1,
          MATNR TYPE VBAP-MATNR,
          KDKG1 TYPE VBKD-KDKG1,
      end of i_alv.
    TYPES: BEGIN OF ty_data,
             lmnga TYPE AFRU-LMNGA,
             BUDAT TYPE AFRU-BUDAT,
             ARBID TYPE AFRU-ARBID,
             aufnr TYPE AFRU-AUFNR,
             kdauf TYPE AUFK-KDAUF,
             name1 TYPE KNA1-NAME1,
             matnr TYPE VBAP-MATNR,
             kdkg1 TYPE VBKD-KDKG1,
           END OF ty_data,
           tt_data TYPE STANDARD TABLE OF ty_data,
           BEGIN OF ty_dyn1,                                    "#EC NEEDED
             ARBID TYPE AFRU-ARBID,
             aufnr TYPE AFRU-AUFNR,
             kdauf TYPE AUFK-KDAUF,
             name1 TYPE KNA1-NAME1,
             matnr TYPE VBAP-MATNR,
             kdkg1 TYPE VBKD-KDKG1,
           END OF ty_dyn1,
           BEGIN OF ty_dyn2,                                    "#EC NEEDED
             date  TYPE AFRU-LMNGA,
           END OF ty_dyn2,
           BEGIN OF ty_cols,
             date TYPE BUDAT,
           END OF ty_cols,
           tt_cols TYPE SORTED TABLE OF ty_cols WITH UNIQUE KEY date.
    DATA: gt_data TYPE tt_data,
          gt_data2 type tt_data,
          gt_cols TYPE tt_cols,
          gs_col  TYPE ty_cols.
    DATA: go_sdescr     TYPE REF TO cl_abap_structdescr,
          go_sdescr_new TYPE REF TO cl_abap_structdescr,
          go_tdescr     TYPE REF TO cl_abap_tabledescr,
          gdo_handle    TYPE REF TO data,
          gs_component  TYPE abap_compdescr,
          gs_comp       TYPE abap_componentdescr,
          gt_components TYPE abap_component_tab,
          gr_data       TYPE REF TO cl_salv_table,
          gr_funct      TYPE REF TO cl_salv_functions,
          gr_columns    TYPE REF TO cl_salv_columns_table,
          gr_column     TYPE REF TO cl_salv_column_table,
          g_col         TYPE lvc_fname,
          g_txt         TYPE scrtext_l.
    FIELD-SYMBOLS: <t_data> TYPE ANY TABLE,
                   <s_data> TYPE any,
                   <c> TYPE any,
                   <d> TYPE ty_data.
    DATA:  pono TYPE ztecerti-pono,
           jobno TYPE ztecerti-jobno,
           sdk TYPE string,
           insert TYPE c,
           ok_code LIKE sy-ucomm.
    CALL SCREEN 100.
    START-OF-SELECTION.
    * Populate test data
    FORM get_data.
      SELECT A~ARBID
             A~BUDAT
             A~WERKS
             A~LMNGA
             A~AUFNR
             B~KDAUF
             B~KDPOS
             C~VBELN
             C~KUNNR
             D~NAME1
             E~MATNR
             F~KDKG1
        INTO CORRESPONDING FIELDS OF TABLE  gt_data
                   FROM AFRU AS A INNER JOIN AUFK AS B ON A~AUFNR EQ B~AUFNR
                                  INNER JOIN VBAK AS C ON B~KDAUF = C~VBELN
                                  INNER JOIN KNA1 AS D ON C~KUNNR = D~KUNNR
                                  INNER JOIN VBAP AS E ON B~KDAUF = E~VBELN
                                  INNER JOIN VBKD AS F ON B~KDAUF = F~VBELN
      WHERE   A~ARBID = '10000181' AND  A~BUDAT BETWEEN  PONO AND jobno
      GROUP BY A~ARBID A~LMNGA  A~BUDAT A~WERKS  A~AUFNR B~KDAUF   F~KDKG1   E~MATNR  D~NAME1 C~KUNNR   C~VBELN  B~KDPOS
    ORDER BY B~KDPOS.
    *collect gt_data into gt_data2.
    *gt_data
    *LOOP AT gt_data ASSIGNING <d>.
    ENDFORM.
    Thanking you i realy need it badly

    Answer is fully given here: Re: alv grid Cross Tab Issue....
    m.

  • Sum of weekly data

    sum of weekly data.. from daily data.
    create table test3(tno number(10), tdate date);
    declare
    begin
    for i in 1..20 loop
    insert into test3 values(i,sysdate+i);
    end loop;
    end;
    SELECT * FROM test3;
    TNO                    TDATE                    
    1                      21-APR-11                
    2                      22-APR-11                
    3                      23-APR-11                
    4                      24-APR-11                
    5                      25-APR-11                
    6                      26-APR-11                
    7                      27-APR-11                
    8                      28-APR-11                
    9                      29-APR-11                
    10                     30-APR-11                
    11                     01-MAY-11                
    12                     02-MAY-11                
    13                     03-MAY-11                
    14                     04-MAY-11                
    15                     05-MAY-11                
    16                     06-MAY-11                
    17                     07-MAY-11                
    18                     08-MAY-11                
    19                     09-MAY-11                
    20                     10-MAY-11                
    20 rows selected
    now i want sum of weekly datasum(tTNO).

    Something like...
    SQL> ed
    Wrote file afiedt.buf
      1  with t as (select rownum as tno, sysdate+rownum as tdate from dual connect by rownum <= 20)
      2  --
      3  -- end of test data
      4  --
      5  select distinct trunc(tdate,'fmIW') as start_wk
      6        ,trunc(tdate,'fmIW')+6 as end_wk
      7        ,sum(tno) over (partition by trunc(tdate,'fmIW')) as sm
      8  from t
      9* order by 1
    SQL> /
    START_WK    END_WK              SM
    18-APR-2011 24-APR-2011         10
    25-APR-2011 01-MAY-2011         56
    02-MAY-2011 08-MAY-2011        105
    09-MAY-2011 15-MAY-2011         39or just a basic SUM rather than analytical...
    SQL> ed
    Wrote file afiedt.buf
      1  with t as (select rownum as tno, sysdate+rownum as tdate from dual connect by rownum <= 20)
      2  --
      3  -- end of test data
      4  --
      5  select trunc(tdate,'fmIW') as start_wk
      6        ,trunc(tdate,'fmIW')+6 as end_wk
      7        ,sum(tno) as sm
      8  from t
      9  group by trunc(tdate,'fmIW')
    10* order by 1
    SQL> /
    START_WK    END_WK              SM
    18-APR-2011 24-APR-2011         10
    25-APR-2011 01-MAY-2011         56
    02-MAY-2011 08-MAY-2011        105
    09-MAY-2011 15-MAY-2011         39Edited by: BluShadow on 20-Apr-2011 13:56

  • How to write a sql query to retrieve data entered in the past 2 weeks

    Hi,
    I have file names and last accessed date(java.sql.Date format) stored in my database table, I would like to know how I can write a query to get the name of files accessed in the past 2 weeks,I use open sql server at the back end.
    Thanks in advance.

    This has essentially nothing to do with JDBC. JDBC is just an API to execute the SQL language using Java and thus interact with the databases.
    Your problem is related to the SQL language, you don't know how to write the SQL language. I suggest you to go through a SQL tutorial (there is one at w3schools.com) and to read the SQL documentation which come along with the database in question. A decent database manfacturer has a website and probably also a discussion forum / mailinglist as well.
    I'll give you a hint: you can just use equality operators in SQL like everywhere. For example: "WHERE date < somedate".

  • SQL to generate last weeks date(not a stored procudure)

    I need SQL to generate last weeks dates starting from Monday to Sunday. I don't want to use any table data or stored procedure. I need inline SQL. Please help me with your suggestions. Thanks in advance for the help.

    varun wrote:
    I need SQL to generate last weeks dates starting from Monday to Sunday. I don't want to use any table data or stored procedure. I need inline SQL. Please help me with your suggestions. Thanks in advance for the help.below should get you started
      1* SELECT SYSDATE-LEVEL from dual CONNECT BY LEVEL < 8
    SQL> /
    SYSDATE-L
    21-APR-13
    20-APR-13
    19-APR-13
    18-APR-13
    17-APR-13
    16-APR-13
    15-APR-13
    7 rows selected.

  • Getting the week number for a given date

    Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
    PL/SQL Release 11.1.0.7.0 - Production
    CORE     11.1.0.7.0     Production
    TNS for 64-bit Windows: Version 11.1.0.7.0 - Production
    NLSRTL Version 11.1.0.7.0 - Production
    Hello all.
    I am currently migrating our product from SQL SERVER to ORACLE and have the following issue.
    Basically I'm just trying to get the year week number for a given date but I'm having trouble with Oracle as it seems to think that the weeks run from Thursday to Thursday?. I presume this is something to do with the fact that the first day of the year was Thursday?
    e.g.
    SQL SERVER:
    select DATEPART(wk, '2009-10-24') as Sat -- 43 - correct
    select DATEPART(wk, '2009-10-25') as Sun -- 44 - correct
    select DATEPART(wk, '2009-10-26') as Mon -- 44 - correct
    select DATEPART(wk, '2009-10-27') as Tue -- 44 - correct
    select DATEPART(wk, '2009-10-28') as Wed -- 44 - correct
    select DATEPART(wk, '2009-10-29') as Thu -- 44 - correct
    select DATEPART(wk, '2009-10-30') as Fri -- 44 - correct
    select DATEPART(wk, '2009-10-31') as Sat -- 44 - correct
    select DATEPART(wk, '2009-11-1') as Sun -- 45 - correct
    select DATEPART(wk, '2009-11-2') as Mon -- 45 - correct
    select DATEPART(wk, '2009-11-3') as Tue -- 45 - correct
    select DATEPART(wk, '2009-11-4') as Wed -- 45 - correct
    select DATEPART(wk, '2009-11-5') as Thu -- 45 - correct
    ORACLE:
    SELECT to_char(to_date('24-OCT-2009'), 'ww') as Sat from dual; -- 43 correct
    SELECT to_char(to_date('25-OCT-2009'), 'ww') as Sun from dual; -- 43 incorrect - should be 44
    SELECT to_char(to_date('26-OCT-2009'), 'ww') as Mon from dual; -- 43 incorrect - should be 44
    SELECT to_char(to_date('27-OCT-2009'), 'ww') as Tue from dual; -- 43 incorrect - should be 44
    SELECT to_char(to_date('28-OCT-2009'), 'ww') as Wed from dual; -- 43 incorrect - should be 44
    SELECT to_char(to_date('29-OCT-2009'), 'ww') as Thu from dual; -- 44 correct
    SELECT to_char(to_date('30-OCT-2009'), 'ww') as Fri from dual; -- 44 correct
    SELECT to_char(to_date('31-OCT-2009'), 'ww') as Sat from dual; -- 44 correct
    SELECT to_char(to_date('1-NOV-2009'), 'ww') as Sun from dual; -- 44 incorrect - should be 45
    SELECT to_char(to_date('2-NOV-2009'), 'ww') as Mon from dual; -- 44 incorrect - should be 45
    SELECT to_char(to_date('3-NOV-2009'), 'ww') as Tue from dual; -- 44 incorrect - should be 45
    SELECT to_char(to_date('4-NOV-2009'), 'ww') as Wed from dual; -- 44 incorrect - should be 45
    SELECT to_char(to_date('5-NOV-2009'), 'ww') as Thu from dual; -- 45 correct
    Now I don't want to get into a discussion with regard to locales etc.
    In my world (and is seems SQL SERVER's) the first day of the week is Sunday and the last Saturday.
    Is there some NLS_? setting or something that I'm missing?
    thanks for any help on this.
    Andy

    This is what you need.
    SELECT ceil(( 7+(trunc(to_date('25-OCT-2009'),'d')-trunc(to_date('25-OCT-2009'),'Y')) )/7) FROM dual
    HTH!!!
    --tested all these statements.
    Works as you wish!!
    SELECT ceil(( 7+(trunc(to_date('24-OCT-2009'),'d')-trunc(to_date('24-OCT-2009'),'Y')) )/7) as Sat from dual;
    SELECT ceil(( 7+(trunc(to_date('25-OCT-2009'),'d')-trunc(to_date('25-OCT-2009'),'Y')) )/7) as Sun from dual;
    SELECT ceil(( 7+(trunc(to_date('26-OCT-2009'),'d')-trunc(to_date('26-OCT-2009'),'Y')) )/7) as Mon from dual;
    SELECT ceil(( 7+(trunc(to_date('27-OCT-2009'),'d')-trunc(to_date('27-OCT-2009'),'Y')) )/7) as Tue from dual;
    SELECT ceil(( 7+(trunc(to_date('28-OCT-2009'),'d')-trunc(to_date('28-OCT-2009'),'Y')) )/7) as Wed from dual;
    SELECT ceil(( 7+(trunc(to_date('29-OCT-2009'),'d')-trunc(to_date('29-OCT-2009'),'Y')) )/7) as Thu from dual;
    SELECT ceil(( 7+(trunc(to_date('30-OCT-2009'),'d')-trunc(to_date('30-OCT-2009'),'Y')) )/7) as Fri from dual;
    SELECT ceil(( 7+(trunc(to_date('01-NOV-2009'),'d')-trunc(to_date('01-NOV-2009'),'Y')) )/7) as Sat from dual;
    SELECT ceil(( 7+(trunc(to_date('02-NOV-2009'),'d')-trunc(to_date('02-NOV-2009'),'Y')) )/7) as Sun from dual;
    SELECT ceil(( 7+(trunc(to_date('03-NOV-2009'),'d')-trunc(to_date('03-NOV-2009'),'Y')) )/7) as Mon from dual;
    SELECT ceil(( 7+(trunc(to_date('04-NOV-2009'),'d')-trunc(to_date('04-NOV-2009'),'Y')) )/7) as Tue from dual;
    SELECT ceil(( 7+(trunc(to_date('05-NOV-2009'),'d')-trunc(to_date('05-NOV-2009'),'Y')) )/7) as Wed from dual;
    SELECT ceil(( 7+(trunc(to_date('06-NOV-2009'),'d')-trunc(to_date('06-NOV-2009'),'Y')) )/7) as Thu from dual;
    Cheers!!!
    Bhushan
    Edited by: Buga on Oct 29, 2009 10:46 AM

  • Continuation: need data grouped for every week

    SELECT * FROM aetnah_file_emp_cust_hist
    WHERE pctl_employee_seqnum= 133774This query returns too many rows.
    Now my requirement is that I need to get the most recent contrib_amt from this table for every week based on date column CTL_INS_DTTM.
    This column CTL_INS_DTTM stores data about when a row wass inserted into this table.
    pctl_employee_seqnum CTL_INS_DTTM(MM/DD/YYYY) contrib_amt
    133774 01/01/2009 100
    133774 01/02/2009 200
    133774 01/03/2009 300
    133774 01/04/2009 400
    133774 01/05/2009 500
    133774 01/06/2009
    133774 01/07/2009 700
    133774 01/08/2009 800
    133774 01/10/2009 900
    133774 01/12/2009 1000
    133774 01/13/2009 1100
    133774 01/14/2009 1200
    I will need 52 columns (1 year = 52 weeks) totally and the most recent data for each week.
    Ex:
    Desired output:
    01/07/2009 01/14/2009 01/21/2009 01/28/2009
    700 1200 NULL 200
    SELECT pctl_employee_seqnum, CTL_INS_DTTM, contrib_amt
      FROM (
              SELECT pctl_employee_seqnum, CTL_INS_DTTM, contrib_amt, ROW_NUMBER() OVER(PARTITION BY TO_CHAR(CTL_INS_DTTM, 'WMONYYYY')ORDER BY CTL_INS_DTTM DESC) RNO
                FROM aetnah_file_emp_cust_hist
               WHERE pctl_employee_seqnum= 133774
    WHERE rno = 1This code above is doing that but I need to transpose this row wise data into column wise as mentioned on Desired output.
    Total # of columns:52
    Apologize for opening a new thread.....please help on this transpose issue
    Thank You All

    Hi,
    TO_CHAR (ctl_ins_dttm, 'WMONYYYY') will result in 59 or 60 groups per year, since all months (except February in common years) have (incomplete) 5th weeks.
    If you want 52 equal-sized groups, then TO_CHAR (ctl_ins_dttm, 'WWYYYY') will get you closer. (You'll still have an incomplete week 53.)
    To pivot those rows into one column, you can do something like:
    SELECT  MAX (CASE WHEN TO_CHAR (ctl_ins_dttm, 'WW') = '01' THEN contrib_amt END)   AS week_01
    ,       MAX (CASE WHEN TO_CHAR (ctl_ins_dttm, 'WW') = '02' THEN contrib_amt END)   AS week_02
    ,       MAX (CASE WHEN TO_CHAR (ctl_ins_dttm, 'WW') = '03' THEN contrib_amt END)   AS week_03
    ,       ...If you want data (like "01/07/2009") as the columns headers, then you'll have to use dynamic SQL.

  • Have family plan with 250 data which I almost use each month.  Going on vacation and will be on the road for two weeks.  Should I up my data for a month then change back.  Is it worth it or should I just run over and pay the extra 15 per gig?

    have family plan with 250 data which I almost use each month.  Going on vacation and will be on the road for two weeks.  Should I up my data for a month then change back.  Is it worth it or should I just run over and pay the extra 15 per gig?

    Hello mlazaretti. Vacation time is awesome. (Especially a road trip!) Since you will be going out for two weeks, you never know if having extra data may come in handy. I highly recommend switching to the next tier up so this way you have more data. This way it is only $10.00 more versus $15.00, and you dont have to worry about overages. Then change back at the start of the next billing cycle.
    If you need help making this change let us know! Have a safe trip!
    NicandroN_VZW
    Follow us on twitter @VZWSupport

  • When I pre ordered my iphone 6 i was told I was going to get an additional 1 gig of data per month for a year.  It even shows it on the receipt i received with the phone.  When will I see that reflected on my account.  I have had my new phone for a week a

    When I pre ordered my iphone 6 i was told I was going to get an additional 1 gig of data per month for a year.  It even shows it on the receipt i received with the phone.  When will I see that reflected on my account.  I have had my new phone for a week and used tons more data than usual and am hoping that will save me this month.

    concretedonkey, I'm glad you were able to take advantage of this offer when you ordered your new iPhone 6. I can certainly review your account to ensure this was added for you. Please reply to the direct message I have sent you.
    AndreaS_VZW
    Follow us on Twitter @VZWSupport

  • How to get data for current week and previous week using customer exit in Bex.

    Hi everyone,
    I have a scenario in which I need to display data for current week and previous week (based on "sy_datum" the program has to calculate current week and previous week) in Bex using  Customer exit. I have created one variable in Bex Query Designer and I have written code for the variable in CMOD. But it is not working fine, (I know that we can do the same by using offset value in Bex). Can some one guide me how to achieve my requirement using customer exit.
    Thanks in Advance,
    G S Ramanjaneyulu.

    Hi krishna,
    Thanks for your quick reply, can you have a look at my code,
    case i_vnam.
    WHEN 'ZPWK_CWK'.
    ranges : pre_week for sy-datum.
    data : start_date type DATS,
           end_date TYPE dats .
    ************FM TO GET FIRST DATE OF CURRENT WEEK ************************
    CALL FUNCTION 'BWSO_DATE_GET_FIRST_WEEKDAY'
      EXPORTING
        DATE_IN  = sy-datum
      IMPORTING
        DATE_OUT = start_date.   " WEEK FIRST DATE
    end_date = START_DATE + 6.   " WEEK LAST DATE
    END_DATE   = START_DATE - 1.   " PREVIOUS WEEK END DATE
    START_DATE = START_DATE - 7.   " PREVIOUS WEEK START  DATE
    **********PREVIOUS WEEK DATES IN PRE_WEEK******************
    pre_week-SIGN   = 'I'.
    pre_week-option = 'BT'.
    pre_week-LOW    = START_DATE.
    pre_week-HIGH   = END_DATE.
    APPEND  pre_week.
    CLEAR : START_DATE,END_DATE.
    endcase.
    Regards,
    G S Ramanjaneyulu.

  • How to get the date of first day of a week for a given date

    Hi gurus
    can any one say me how to get the date of first day(date of Sunday) of a week for a given date in a BW transformations. For example for 02/23/2012 in source i need to get 02/19/2012(Sunday`s date) date in the result. I can get that start date of a week using  BWSO_DATE_GET_FIRST_WEEKDAY function module. But this function module retrieves me the  start date as weeks monday(02/20/2012) date. But i need sundays(02/19/2012) date as the start date. So it would be really great if anyone sends me the solution.
    Thanks
    Rav

    Hi,
    The simplest way would be to subtract 1 from the date date which you are already getting in transformation routine, but instead of doing that subtraction manually which might need bit of errort, you can simply use another FM to subtract 1 from given date.
    RP_CALC_DATE_IN_INTERVAL
    Regards,
    Durgesh.

  • From where to get "First day of the week" data for all the locales, is it present in CLDR spec 24?

    I am trying to get "First day of the week" data from CLDR spec24 but cannot find where to look for it in the spec. I need this data to calculate numeric value of "LOCAL day of the week".
    This data to implement "c" and "cc" day formats that equals numeric local day of the week.
    e.g if "First day of the week" data for a locale is 2 (Monday) , it means numeric value for local day of the week will be 1 if it is Monday that day, 2 if it is Tuesday that day and likewise.

    Hi
    If you want to week to be started with Sunday then use the following formula:
    TimestampAdd(SQL_TSI_DAY, 1-DAYOFWEEK(Date'@{var_Date}'), Date'@{var_Date}') if it's retail week(starts from Monday) then the follow below:
    TimestampAdd(SQL_TSI_DAY, 1-DAYOFWEEK(Date'@{var_Date}'), Date'@{var_Date}')
    I'm assuming var_Date is the presentation variable for prompt...
    Edited by: Kishore Guggilla on Jan 3, 2011 4:48 PM

  • HT4623 Is this the most up to date article on up dating the iPhone - it there anything about the problems relating to the up date to OS 6 - my G3 no longer works and Apple seems to have gone to ground.  No Genius bar appointments available for a week!

    Is this the most up to date article on up-dating the iPhone OS - is there anything about the problems relating to the up-date to OS 6 - my G3 no longer works and Apple seems to have gone to ground.  No Genius bar appointments available for a week!

    iOS 7 devices need iTunes 11.1 or newer to sync.  Depending upon your computer and the OS it is running, you may not be able to upgrade to iTunes 11.1 based on the requirements for iTunes 11.1.  What computer and OS are you running your iTunes on?  From the iTunes download pages I see the requiorements are:
    Macintosh System Requirements
    Mac computer with an Intel Core processor
    OS X version 10.6.8 or later
    400MB of available disk space
    Broadband Internet connection to use the iTunes Store
    Windows System Requirements
    PC with a 1GHz Intel or AMD processor and 512MB of RAM
    Windows XP Service Pack 2 or later, 32-bit editions of Windows Vista, Windows 7, or Windows 8
    64-bit editions of Windows Vista, Windows 7, or Windows 8 require the iTunes 64-bit installer
    400MB of available disk space
    Broadband Internet connection to use the iTunes Store

  • Java.sql.SQLException:ORA-01801:date format is too long for internal buffer

    Hi,
    I am getting the following exception when i trying to insert data in to a table through a stored procedure.
    oracle.apps.fnd.framework.OAException: java.sql.SQLException: ORA-01801: date format is too long for internal buffer
    when execute this stored procedure from ana anonymous block , it gets executed successfully, but i use a OracleCallableStatement to execute the procedure i am getting this error.
    Please let me know how to resolve this error.
    Is this error something to do with the Database Configuration ?
    Thanks & Regards
    Meenal

    I don't know if this will help, but we were getting this error in several of the standard OA framework pages and after much pain and aggravation it was determined that visiting the Sourcing Home Page was changing the timezone. For most pages this just changed the timezone that dates were displayed in, but some had this ORA-01801 error and some others had an ORA-01830 error (date format picture ends before converting entire input string). Unfortunately, if you are not using Sourcing at your site, this probably won't help, but if you are, have a look at patch # 4519817.
    Note that to get the same error, try the following query (I got this error in 9.2.0.5 and 10.1.0.3):
    select to_date('10-Mar-2006', 'DD-Mon-YYYY________________________________________________HH24:MI:SS') from dual;
    It appears that you can't have a date format that is longer than 68 characters.

Maybe you are looking for

  • Need to print the company logo in alv report

    Hi All ,        I am displaying an alv grid for some QM report .        I have used top-of-page event and   'REUSE_ALV_COMMENTARY_WRITE '    FM to display the logo and header . Every thing is working fine .   But when I am taking the print-out  the l

  • Merge Lync 2013 Edge servers in same pool

    Hi guys. - We had Lync 2013 FE STD version. - We have added one more Lync 2013 FE STD and done front end pool pairing. - We had single Edge Pool, soo only 1 EDGE server being in 1 POOL. We wish to add another Edge server and put previous and this new

  • Sound in SWF not working in PDF

    Hi all, I did a childrens book some time ago and I just reopened it in CS 5, applied the page turn animation to it and output it as SWF. It worked fine as an embedded SFW in the pdf - no problems. I then decided to add sound to each page and set the

  • How do I open a downloaded Zip file from firefox in android platform

    as above . I downloaded a file off the net via a link and its an ebook in ZIP format. The android version of Firefox refuses to allow me to do anything with it

  • SAP Screens to HTML conversion

    Hi Friends, I need to convert SAP Transaction screens into HTML pages. SAP GUI for HTML gives only BHTML pages but not normal HTML pages. Even from EP TRansaction iview, teh same SAP GUI for HTML pages will be displayed. Even we can not save them als