Sum by Date

Hello,
I have a table that has an end_of_week column along with sunday,monday,tuesday,wednesday,thursday,friday,saturday columns.
I want to be able to query the data based on any date I input(not necessarily the end_of_week date) and have the sum of the days return.
For example:
End_of_week  sunday  monday    tuesday    wednesday   thursday    friday      saturday
08/20/2011    15     5         5          1           2           3           1
08/27/2011    0      12        2          0           0           0           0select sum(sunday + monday+ tuesday+wednesday+thursday+friday +saturday)
where end_of_week between '08/16/2011' and '08/27/2011'
This returns the wrong data because it does not recognize the days at dates.It sums sunday - saturday and not the 16th(tuesday) to the 27th.
Any help is appreciated.

var start_date varchar2(10);
var end_date varchar2(10);
begin
  :start_date := '08/16/2011';
  :end_date := '08/27/2011';
end;
select sum(case when end_of_week - 6 between to_date(:start_date,'mm/dd/yyyy') and
                                             to_date(:end_date,'mm/dd/yyyy')
                then sunday
                else 0
           end +
           case when end_of_week - 5 between to_date(:start_date,'mm/dd/yyyy') and
                                             to_date(:end_date,'mm/dd/yyyy')
                then monday
                else 0
           end +
           case when end_of_week - 4 between to_date(:start_date,'mm/dd/yyyy') and
                                            to_date(:end_date,'mm/dd/yyyy')
                then tuesday
                else 0
           end +
           case when end_of_week - 3 between to_date(:start_date,'mm/dd/yyyy') and
                                            to_date(:end_date,'mm/dd/yyyy')
                then wednesday
                else 0
           end +
           case when end_of_week - 2 between to_date(:start_date,'mm/dd/yyyy') and
                                            to_date(:end_date,'mm/dd/yyyy')
                then thursday
                else 0
           end +
           case when end_of_week - 1 between to_date(:start_date,'mm/dd/yyyy') and
                                            to_date(:end_date,'mm/dd/yyyy')
                then friday
                else 0
           end +
           saturday)
  from week_table
where end_of_week between to_date(:start_date,'mm/dd/yyyy') and to_date(:end_date,'mm/dd/yyyy');

Similar Messages

  • HT201412 After ios7, My iphone4 has gone dead twice in a time frame of around two months...even though its not happening very often but a two month Old phone going dead just after a New update is not acceptable.probably ios7 still has sum bugs dat need to

    After ios7, My iphone4 has gone dead twice in a time frame of around two months...even though its not happening very often but a two month Old phone going dead just after a New update is not acceptable.probably ios7 still has sum bugs dat need to be fixed...Can this bug expected to be fixed in the next update...

    Hi 1283ar.
    Unfortunately, iOS 7 is too hard to push for the iPhone 4 and therefore has a lot of effects turned off to try to get it to run as smoothly as possible.
    However, it becomes better and better with each update coming but it's hard to do anything about the hardware on an already released phones.
    If you still have trouble or think they are too hard. My tip is, if so, to restore your iPhone 4 and make a clean setup with no iCloud backup. But all your photos in a photostream so you can access it later.

  • Sum by Date in a SQL Region

    I have a simple table where users are supposed to log the number of hours they work each day on different projects.
    ID
    USERNAME
    WORKDATE
    CHARGECODE
    CHARGEHOURS
    I want to have a region which shows the totals for each day for that user.
    as they enter timecharge codes and hours it would update with the total number of hours.
    Eg the following line items
    USERNAME WORKDATE CHARGECODE CHARGEHOURS
    JMP------11-JUL-2008--- IT-------6.5
    JMP------11-JUL-2008----HW-------1.0
    would sum to
    USERNAME WORKDATE TOTAL
    JMP-------11-JUL-2008------7.5
    We need to be able to see and total hours for each project so I need to keep them separate.
    My initial way of approaching this was to try and create a collection (Like the one used in the ORDERS page in the Example Application) but I did not get very far with that idea.
    Any help would be most appreciated.
    Message was edited by:
    [email protected]

    When you say "as they enter timecharge codes and hours", do you mean that you want the total displayed real-time (using javascript or ajax) ? After entering the hours, if the user clicks a SUBMIT button, then it is possible to store the data in table, temp table or collection, and use the query to display totals in a different region.
    <p>
    In case you are looking for the query to display the table, here it is:
    <p>
    select USERNAME, WORKDATE, sum(CHARGEHOURS) as TOTAL
    from table
    group by USERNAME, WORKDATE<p>
    Ravi

  • Moving sum using date intervals - analytic functions help

    let's say you have the following set of data:
    DATE SALES
         09/02/2012     100
         09/02/2012     50
         09/02/2012     10
         09/02/2012     1000
         09/02/2012     20
         12/02/2012     1000
         12/02/2012     1100
         14/02/2012     1000
         14/02/2012     100
         15/02/2012     112500
         15/02/2012     13500
         15/02/2012     45000
         15/02/2012     1500
         19/02/2012     1500
         20/02/2012     400
         23/02/2012     2000
         27/02/2012     4320
         27/02/2012     300000
         01/03/2012     100
         04/03/2012     17280
         06/03/2012     100
         06/03/2012     100
         06/03/2012     4320
         08/03/2012     100
         13/03/2012     1000
    for each day i need to know the sum of the sales in the present and preceding 5 days (calendar) [not five rows].
    What qurey could i use???
    Please help!

    Hi.
    Here's one way.
    WITH data AS
         SELECT TO_DATE('09/02/2012','DD/MM/YYYY') d,     100 n FROM DUAL UNION ALL
         SELECT TO_DATE('09/02/2012','DD/MM/YYYY') d,     50 n FROM DUAL UNION ALL
         SELECT TO_DATE('09/02/2012','DD/MM/YYYY') d,     10 n FROM DUAL UNION ALL
         SELECT TO_DATE('09/02/2012','DD/MM/YYYY') d,     1000 n FROM DUAL UNION ALL
         SELECT TO_DATE('09/02/2012','DD/MM/YYYY') d,     20 n FROM DUAL UNION ALL
         SELECT TO_DATE('12/02/2012','DD/MM/YYYY') d,     1000 n FROM DUAL UNION ALL
         SELECT TO_DATE('12/02/2012','DD/MM/YYYY') d,     1100 n FROM DUAL UNION ALL
         SELECT TO_DATE('14/02/2012','DD/MM/YYYY') d,     1000 n FROM DUAL UNION ALL
         SELECT TO_DATE('14/02/2012','DD/MM/YYYY') d,     100 n FROM DUAL UNION ALL
         SELECT TO_DATE('15/02/2012','DD/MM/YYYY') d,     112500 n FROM DUAL UNION ALL
         SELECT TO_DATE('15/02/2012','DD/MM/YYYY') d,     13500 n FROM DUAL UNION ALL
         SELECT TO_DATE('15/02/2012','DD/MM/YYYY') d,     45000 n FROM DUAL UNION ALL
         SELECT TO_DATE('15/02/2012','DD/MM/YYYY') d,     1500 n FROM DUAL UNION ALL
         SELECT TO_DATE('19/02/2012','DD/MM/YYYY') d,     1500 n FROM DUAL UNION ALL
         SELECT TO_DATE('20/02/2012','DD/MM/YYYY') d,     400 n FROM DUAL UNION ALL
         SELECT TO_DATE('23/02/2012','DD/MM/YYYY') d,     2000 n FROM DUAL UNION ALL
         SELECT TO_DATE('27/02/2012','DD/MM/YYYY') d,     4320 n FROM DUAL UNION ALL
         SELECT TO_DATE('27/02/2012','DD/MM/YYYY') d,     300000 n FROM DUAL UNION ALL
         SELECT TO_DATE('01/03/2012','DD/MM/YYYY') d,     100 n FROM DUAL UNION ALL
         SELECT TO_DATE('04/03/2012','DD/MM/YYYY') d,     17280 n FROM DUAL UNION ALL
         SELECT TO_DATE('06/03/2012','DD/MM/YYYY') d,     100 n FROM DUAL UNION ALL
         SELECT TO_DATE('06/03/2012','DD/MM/YYYY') d,     100 n FROM DUAL UNION ALL
         SELECT TO_DATE('06/03/2012','DD/MM/YYYY') d,     4320 n FROM DUAL UNION ALL
         SELECT TO_DATE('08/03/2012','DD/MM/YYYY') d,     100 n FROM DUAL UNION ALL
         SELECT TO_DATE('13/03/2012','DD/MM/YYYY') d,     1000 n FROM DUAL
    days AS
         SELECT TO_DATE('2012-02-01','YYYY-MM-DD')+(LEVEL-1) d
         FROM DUAL
         CONNECT BY LEVEL <= 60
    totals_per_day AS
         SELECT dy.d,SUM(NVL(dt.n,0)) total_day
         FROM
              data dt,
              days dy
         WHERE
              dy.d = dt.d(+)
         GROUP BY dy.d
         ORDER BY 1
    SELECT
         d,
         SUM(total_day) OVER
              ORDER BY d
             RANGE BETWEEN  5 PRECEDING AND CURRENT ROW
         ) AS five_day_total
    FROM totals_per_day;
    2012-02-01 00:00:00     0
    2012-02-02 00:00:00     0
    2012-02-03 00:00:00     0
    2012-02-04 00:00:00     0
    2012-02-05 00:00:00     0
    2012-02-06 00:00:00     0
    2012-02-07 00:00:00     0
    2012-02-08 00:00:00     0
    2012-02-09 00:00:00     1180
    2012-02-10 00:00:00     1180
    2012-02-11 00:00:00     1180
    2012-02-12 00:00:00     3280
    2012-02-13 00:00:00     3280
    2012-02-14 00:00:00     4380
    2012-02-15 00:00:00     175700
    2012-02-16 00:00:00     175700
    2012-02-17 00:00:00     175700
    2012-02-18 00:00:00     173600
    2012-02-19 00:00:00     175100
    2012-02-20 00:00:00     174400
    2012-02-21 00:00:00     1900
    2012-02-22 00:00:00     1900
    2012-02-23 00:00:00     3900
    2012-02-24 00:00:00     3900
    2012-02-25 00:00:00     2400
    2012-02-26 00:00:00     2000
    2012-02-27 00:00:00     306320
    2012-02-28 00:00:00     306320
    2012-02-29 00:00:00     304320
    2012-03-01 00:00:00     304420
    2012-03-02 00:00:00     304420
    2012-03-03 00:00:00     304420
    2012-03-04 00:00:00     17380
    2012-03-05 00:00:00     17380
    2012-03-06 00:00:00     21900
    2012-03-07 00:00:00     21800
    2012-03-08 00:00:00     21900
    2012-03-09 00:00:00     21900
    2012-03-10 00:00:00     4620
    2012-03-11 00:00:00     4620
    2012-03-12 00:00:00     100
    2012-03-13 00:00:00     1100
    2012-03-14 00:00:00     1000
    2012-03-15 00:00:00     1000
    2012-03-16 00:00:00     1000
    2012-03-17 00:00:00     1000
    2012-03-18 00:00:00     1000
    2012-03-19 00:00:00     0
    2012-03-20 00:00:00     0
    2012-03-21 00:00:00     0
    2012-03-22 00:00:00     0
    2012-03-23 00:00:00     0
    2012-03-24 00:00:00     0
    2012-03-25 00:00:00     0
    2012-03-26 00:00:00     0
    2012-03-27 00:00:00     0
    2012-03-28 00:00:00     0
    2012-03-29 00:00:00     0
    2012-03-30 00:00:00     0
    2012-03-31 00:00:00     0Hope this helps.
    Regards.

  • Calculate a sum by date

    I had an Excelsheet imported in siena.
    First column is Date (01/01/2011 for example) - second column is a Value. For each day i have one row and each new day i add one value. (Date and Value)
    In siena i would like to calculate a sum from the beginning of the actual month until  today. And i would like to show the result in a lable.
    After that i would like to add a 2nd Label an show the sum from the beginnig of actual month (1 year ago) to today (on year ago).
    I would like to place both lable on the mainscreen. I tried a lot of different things but i didn't get the solution. Maybe it is very simple, but ....
    Thanks for helping me soon and greetings from
    Hermann

    Hi Hermann123, thanks for using Siena! Below is the expression that works for me for the "need the sum of all values in the current month before today".
    Let's assume that you have an Excel table named Table1. The table has 2 columns named "MyDate" and "Val", the first being Date format, the second being Number format in Excel. Let's assume you have imported the table in Siena Beta2. Given
    the above assumptions, do the following:
    1) Add a label
    2) Set this expression for the Text property of the label:
    Sum(Filter(Table1, Year(Today()) = Year(MyDate) && Month(Today()) = Month(MyDate) && Day(Today()) >= Day(MyDate)), Val)
    Allow me to explain in a bit more depth
    what the expression does and how I arrived to it.
    Since we want to sum over multiple rows, we need to use the Sum function for a table (http://siena.blob.core.windows.net/beta/ProjectSienaBetaFunctionReference.html#_Toc373745529).
    I usually start with summing *everything* in the Val column: Sum(Table1, Val)
    Now it's time to narrow down the rows by filtering out the rows we don't want included. Therefore instead of Table1 I need to use something like Filter(Table1, <some_condition>)
    Now let's start entering the desired condition. First we want the year to match:
    Sum( Filter(Table1, Year(Today()) = Year(MyDate), Val)
    Notice the <some_condition> now becomes Year(Today()) = Year(MyDate). Basically the condition is true only if the today's year matches the year in the current row of the Excel table. The Siena table-functions take care of running the condition check
    for each row of the table.
    Next, we want the month to match:
    Sum( Filter(Table1, Year(Today()) = Year(MyDate) && Month(Today()) = Month(MyDate), Val)
    To finish off, we need to incorporate the "give me only past days or today in the current month" requirement. This should be easy now:
    Sum( Filter(Table1, Year(Today()) = Year(MyDate) && Month(Today()) = Month(MyDate) && Day(Today()) >= Day(MyDate), Val)
    Enjoy and thanks again for using Siena!
    Manski

  • Sql ---sum of Date-time formate ......

    id time
    1 1.20
    2 2.30
    3 4.20
    4 5.30
    5 1.00
    result expected = 14:40 or (if AM/PM so result is ???)
    how can i calculate the sum of time ???
    please give me some fast tricks of handling date/time formate in oracle....

    942425 wrote:
    id time
    1 1.20
    2 2.30
    3 4.20
    4 5.30
    5 1.00
    result expected = 14:40 or (if AM/PM so result is ???)
    how can i calculate the sum of time ???
    please give me some fast tricks of handling date/time formate in oracle....Hi,
    First you post in wrong forum. Please post at PL/SQL
    Before posting there close it and mark as answered.
    if your time is number data type then you can try this
    SELECT SUM(NVL(ROUND(TRUNC(DAC_OT_HOUR)+((TO_NUMBER (SUBSTR (DAC_OT_HOUR, INSTR (DAC_OT_HOUR,'.',1,1)+1)))/60)),0))
    FROM DA_CHECKHope this helps
    Hamid
    Mark correct/helpful to help others to get right answer(s).*

  • Why does the sum of data in subfolders not add to the top folder?

    I am trying to remove folders in my primary drive of my MacPro so as to free-up space on the primary drive.  I have been using the Info button for each folder to see the total amount of data for that particular folder.  When I add up the data contained in each folder they sum to a substantially lower amount of Gigabytes that is shown by the info button for the top folder.  If you add the data shown in each of the top folders on the drive, the sum is considerable less than is shown for the drive in its info tab.  Why is there this difference in the sum of individual folders and the drive Gigabytes shown?  Is there data that is not apparent because hidden folders not shown in the Finder?

    Suppose you had folder with three one-byte files in it. GetInfo reports three bytes in the folder. But each file takes a whole number of allocation blocks on the drive, probably something like 12K total.
    Mac OS X also blurs exactly what is meant by a folder. For example, an Application is typically presented to you as one file, but if you look inside it, you would see that it may contain a hundred files in folders.
    Some Users have reported better luck using OmniDiskSweeper (run as admin) to show how much real disk space is taken up by various items -- the GetInfo byte counts can be hard to deal with.

  • Sum calculated data results from three queries in another query

    Hello,
      I would like to read calculated data from three different queries and sum them in another query.  Is it possible to do it, please?

    Hi there,
    We have done this in Excel with a debtors report. I have five queries, each on a different sheet in a workbook. I have their total line appearing at the top of the results area.
    Then I have an Excel worksheet with Excel formulas to bring through the totals on the other reports.
    i.e.   +Sheet1!C30      + Sheet2!C35     + Sheet3!D40    = Result
    Hope this helps,
    Regards
    Gill

  • SQL for summing weekly data

    Hi
    I have two tables which has date column and holding more than month data. I want to sum the weekly data starting Monday by combing two table . Can you any help me on to get the SQL for this.
    SQL> select * from ORCL_DATA_GROWTH;
    REPORT_DA DATAGROWTH                                                                                                             
    03-SEP-12       9.78                                                                                                             
    04-SEP-12       5.36                                                                                                             
    05-SEP-12       5.42                                                                                                             
    06-SEP-12      33.36                                                                                                             
    07-SEP-12       5.47                                                                                                             
    08-SEP-12        5.5                                                                                                             
    09-SEP-12        5.5                                                                                                             
    10-SEP-12       9.47                                                                                                             
    11-SEP-12       8.16                                                                                                             
    12-SEP-12      23.97                                                                                                             
    13-SEP-12      51.28                                                                                                             
    14-SEP-12      24.05                                                                                                             
    15-SEP-12      24.03                                                                                                             
    16-SEP-12      24.17                                                                                                             
    17-SEP-12      28.16                                                                                                             
    18-SEP-12      24.17                                                                                                             
    19-SEP-12      24.19                                                                                                             
    20-SEP-12      50.96                                                                                                             
    21-SEP-12      24.19   
    SQL> select * from ORCL_PURGING;
    REPORT_DA PURGING_SPACE FILE_SYSTEM                                                                                              
    01-OCT-12            18 /dborafiles/orac/ora_Test/oradata01                                                                     
    01-OCT-12             0 /dborafiles/orac/ora_Test/oradata05                                                                     
    02-OCT-12            55 /dborafiles/orac/ora_Test/oradata01                                                                     
    02-OCT-12             0 /dborafiles/orac/ora_Test/oradata05                                                                     
    03-OCT-12            21 /dborafiles/orac/ora_Test/oradata01                                                                     
    03-OCT-12             0 /dborafiles/orac/ora_Test/oradata05                                                                     
    04-OCT-12             0 /dborafiles/orac/ora_Test/oradata01                                                                     
    04-OCT-12             0 /dborafiles/orac/ora_Test/oradata05                                                                     
    05-OCT-12             0 /dborafiles/orac/ora_Test/oradata01                                                                     
    05-OCT-12             0 /dborafiles/orac/ora_Test/oradata05                                                                     
    06-OCT-12            70 /dborafiles/orac/ora_Test/oradata01                                                                     
    06-OCT-12             0 /dborafiles/orac/ora_Test/oradata05                                                                     
    07-OCT-12            21 /dborafiles/orac/ora_Test/oradata01                                                                     
    07-OCT-12             0 /dborafiles/orac/ora_Test/oradata05                                                                     
    08-OCT-12            21 /dborafiles/orac/ora_Test/oradata01                                                                     
    08-OCT-12             0 /dborafiles/orac/ora_Test/oradata05 Combining two table and daily out put sample below. But I want to have output by summing weekly value
    SQL> select a.report_date,sum(a.PURGING_SPACE) PURGING_SPACE,b.DATAGROWTH from ORCL_PURGING a ,ORCL_DATA_GROWTH b where a.report_date=b.report_date and a.report
    _date<=(sysdate) and a.report_date >=(sysdate-30) group by a.report_date,b.datagrowth order by a.report_date;
    REPORT_DA PURGING_SPACE DATAGROWTH
    19-SEP-12            77      24.19
    20-SEP-12             2      50.96
    21-SEP-12            47      24.19
    22-SEP-12            19      24.16
    23-SEP-12            22      24.05
    24-SEP-12            25      28.11
    25-SEP-12            43      24.08
    26-SEP-12            21      24.06
    27-SEP-12            22      50.86
    28-SEP-12            22      23.05
    29-SEP-12            22      23.27
    30-SEP-12            22      23.61
    01-OCT-12            18      28.67
    02-OCT-12            55      25.92
    03-OCT-12            21      23.38
    04-OCT-12             0      50.46
    05-OCT-12             0      23.62
    06-OCT-12            70      24.39
    07-OCT-12            21      24.53
    08-OCT-12            21      28.66
    09-OCT-12            51      24.41
    10-OCT-12            22      24.69
    11-OCT-12            23      50.72
    12-OCT-12            22      25.08
    13-OCT-12            25      25.57
    14-OCT-12            21      23.38
    15-OCT-12            22      27.77
    27 rows selected.                               Thanks in advance.
    Edited by: BluShadow on 18-Oct-2012 09:28
    added {noformat}{noformat} tags for readability.  Please read {message:id=9360002} and learn to do this yourself in future.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

    Hi,
    user9256814 wrote:
    Hi
    This is my qery
    WITH purging_week AS
         SELECT TRUNC (report_date, 'IW')     AS report_week
         ,     SUM (purging_space)           AS total_purging_space -- same as in main query
         FROM      orcl_purging
         GROUP BY TRUNC (report_date, 'IW')
    ,     data_growth_week     AS
         SELECT TRUNC (report_date, 'IW')     AS report_week
         ,     SUM (datagrowth)           AS total_datagrowth
         FROM      orcl_data_growth
         GROUP BY TRUNC (report_date, 'IW')
    SELECT     COALESCE ( p.report_week
              , d.report_week
              )               AS report_week
    ,     p.total_purging_space
    ,     d.total_datagrowth
    FROM          purging_week     p
    FULL OUTER JOIN     data_growth_week d ON d.report_week = p.report_week
    ORDER BY report_week
    ;That's still hard to read.
    You may have noticed that this site normally compresses whitespace.
    Whenever you post formatted text (including, but not limited to, code) on this site, type these 6 characters:
    \(small letters only, inside curly brackets) before and after each section of formatted text, to preserve spacing.  Blushadow probably won't do it for you every time you post a message.
    This is just one of the many things mentioned in the forum FAQ {message:id=9360002} that can help you get better answers sooner.
    Are you still having a problem?  Have you tried using in-line views?  If so, wouldn't it make more sense to post the code you're using, rather than some code you're not using?                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Update column based on sum of data from another table

    Hi We had two tables.
    Table 1: matusetrans
    ITEMNUM Location Quantity transdate
    AM1324 AM1 2 12-4-12
    AM1324 AM1 2 15-5-12
    AM1324 AM1 3 10-6-12
    AM1324 AM1 4 5-1-13
    AM1324 AM1 2 13-3-13
    AM1324 AM2 3 2-5-12
    AM1324 AM2 2 12-7-12
    AM1324 AM2 1 13-2-13
    Table 2: Inventory
    ITEMNUM STORELOC lastyear currentyear
    AM1324 AM1 need sum(quantity) here need sum(quantity)
    AM1324 AM2 need sum(quantity) here need sum(quantity)
    We have to update the last year and current year columns with sum of quantities for each item from matusetrans table based on date at different location in Inventory table.
    we had nearly 13,000 records(itemnum's with different location) in inventory table in DB we have to update entire records.
    Any help...How to write an sql queries to update lastyear and currentyear columns with sum of quantities based on itemnum and location in Inventory table
    Thanks
    Edit/Delete Message

    Try this
    SQL> select * from matusetrans;
    ITEMNU LOC   QUANTITY TRANSDATE
    AM1324 AM1          2 12-APR-12
    AM1324 AM1          2 15-MAY-12
    AM1324 AM1          3 10-JUN-12
    AM1324 AM1          4 05-JAN-13
    AM1324 AM1          2 13-MAR-13
    AM1324 AM2          3 02-MAY-12
    AM1324 AM2          2 12-JUL-12
    AM1324 AM2          1 13-FEB-13
    8 rows selected.
    SQL> select * from inventory;
    ITEMNU STO   LASTYEAR CURRENTYEAR
    AM1324 AM1          0           0
    AM1324 AM2          0           0
    SQL> merge into inventory i
      2  using (
      3            select itemnum
      4                 , location
      5                 , sum(decode(extract(year from transdate), extract(year from sysdate), quantity)) currentyear
      6                 , sum(decode(extract(year from transdate), extract(year from add_months(sysdate, -12)), quantity)) lastyear
      7              from matusetrans
      8             group
      9                by itemnum
    10                 , location
    11        ) t
    12     on (
    13           i.itemnum  = t.itemnum and
    14           i.storeloc = t.location
    15        )
    16  when matched then
    17    update set i.lastyear = t.lastyear
    18             , i.currentyear = t.currentyear
    19  /
    2 rows merged.
    SQL> select * from inventory;
    ITEMNU STO   LASTYEAR CURRENTYEAR
    AM1324 AM1          7           6
    AM1324 AM2          5           1
    SQL>

  • Preview does not sum column data in PDF forms

    Preview is way better than Acrobat for most purposes, but it has some basic limitations that just make it hard to use. for example I have to use this form that asks you to enter $ in a table, however Preview does not sum the column, it just puts a "0" in. Why doesn't Preview add? Plus, this default behaviour makes it harder to paste in my own number as there is a zero in the way. At least it would be nice if it could just remove the 0 and leave me with an editble section. Is this just the product of a badly created PDF form or, as I suspect, a limitation of Preview?

    I'm having the exact problem. I have a fillable PDF form that has many checkboxes. Saving the pdf (doesn't matter whether it's 'Save', 'Save All', or 'Save As' under the File menu) will clear/reset the checkboxes in the file.
    Anyone know of a workaround for this? Or, is Apple fixing this soon? I really don't want to get Acrobat Pro 9 (well, I don't think it's compatible with Snow Leopard yet anyway)
    If you'd like a sample PDF file to try it out, let me know.
    Thanks.

  • ALV Grid Collect Statment for SUM the data.

    Hellow Experts,
    i am new to ABAP i have on programme which output is coming like ,
    Prod.ID   ... .... ... Qnty in Date1 .. Qnty in  Date2... Qnty in Date3........Qnty in Date31.
    0001                       12.1               0.00               10.1
    0001                       12.1               0.00               10.1
    I need the summestion of qnty against that date , and prodId should not repate my Output should be like this,
    Prod.ID   ... .... ... Qnty in Date1 .. Qnty in  Date2... Qnty in Date3........Qnty in Date31.
    0001                       24.2               0.00               20.2
    0002                       12.1               0.00               10.1
    I wrote collect statment but its not working , plz help me for this, and one more thing that i am passing the data to gt_data.
    My code is here,
    TABLES: AFRU,AUFK,VBAK,KNA1,VBAP,VBKD.
    DATA: BEGIN OF i_alv OCCURS 0,
          ARBID TYPE AFRU-ARBID,
          BUDAT TYPE AFRU-BUDAT,
          WERKS TYPE AFRU-WERKS,
          LMNGA TYPE AFRU-LMNGA,
          AUFNR TYPE AFRU-AUFNR,
          KDAUF TYPE AUFK-KDAUF,
          KDPOS TYPE AUFK-KDPOS,
          VBELN TYPE VBAK-VBELN,
          KUNNR TYPE VBAK-KUNNR,
          NAME1 TYPE KNA1-NAME1,
          MATNR TYPE VBAP-MATNR,
          KDKG1 TYPE VBKD-KDKG1,
      end of i_alv.
    TYPES: BEGIN OF ty_data,
             lmnga TYPE AFRU-LMNGA,
             BUDAT TYPE AFRU-BUDAT,
             ARBID TYPE AFRU-ARBID,
             aufnr TYPE AFRU-AUFNR,
             kdauf TYPE AUFK-KDAUF,
             name1 TYPE KNA1-NAME1,
             matnr TYPE VBAP-MATNR,
             kdkg1 TYPE VBKD-KDKG1,
           END OF ty_data,
           tt_data TYPE STANDARD TABLE OF ty_data,
           BEGIN OF ty_dyn1,                                    "#EC NEEDED
             ARBID TYPE AFRU-ARBID,
             aufnr TYPE AFRU-AUFNR,
             kdauf TYPE AUFK-KDAUF,
             name1 TYPE KNA1-NAME1,
             matnr TYPE VBAP-MATNR,
             kdkg1 TYPE VBKD-KDKG1,
           END OF ty_dyn1,
           BEGIN OF ty_dyn2,                                    "#EC NEEDED
             date  TYPE AFRU-LMNGA,
           END OF ty_dyn2,
           BEGIN OF ty_cols,
             date TYPE BUDAT,
           END OF ty_cols,
           tt_cols TYPE SORTED TABLE OF ty_cols WITH UNIQUE KEY date.
    DATA: gt_data TYPE tt_data,
          gt_data2 type tt_data,
          gt_cols TYPE tt_cols,
          gs_col  TYPE ty_cols.
    DATA: go_sdescr     TYPE REF TO cl_abap_structdescr,
          go_sdescr_new TYPE REF TO cl_abap_structdescr,
          go_tdescr     TYPE REF TO cl_abap_tabledescr,
          gdo_handle    TYPE REF TO data,
          gs_component  TYPE abap_compdescr,
          gs_comp       TYPE abap_componentdescr,
          gt_components TYPE abap_component_tab,
          gr_data       TYPE REF TO cl_salv_table,
          gr_funct      TYPE REF TO cl_salv_functions,
          gr_columns    TYPE REF TO cl_salv_columns_table,
          gr_column     TYPE REF TO cl_salv_column_table,
          g_col         TYPE lvc_fname,
          g_txt         TYPE scrtext_l.
    FIELD-SYMBOLS: <t_data> TYPE ANY TABLE,
                   <s_data> TYPE any,
                   <c> TYPE any,
                   <d> TYPE ty_data.
    DATA:  pono TYPE ztecerti-pono,
           jobno TYPE ztecerti-jobno,
           sdk TYPE string,
           insert TYPE c,
           ok_code LIKE sy-ucomm.
    CALL SCREEN 100.
    START-OF-SELECTION.
    * Populate test data
    FORM get_data.
      SELECT A~ARBID
             A~BUDAT
             A~WERKS
             A~LMNGA
             A~AUFNR
             B~KDAUF
             B~KDPOS
             C~VBELN
             C~KUNNR
             D~NAME1
             E~MATNR
             F~KDKG1
        INTO CORRESPONDING FIELDS OF TABLE  gt_data
                   FROM AFRU AS A INNER JOIN AUFK AS B ON A~AUFNR EQ B~AUFNR
                                  INNER JOIN VBAK AS C ON B~KDAUF = C~VBELN
                                  INNER JOIN KNA1 AS D ON C~KUNNR = D~KUNNR
                                  INNER JOIN VBAP AS E ON B~KDAUF = E~VBELN
                                  INNER JOIN VBKD AS F ON B~KDAUF = F~VBELN
      WHERE   A~ARBID = '10000181' AND  A~BUDAT BETWEEN  PONO AND jobno
      GROUP BY A~ARBID A~LMNGA  A~BUDAT A~WERKS  A~AUFNR B~KDAUF   F~KDKG1   E~MATNR  D~NAME1 C~KUNNR   C~VBELN  B~KDPOS
    ORDER BY B~KDPOS.
    *collect gt_data into gt_data2.
    *gt_data
    *LOOP AT gt_data ASSIGNING <d>.
    ENDFORM.
    Thanking you i realy need it badly

    Answer is fully given here: Re: alv grid Cross Tab Issue....
    m.

  • Using Dimension Formulas to sum data based on different criteria

    Hi all,
    I am trying to use a u201CDimension Formulau201D to perform the following calculation:
    We have an account dimension which has 2 important properties:
    1.     CRITERIUMTYPE: This property can have 3 different values: u201CWERKu201D, u201CINVu201D OR u201CLIQu201D
    2.     ACCTYPE: This property can have 2 different values: u201CEXPu201D or u201CINCu201D
    The client wants to have a report that sums data based on these 2 properties. An example will help to clarify this:
    ACCOUNTS     CRITERIUMTYPE     ACCTYPE     VALUE
    ACCOUNT A         WERK                          EXP               100 u20AC
    ACCOUNT B         WERK                          INC               150 u20AC
    ACCOUNT C         WERK                          EXP               200 u20AC
    ACCOUNT D         WERK                          INC               300 u20AC
    ACCOUNT E         INV                          EXP               50 u20AC
    ACCOUNT F         INV                          INC               100 u20AC
    ACCOUNT G         INV                          EXP               200 u20AC
    ACCOUNT H         INV                          INC               500 u20AC
    The clients wishes to see this data in the following way:
    CRITERIUMTYPE     ACCTYPE     VALUE
       WERK                         EXP              300 u20AC
                                        INC              450 u20AC
       INV                         EXP              250 u20AC
                                     INC              600 u20AC
    In order to achieve this I have created several new accounts, one for each combination e.g.: Account WERKEXP is used to sum the data on the combination CRITERIUMTYPE=WERK and ACCTYPE=EXP. I have created a dimension formula in my account dimension but this is where I am stuck. I have created the following formula to calculate the account WERKEXP:
    IIF([BUDGETPOSITIE].CURRENTMEMBER.PROPERTIES("ACCTYPE")="EXP",IIF([BUDGETPOSITIE].CURRENTMEMBER.PROPERTIES("CRITERIUMTYPE")="WERK",[BUDGETPOSITIE].CURRENTMEMBER,0),0)
    The problem with this formula is the following:
    The formula will add all amounts that meet the 2 criteria mentioned in the formula, EXP and WERK, but as soon as it finds an accounts that does not match the 2 criteria it will set the account WERKEXP back to 0. I need to know if there is a way, using dimension formulas, of adding these values together without the new account being set to 0 as soon as one of the accounts it needs to check does not meet 1 of the 2 criteria.
    We are working on SAP BPC 7.5 for NW with SP04
    All help is very much appreciated!
    Kind regards,
    Stefano

    Hi,
    You can also use ParentHn property to have different grouping of accounts within the dimension.
    So in your case rather using the member formula you can have four accounts and add them in the Parenthn property for grouping it.
    1. The Solution proposed by Nilanjan is specific for a Report/IS and Performance will be good
    2. My solution will be global something similar to MDX formula, but performance may be slightly lesser than using excel function.
    Hope this helps,
    Regards,
    G.Vijaya Kumar

  • Sum Date Time data type

    hello,
    In rdlc report i need to sum the date time datatype .
    please help .
    thanks.

    Hi Krishnakumar_Dev,
    If we want to extract hour in a date time field, we can use hour function as below in Reporting Services.
    =sum(hour(Fields!date.Value))
    Besides, if this issue is still existed, Could you please post the screenshot about your dataset with sample data and desired result?
    For more information about Date and Time Functions, please refer to the following link:
    http://technet.microsoft.com/en-us/library/aa337153(v=sql.100).aspx
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • CFScript Math expression - Summing up looped over data

    If I loop over numeric data and  I want to sum that data I am looking for the correct manner to do so. Of course I want to do this in cfscript.
    I know that I could go the route of doing a one demensional array and using the array sum function and I could also use stauctures.
    I am interested in different solutions to this problem.
    My end state is that I need to sum and average looped dynamic data for reporting purposes.
    For purposes of example static values will be helpful for better understanding.
    Thanks in advance.

    Owain,
    This is what I came up with and it outputs the looped data. Now I am trying to add it up and average the results.
    BTW Adam,
    Be delusional on a Microsoft or Oracle forum board.
    Now that would be some fun reading cutting through those flames! (In a patronising Kinda Way!).

Maybe you are looking for

  • Laserjet 200 color MFP M276n

    Good morning, i have a problem with this MFP that doesn't connect to web and so i cn't use eprint function. I activate the web sercices and i obtained the eprint email address (i can read this by a browser pointing the ip of the mfp), but i can't pri

  • Error (-208) occurs when I try to sync photos to my iPod

    Whenever I try to update photos on my iPod (30GB 5.5G) I keep receiving error -208. I have tried restoring, and resting my iPod, but noting has solved this problem. However, I can transfer music, videos, and podcasts just fine. I am only having troub

  • Backup 3.1

    Not sure if this is the proper forum, but sure hope someone can help me. I have 19GB of music on my iTunes that I REALLY need to Backup. I downloaded Backup 3.1, purchased a portable hard drive, set it up to Backup my iLife (might as well) including

  • Please help...Automation not chasing?

    For some reason a new mix I am working on is suffering from automation issues, namely the automation is not chasing. Is ther some switch for this? I have looked everywhere..

  • New MacBook

    I just got my new macbook in, and this is my first time switching between 10.4 and 10.5. There are two questions I have right now, although I'm sure there will be more! First, I noticed that there is no way in the dock to tell if a program is open or