To calcuate the accumulated value

Hello,
I would like to write a calc script for the following scenario: I have a series of data value, and would like to calcuate the accumulated value based on the sequence of from the smallest to the largest. The case is like this:
The input is as following
            Case 1
Acct1: 10
Acct2:    5
Acct3:    8
Acct4: 15
I would like to calcuate the accumlated value based on the data value, the caulcated result is at column Case 2:
           Case 1  Case 2  Rank
Acct1: 10          23           3
Acct2:    5           5           1
Acct3:    8          13          2
Acct4: 15              38       4
I can easily use the rank function to rank the data value of "Case1", but after that how can I use this rank to cauculate the accumluated values? The data value may change, (from time to time, for example).
Any good suggestions? Thanks in advance

Hi There
Try this script below. I have done this in the SAMPLE app. script may look crazy, but I have tested this and it works. Also what makes it a little more complicated is the fact that ranking can give you duplicate rank values, so I have designed this to get around that problem.
Thanks
Anthony
SET UPDATECALC OFF;
VAR AccountCount=0;
VAR CurrRank=0
/*Create blocks in case dimension as mine is sparse*/
FIX(Local,FY13,BegBalance,Forecast,Working,"E01_101_1110",@RELATIVE("TotaltestAccount",0),"HSP_InputValue")
DATACOPY "Case1" TO "Rank";
DATACOPY "Case1" TO "Case2";
DATACOPY "Case1" TO "Sequence";
DATACOPY "Case1" TO "Case1Seq";
ENDFIX
FIX(Local,FY13,BegBalance,Forecast,Working,"E01_101_1110",@RELATIVE("TotaltestAccount",0),"HSP_InputValue")
"Case2" = #missing;
"Sequence"= #missing;
/*Create a sequence for the list of accounts so that if you for any chance have a value that is the same the rank will give a unique rank value - because this becomes a problem later*/
"Sequence"= (@SHIFT("Sequence",-1,"Acc1":"Acc5") + 0.00000000001);
/*Rank case1 and add the sequence to give you the unique ranking and then * -1 to reverse the rank*/
"Case1Seq" = "Sequence" + @RANK(SKIPNONE,"Case1",@RANGE("Case1",@RELATIVE("TotaltestAccount",0))) * -1;
ENDFIX
FIX(Local,FY13,Forecast,Working,"E01_101_1110",@RELATIVE("TotaltestAccount",0),"HSP_InputValue")
/*Next rank the previous rank from above to now populate the unique ranking of case 1 in dec order*/
"Rank"(
      IF(@ISMBR("BegBalance"))
"Rank" = @RANK(SKIPNONE,"Case1Seq",@RANGE("Case1seq",@RELATIVE("TotaltestAccount",0)));
ENDIF
/*Populate the max ranking value to start the variable counter*/
AccountCount = @MAXSRANGE(SKIPMISSING,"Rank"->"BegBalance",@RELATIVE("TotalTestAccount",0));
CurrRank = AccountCount;
ENDFIX
FIX(Local,FY13,"Case2",Forecast,Working,"E01_101_1110",@RELATIVE("TotaltestAccount",0),"HSP_InputValue")
/*Calculate the lowest case 1 value first or highest rank value, then make curr rank equal to highest rank -1 */
"BegBalance"(
           IF(AccountCount == "Rank")
           "Begbalance" = "BegBalance"->"Case1" ;
               CurrRank = CurrRank - 1;
           ENDIF
/*Then loop through the process reducing the current rank variable by 1 each time*/
LOOP(100)
"BegBalance"(
         IF(CurrRank == "Rank")/* i.e rank ==4 */
      "Begbalance" = @MAXSRANGE(SKIPMISSING,"Case2"->"BegBalance",@RELATIVE("TotalTestAccount",0)) +"BegBalance"->"Case1";
          CurrRank = CurrRank - 1;/*then make rank == 3*/
           ENDIF)
ENDLOOP;
ENDFIX

Similar Messages

  • Select just the values between min and max of an accumulated value over day

    Hello Forum,
    a value is accumulated over a day and over a period of time. Next day the value is reseted and starts again to be accumulated:
    with sampledata as (select to_date('09.09.2012 00:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
                       select to_date('09.09.2012 01:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
                       select to_date('09.09.2012 02:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
                       select to_date('09.09.2012 03:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
                       select to_date('09.09.2012 04:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
                       select to_date('09.09.2012 05:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
                       select to_date('09.09.2012 06:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 23 val from dual union all
                       select to_date('09.09.2012 07:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 29 val from dual union all
                       select to_date('09.09.2012 08:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 30 val from dual union all
                       select to_date('09.09.2012 09:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 45 val from dual union all
                       select to_date('09.09.2012 10:12:12', 'dd.mm.yyyy hh24:mi:ss') ts, 60 val from dual union all
                       select to_date('09.09.2012 11:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 75 val from dual union all
                       select to_date('09.09.2012 12:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 95 val from dual union all
                       select to_date('09.09.2012 13:21:26', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
                       select to_date('09.09.2012 14:21:27', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('09.09.2012 15:21:30', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('09.09.2012 16:21:32', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('09.09.2012 17:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('09.09.2012 21:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('09.09.2012 23:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('10.09.2012 00:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('10.09.2012 01:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('10.09.2012 02:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('10.09.2012 03:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('10.09.2012 04:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('10.09.2012 05:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('10.09.2012 06:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 14 val from dual union all
                       select to_date('10.09.2012 07:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 34 val from dual union all
                       select to_date('10.09.2012 08:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 58 val from dual union all
                       select to_date('10.09.2012 09:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 70 val from dual union all
                       select to_date('10.09.2012 10:12:12', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
                       select to_date('10.09.2012 11:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('10.09.2012 12:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
                       select to_date('10.09.2012 13:21:26', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
                       select to_date('10.09.2012 14:21:27', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
                       select to_date('10.09.2012 15:21:30', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
                       select to_date('10.09.2012 16:21:32', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
                       select to_date('10.09.2012 21:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual)
    select   ts, val
    from     sampledata
    order by ts asc;How should I change the select statement to skip all data sets before the first minimum and the duplicates after the maximum of a day in order to get such a result:
    TS     VAL
    09.09.12 06:12     23
    09.09.12 07:12     29
    09.09.12 08:12     30
    09.09.12 09:12     45
    09.09.12 10:12     60
    09.09.12 11:12     75
    09.09.12 12:21     95
    09.09.12 13:21     120
    09.09.12 14:21     142
    10.09.12 06:12     14
    10.09.12 07:12     34
    10.09.12 08:12     58
    10.09.12 09:12     70
    10.09.12 10:12     120
    10.09.12 11:12     142
    10.09.12 12:21     153Thank you

    This solution works perfectly when the accumulated value has its low and its high on the same day. But I found out :( , that there is also data, which has its low yesterday and its high today. For a better understandig of the case, there is a machine, wich is working over 3 Shifts with irregular start and end time. For example Shift1 cann start at 5:50 or at 7:15. The accumulated value of the worked time is accumuated for each shift extra. This solution works for the shift 1 (approximate between 06:00-14:00) and for the shift 2(approximate between 14:00-22:00), because there is the low and the high of the accumulated value on the same day. This solution does not work for the shif 3(approximate between 22:00-06:00), because the high of the accumulated value is or can be the next day.
    So the thread title should be: "Select just the values between min and max of an accumulated value over the same day(today) or over two successive days (yesterday and today)
    Sampledata for shift 1 or shift 2:
    {code}
    with sampledata as (select to_date('09.09.2012 00:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
    select to_date('09.09.2012 01:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
    select to_date('09.09.2012 02:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
    select to_date('09.09.2012 03:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
    select to_date('09.09.2012 04:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
    select to_date('09.09.2012 05:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
    select to_date('09.09.2012 06:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 23 val from dual union all
    select to_date('09.09.2012 07:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 29 val from dual union all
    select to_date('09.09.2012 08:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 30 val from dual union all
    select to_date('09.09.2012 09:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 45 val from dual union all
    select to_date('09.09.2012 10:12:12', 'dd.mm.yyyy hh24:mi:ss') ts, 60 val from dual union all
    select to_date('09.09.2012 11:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 75 val from dual union all
    select to_date('09.09.2012 12:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 95 val from dual union all
    select to_date('09.09.2012 13:21:26', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
    select to_date('09.09.2012 14:21:27', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('09.09.2012 15:21:30', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('09.09.2012 16:21:32', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('09.09.2012 17:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('09.09.2012 21:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('09.09.2012 23:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('10.09.2012 00:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('10.09.2012 01:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('10.09.2012 02:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('10.09.2012 03:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('10.09.2012 04:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('10.09.2012 05:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('10.09.2012 06:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 143 val from dual union all
    select to_date('10.09.2012 07:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 144 val from dual union all
    select to_date('10.09.2012 08:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 145 val from dual union all
    select to_date('10.09.2012 09:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 10:12:12', 'dd.mm.yyyy hh24:mi:ss') ts, 147 val from dual union all
    select to_date('10.09.2012 11:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 148 val from dual union all
    select to_date('10.09.2012 12:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
    select to_date('10.09.2012 13:21:26', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
    select to_date('10.09.2012 14:21:27', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
    select to_date('10.09.2012 15:21:30', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
    select to_date('10.09.2012 16:21:32', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
    select to_date('10.09.2012 21:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual)
    , got_analytics     AS
         SELECT ts, val
         ,     MIN (val) OVER ( PARTITION BY TRUNC (ts)
                        ORDER BY      ts DESC
                        )      AS min_val_after
         ,     CASE
              WHEN ROW_NUMBER () OVER ( PARTITION BY TRUNC (ts)
                             ORDER BY     val
                             ,      ts     
                             ) = 1          
              THEN -1 -- Impossibly low val
              ELSE LAG (val) OVER ( PARTITION BY TRUNC (ts)
                             ORDER BY      ts
              END           AS prev_val
         ,     MIN (val) OVER (PARTITION BY     TRUNC (ts))
                        AS low_val_today
         ,     NVL ( LAST_VALUE (val) OVER ( ORDER BY ts
                             RANGE BETWEEN UNBOUNDED PRECEDING
                                  AND     ts - TRUNC (ts) PRECEDING
              , -1
              )          AS last_val_yesterday
         FROM sampledata
    SELECT     ts
    ,     val
    FROM     got_analytics
    WHERE     val          <= min_val_after
    AND     val          > prev_val
    AND     (      val     > low_val_today
         OR     val     != last_val_yesterday
    ORDER BY ts
    {code}
    with the expected results:
    {code}
    1     09.09.2012 06:12:02     23
    2     09.09.2012 07:12:03     29
    3     09.09.2012 08:12:04     30
    4     09.09.2012 09:12:11     45
    5     09.09.2012 10:12:12     60
    6     09.09.2012 11:12:13     75
    7     09.09.2012 12:21:24     95
    8     09.09.2012 13:21:26     120
    9     09.09.2012 14:21:27     142
    10     10.09.2012 06:12:02     143
    11     10.09.2012 07:12:03     144
    12     10.09.2012 08:12:04     145
    13     10.09.2012 09:12:11     146
    14     10.09.2012 10:12:12     147
    15     10.09.2012 11:12:13     148
    16     10.09.2012 12:21:24     153
    {code}
    And the sampledata for shift 3 is:
    {code}
    with sampledata as (select to_date('08.09.2012 00:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 23 val from dual union ALL
    select to_date('08.09.2012 02:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 45 val from dual union all
    select to_date('08.09.2012 05:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 78 val from dual union all
    select to_date('08.09.2012 06:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
    select to_date('08.09.2012 08:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
    select to_date('08.09.2012 10:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
    select to_date('08.09.2012 12:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
    select to_date('08.09.2012 16:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
    select to_date('08.09.2012 17:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
    select to_date('08.09.2012 19:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
    select to_date('08.09.2012 21:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
    select to_date('08.09.2012 22:00:12', 'dd.mm.yyyy hh24:mi:ss') ts, 24 val from dual union all
    select to_date('08.09.2012 22:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 40 val from dual union all
    select to_date('08.09.2012 23:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 68 val from dual union all
    select to_date('09.09.2012 01:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 79 val from dual union all
    select to_date('09.09.2012 02:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 124 val from dual union all
    select to_date('09.09.2012 03:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 125 val from dual union all
    select to_date('09.09.2012 04:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 126 val from dual union all
    select to_date('09.09.2012 05:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 06:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 07:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 08:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 09:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 10:12:12', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 11:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 12:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 13:21:26', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 14:21:27', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 15:21:30', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 16:21:32', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 17:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 21:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union ALL
    select to_date('09.09.2012 22:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 5 val from dual union ALL
    select to_date('09.09.2012 22:51:33', 'dd.mm.yyyy hh24:mi:ss') ts, 23 val from dual union all
    select to_date('09.09.2012 23:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 40 val from dual union all
    select to_date('10.09.2012 00:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 50 val from dual union all
    select to_date('10.09.2012 01:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 60 val from dual union all
    select to_date('10.09.2012 02:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 78 val from dual union all
    select to_date('10.09.2012 03:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('10.09.2012 04:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 145 val from dual union all
    select to_date('10.09.2012 05:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 06:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 07:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 08:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 09:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 10:12:12', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 11:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 12:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 13:21:26', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 14:21:27', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 15:21:30', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 16:21:32', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 21:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual)
    , got_analytics AS
    SELECT ts, val
    , MIN (val) OVER ( PARTITION BY TRUNC (ts)
    ORDER BY ts DESC
    ) AS min_val_after
    , CASE
    WHEN ROW_NUMBER () OVER ( PARTITION BY TRUNC (ts)
    ORDER BY val
    , ts
    ) = 1
    THEN -1 -- Impossibly low val
    ELSE LAG (val) OVER ( PARTITION BY TRUNC (ts)
    ORDER BY ts
    END AS prev_val
    , MIN (val) OVER (PARTITION BY TRUNC (ts))
    AS low_val_today
    , NVL ( LAST_VALUE (val) OVER ( ORDER BY ts
    RANGE BETWEEN UNBOUNDED PRECEDING
    AND ts - TRUNC (ts) PRECEDING
    , -1
    ) AS last_val_yesterday
    FROM sampledata
    SELECT ts
    , val
    FROM got_analytics
    WHERE val <= min_val_after
    AND val > prev_val
    AND ( val > low_val_today
    OR val != last_val_yesterday
    ORDER BY ts
    {code}
    with the unexpected results:
    {code}
    - ts val
    1     08.09.2012 00:04:08     23
    2     08.09.2012 22:12:13     40
    3     08.09.2012 23:21:24     68
    4     09.09.2012 22:21:33     5
    5     09.09.2012 22:51:33     23
    6     09.09.2012 23:21:33     40
    7     10.09.2012 00:04:08     50
    8     10.09.2012 01:03:08     60
    9     10.09.2012 02:54:11     78
    10     10.09.2012 03:04:08     142
    11     10.09.2012 04:04:19     145
    12     10.09.2012 05:04:20     146
    {code}
    The result should be:
    {code}
    - ts val
    1     08.09.2012 00:04:08     23
    2     08.09.2012 02:04:08     45
    3     08.09.2012 05:03:08     78
    4     08.09.2012 06:54:11     90
    5     08.09.2012 22:00:12     24
    6     08.09.2012 22:12:13     40
    7     08.09.2012 23:21:24     68
    8     09.09.2012 01:03:08     79
    9     09.09.2012 02:54:11     124
    10     09.09.2012 03:04:08     125
    11     09.09.2012 04:04:19     126
    12     09.09.2012 05:04:20     127
    13     09.09.2012 22:21:33     5
    14     09.09.2012 22:51:33     23
    15     09.09.2012 23:21:33     40
    16     10.09.2012 00:04:08     50
    17     10.09.2012 01:03:08     60
    18     10.09.2012 02:54:11     78
    19     10.09.2012 03:04:08     142
    20     10.09.2012 04:04:19     145
    21     10.09.2012 05:04:20     146
    {code}
    Thank you for your help!

  • Calculation of  accumulated values

    Hi experts!
    I'm looking for a solution on how to accumulate a value, say 0AMOUNT, over time. The reason is a requirement to look att accumulated values although only posted amounts per fiscal perioid is available from the BC extractor. So the posted amounts need to be summed up over the time dimension in order to calculate a (also historical) balance for each fiscal period. How would you best implement such a solution?
    I have considered:
    1) define a hierarchy on 0fiscper that groups together fiscal periods into an aggregated level - e.g 2001-01 in the hierarchy would contain actual fiscper 2001-001, 2001-02 in the hierarchy would contain both 2001-001 and 2001-002 (thereby causing an accumulated value when summed up), and so on ...
    2) Do the accumulation in an update routine by some ABAP code - im not quite sure on how to do this - any suggestions would be helpul!
    3) Dynamically calculate the sum of 0amount in the report although this would be the least efficient soulution...
    Any help/suggestions is appreciated!
    Many thanks,
    David

    Just as you say i'm also sure that it "can" be done, i'm just looking for the best solution. And possibly some hints in implementing that one.
    It is true that I have 0AMOUNT per fiscal year/period, but the option to simply sum up 0AMOUNT in the report is not feasible since that would require to much computation to be done online (we keep history for a long time back). Hence I would need to precompute the accumulated value per fiscal period and save it in the cube as a non-cumulative balance. So when the user restricts on, say fiscper "2003-01" in the query, the accumulated balance for 2003-01 is directly retrieved from the cube.
    Many thanks,
    David

  • Auto sum Column accumulated value in gridview

    The principle seems simple but I'm in trouble. Did anyone have an example or script to auto sum of a column in the gridview. Explain, I have a gridview with the columns:
    amount, unitary value, total value and accumulated value. I need the accumulated value column add the previous value
    to the current value. For example:
    quantity unitary_value total_value accumulated_value
    10 10,00 100,00 100,00
    20 50,00 1.000,00 1.100,00
    5 500,00 2.500,00 3.600,00
    Basically it would be up. The column value (accumulated value) takes the value of the previous line and adds to the current value.
    If you can help !!
    Thank you for your attention.
    Thank you for your attention.

    No you're right with the calculation of the first post, I got it wrong. Now check the below solution, you need to use RowDataBound event of the gridview to do the calculation and print in the cell:
    //This is just a method to fill the datatable with dummy data(you can ignore in your code)static DataTable GetTable()
    // Here we create a DataTable with four columns.
    DataTable table = new DataTable();
    table.Columns.Add("quantity", typeof(double));
    table.Columns.Add("unitary_value", typeof(double));
    table.Columns.Add("total_value", typeof(double));
    table.Columns.Add("accumulated_value", typeof(double));
    // Here we add five DataRows.
    table.Rows.Add(10, 10, 0, 0);
    table.Rows.Add(20, 50, 0, 0);
    table.Rows.Add(30, 40, 0, 0);
    return table;
    protected void Page_Load(object sender, EventArgs e)
    //Bind the gridview to the datatable
    GridView1.DataSource = GetTable();
    GridView1.DataBind();
    //Global var to keep accumulated value of each rowo
    double _accumulatedvalue = 0;
    protected void GridView1_RowDataBound(object sender, GridViewRowEventArgs e)
    if (e.Row.RowType == DataControlRowType.DataRow)
    //Create total value column by multiplying cells 1 and 2(maybe you don't need this)
    Double val = Convert.ToDouble(e.Row.Cells[0].Text) * Convert.ToDouble(e.Row.Cells[1].Text);
    e.Row.Cells[2].Text = val.ToString();
    //Accumualated value and fill the cell related (you can replace val below with Convert.ToDouble(e.Row.Cells[2].Text)
    _accumulatedvalue = _accumulatedvalue + val;
    e.Row.Cells[3].Text = _accumulatedvalue.ToString();
    Fouad Roumieh

  • YTD and accumul. depreciation have the same value when dpis is 2 years ago

    Hi all
    I am using FA and i have some problems. I have just setup this module and now i want to migrate assets in the system. The customer gave me an excel with the following fields that want to be in the system like: asset cost, accumulated depreciation, date placed in service, ytd accumulation and so on.
    the problems are:
    1) when i want to enter new asset manually the ytd depreciation and accumulated depreciation are required to be equal. ok so far, i register new asset on test environment and when i enter the date placed in service for example two years ago (because that is the real date placed in service of the asset) and run depreciation, the system has calculated correctly the accumulated depreciation, but has calculated the same value even for YTD depreciation which is not true, because in this case accumulated depreciation and ytd depreciation are completely different. how to solve this problem? How can i put correct values in this fields?
    2) when i enter these assets directly in the fa_mass_additions table according to the fixed assets user guide, i have the following errors: when i put "ON HOLD" on queue name, and open these assets on the system, the on hold field has no value (is empty)
    When i try to run post mass additions, the report is not finding the parameter (book) because the system is giving en error message telling me that there is no value in this list of values
    please help to solve this problems because i am not being able to migrate assets for the company.
    Thank you and best regards

    gaskins,
    You may have different Color Settings, and if you use Pantone you should be aware that the whole interpretation has changed (a number of times, most radically recently).
    If you say more, more can be said.

  • Need a help to write up the asset value(Land) which is not depreciated

    Hi Experts,
    My client want to write up the asset values(Land value) from 500000 to 750000. But this asset is not having any accumulated dep.
    to write up the vlaues.
    please could some one suggest how can we write up the values which are not having the accumulated dep.
    i tried through tcode ABZU but i am getting the below error.
    You cannot post write-ups
    Message no. AA402
    Diagnosis
    None of the areas to be posted in accordance with the transaction type entered manages one of the depreciation types entered or cumulative values.
    Procedure
    Check the transaction type.
    please could some one help on this.
    Thanks in advance.
    Regards,
    Venkat

    Hi Paul,
    Thanks for your valuable inputs.
    As per second option my client is not configured the Revaluation of assests, hence this is not applicable.
    Then as per your first option we can go ahed on this.
    Please could you suggest me which t code is most suitable for the manual posting.
    Thanks & Regards,
    Venkat

  • FI-AP Tax Code not considering the excise duty while calcuating the VAT

    Sub: A/P Tax Code not considering the excise duty while calcuating the VAT
    Hi Frnds.
    I have a typical problem.
    Till now we are using a "W6" as a tax code, where Excise-10%, Ed Cess-2%, SEd Cess-1% and VAT-4%.
    But as the Govt. rules changed I have created a new tax code "W8" by copying the W6 and modified the VAT to 5% by changing the condition record JVRD.
    The new tax code is not calculating VAT part correctly. It is taxing into consideration Invoice Base value, Ed Cess & SEd Cess (Omiting the Excise Duty).
    I have checked my Tax Procedure, there everything is fine. Even my old tax code "W6" is working correctly.
    Let me know where the error lies.
    Here is the scenario
    00. Tax code                       -   W6   -   W8   - Current Staus of W8
    01. Invoice Base               - 355181 - 351798 - 353324 (by SAP Document Simulation View)
    02. Excise @ 10% on 01.        -  35518 -  35180 -  35332
    03. Ed Cess @ 2% on 01.        -    710 -    704 -    707
    04. SEd Cess @ 1% on 01.       -    355 -    352 -    353
    05. VAT @ 4% on (010203+04)  -  15671 -   0     -  0
    05.    VAT @ 5% on (010203+04)  -    0    -  19401 -  0
    05.     VAT @ 5% on (010304)     -   0     -    0    -  17719
    INVOICE TOTAL VALUE            - 407435 - 407435 - 407435
    Pls. let me know how to come over the issue.
    By the way I am using the above tax code at FB60 transaction code.
    Regards
    Krishna
    Edited by: Gopi Krishna Gutti on Oct 17, 2011 10:18 AM

    Hi Vivek
    Acutually we planned to implement SAP - FI & MM Modules only, and all the configuration is done in the system, but at the final stage our management is not satisfied with MM area, so finally we have gone live with only FI Moudle.
    By the way thankyou for the reply. I have maintained 100% in JMX1 for the new tax code and it is working fine.
    I will give you full points.
    Once again thanks a lot.
    Regards
    Krishna

  • Calculating Accumulative Value for a particular period

    Hi,
    I want to calculate acumulative values based on 0calmonth for a key figure.
    In Rows, i want 0calmonth and a key figure in Columns. If we select the property of the key figure as "Cumulative". It is adding values like in first month, first month value, in 2nd month it is showing the values by adding 1st and 2nd months. But I gave a Interval variable on 0calmonth (e.g  03.2006 to 09.2006 ) it displaying cumulative values from 3rd month. In 4th month it is showing value for 3rd and 4th month. But i want to see the values as "Accumulative" means from starting of that year. Eventhougth i gave the period value as 03.2006 to 09.2006, it has to display the value of 3rd month as adding of 1st,2nd and 3rd months. like that it has to show up to last month in the given peroid.
    Please can any one suggest me....
    Thanks and Regards
    Rajesh
    Message was edited by:
            rajesh
    Message was edited by:
            rajesh

    Hi ,
    For my Above Problem I am using the code as follows. But it has no errors. but when it is displaying on the web browser. it is not getting values.
    DATA: L_S_RANGE1 TYPE RSR_S_RANGESID.
          DATA: LOC_VAR_RANGE1 LIKE RRRANGEEXIT.
          DATA: L_VALUE LIKE RRRANGEEXIT-HIGH.
    CASE I_VNAM.
        WHEN 'ZCUM_INTERVAL'.
          IF I_STEP = 2.
          LOOP AT i_t_var_range INTO LOC_VAR_RANGE1 WHERE VNAM = '0I_CMNTH'.
             L_VALUE = LOC_VAR_RANGE1-LOW.
             while L_VALUE4(2) < LOC_VAR_RANGE1-HIGH4(2).
                if sy-index > 1.
                  L_VALUE4(2) = L_VALUE4(2) + 1.
                  if strlen( L_value+4(2) ) = 1.
                     concatenate '0' L_VALUE4(2) into L_VALUE4(2).
                  endif.
                endif.
                CLEAR L_S_RANGE1.
                L_S_RANGE1-LOW = LOC_VAR_RANGE1-LOW(4).
                L_S_RANGE1-LOW+4(2) = '01'.
                L_S_RANGE1-HIGH = L_VALUE.
               L_S_RANGE1-SIGN = 'I'.
               L_S_RANGE1-OPT = 'BT'.
                APPEND L_S_RANGE1 TO E_T_RANGE.
              ENDwhile.
            ENDLOOP.
          ENDIF.
    Please can any one suggest me regarding this.
    Thanks in Advance...
    TR
    Rajesh

  • Watching the accumulative credit of a vendor

    I'm searching for a report that produce the accumulative credit of a vendor/
    I mean something like fbl1n but the sum column has to be like that:
    if fl1n give the 3 line:
    vendor         sum
    50051          100
    50051          150
    50051          -30
    than i need a report that look like this:
    vendor         sum
    50051          100
    50051          250
    50051          220

    Hi,
    Use the following drilldown report: S_ALR_87012079.  This is an advanced report.  You can also create your own drill down report to suit your tase using transaction code: FKI1.
    For the specific output that you have in mind, execute the report.  In the output, select the column header where the total is, so that the entire value column is higlighted.  Then go to the menu: Edit -> Cumulative display on/off.  You will get the desired output.
    To retain this state, from the menu select: Report -> Save Layout.
    Cheers

  • FI-AA: table(-s) for the asset value

    Dear forum members,
    I need to find the table where I can get the asset value from!?
    The background is that I`m writing a report where I have to get the value of an asset. If it is higher then 0,00 then it has to do this and if it lower than 0,00 it has to do that and so on.
    It would be great if someone could help me with this matter, or at least give me an idea how to find that out.
    Thank`s in advance

    Hi,
    definitely table ANLC. You have to calculate the asset value as ANLC contains acquisition cost, accumulated depreciation, depr. of the actual year,...
    best regards, Christian

  • The accumulation at the aggregate level

    Hi,
    in the web form, I entered data in the first 3 months (January, February, March) ...I recorded ... but the accumulation does not occur at the Q1. Q1 take just the values of the March.

    Is this a Classic Planning Application? If yes, here are the steps version 11.1.1.3: http://download.oracle.com/docs/cd/E12825_01/epm.111/hp_admin/mem_prop.html

  • How to change the accumulated (uploaded) depreciation

    Hi Experts,
    At the time of data transfer from legacy system to SAP, we had transferred assets also with their gross values and accumulated depreciation till 29.02.2008. Now while doing so there was errors in some 10 assets where the gross value as well as accumulated depreciation which was uploaded was less. We now have to increase the asset gross value as well as uploaded accumulated depreciation value also. I have tride AS92 but this allows me to change only gross value. But this alone would not solve my problem. I also have to change the accumulated (uploaded) depreciation. If I use transaction ABAA than system would pass and expense entry also which is also not correct as the expenses was already booked and has been uploaded seperately. Thus if I do ABAA then I would be posting double depreciation (expenses). We have still not do "production" for assets. We have also run depreciation for the period 12 i.e. March 2008 and there after for three periods i.e. 1 to 3 for year 2008 for all the assets. Kindly let us know how to get the rectification done in system.

    Hello Nikki,
    this sounds a bit strange, normally the cumulated depreciation can be edited too. Check oayf whether "calculate accumulated depreciation" is active. If depreciation has been posted: the system carries out correction postings in current period.
    If this does not work: Did you use a SAP standardtool for legacy data transfer? What is your  fiscal year, the transfer date and last posted period in the system?
    Best regards
      Horst

  • Unable to capture the parameter values from a PL/SQL procedure

    hi.
    i'm trying to capture the parameter values of a PL/SQL procedure by calling inside a anonymous block but i'm getting a "reference to uninitialized collection error" ORA-06531.
    Please help me regarding.
    i'm using following block for calling the procedure.
    declare
    err_cd varchar2(1000);
    err_txt VARCHAR2(5000);
    no_of_recs number;
    out_sign_tab search_sign_tab_type:=search_sign_tab_type(search_sign_type(NULL,NULL,NULL,NULL,NULL));
    cntr_var number:=0;
    begin
         rt843pq('DWS','3000552485',out_sign_tab,no_of_recs,err_cd,err_txt);
         dbms_output.put_line('The error is ' ||err_cd);
         dbms_output.put_line('The error is ' ||err_txt);
         dbms_output.put_line('The cntr is ' ||cntr_var);
         for incr in 1 .. OUT_SIGN_TAB.count
         loop
         cntr_var := cntr_var + 1 ;
    Dbms_output.put_line(OUT_SIGN_TAB(incr).ref_no||','||OUT_SIGN_TAB(incr).ciref_no||','||OUT_SIGN_TAB(incr).ac_no||','||OUT_SIGN_TAB(incr).txn_type||','||OUT_SIGN_TAB(incr).objid);
    end loop;
    end;
    Error is thrown on "for incr in 1 .. OUT_SIGN_TAB.count" this line
    Following is some related information.
    the 3rd parameter of the procedure is a out parameter. it is a type of a PL/SQL table (SEARCH_SIGN_TAB_TYPE) which is available in database as follows.
    TYPE "SEARCH_SIGN_TAB_TYPE" IS TABLE OF SEARCH_SIGN_TYPE
    TYPE "SEARCH_SIGN_TYPE" AS OBJECT
    (ref_no VARCHAR2(22),
    ciref_no VARCHAR2(352),
    ac_no VARCHAR2(22),
    txn_type VARCHAR2(301),
    objid VARCHAR2(1024))............

    We don't have your rt843pq procedure, but when commenting that line out, everything works:
    SQL> create TYPE "SEARCH_SIGN_TYPE" AS OBJECT
      2  (ref_no VARCHAR2(22),
      3  ciref_no VARCHAR2(352),
      4  ac_no VARCHAR2(22),
      5  txn_type VARCHAR2(301),
      6  objid VARCHAR2(1024))
      7  /
    Type is aangemaakt.
    SQL> create type "SEARCH_SIGN_TAB_TYPE" IS TABLE OF SEARCH_SIGN_TYPE
      2  /
    Type is aangemaakt.
    SQL> declare
      2    err_cd varchar2(1000);
      3    err_txt VARCHAR2(5000);
      4    no_of_recs number;
      5    out_sign_tab search_sign_tab_type:=search_sign_tab_type(search_sign_type(NULL,NULL,NULL,NULL,NULL));
      6    cntr_var number:=0;
      7  begin
      8    -- rt843pq('DWS','3000552485',out_sign_tab,no_of_recs,err_cd,err_txt);
      9    dbms_output.put_line('The error is ' ||err_cd);
    10    dbms_output.put_line('The error is ' ||err_txt);
    11    dbms_output.put_line('The cntr is ' ||cntr_var);
    12    for incr in 1 .. OUT_SIGN_TAB.count
    13    loop
    14      cntr_var := cntr_var + 1 ;
    15      Dbms_output.put_line(OUT_SIGN_TAB(incr).ref_no||','||OUT_SIGN_TAB(incr).ciref_no||','||OUT_SIGN_TAB(incr).ac_no||','||OUT_SIGN
    TAB(incr).txntype||','||OUT_SIGN_TAB(incr).objid);
    16    end loop;
    17  end;
    18  /
    The error is
    The error is
    The cntr is 0
    PL/SQL-procedure is geslaagd.Regards,
    Rob.

  • What the Initial value for sy-tabix & sy-index

    Hi Folks
       I have a small doubt.
    What the Initial value for sy-tabix & sy-index?
    Can anyone please clarify me?
    Regards,
    Sree

    hi sree,
    both values are initialized to 0 before processing and after processing values are changed according to used scenarios.
    if helpful reward some points.
    with regards,
    suresh babu aluri.

  • Not able to get the full value from request.getParameter()

    hi all,
    Iam giving a text input value as" Analysis and tracking" in one jsp form.
    while fetching and assigning the value to another variable using request.getparameter , iam getting only the text "Analysis" , the space after it were ignored andnt displaying it.
    Help me in solving this.
    thanx
    Balaji

    I think you are code is something like this
    <input name="xyz" type="text" value=<%=variable%>>
    HTML ignores spaces. Therefore you are only getting Analysis, the first word, when your browser encounters the first space it ignores whatever is present after that.
    One way of overcoming this is putting double quotes around the text value
    <input name="xyz" type="text" value=<% out.println("\"" + variable + "\""); %>
    Thanks.

Maybe you are looking for

  • Connecting Apple TV to iTunes Tip

    After following every bit of info on this forum as well as Apple's solution I still had no luck. Finally, I tried unchecking Proxy box in the network settings and it worked! I guess for those of you still having trouble connecting Apple Tv to iTunes,

  • Cannot edit or see CR2 files in Elements8

    I cannot open my Canon RAW files (CR2) in the Elements8 Editor as it says the filetype is not supportet. Neither can I see the files in the Organizer. When I try to Add unmanaged files to Organizer it says that: The file is damaged or is a format tha

  • PSE11 error unpacking after download

    Hi, I just purchased a download copy of Photoshop Elements 11. I installed the download manager, downloaded the product and then it says it has to extract files. This ends up in an error message saying that it could not extract all files and I need t

  • Cannot connect to hamachi servers

    I try to connect to the servers I uninstall and install three times but it still does not work and I cannot connect to the servers while my friends can pls help

  • Last change date and time of Reservation Record

    Hi All, I have a requirement to find out the last changed date / time of a reservation (table - RESB). I was not able to find any change document object for reservation. Please let me know what I can do for this.