Calculation of accumulated values
Hi experts!
I'm looking for a solution on how to accumulate a value, say 0AMOUNT, over time. The reason is a requirement to look att accumulated values although only posted amounts per fiscal perioid is available from the BC extractor. So the posted amounts need to be summed up over the time dimension in order to calculate a (also historical) balance for each fiscal period. How would you best implement such a solution?
I have considered:
1) define a hierarchy on 0fiscper that groups together fiscal periods into an aggregated level - e.g 2001-01 in the hierarchy would contain actual fiscper 2001-001, 2001-02 in the hierarchy would contain both 2001-001 and 2001-002 (thereby causing an accumulated value when summed up), and so on ...
2) Do the accumulation in an update routine by some ABAP code - im not quite sure on how to do this - any suggestions would be helpul!
3) Dynamically calculate the sum of 0amount in the report although this would be the least efficient soulution...
Any help/suggestions is appreciated!
Many thanks,
David
Just as you say i'm also sure that it "can" be done, i'm just looking for the best solution. And possibly some hints in implementing that one.
It is true that I have 0AMOUNT per fiscal year/period, but the option to simply sum up 0AMOUNT in the report is not feasible since that would require to much computation to be done online (we keep history for a long time back). Hence I would need to precompute the accumulated value per fiscal period and save it in the cube as a non-cumulative balance. So when the user restricts on, say fiscper "2003-01" in the query, the accumulated balance for 2003-01 is directly retrieved from the cube.
Many thanks,
David
Similar Messages
-
Auto sum Column accumulated value in gridview
The principle seems simple but I'm in trouble. Did anyone have an example or script to auto sum of a column in the gridview. Explain, I have a gridview with the columns:
amount, unitary value, total value and accumulated value. I need the accumulated value column add the previous value
to the current value. For example:
quantity unitary_value total_value accumulated_value
10 10,00 100,00 100,00
20 50,00 1.000,00 1.100,00
5 500,00 2.500,00 3.600,00
Basically it would be up. The column value (accumulated value) takes the value of the previous line and adds to the current value.
If you can help !!
Thank you for your attention.
Thank you for your attention.No you're right with the calculation of the first post, I got it wrong. Now check the below solution, you need to use RowDataBound event of the gridview to do the calculation and print in the cell:
//This is just a method to fill the datatable with dummy data(you can ignore in your code)static DataTable GetTable()
// Here we create a DataTable with four columns.
DataTable table = new DataTable();
table.Columns.Add("quantity", typeof(double));
table.Columns.Add("unitary_value", typeof(double));
table.Columns.Add("total_value", typeof(double));
table.Columns.Add("accumulated_value", typeof(double));
// Here we add five DataRows.
table.Rows.Add(10, 10, 0, 0);
table.Rows.Add(20, 50, 0, 0);
table.Rows.Add(30, 40, 0, 0);
return table;
protected void Page_Load(object sender, EventArgs e)
//Bind the gridview to the datatable
GridView1.DataSource = GetTable();
GridView1.DataBind();
//Global var to keep accumulated value of each rowo
double _accumulatedvalue = 0;
protected void GridView1_RowDataBound(object sender, GridViewRowEventArgs e)
if (e.Row.RowType == DataControlRowType.DataRow)
//Create total value column by multiplying cells 1 and 2(maybe you don't need this)
Double val = Convert.ToDouble(e.Row.Cells[0].Text) * Convert.ToDouble(e.Row.Cells[1].Text);
e.Row.Cells[2].Text = val.ToString();
//Accumualated value and fill the cell related (you can replace val below with Convert.ToDouble(e.Row.Cells[2].Text)
_accumulatedvalue = _accumulatedvalue + val;
e.Row.Cells[3].Text = _accumulatedvalue.ToString();
Fouad Roumieh -
I would like to know how i can create a bell graph with out using sub VIs, the data that i created consists in 500 readings with values of 0 to 100, i calculated the mean value and standard diviation. I hope some one can help me
Here's a quick example I threw together that generates a sort-of-bell-curve shaped data distribution, then performs the binning and plotting.
-Kevin P.
Message Edited by Kevin Price on 12-01-2006 02:42 PM
Attachments:
Binning example.vi 51 KB
Binning example.png 12 KB -
Octroi is not calculating on Freight value
Hello,
We are maintained all the conditions in PO conditions tab. Inclucding of Frieght conditions.
here Octroi is not calculating on Freight value.
If suppose the frieght vendor is different then it is ok to calculted Octri without frieght,
If suppose the Frieght vendor is same as Main vendor then Octri should calculte on including of fright value.
Regards
Mahesh NaikHello,
In pricing schema We have given the 361 as ALT CBV to octroi condition (JOCM), to calculate the octroi percentage on Octroi Base value that is basic Price excise dutysales tax.
As per requirement 361, system should fetch the octroi base value from taxation procedure to pricing schema
Regards
Mahesh Naik. -
Condition Records not to be calculated if base value is negative
Dear all
I have configured a pricing procedure which is being used while performing DP90 with reference to service order. Now the values are getting populated in condition type EK02 and based on this value various other charges like service tax, supervision charges are getting calculated.(condition records have been maintained for condition types related to these charges). Is it possible that these charges do not get calculated if the value in EK02 is negative?????You can create a routine where you can put the check. And assign it to pricing procedure for all the condition types you don't want negative value for.
Thanks -
Excise duty calculation on Assessable value in PO
Dear Team,
We are facing critical issue in Excise Duty calculation, Excise is calculating on basic price for our normal procurement and we are selling the same at MRP cost... Now the issue is Excise Duty has to be calculated on Assessable value (65% on MRP value) instead of basic price in PO.
Example:
Basic Price - 100
MRP - 300
Assessable value - 195
Excise Duty - 24.10 (195*12.36%)
Regards,
SaiHi Sai,
Sorry for the late reply. I think you are close to the solution. You should not change the Procedure too much.
MRP and Abatement condition types should be kept independent of calculations in the procedure. It is only for changing the base price already got for BED(JMOP) making use of the routine. If that is the case you will get base price (BASB) added to Excise duties which are modified for calculation of VAT.
For Eg:
10 BASB
30 ZMRP(Independent condition type used to store MRP value)
40 ZABA(Independent condition typeused to store Abatement %)
70 JMOP From 10(Routine XXX which will change the base value for calculating JMOP with a formula MRP*Abatement %)
270 ZVAT(As usual will calculate from BASB+Excise duties)
Hope this is the way you have done this. Steps 30, and 40 should not come in any "From -to"steps.
This is working well with one of our clients.
Regards
Binoy -
How to do calculation on time value?
I have "Duration" as a column in a source excel file which contain data like 00:00:06,00:01:10 when i connect this excel to lumira i am getting the column data as [$-5400]12:00:06\AM
please specify how to perform certain calculation on time value like sum or avg.I'm struggling to think of examples of this kind of data 00:00:06,00:01:10
so i suspect this formatting will be troublesome..if Lumira can't handle this using advanced acquisition properties, then it might be an idea to split it before coming into lumira .
regards,
H -
To calcuate the accumulated value
Hello,
I would like to write a calc script for the following scenario: I have a series of data value, and would like to calcuate the accumulated value based on the sequence of from the smallest to the largest. The case is like this:
The input is as following
Case 1
Acct1: 10
Acct2: 5
Acct3: 8
Acct4: 15
I would like to calcuate the accumlated value based on the data value, the caulcated result is at column Case 2:
Case 1 Case 2 Rank
Acct1: 10 23 3
Acct2: 5 5 1
Acct3: 8 13 2
Acct4: 15 38 4
I can easily use the rank function to rank the data value of "Case1", but after that how can I use this rank to cauculate the accumluated values? The data value may change, (from time to time, for example).
Any good suggestions? Thanks in advanceHi There
Try this script below. I have done this in the SAMPLE app. script may look crazy, but I have tested this and it works. Also what makes it a little more complicated is the fact that ranking can give you duplicate rank values, so I have designed this to get around that problem.
Thanks
Anthony
SET UPDATECALC OFF;
VAR AccountCount=0;
VAR CurrRank=0
/*Create blocks in case dimension as mine is sparse*/
FIX(Local,FY13,BegBalance,Forecast,Working,"E01_101_1110",@RELATIVE("TotaltestAccount",0),"HSP_InputValue")
DATACOPY "Case1" TO "Rank";
DATACOPY "Case1" TO "Case2";
DATACOPY "Case1" TO "Sequence";
DATACOPY "Case1" TO "Case1Seq";
ENDFIX
FIX(Local,FY13,BegBalance,Forecast,Working,"E01_101_1110",@RELATIVE("TotaltestAccount",0),"HSP_InputValue")
"Case2" = #missing;
"Sequence"= #missing;
/*Create a sequence for the list of accounts so that if you for any chance have a value that is the same the rank will give a unique rank value - because this becomes a problem later*/
"Sequence"= (@SHIFT("Sequence",-1,"Acc1":"Acc5") + 0.00000000001);
/*Rank case1 and add the sequence to give you the unique ranking and then * -1 to reverse the rank*/
"Case1Seq" = "Sequence" + @RANK(SKIPNONE,"Case1",@RANGE("Case1",@RELATIVE("TotaltestAccount",0))) * -1;
ENDFIX
FIX(Local,FY13,Forecast,Working,"E01_101_1110",@RELATIVE("TotaltestAccount",0),"HSP_InputValue")
/*Next rank the previous rank from above to now populate the unique ranking of case 1 in dec order*/
"Rank"(
IF(@ISMBR("BegBalance"))
"Rank" = @RANK(SKIPNONE,"Case1Seq",@RANGE("Case1seq",@RELATIVE("TotaltestAccount",0)));
ENDIF
/*Populate the max ranking value to start the variable counter*/
AccountCount = @MAXSRANGE(SKIPMISSING,"Rank"->"BegBalance",@RELATIVE("TotalTestAccount",0));
CurrRank = AccountCount;
ENDFIX
FIX(Local,FY13,"Case2",Forecast,Working,"E01_101_1110",@RELATIVE("TotaltestAccount",0),"HSP_InputValue")
/*Calculate the lowest case 1 value first or highest rank value, then make curr rank equal to highest rank -1 */
"BegBalance"(
IF(AccountCount == "Rank")
"Begbalance" = "BegBalance"->"Case1" ;
CurrRank = CurrRank - 1;
ENDIF
/*Then loop through the process reducing the current rank variable by 1 each time*/
LOOP(100)
"BegBalance"(
IF(CurrRank == "Rank")/* i.e rank ==4 */
"Begbalance" = @MAXSRANGE(SKIPMISSING,"Case2"->"BegBalance",@RELATIVE("TotalTestAccount",0)) +"BegBalance"->"Case1";
CurrRank = CurrRank - 1;/*then make rank == 3*/
ENDIF)
ENDLOOP;
ENDFIX -
Problem getting accumulative value by a user defined week
I have a requirement where I have to accumulative values by week where a week is defined as Sunday to Saturday. For example:
date value acc_value
9/1/2010 2 2 Wed
9/2/2010 5 7 Thur
9/3/2010 3 10 Fri
9/4/2010 4 14 Sat
9/5/2010 8 8 Sun value is reset
9/6/2010 2 10 Mon
9/7/2010 1 11 Tue
9/8/2010 4 15 Wed
9/9/2010 7 22 Thu
9/10/2010 4 26 Fri
9/11/2010 5 31 Sat Any help would be appreciated.
Thanks.Try this:
with my_table as (select to_date('01/09/2010', 'dd/mm/yyyy') dt, 2 value from dual union all
select to_date('02/09/2010', 'dd/mm/yyyy') dt, 5 value from dual union all
select to_date('03/09/2010', 'dd/mm/yyyy') dt, 3 value from dual union all
select to_date('04/09/2010', 'dd/mm/yyyy') dt, 4 value from dual union all
select to_date('05/09/2010', 'dd/mm/yyyy') dt, 8 value from dual union all
select to_date('06/09/2010', 'dd/mm/yyyy') dt, 2 value from dual union all
select to_date('07/09/2010', 'dd/mm/yyyy') dt, 1 value from dual union all
select to_date('08/09/2010', 'dd/mm/yyyy') dt, 4 value from dual union all
select to_date('09/09/2010', 'dd/mm/yyyy') dt, 7 value from dual union all
select to_date('10/09/2010', 'dd/mm/yyyy') dt, 4 value from dual union all
select to_date('11/09/2010', 'dd/mm/yyyy') dt, 5 value from dual)
-- end of mimicking your data in a table called my_table
select dt,
value,
sum(value) over (partition by trunc(dt+1, 'iw') order by dt) acc_value,
to_char(dt, 'dy') dy
from my_table
order by dt;
DT VALUE ACC_VALUE DY
01/09/2010 2 2 wed
02/09/2010 5 7 thu
03/09/2010 3 10 fri
04/09/2010 4 14 sat
05/09/2010 8 8 sun
06/09/2010 2 10 mon
07/09/2010 1 11 tue
08/09/2010 4 15 wed
09/09/2010 7 22 thu
10/09/2010 4 26 fri
11/09/2010 5 31 sat -
Overwrite the system calculated VAT - Condition Value with Inbound VAT Amou
Hello experts:
We have a following requirement in my customer for Inbound VAT Processing.
We want to overwrite the system calculated VAT - Condition Value with Inbound VAT Amount
Details :
Aggregated sales Idoc ( WPUUMS01) is used here for posting the daily sale.
Inbound pricing procedure is ZPOS00 ( copy of POS000)
VAT is maintained with condition Type ZMWS. Using Tax Codes
When idoc get posted system calculates the ZMWS and working fine.
However we would like to overwrite this condition value with inbound condition value (from segment 5 of WPUUMS)
Example :
For Article A : inbound aggregated sales Amount= 100 INR and ZMWS is 10%( VAT code B1)
Then system will calculate 90 INR revenue and 10 INR as VAT
However due to rounding in POS system we may get from POS the VAT Amount =9.99 for example.
Tisak wants to overwrite the condition value 10 INR with 9.99 INR.
I will appreciate if anybody can suggest some approach or share experience if came across the same scenario (Tax condition type).
Thanks and regards,Hi,
Either you can use rounding in POS such that it round up for 9.99 to 10.
And then in SAP pricing procedure you can use alternate calculation type as 16 or 17 (17 should be used with decimal point assigned to currency for rounding like 2 decimal for INR).
doing this will also round the figure from 9.99 to 10 is SAP.
Hope this solves your query.
Regards,
Ashutosh -
Select just the values between min and max of an accumulated value over day
Hello Forum,
a value is accumulated over a day and over a period of time. Next day the value is reseted and starts again to be accumulated:
with sampledata as (select to_date('09.09.2012 00:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
select to_date('09.09.2012 01:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
select to_date('09.09.2012 02:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
select to_date('09.09.2012 03:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
select to_date('09.09.2012 04:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
select to_date('09.09.2012 05:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
select to_date('09.09.2012 06:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 23 val from dual union all
select to_date('09.09.2012 07:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 29 val from dual union all
select to_date('09.09.2012 08:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 30 val from dual union all
select to_date('09.09.2012 09:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 45 val from dual union all
select to_date('09.09.2012 10:12:12', 'dd.mm.yyyy hh24:mi:ss') ts, 60 val from dual union all
select to_date('09.09.2012 11:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 75 val from dual union all
select to_date('09.09.2012 12:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 95 val from dual union all
select to_date('09.09.2012 13:21:26', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
select to_date('09.09.2012 14:21:27', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('09.09.2012 15:21:30', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('09.09.2012 16:21:32', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('09.09.2012 17:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('09.09.2012 21:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('09.09.2012 23:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('10.09.2012 00:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('10.09.2012 01:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('10.09.2012 02:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('10.09.2012 03:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('10.09.2012 04:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('10.09.2012 05:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('10.09.2012 06:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 14 val from dual union all
select to_date('10.09.2012 07:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 34 val from dual union all
select to_date('10.09.2012 08:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 58 val from dual union all
select to_date('10.09.2012 09:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 70 val from dual union all
select to_date('10.09.2012 10:12:12', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
select to_date('10.09.2012 11:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('10.09.2012 12:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
select to_date('10.09.2012 13:21:26', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
select to_date('10.09.2012 14:21:27', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
select to_date('10.09.2012 15:21:30', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
select to_date('10.09.2012 16:21:32', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
select to_date('10.09.2012 21:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual)
select ts, val
from sampledata
order by ts asc;How should I change the select statement to skip all data sets before the first minimum and the duplicates after the maximum of a day in order to get such a result:
TS VAL
09.09.12 06:12 23
09.09.12 07:12 29
09.09.12 08:12 30
09.09.12 09:12 45
09.09.12 10:12 60
09.09.12 11:12 75
09.09.12 12:21 95
09.09.12 13:21 120
09.09.12 14:21 142
10.09.12 06:12 14
10.09.12 07:12 34
10.09.12 08:12 58
10.09.12 09:12 70
10.09.12 10:12 120
10.09.12 11:12 142
10.09.12 12:21 153Thank youThis solution works perfectly when the accumulated value has its low and its high on the same day. But I found out :( , that there is also data, which has its low yesterday and its high today. For a better understandig of the case, there is a machine, wich is working over 3 Shifts with irregular start and end time. For example Shift1 cann start at 5:50 or at 7:15. The accumulated value of the worked time is accumuated for each shift extra. This solution works for the shift 1 (approximate between 06:00-14:00) and for the shift 2(approximate between 14:00-22:00), because there is the low and the high of the accumulated value on the same day. This solution does not work for the shif 3(approximate between 22:00-06:00), because the high of the accumulated value is or can be the next day.
So the thread title should be: "Select just the values between min and max of an accumulated value over the same day(today) or over two successive days (yesterday and today)
Sampledata for shift 1 or shift 2:
{code}
with sampledata as (select to_date('09.09.2012 00:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
select to_date('09.09.2012 01:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
select to_date('09.09.2012 02:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
select to_date('09.09.2012 03:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
select to_date('09.09.2012 04:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
select to_date('09.09.2012 05:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
select to_date('09.09.2012 06:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 23 val from dual union all
select to_date('09.09.2012 07:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 29 val from dual union all
select to_date('09.09.2012 08:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 30 val from dual union all
select to_date('09.09.2012 09:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 45 val from dual union all
select to_date('09.09.2012 10:12:12', 'dd.mm.yyyy hh24:mi:ss') ts, 60 val from dual union all
select to_date('09.09.2012 11:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 75 val from dual union all
select to_date('09.09.2012 12:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 95 val from dual union all
select to_date('09.09.2012 13:21:26', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
select to_date('09.09.2012 14:21:27', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('09.09.2012 15:21:30', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('09.09.2012 16:21:32', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('09.09.2012 17:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('09.09.2012 21:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('09.09.2012 23:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('10.09.2012 00:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('10.09.2012 01:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('10.09.2012 02:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('10.09.2012 03:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('10.09.2012 04:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('10.09.2012 05:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('10.09.2012 06:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 143 val from dual union all
select to_date('10.09.2012 07:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 144 val from dual union all
select to_date('10.09.2012 08:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 145 val from dual union all
select to_date('10.09.2012 09:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
select to_date('10.09.2012 10:12:12', 'dd.mm.yyyy hh24:mi:ss') ts, 147 val from dual union all
select to_date('10.09.2012 11:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 148 val from dual union all
select to_date('10.09.2012 12:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
select to_date('10.09.2012 13:21:26', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
select to_date('10.09.2012 14:21:27', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
select to_date('10.09.2012 15:21:30', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
select to_date('10.09.2012 16:21:32', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
select to_date('10.09.2012 21:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual)
, got_analytics AS
SELECT ts, val
, MIN (val) OVER ( PARTITION BY TRUNC (ts)
ORDER BY ts DESC
) AS min_val_after
, CASE
WHEN ROW_NUMBER () OVER ( PARTITION BY TRUNC (ts)
ORDER BY val
, ts
) = 1
THEN -1 -- Impossibly low val
ELSE LAG (val) OVER ( PARTITION BY TRUNC (ts)
ORDER BY ts
END AS prev_val
, MIN (val) OVER (PARTITION BY TRUNC (ts))
AS low_val_today
, NVL ( LAST_VALUE (val) OVER ( ORDER BY ts
RANGE BETWEEN UNBOUNDED PRECEDING
AND ts - TRUNC (ts) PRECEDING
, -1
) AS last_val_yesterday
FROM sampledata
SELECT ts
, val
FROM got_analytics
WHERE val <= min_val_after
AND val > prev_val
AND ( val > low_val_today
OR val != last_val_yesterday
ORDER BY ts
{code}
with the expected results:
{code}
1 09.09.2012 06:12:02 23
2 09.09.2012 07:12:03 29
3 09.09.2012 08:12:04 30
4 09.09.2012 09:12:11 45
5 09.09.2012 10:12:12 60
6 09.09.2012 11:12:13 75
7 09.09.2012 12:21:24 95
8 09.09.2012 13:21:26 120
9 09.09.2012 14:21:27 142
10 10.09.2012 06:12:02 143
11 10.09.2012 07:12:03 144
12 10.09.2012 08:12:04 145
13 10.09.2012 09:12:11 146
14 10.09.2012 10:12:12 147
15 10.09.2012 11:12:13 148
16 10.09.2012 12:21:24 153
{code}
And the sampledata for shift 3 is:
{code}
with sampledata as (select to_date('08.09.2012 00:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 23 val from dual union ALL
select to_date('08.09.2012 02:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 45 val from dual union all
select to_date('08.09.2012 05:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 78 val from dual union all
select to_date('08.09.2012 06:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
select to_date('08.09.2012 08:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
select to_date('08.09.2012 10:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
select to_date('08.09.2012 12:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
select to_date('08.09.2012 16:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
select to_date('08.09.2012 17:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
select to_date('08.09.2012 19:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
select to_date('08.09.2012 21:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
select to_date('08.09.2012 22:00:12', 'dd.mm.yyyy hh24:mi:ss') ts, 24 val from dual union all
select to_date('08.09.2012 22:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 40 val from dual union all
select to_date('08.09.2012 23:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 68 val from dual union all
select to_date('09.09.2012 01:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 79 val from dual union all
select to_date('09.09.2012 02:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 124 val from dual union all
select to_date('09.09.2012 03:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 125 val from dual union all
select to_date('09.09.2012 04:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 126 val from dual union all
select to_date('09.09.2012 05:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
select to_date('09.09.2012 06:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
select to_date('09.09.2012 07:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
select to_date('09.09.2012 08:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
select to_date('09.09.2012 09:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
select to_date('09.09.2012 10:12:12', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
select to_date('09.09.2012 11:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
select to_date('09.09.2012 12:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
select to_date('09.09.2012 13:21:26', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
select to_date('09.09.2012 14:21:27', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
select to_date('09.09.2012 15:21:30', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
select to_date('09.09.2012 16:21:32', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
select to_date('09.09.2012 17:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
select to_date('09.09.2012 21:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union ALL
select to_date('09.09.2012 22:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 5 val from dual union ALL
select to_date('09.09.2012 22:51:33', 'dd.mm.yyyy hh24:mi:ss') ts, 23 val from dual union all
select to_date('09.09.2012 23:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 40 val from dual union all
select to_date('10.09.2012 00:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 50 val from dual union all
select to_date('10.09.2012 01:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 60 val from dual union all
select to_date('10.09.2012 02:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 78 val from dual union all
select to_date('10.09.2012 03:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('10.09.2012 04:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 145 val from dual union all
select to_date('10.09.2012 05:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
select to_date('10.09.2012 06:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
select to_date('10.09.2012 07:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
select to_date('10.09.2012 08:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
select to_date('10.09.2012 09:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
select to_date('10.09.2012 10:12:12', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
select to_date('10.09.2012 11:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
select to_date('10.09.2012 12:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
select to_date('10.09.2012 13:21:26', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
select to_date('10.09.2012 14:21:27', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
select to_date('10.09.2012 15:21:30', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
select to_date('10.09.2012 16:21:32', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
select to_date('10.09.2012 21:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual)
, got_analytics AS
SELECT ts, val
, MIN (val) OVER ( PARTITION BY TRUNC (ts)
ORDER BY ts DESC
) AS min_val_after
, CASE
WHEN ROW_NUMBER () OVER ( PARTITION BY TRUNC (ts)
ORDER BY val
, ts
) = 1
THEN -1 -- Impossibly low val
ELSE LAG (val) OVER ( PARTITION BY TRUNC (ts)
ORDER BY ts
END AS prev_val
, MIN (val) OVER (PARTITION BY TRUNC (ts))
AS low_val_today
, NVL ( LAST_VALUE (val) OVER ( ORDER BY ts
RANGE BETWEEN UNBOUNDED PRECEDING
AND ts - TRUNC (ts) PRECEDING
, -1
) AS last_val_yesterday
FROM sampledata
SELECT ts
, val
FROM got_analytics
WHERE val <= min_val_after
AND val > prev_val
AND ( val > low_val_today
OR val != last_val_yesterday
ORDER BY ts
{code}
with the unexpected results:
{code}
- ts val
1 08.09.2012 00:04:08 23
2 08.09.2012 22:12:13 40
3 08.09.2012 23:21:24 68
4 09.09.2012 22:21:33 5
5 09.09.2012 22:51:33 23
6 09.09.2012 23:21:33 40
7 10.09.2012 00:04:08 50
8 10.09.2012 01:03:08 60
9 10.09.2012 02:54:11 78
10 10.09.2012 03:04:08 142
11 10.09.2012 04:04:19 145
12 10.09.2012 05:04:20 146
{code}
The result should be:
{code}
- ts val
1 08.09.2012 00:04:08 23
2 08.09.2012 02:04:08 45
3 08.09.2012 05:03:08 78
4 08.09.2012 06:54:11 90
5 08.09.2012 22:00:12 24
6 08.09.2012 22:12:13 40
7 08.09.2012 23:21:24 68
8 09.09.2012 01:03:08 79
9 09.09.2012 02:54:11 124
10 09.09.2012 03:04:08 125
11 09.09.2012 04:04:19 126
12 09.09.2012 05:04:20 127
13 09.09.2012 22:21:33 5
14 09.09.2012 22:51:33 23
15 09.09.2012 23:21:33 40
16 10.09.2012 00:04:08 50
17 10.09.2012 01:03:08 60
18 10.09.2012 02:54:11 78
19 10.09.2012 03:04:08 142
20 10.09.2012 04:04:19 145
21 10.09.2012 05:04:20 146
{code}
Thank you for your help! -
Calculating Accumulative Value for a particular period
Hi,
I want to calculate acumulative values based on 0calmonth for a key figure.
In Rows, i want 0calmonth and a key figure in Columns. If we select the property of the key figure as "Cumulative". It is adding values like in first month, first month value, in 2nd month it is showing the values by adding 1st and 2nd months. But I gave a Interval variable on 0calmonth (e.g 03.2006 to 09.2006 ) it displaying cumulative values from 3rd month. In 4th month it is showing value for 3rd and 4th month. But i want to see the values as "Accumulative" means from starting of that year. Eventhougth i gave the period value as 03.2006 to 09.2006, it has to display the value of 3rd month as adding of 1st,2nd and 3rd months. like that it has to show up to last month in the given peroid.
Please can any one suggest me....
Thanks and Regards
Rajesh
Message was edited by:
rajesh
Message was edited by:
rajeshHi ,
For my Above Problem I am using the code as follows. But it has no errors. but when it is displaying on the web browser. it is not getting values.
DATA: L_S_RANGE1 TYPE RSR_S_RANGESID.
DATA: LOC_VAR_RANGE1 LIKE RRRANGEEXIT.
DATA: L_VALUE LIKE RRRANGEEXIT-HIGH.
CASE I_VNAM.
WHEN 'ZCUM_INTERVAL'.
IF I_STEP = 2.
LOOP AT i_t_var_range INTO LOC_VAR_RANGE1 WHERE VNAM = '0I_CMNTH'.
L_VALUE = LOC_VAR_RANGE1-LOW.
while L_VALUE4(2) < LOC_VAR_RANGE1-HIGH4(2).
if sy-index > 1.
L_VALUE4(2) = L_VALUE4(2) + 1.
if strlen( L_value+4(2) ) = 1.
concatenate '0' L_VALUE4(2) into L_VALUE4(2).
endif.
endif.
CLEAR L_S_RANGE1.
L_S_RANGE1-LOW = LOC_VAR_RANGE1-LOW(4).
L_S_RANGE1-LOW+4(2) = '01'.
L_S_RANGE1-HIGH = L_VALUE.
L_S_RANGE1-SIGN = 'I'.
L_S_RANGE1-OPT = 'BT'.
APPEND L_S_RANGE1 TO E_T_RANGE.
ENDwhile.
ENDLOOP.
ENDIF.
Please can any one suggest me regarding this.
Thanks in Advance...
TR
Rajesh -
Pricing Procedure calculation for Net value
Hi Gurus,
I have issue with pricing procedure for net price.
My pricing procedure is as follows.
Step Cond.type from to Subtotal req.type cal.type Basetype
10 CPLF 0 0 0 0 0
20 CA 0 0 0 0 0
30 CPY 0 0 0 0 0
40 CWE 0 0 0 0 0
50 CPY1 30 40 0 601 0
60 PCP 0 0 0 0 0
70 NCC 50 60 0 0 0 0
80 Subtotal 1 70 1
90 VC 0 0 0 0 0
110 FPA 0 0 0 600 0
150 INCR 0 0 0 0 0
160 BINC 0 0 0 0 0
170 PINC 0 0 0 0 0
180 CINC 0 0 0 602 0
190 PRIN 0 0 0 0 0
200 FINC
210 Subtotal 2 200 2 0 0
220 ADT 0 0 0 0 0
230 FTAX 0 0 0 603 0
240 Subtotal 3 210 3 0 0
My query is when i am doing pricing for line items
1) For line item 1 net value of the item will be subtotal 1
2) For line item 2 Net value should be through condition type calculation.
3) For line item 3 net value should be through calculation of routine
4) For line item 4 net value should be Subtotal 2
5) For line item 5 net value should be Subtotal 3
So please tell me who to do?
regards
SrinivasHi,
Can you please explain what type of pricing you want to for your sales order.
I am confused with your line items.
Can you explain what kind lines item you are entering in your sales order.
Are they proper material for which you have maintained condition record in MM01.
Because for all the line item system will calculate price by the pricing procedure which you have shown.
So in short system will calculate different price for different line item.
After calculating the whole price it will make the total of net amount of all the line item.
And that net amount will be shown in the header data of the sales order.
Regards
Raj. -
Calculation of Perk value for Housing Deposits
Our Company is offering Security Deposits for housing (mainly in case of Mumbai Employees) as Interest free loans and getting it recovered in 36 instalments.
The perk value as per IT rules shall be added in the annual gross salary of the employee.Standard interest rate @ 14% shall be calculated and added in the perk of the employee.
We want to map the same in our SAP Payroll system.Can anyone suggest how to go about.
With Rgds.,
Rakesh Kumar
[email protected]Hi,
You woudl have to create a payroll function r PCR and add it
in the schema before the tax calculations part.
here move the amount to /127 annual perk.
I think this should sole it.
Points if helpfull.
Regards,
Sandeep. -
Calculation of depreciation value
I have to calculate depreciation on an asset on a prorata basis and then on rate of 20% over the next five years. I have config a depreciation key for the same for which the config is:
1. Multi Lev Method - is from the capitalization date and teh rate maintained inlevels is 20%
2. Base Method - Specified Ordinary Depreciation and stated percentage
The system is correctly calculating the pro rata dep in acq year and the following years on pro rata basis except for the last year in which it is not calculating the value as required. The system is showing the value on a pro rata basis whereas in the last year also it should reduce the entire value. The asset acquisition value is Rs 500000 and the date of acquisition is 21.11.2006.
Regards
Sanil BhandariSanil.
In multi level method, in base value field give 24 (net book balue) and not 1.
Full points if you are ok.
Thanks,
Sujai C
Maybe you are looking for
-
Adobe Illustrator CC: Schneiden von Pfaden an Schnittmenge
Hallo, ich habe lange recherchiert und auch in englischen Foren nichts gefunden, was mein Problem beheben konnte. Bevor ich mit Illustrator CC arbeitete, hatte ich die Version CS5. Hier funktionierte folgendes einwandfrei: Ich habe Pfade (auch in ver
-
How to get worj orders for a plant maintanance
hai gurus can any one tell me what are the function modules or bapis which are used for creation and reading the work orders thanx afzal
-
How to configure Integration Builder using an Integration Process?
Dear experts, I have problems configuring the Integration builder using a Integration process. Scenario = idoc to PI, integration process transforms idoc to file, file to external system. My integration process starts with an abstract interface (stru
-
Error loading QT/DivX 5 component
In the console log I am seeing multiple errors of the type below but sent from Finder, Messenger, Loginwindow, Safari and iPhoto. Clearly I am missing a DivX component/plugin for QT but I can't find a copy of DivX that runs on a Powerbook G4 running
-
PDF opens blank in Android Phone.
PDF attachments open blank in my (Android 2.3.3) HTC Salsa C510E from 20th September 2013. Before that it was ok. But the same attachment opens correctly in my laptop. I uninstalled and then reinstalled ADOBE READER in my android phone but of no u