How do calculate the average of a group of percentages excluding percentages listed as zero on Numbers?
Hey all
This is driving me crazy!
I need to calculate the average of a group of percentages excluding cells listed as '0%'.
Please see screen print below, numbers highlighted need to be averaged, when the table is filled out correctly some will remain '0%' whilst others will have a value, I want the average to be calculated only using numbers with a value greater than '0'.
FYI I am looking for the result to display in 'X2'
Thanks :-)
you can use the averageif() function. Here is a simple example to get you started:
In this example the average if() function averages values that are non-zero
C2=AVERAGEIF(A1:E1,"<>0",A1:E1)
this is shorthand for select cell C2, then type (or copy and paste from here) the formula:
=AVERAGEIF(A1:E1,"<>0",A1:E1)
Similar Messages
-
How to calculate the average inventory in ABAP
Dear All,
Please find the below formula and this formula how to calculate the Average inventory at value.Please let me know the abap base tables and the corresponding fields.
Formula
Inventory Turnover = Cost of Goods Sold (COGS) / Average Inventory at value.
Thanks
Regards,
SaiHi Arivazhagan,
Thanks for your quick response .
The field MBEWH from the table is fulfill the average inventory at value.
For Eg :I want to calculate Inventory Turnover = Cost of Goods Sold (COGS)/
Average Inventory at value.
so shall i take Inventory Turnover = Cost of Goods Sold (COGS)/MBEWH
The above formula will meet my requirement to find the average inventory Turnover.
Thanks
Regards,
Sai -
help..!
I want to calculate average interest rate in discoverer, question describes below:
1. the data have 4 columns, 1 account number, 2 balance, 3.interest rate, 4. product type 1, product type 2
2.average interest rate =
sum (product type1 account bal * rate / sum(product bal)).
a*sum(product bal) is sum every records column 2 according the type of product
b*every record need to calculate using balance to divide sum(product bal) for calculate the rate of the sum(product bal) of type of product
c* sum(b*) is what my answer rate.
3.
I need to display average interest rate in discoverer, and each product have a one average rate according user drill-up or drill-down product dimension.
*product dimension have two level
please tell me how to let every record divide by group sum(product bal)
or tell me how to solve this problem.
thanks!
davidHi David
Try using the AVG analytic function.
Best wishes
Michael -
How to Compute the average six months of customer orders excluding the first orders
Hi all
I am writing some basic sql to generate some measures from a postgres database table that are going to be inserted into a SQL SErver table.
I have to two tasks where I am supposed to generate the average 6 months spend and average 1 year spend using the customer data but excluding the first time orders.
I have some sample data below:
CREATE TABLE orders
persistent_key_str character varying,
ord_id character varying(50),
ord_submitted_date date,
item_sku_id character varying(50),
item_extended_actual_price_amt numeric(18,2)
INSERT INTO orders VALUES ('01120736182','ORD6266073','2010-12-08','100856-01',39.90);
INSERT INTO orders VALUES('01120736182','ORD33997609','2011-11-23','100265-01',49.99);
INSERT INTO orders VALUES('01120736182','ORD33997609','2011-11-23','200020-01',29.99);
INSERT INTO orders VALUES('01120736182','ORD33997609','2011-11-23','100817-01',44.99);
INSERT INTO orders VALUES('01120736182','ORD89267964','2012-12-05','200251-01',79.99);
INSERT INTO orders VALUES('01120736182','ORD89267964','2012-12-05','200269-01',59.99);
INSERT INTO orders VALUES('01011679971','ORD89332495','2012-12-05','200102-01',169.99);
INSERT INTO orders VALUES('01120736182','ORD89267964','2012-12-05','100907-01',89.99);
INSERT INTO orders VALUES('01120736182','ORD89267964','2012-12-05','200840-01',129.99);
INSERT INTO orders VALUES('01120736182','ORD125155068','2013-07-27','201443-01',199.99);
INSERT INTO orders VALUES('01120736182','ORD167230815','2014-06-05','200141-01',59.99);
INSERT INTO orders VALUES('01011679971','ORD174927624','2014-08-16','201395-01',89.99);
INSERT into orders values('01000217334','ORD92524479','2012-12-20','200021-01',29.99);
INSERT into orders values('01000217334','ORD95698491','2013-01-08','200021-01',19.99);
INSERT into orders values('01000217334','ORD90683621','2012-12-12','200021-01',29.990);
INSERT into orders values('01000217334','ORD92524479','2012-12-20','200560-01',29.99);
INSERT into orders values('01000217334','ORD145035525','2013-12-09','200972-01',49.99);
INSERT into orders values('01000217334','ORD145035525','2013-12-09','100436-01',39.99);
INSERT into orders values('01000217334','ORD90683374','2012-12-12','200284-01',39.99);
INSERT into orders values('01000217334','ORD139437285','2013-11-07','201794-01',134.99);
INSERT into orders values('01000827006','W02238550001','2010-06-11','HL 101077',349.000);
INSERT into orders values('01000827006','W01738200001','2009-12-10','EL 100310 BLK',119.96);
INSERT into orders values('01000954259','P00444170001','2009-12-03','PC 100455 BRN',389.99);
INSERT into orders values('01002319116','W02242430001','2010-06-12','TR 100966',35.99);
INSERT into orders values('01002319116','W02242430002','2010-06-12','EL 100985',99.99);
INSERT into orders values('01002319116','P00532470001','2010-05-04','HO 100482',49.99);
Using the data, this is what I have done:
SELECT q.ord_year, avg( item_extended_actual_price_amt )
FROM (
SELECT EXTRACT(YEAR FROM ord_submitted_date) as ord_year, persistent_key_str,
min(ord_submitted_date) as first_order_date
FROM ORDERS
GROUP BY ord_year, persistent_key_str
) q
JOIN ORDERS o
ON q.persistent_key_str = o.persistent_key_str and
q.ord_year = EXTRACT (year from o.ord_submitted_date) and
o.ord_submitted_date > q.first_order_date AND o.ord_submitted_date < q.first_order_date + INTERVAL ' 6 months'
GROUP BY q.ord_year
ORDER BY q.ord_year
Can someone help me look into my query and see whether I am doing it the right way.
Thanks,
IonThis more looks like a mysql code. Please try posting the question in mysql forums
http://forums.mysql.com/
This is one way of doing this in T-SQL
CREATE TABLE orders
persistent_key_str varchar(100),
ord_id varchar(50),
ord_submitted_date date,
item_sku_id varchar(50),
item_extended_actual_price_amt numeric(18,2)
INSERT INTO orders VALUES ('01120736182','ORD6266073','2010-12-08','100856-01',39.90);
INSERT INTO orders VALUES('01120736182','ORD33997609','2011-11-23','100265-01',49.99);
INSERT INTO orders VALUES('01120736182','ORD33997609','2011-11-23','200020-01',29.99);
INSERT INTO orders VALUES('01120736182','ORD33997609','2011-11-23','100817-01',44.99);
INSERT INTO orders VALUES('01120736182','ORD89267964','2012-12-05','200251-01',79.99);
INSERT INTO orders VALUES('01120736182','ORD89267964','2012-12-05','200269-01',59.99);
INSERT INTO orders VALUES('01011679971','ORD89332495','2012-12-05','200102-01',169.99);
INSERT INTO orders VALUES('01120736182','ORD89267964','2012-12-05','100907-01',89.99);
INSERT INTO orders VALUES('01120736182','ORD89267964','2012-12-05','200840-01',129.99);
INSERT INTO orders VALUES('01120736182','ORD125155068','2013-07-27','201443-01',199.99);
INSERT INTO orders VALUES('01120736182','ORD167230815','2014-06-05','200141-01',59.99);
INSERT INTO orders VALUES('01011679971','ORD174927624','2014-08-16','201395-01',89.99);
INSERT into orders values('01000217334','ORD92524479','2012-12-20','200021-01',29.99);
INSERT into orders values('01000217334','ORD95698491','2013-01-08','200021-01',19.99);
INSERT into orders values('01000217334','ORD90683621','2012-12-12','200021-01',29.990);
INSERT into orders values('01000217334','ORD92524479','2012-12-20','200560-01',29.99);
INSERT into orders values('01000217334','ORD145035525','2013-12-09','200972-01',49.99);
INSERT into orders values('01000217334','ORD145035525','2013-12-09','100436-01',39.99);
INSERT into orders values('01000217334','ORD90683374','2012-12-12','200284-01',39.99);
INSERT into orders values('01000217334','ORD139437285','2013-11-07','201794-01',134.99);
INSERT into orders values('01000827006','W02238550001','2010-06-11','HL 101077',349.000);
INSERT into orders values('01000827006','W01738200001','2009-12-10','EL 100310 BLK',119.96);
INSERT into orders values('01000954259','P00444170001','2009-12-03','PC 100455 BRN',389.99);
INSERT into orders values('01002319116','W02242430001','2010-06-12','TR 100966',35.99);
INSERT into orders values('01002319116','W02242430002','2010-06-12','EL 100985',99.99);
INSERT into orders values('01002319116','P00532470001','2010-05-04','HO 100482',49.99);
/*SELECT q.ord_year, avg( item_extended_actual_price_amt )
FROM (
SELECT EXTRACT(YEAR FROM ord_submitted_date) as ord_year, persistent_key_str,
min(ord_submitted_date) as first_order_date
FROM ORDERS
GROUP BY ord_year, persistent_key_str
) q
JOIN ORDERS o
ON q.persistent_key_str = o.persistent_key_str and
q.ord_year = EXTRACT (year from o.ord_submitted_date) and
o.ord_submitted_date > q.first_order_date AND o.ord_submitted_date < q.first_order_date + INTERVAL ' 6 months'
GROUP BY q.ord_year
ORDER BY q.ord_year
with cte as
(select *, row_number() OVER( partition by persistent_key_str order by ord_submitted_date) RN from orders )--where ord_submitted_date between dateadd(month,6,getdate()) and getdate())
SELECT year(cte.ord_submitted_date),avg(cte.item_extended_actual_price_amt) FROM cte
where rn<>1
group by year(cte.ord_submitted_date)
Satheesh
My Blog |
How to ask questions in technical forum -
How do I calculate the average of ONLY the populated fields?
I'm working on an Adobe form to streamline the process of summarizing course evaluations. To protect the anonymity of the students, instructors with 10 or fewer students are given a typed summary of the evaluation results. On the first half of the evaluation, the students are asked to rate a series of statements from 1-5. Three of these are highlighted on the summary form with the average response.
I've already set up the form with 10 unique boxes for each question and an 11th box which calculates the average. Each field is named according to the question number and the student's "number" (I arbitrarily gave each student a number so that each field would be unique):
i.e. #1. Questionquestionquestion. Q1S1 Q1S2 Q1S3 Q1S4 Q1S5 Q1S6 Q1S7 Q1S8 Q1S9 Q1S10 1AVG
The problem, of course, is that no matter how many students were actually in the class, the AVG field will always be calculated based on a group of 10...which isn't accurate or fair to the instructor. I thought about just changing the form and removing the unused fields from the equation each time I make up a new summary, but the point of creating a form was to make it as quick and easy as possible to bang these things out... Plus, some of my coworkers might be using the form from time to time and I'd have to explain what they have to do and if they don't have Adobe Acrobat then they can't actually make the changes and blah blah blah...it just gets ridiculous really quickly!
So anyway, I tried reading some other posts for similar questions in an attempt to figure out a custom calculation script for myself, but I just couldn't focus on it.
I was hoping someone could explain how to write a custom calculation script that will omit any fields which are left blank... Or, even better...is anyone willing to write it for me? At least just an example of the first guy, cause I could probably figure out how to get all the other ones from there.
Thanks.In formcalc the function Avg will calculate the average of only the fields with a value in them. So you would would put in the calcuate event of the average field:
$ = Avg(Q1S1,Q1S2,Q1S3,Q1S4, etc) -
How do i calculate the average of runs that have variable number of samples?
Hi
I am trying to calculate the average trajectory from lets say 4 runs, with varying sample size.
That is, if the the 4 runs have 78,74,73,55 samples respectively, How do I calculate the average of (78+74+73+55)/4 , (78+74+73)/3 and (78+74)/2 and 78/1 ?
Thank youI'm not sure you gave a sufficient information. Assuming that you just want to calculate a linear average of the data collected in consecutive runs, you have two simple methods :
1/ concatenate the data before calculating the mean.
Simple, but can occupy a huhe amount of space since all the data have to be stored in memory
2/ store only the means Mi and the corresponding number of data points ni and calculate a balanced mean : M = sum (ni x Mi) / sum (ni) )
The attached vi illustrate both methods.
Chilly Charly (aka CC)
E-List Master - Kudos glutton - Press the yellow button on the left...
Attachments:
Averages.vi 40 KB -
How to calculate the difference of two totals which are of the same group?
Post Author: mingli
CA Forum: Formula
Hello,
I have an existing report which has a group defined. The group generates two set of data, at the footer of each set of data, the total is calcuated. Now I need to calculate the difference between the totals. For example:
value
set 1
item 1 100
item 2 200
item 3 300
set 1 total 600
set 2
item 1 200
item 2 20
item 3 300
item 4 40
set 2 total 560
difference 40
My question is: how can I do this? I'm thinking about create a formula. But the problem is, these two totals have the same field name. Could someone help me out?
Thanks.Post Author: deejayw
CA Forum: Formula
Hi,I have a similar type of query...my Sets only ever contain two entries but there is an unknown number of sets. value
set 1
item 1 100
item 2 200
difference (item2 - item1) = 100
set 2
item 1 200
item 2 20difference (item2 - item1) = -180
Set 3, Set 4, etc, etc I need to figure out how to calculate the "difference (item2 - item1)" total above. I am really confused by this and need assistance. Many thanks. -
How to calculate the individual sums of multiple columns in a single query
Hello,
Using Oracle 11gR2 on windows 7 client. I have a question on calculating sum() on multiple columns on different columns and store the results in a view. Unfortunately I could not post the problem here as it keeps on giving error "Sorry, this content is not allowed", without telling where or what it is! So I had to post it in the stack-overflow forum, here is the link: http://stackoverflow.com/questions/16529721/how-to-calculate-the-individual-sums-of-multiple-columns-in-a-single-query-ora
Will appreciate any help or suggestion.
Thanksuser13667036 wrote:
Hello,
Using Oracle 11gR2 on windows 7 client. I have a question on calculating sum() on multiple columns on different columns and store the results in a view. Unfortunately I could not post the problem here as it keeps on giving error "Sorry, this content is not allowed", without telling where or what it is! So I had to post it in the stack-overflow forum, here is the link: http://stackoverflow.com/questions/16529721/how-to-calculate-the-individual-sums-of-multiple-columns-in-a-single-query-ora
Will appreciate any help or suggestion.
ThanksLooks like you want a simple group by.
select
yr
, mnth
, region
, sum(handled_package)
, sum(expected_missing_package)
, sum(actual_missing_package)
from test
group by
yr, mnth, region
order by
yr, mnth, region;I wouldn't recommend storing your data for year / month in 2 columns like that unless you have a really good reason. I would store it as a date column and add a check constraint to ensure that the date is always the first of the month, then format it out as you wish to the client.
CREATE TABLE test
year_month date,
Region VARCHAR2(50),
CITY VARCHAR2(50),
Handled_Package NUMBER,
Expected_Missing_Package NUMBER,
Actual_Missing_Package NUMBER
alter table test add constraint firs_of_month check (year_month = trunc(year_month, 'mm'));
ME_XE?Insert into TEST (year_month, REGION, CITY, HANDLED_PACKAGE, EXPECTED_MISSING_PACKAGE, ACTUAL_MISSING_PACKAGE)
2 Values (to_date('2012-nov-12', 'yyyy-mon-dd'), 'Western', 'San Fransisco', 200, 10, 5);
Insert into TEST (year_month, REGION, CITY, HANDLED_PACKAGE, EXPECTED_MISSING_PACKAGE, ACTUAL_MISSING_PACKAGE)
ERROR at line 1:
ORA-02290: check constraint (TUBBY.FIRS_OF_MONTH) violated
Elapsed: 00:00:00.03
ME_XE?Insert into TEST (year_month, REGION, CITY, HANDLED_PACKAGE, EXPECTED_MISSING_PACKAGE, ACTUAL_MISSING_PACKAGE)
2 Values (to_date('2012-nov-01', 'yyyy-mon-dd'), 'Western', 'San Fransisco', 200, 10, 5);
1 row created.
Elapsed: 00:00:00.01
ME_XE?select
2 to_char(year_month, 'fmYYYY') as year
3 , to_char(year_month, 'fmMonth') as month
4 , Region
5 , CITY
6 , Handled_Package
7 , Expected_Missing_Package
8 , Actual_Missing_Package
9 from test;
YEAR MONTH REGION CITY HANDLED_PACKAGE EXPECTED_MISSING_PACKAGE ACTUAL_MISSING_PACKAGE
2012 November Western San Fransisco 200 10 5
1 row selected.
Elapsed: 00:00:00.01
ME_XE?Then you have nice a nice and easy validation that ensures you data integrity.
Cheers, -
How to calculate the final score in Appraisal
Hi experts,
I'm creating a new appraisal form and I have to create some VC where I'll display the result of the average of a group of VCs from the same evaluation form. This calcutation must be automatic and the appraisee or appraiser will not be allowed to change the value.
I've tried to do that using the available Badi, but turns out to never bring any value.
Does anybody knows how can I use the standard calculation?
Thanks in advance for any help!
Best regards,
ThaisHi Srikanth,
Thanks for your reply. I've tried to do what yoú told me, but when I create the new VB they system shows an warning saying that The determination of the average without elements must be restrict. I don't know what it means.
Do you know what can I do or give me more details?
Thanks in advance!
Best wishes,
Thais -
Dear Sir or madam:
I have collected a lot of electroencephalogram data, which looks like a continuous sine wave plus a noise signal. what i want to do now is to fit the data with the bilinear model as the attached jpg file. the problem is the parameters of the model are changing with the evolution of the signal. would someone like to tell me how to calculate the parameters in the bilinear model?
thank you very much
Attachments:
Bilinear_Model.jpg 33 KBI can point you in the right direction, but probably not answer your question really well. If you have the pro or full distribution of LabVIEW, you can use the curve fitting VIs to match any set of data to a theoretical curve. The curve fitting Express VI may do all you need to do (use the nonlinear option). To get the signal evolution, break your data into pieces and process each. If you need to average a bit, you can use a sliding window of your data for each analysis, moving the window less than the window width for each analysis.
There are a plethora of curve fitting techniques built into LabVIEW - matrix operations, linear and log linear fits, Levenberg-Marquardt methods, downhill simplex, etc. You will probably need to experiment a bit to get a stable
algorithm for your case. Check your literature for ways other people have done this. There may be an easy, stable method out there.
If you are unfamiliar with curve fitting techniques, or you do not have the pro or full versions of LabVIEW, I would recommend "Numercial Recipes in C" by Press et. al., published by Cambridge University Press. The chapter on Modeling of Data will get you going. The rest of the book will provide any background you need.
This was a very general answer. If you need something more specific, let me know. Be aware that this type of problem usually requires some trial and error to get right. The answers should be tightly scrutinized before being believed.
This account is no longer active. Contact ShadesOfGray for current posts and information. -
Hi Experts,
I have doubt, i have an internal table now i want to calculate the total for every group. How to do that?
value type
5,439.01 ; ZMP0
509.60 ; ZMP0
4,749.26 ; ZMP0
9,053.95- ZPNL
732.70- ZPNL
66.30- ZPNL
18.10- ZS03
63.90 ; ZS03
According to the type i need a total for ZMPO, ZPNL, ZSO3 separately.
how to do that?
5,439.01 ; ZMP0
509.60 ; ZMP0
4,749.26 ; ZMP0
ZMPO total =
9,053.95- ZPNL
732.70- ZPNL
66.30- ZPNL
ZPNL total =
18.10- ZS03
63.90 ; ZS03
ZSO3 total =
Like that i need, can any one help this?
Regards,
MohanaHello,
Use the COLLECT.
DATA:
wa LIKE LINE OF itab,
itab_sum LIKE itab.
SORT itab BY key.
LOOP AT itab INTO wa.
COLLECT wa INTO itab_sum.
ENDLOOP.
The itab_sum has the same structure like the itab where is the data to me summarized, and the wa has the line structure of the itab.
Regards. -
the data I measured changed rapidly, so i want get the average value of the data
Do not tell me to use mean.vi , i have already know that.
and i got an idea which is add the data into an array every time, then sum of all the data value and take the result divide by the number of elements
but i dont know how to achieve that, anyone can build a simple vi to show me ? thank you
i have attached my vi which is using mean.vi to calc the average value, you can delete it and using in your way , thank you !
Solved!
Go to Solution.
Attachments:
EN-new.vi 274 KBHi I got a similar issue for averaging. I used the mean.vi from the math function but the average is rolling when i run it. I am trying to calculate the average for the data i read to the RT FIFO (which is around 40000 lines).I got the writing part working, however, when i am reading the data, I couldn't get it working. I thought i read the data as a 1-D array, and then pass it to the Mean.vi and then got the result. But seems like the mean is only showing the last data in the array.
Can someone help me with this??
Attachments:
FPGA-vi.png 242 KB
RT-vi.png 182 KB
RT-2mod.vi 515 KB -
How to calculate moving average price?
hi,
i need someone to explain to me how to calculate moving average priceHi,
Follow the Link,
Re: moving average price -
Hi,
How to find the average of table row values it should display in next row in libwindow/CVI
Please let me know the solution.There isn't a built-in function to perform calculations on thable cells. What you can do is to retrieve thable cells values and calculate the average by yourself.
To retrieve a bunch of cells in a single instruction you can use GetTableCellRangeVals: prerequisite for this function to work correctly is that cells are all included in a Rect structure (shortly, a rectangle) and are all of the same data type. See the help for the function for some explanations and the link to an example of its usage. In Cell range parameter you can pass VAL_TABLE_ROW_RANGE (row) macro to retrieve an entire row. See here for details.
Once you have retrieved cell values in an array, you can pass it to Mean function to calculate the average.
Proud to use LW/CVI from 3.1 on.
My contributions to the Developer Zone Community
If I have helped you, why not giving me a kudos? -
How to calculate the 99th percentile of a number stream ...
Environment:
Oracle 11.2.0.3 EE on Solaris 10.5
I have a stream of numbers (say 1000), I need to calculate the 99th percentile of the distribution of that stream such that when a new number 'n' is introduced I can tell if the new number is above the 99th percentile of the distribution of my stream.
I don't have a good feel for the nature of the distribution if that's important.
I am NOT, I repeat NOT a statistician! :-)
I have read the docs on the different functions available, ntile, percent_rank, percentile_cont, percentile_disc, etc. I have also read many articles referenced via Google.
The examples do not do exactly what I'm attempting and I was unable to get the result I need by trial and error (mostly!).
Any suggestions are most welcome!!!
If you need additional information I'll try to supply what I know.
-garyHere are the CREATE TABLE and INSERT statements for some sample data:
create table sales(custno number, tran_dt date, amount number)Here are the INSERTs to load:
INSERT INTO SALES(custno, tran_dt, amount) VALUES (5, TO_DATE('23-MAR-12','DD-MON-RRRR'),3065);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (5, TO_DATE('29-FEB-12','DD-MON-RRRR'),8435);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (5, TO_DATE('29-FEB-12','DD-MON-RRRR'),9712);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (5, TO_DATE('20-MAR-12','DD-MON-RRRR'),1869);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (5, TO_DATE('03-AUG-12','DD-MON-RRRR'),647);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (5, TO_DATE('03-MAR-12','DD-MON-RRRR'),7434);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (5, TO_DATE('23-AUG-12','DD-MON-RRRR'),225);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (5, TO_DATE('01-SEP-12','DD-MON-RRRR'),28);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (5, TO_DATE('10-APR-12','DD-MON-RRRR'),9393);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('19-MAY-12','DD-MON-RRRR'),1010);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('29-FEB-12','DD-MON-RRRR'),2625);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('18-MAR-12','DD-MON-RRRR'),2511);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('15-AUG-12','DD-MON-RRRR'),1156);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('17-AUG-12','DD-MON-RRRR'),5106);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('19-AUG-12','DD-MON-RRRR'),714);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('07-APR-12','DD-MON-RRRR'),6105);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('25-MAY-12','DD-MON-RRRR'),1592);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('14-MAY-12','DD-MON-RRRR'),1880);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('29-JUL-12','DD-MON-RRRR'),8497);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('13-MAY-12','DD-MON-RRRR'),1658);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('06-APR-12','DD-MON-RRRR'),3071);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('04-SEP-12','DD-MON-RRRR'),8277);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('20-MAR-12','DD-MON-RRRR'),3929);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('04-MAR-12','DD-MON-RRRR'),9239);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('24-APR-12','DD-MON-RRRR'),4390);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('12-MAY-12','DD-MON-RRRR'),8362);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('06-APR-12','DD-MON-RRRR'),4157);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('01-SEP-12','DD-MON-RRRR'),9260);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('31-MAR-12','DD-MON-RRRR'),8017);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('20-MAY-12','DD-MON-RRRR'),2420);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('21-JUL-12','DD-MON-RRRR'),5272);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('23-AUG-12','DD-MON-RRRR'),313);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('24-MAY-12','DD-MON-RRRR'),4747);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('25-APR-12','DD-MON-RRRR'),272);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('02-MAY-12','DD-MON-RRRR'),4329);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('28-JUL-12','DD-MON-RRRR'),3149);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('18-MAR-12','DD-MON-RRRR'),1740);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('07-MAR-12','DD-MON-RRRR'),6868);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('02-JUN-12','DD-MON-RRRR'),5661);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('08-MAY-12','DD-MON-RRRR'),6136);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('23-AUG-12','DD-MON-RRRR'),512);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('16-AUG-12','DD-MON-RRRR'),8784);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('31-AUG-12','DD-MON-RRRR'),7300);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('12-MAY-12','DD-MON-RRRR'),9303);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('29-MAR-12','DD-MON-RRRR'),2626);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('14-JUL-12','DD-MON-RRRR'),6365);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('19-JUN-12','DD-MON-RRRR'),7880);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('20-JUN-12','DD-MON-RRRR'),6096);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('05-AUG-12','DD-MON-RRRR'),1980);
INSERT INTO SALES(custno, tran_dt, amount) VALUES (26, TO_DATE('27-MAY-12','DD-MON-RRRR'),3004);I can now calculate the average sales over the past 6 months (180 day) and 1 month (30 day) periods. I don't need the 6 month average but I do need the individual sales to get the distribution and hence the standard deviations and hopefully the 99th percentile number. I know it's around 3 standard deviations but the spec I'm working with requires using the 99th percentile. Not my choice! :-)
select custno,
avg(case when tran_dt between sysdate -210 and sysdate -31 then amount else 0 end) sales6mo,
avg(case when tran_dt >= sysdate -30 then amount else 0 end) sales1mo,
stddev(amount) sd,
stddev(amount)*3 sd3
from sales
group by custno;I want to see if there are any customers' (I know I only generated 2) 1 month sales that are more than the 99th percentile of the 6 month distribution.
If it's easier to use a lower percentile I can use the examples to plug in the 99.
Here is the result of the above query:
CUSTNO SALES6MO SALES1MO SD SD3
5 4506.11111 28.1111111 4142.61146 12427.8344
26 3563.81579 1090.05263 2952.1488 8856.44641So, from this data I would expect my 99th percentile to be close to the SD3 column. Correct?
Thanks very much for the assistance!!
-gary
Edited by: garywicke on Sep 5, 2012 9:54 AM
Maybe you are looking for
-
How do I print envelopes for everyone in my address book?
How do I print envelopes for everyone in my address book? It seems so simple but after half an hour... no joy!!
-
Burn iMovie from Powerbook to iDVD on iMac?
Have a PowerBook G4 w/ combo drive and an iMac G5. Cut an iMovie on PB and now want to burn it to DVD. Yes I didn't get the superdrive on the PB 'cuz (I'm cheap) I thought I could just burn DVDs on the iMac, just not sure how... Thanks much. db Power
-
How do I get old display back for Podcasts?
The display on iTunes in my iMac has switched to displaying an icon and the name of the podcast in one column and listing each episode in another column to the right with a couple of lines describing the episode. Before, it simply listed the title of
-
PPD for HP OfficeJet J3600 not being found/generated
I've installed hplip and run hp-setup, but it always fails to find the PPD for my HP OfficeJet J3640 All-in-One. D-Bus and CUPS are running. What am I forgetting to install or execute?
-
Heloo All, in my icss(e service) application, when ever user chnages/updates ibase, the chnages should not be flown to ECC unless until some one in my client side in crm system approves it. How to achieve that requirements. any ideas/clues please. T