Count(*) with group by max(date)
SQL> select xdesc,xcust,xdate from coba1 order by xdesc,xcust,xdate;
XDESC XCUST XDATE
RUB-A 11026 01-JAN-06
RUB-A 11026 05-JAN-06
RUB-A 11026 08-JAN-06
RUB-A 11027 10-JAN-06
RUB-B 11026 02-JAN-06
RUB-B 11026 08-JAN-06
RUB-B 11026 09-JAN-06
RUB-C 11027 08-JAN-06
I want to make sql that result :
XDESC COUNT(*)
RUB-A 2
RUB-B 1
RUB-C 1
Criteria : GROUPING: XDESC XCUST AND MAX(DATE)
bellow mark *** that was selected in count.
XDESC XCUST XDATE
RUB-A 11026 01-JAN-06
RUB-A 11026 05-JAN-06
RUB-A 11026 08-JAN-06 ***
RUB-A 11027 10-JAN-06 ***
---------------------------------------------------------COUNT RUB-A = 2
RUB-B 11026 02-JAN-06
RUB-B 11026 08-JAN-06
RUB-B 11026 09-JAN-06 ***
---------------------------------------------------------COUNT RUB-B = 1
RUB-C 11027 08-JAN-06 ***
--------------------------------------------------------COUNT RUB-C = 1
Can Anybody help ?
I tried :
select xdesc,max(xdate),count(max(xdate)) from coba1 group by xdesc
ERROR at line 1:
ORA-00937: not a single-group group function
Thank
This one is duplicate. see the following link
Count(*) with group by max(date)
Thanks
Similar Messages
-
What is HEAD COUNT with regards to HR Data
Hi Experts
Can you please define HEAD COUNT with regards to the HR Data.
Is it a Count of Employees
Please update me thanksHI
In HR, the head count term refers to no of employees only...there are some standard reports available for analysing the head count for an org.
Hope it helps
Darshan -
Max, Min and Count with Group By
Hello,
i want the max, min and count of a table, which is grouped by a column
I need a combination of these two selects:
select
max(COUNTRY_S) MAXVALUE,
min(COUNTRY_S) MINVALUE
from
tab_Country
select
count(*)
from
(select COUNTRY_TXT from tab_Country group by COUNTRY_TXT) ;
The result should be one row with the max and min value of the table and with the count of the grouped by table, not the max and min of each group! - i hope you understand my question?
Is this possible in one SQL-select?
Thank you very much
Best regards
HeidiHi, Heidi,
HeidiWeber wrote:
Hello,
i want the max, min and count of a table, which is grouped by a column
I need a combination of these two selects:
select
max(COUNTRY_S) MAXVALUE,
min(COUNTRY_S) MINVALUE
from
tab_Country
select
count(*)
from
(select COUNTRY_TXT from tab_Country group by COUNTRY_TXT) ;
The result should be one row with the max and min value of the table and with the count of the grouped by table, not the max and min of each group! - i hope you understand my question?
Is this possible in one SQL-select?
Thank you very much
Best regards
Heidi
It's not clear what you want. Perhaps
SELECT MAX (country_s) AS max_country_s
, MIN (country_s) AS min_country_s
, COUNT (DISTINCT country_txt) AS count_country_txt
FROM tab_country
I hope this answers your question.
If not, post a little sample data (CREATE TABLE and INSERT statements, relevant columns only) for all the tables involved, and the results you want from that data.
Explain, using specific examples, how you get those results from that data.
Always say what version of Oracle you're using (e.g. 11.2.0.2.0).
See the forum FAQ: https://forums.oracle.com/message/9362002 -
Use of SUM and COUNT with GROUP BY
All,
I have a set like below:
State DOB StartDt EndDt Desc Category MemberID SubID Code
NC 2003-01-30 2014-01-01 2014-01-31 Child B 123456 2 38
NC 2003-01-30 2014-01-01 2014-01-31 Child B 123456 2 39
NC 2003-01-30 2014-02-01 2014-05-31 Child B 123456 2 38
NC 2003-01-30 2014-02-01 2014-05-31 Child B 123456 2 39
NC 2003-01-30 2014-06-01 2014-07-31 Child B 123456 2 38
NC 2003-01-30 2014-06-01 2014-07-31 Child B 123456 2 39
NC 2014-01-17 2014-01-01 2014-07-31 Infant S 456789 1 49
NC 2014-02-04 2014-02-01 2014-07-31 Infant S 246376 3 49
-- MemberID and SubID identify 1 member
-- Member 123456 has 2 distinct "Code" i.e. 38,39 but each code has different "StartDt" and "EndDt"
-- Expected Result
State Desc Category CountOfDistinctCode TotalMonthsEnrolled
NC Child B 2 (38 and 39 for same member) 7 months (1 + 4 + 2) (Difference between StartDt and EndDt:1 (1/31/2014 - 1/1/2014) + 4 (5/31/2014 - 2/1/2014).. )
NC Infant S 2 (Same code 49 but different member) 13 months (7+6) (7/31/2014 - 1/1/2014 = 7 months + 7/31/2014 - 2/1/2014 = 6 months)
I tried doing a count of distinct Code and the summing up the member months, grouped by State, Desc, Category, but I am not able to get 2 and 7 for child, it somehow calculates Months as 14 (7+7). Please let me know what you guys suggest.OK, so we need a different approach. We need a table of numbers, a concept that I discuss here:
http://www.sommarskog.se/arrays-in-sql-2005.html#numbersasconcept
(Only read down to the next header.)
We join the numbers to temp table with BETWEEN over the numbers. Then we count the distinct combination of member ID and number.
We also need to make an adjustment how to build the string for the DISTINCT. I initially assumed that your columns were integer, why I used str() which produces a right-adjusted fixed-length string of 10 characters. (The default, you can specify a different
width as the second parameter.) But str() expects a float value as input, and will fail if for instance member id would be non-numeric. We cannot just concatenate the strings, since in that case (MemberID, SubID) = ('12345', '61') would be
the same as ('123456', '1'). So we must convert to char to get fixed length. All that said, and some more test data added, here is a solution:
CREATE TABLE #Test
[State] CHAR(2),
DOB DATE,
StartDt DATE,
EndDt DATE,
[Desc] VARCHAR(8),
Category CHAR(1),
MemberID VARCHAR(10),
SubID VARCHAR(2),
Code VARCHAR(5)
INSERT INTO #Test
VALUES
('NC', '20130130', '20140101', '20140120', 'Child', 'B', '123456', '2', '38'),
('NC', '20130130', '20140121', '20140131', 'Child', 'B', '123456', '2', '39'),
('NC', '20130130', '20140201', '20140531', 'Child', 'B', '123456', '2', '38'),
('NC', '20130130', '20140401', '20140613', 'Child', 'B', '123456', '2', '39'),
('NC', '20130130', '20140601', '20140731', 'Child', 'B', '123456', '2', '38'),
('NC', '20130130', '20140614', '20140731', 'Child', 'B', '123456', '2', '39'),
('NC', '20130129', '20140601', '20140731', 'Child', 'B', '9123456', '1', '38'),
('NC', '20130129', '20140614', '20140831', 'Child', 'B', '9123456', '1', '39'),
('NC', '20140117', '20140101', '20140731', 'Infant', 'S', '456789', '1', '49'),
('NC', '20140204', '20140201', '20140731', 'Infant', 'S', '246376', '3', '49')
SELECT * FROM #Test ORDER BY MemberID, StartDt, EndDt
SELECT
[State]
, [Desc]
, Category
, COUNT(DISTINCT convert(char(10), MemberID) +
convert(char(2), SubID) +
convert(char(5), Code)) AS TotalCnt
, COUNT(DISTINCT convert(char(10), MemberID) +
convert(char(2), SubID) +
str(N.Number)) AS CalMonths
FROM #Test t
JOIN Numbers N ON N.Number BETWEEN
datediff(MONTH, '19000101', t.StartDt) AND
datediff(MONTH, '19000101', t.EndDt)
GROUP BY [State], [Desc], Category
go
DROP TABLE #Test
Erland Sommarskog, SQL Server MVP, [email protected] -
Count(*) with group by and a join
I've the below tables:
Tables:
EMPLOYEE
EMP_ID LNAME FNAME
1000 SLITT JOHN
1001 SLITHER STEVE
1002 JACOB MARLYN
1003 STUFFEY NOSE
1004 SLIPPY MOUTH-----
ACCESS_TYPE
ACCESS_TYPE_ID ACCESS_DESCRIPTION
1 EMPLOYEE
2 EMPLOYEE_ADMIN
3 CONTRACTOR-----
EMPLOYEE_ACCESS
ACCESS_ID ACCESS_TYPE_ID EMP_ID ACCESS_EFF_DATE
1 1 1000 01-JAN-13
2 1 1001 10-FEB-13
3 1 1002 18-FEB-13
4 2 1003 10-OCT-12
5 3 1004 10-MAR-08-----
I'm trying to figure out the count of employees group by their type and whose last name does not start with SL
I've written the query below and I'm getting a 00936 Missing Expression. What am I doing wrong? Thanks
SELECT EA.COUNT(*) , ACCESS_TYPE_ID
FROM EMPLOYEE_ACCESS EA, EMPLOYEE E
WHERE ACCESS_EFF_DATE >= '31-DEC-12' AND E.LNAME NOT LIKE 'SL%'
GROUP BY ACCESS_TYPE_ID;
Edited by: parsar on Mar 25, 2013 3:54 PMthanks for the response.
If I don't use the alias prefix, which table rows will the query count because I've two tables in the from clause.
I've modified the query as below but it doesn't return any data:
SELECT COUNT(*) AS ACCESS_COUNT, ACCESS_TYPE_ID
FROM EMPLOYEE_ACCESS EA , EMPLOYEE E
WHERE EA.EMP_ID=E.EMP_ID AND
ACCESS_EFF_DATE >= TO_DATE('2012-12-31', 'YYYY-MM-DD') AND
E.LNAME NOT LIKE 'SL%'
GROUP BY ACCESS_TYPE_ID;Please let me know which part of the query I screwed up. -
COUNT with Group By clause issue
Good Morning everyone. I have a simple query that calculates the count of 3 expressions. It is supposed to group by region and province as a result set, but instead places the TOTAL count for each expression in the region and province fields. What am I doing wrong? This is my query:
SELECT TABLE1."Province",
TABLE1."Region",
(SELECT(COUNT(TABLE1."Nationality"))
FROM TABLE1
WHERE (TABLE1."Nationality" <> 'United States'
AND TABLE1."Nationality" <> 'Nat1')
OR (TABLE1."Medical" <> 'ON MEDICAL'
AND TABLE1."Region" <> 'CONUS')
) "TCN COUNT",
(SELECT(COUNT(TABLE1."Nationality"))
FROM TABLE1
WHERE (TABLE1."Nationality" = 'United States')
OR (TABLE1."Medical" <> 'ON MEDICAL'
AND TABLE1."Region" <> 'CONUS')
) "US COUNT",
(SELECT(COUNT(TABLE1."Nationality"))
FROM TABLE1
WHERE (TABLE1."Nationality" = 'Nat1')
OR (TABLE1."Medical" <> 'ON MEDICAL'
AND TABLE1."Region" <> 'CONUS')
) "HCN COUNT"
FROM TABLE1
GROUP BY TABLE1."Province",
TABLE1."Region";
Any help would be appreciated. Thank you.
AquaBecause you are not passing any values to the inner query from the outer one..
Are you looking for this?
SELECT TABLE1."Province",
TABLE1."Region",
sum (
case when (
TABLE1."Nationality" != 'United States'
AND TABLE1."Nationality" != 'Nat1'
OR (
TABLE1."Medical" != 'ON MEDICAL'
AND TABLE1."Region" != 'CONUS'
) then 1 else 0 end
) "TCN COUNT",
sum (
case when (
TABLE1."Nationality" = 'United States'
OR (
TABLE1."Medical" 'ON MEDICAL'
AND TABLE1."Region" 'CONUS'
) then 1 else 0 end
) "US COUNT",
sum (
case when (
TABLE1."Nationality" = 'Nat1'
OR (
TABLE1."Medical" 'ON MEDICAL'
AND TABLE1."Region" 'CONUS'
) then 1 else 0 end
) "HCN COUNT"
FROM TABLE1
GROUP BY TABLE1."Province",TABLE1."Region"; -
Distinct count differnt from variable count on group
I have a date field and a number field. I have grouped the date by week and by month. I have created a DistinctCount summary on number field and placed in each group. I also created a variable formula which gets reset in each group header. I get differnt results on the monthly group footer.
monthly group header formula:
WhilePrintingRecords;
Shared NumberVar Job_Count_Monthly;
Job_Count_Monthly := 0
weekly group footer formula:
WhilePrintingRecords;
Shared NumberVar Job_Count_Monthly;
Job_Count_Monthly := Job_Count_Monthly + 1
monthly group footer formula:
WhilePrintingRecords;
Shared NumberVar Job_Count_Monthly;
Job_Count_Monthly
Results:
11/30/08 102
12/7/08 110
12/14/08 152
12/21/08 79
12/28/08 89
December 08 DistinctCount returns 491 Formulas returns 532
Can someone please explain why and which is correct?Thanks Vinay.
I resumed all my details and exported to excel to do a count. I found that a least one of my numbers were in two different weekly groups. Now I don't understand why this is because I am grouped by the number. My date group should be the second criteria. My date is from a formula and I wanted it to group by Max Date however it appears the group does not do that. I previously tried to use sub-reports to return number & desired date but I could not figure out how to create groups from sub-report variable to display weekly,monthly & yearly totals. Does anyone have any ideas?
Thanks
Wayne -
Count(*) in a analytic situation with group by order by
Hello every body,
I have a count(*) problem in an sql with analytics function on a table
when I want to have all his column in the result
Say I have a table
mytable1
CREATE TABLE MYTABLE1
MY_TIME TIMESTAMP(3),
PRICE NUMBER,
VOLUME NUMBER
insert into mytable1 values (to_timestamp ('04-MAR-08 09.00.01.664','DD-MM-YY HH24:MI:SS:FF3' ),49.55,704492 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.00.02.570','DD-MM-YY HH24:MI:SS:FF3' ),49.55,705136 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.00.31.227','DD-MM-YY HH24:MI:SS:FF3' ),49.55,707313 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.00.31.227','DD-MM-YY HH24:MI:SS:FF3' ),49.55,706592 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.00.30.695','DD-MM-YY HH24:MI:SS:FF3' ),49.55,705581 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.00.31.227','DD-MM-YY HH24:MI:SS:FF3' ),49.55,707985 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.00.31.820','DD-MM-YY HH24:MI:SS:FF3' ),49.56,708494 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.00.32.258','DD-MM-YY HH24:MI:SS:FF3' ),49.57,708955 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.00.36.180','DD-MM-YY HH24:MI:SS:FF3' ),49.58,709519 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.00.44.352','DD-MM-YY HH24:MI:SS:FF3' ),49.59,710502 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.00.44.352','DD-MM-YY HH24:MI:SS:FF3' ),49.59,710102 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.00.44.352','DD-MM-YY HH24:MI:SS:FF3' ),49.59,709962 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.00.52.399','DD-MM-YY HH24:MI:SS:FF3' ),49.59,711427 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.00.44.977','DD-MM-YY HH24:MI:SS:FF3' ),49.6,710902 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.00.50.492','DD-MM-YY HH24:MI:SS:FF3' ),49.6,711379 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.00.45.550','DD-MM-YY HH24:MI:SS:FF3' ),49.6,711302 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.00.50.492','DD-MM-YY HH24:MI:SS:FF3' ),49.62,711417 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.01.57.790','DD-MM-YY HH24:MI:SS:FF3' ),49.49,715587 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.01.47.712','DD-MM-YY HH24:MI:SS:FF3' ),49.5,715166 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.01.57.790','DD-MM-YY HH24:MI:SS:FF3' ),49.5,715469 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.01.24.821','DD-MM-YY HH24:MI:SS:FF3' ),49.53,714833 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.01.24.821','DD-MM-YY HH24:MI:SS:FF3' ),49.53,714914 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.01.24.493','DD-MM-YY HH24:MI:SS:FF3' ),49.54,714136 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.01.19.977','DD-MM-YY HH24:MI:SS:FF3' ),49.55,713387 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.01.19.977','DD-MM-YY HH24:MI:SS:FF3' ),49.55,713562 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.01.08.695','DD-MM-YY HH24:MI:SS:FF3' ),49.59,712172 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.01.09.274','DD-MM-YY HH24:MI:SS:FF3' ),49.59,713287 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.01.09.117','DD-MM-YY HH24:MI:SS:FF3' ),49.59,713206 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.01.08.695','DD-MM-YY HH24:MI:SS:FF3' ),49.59,712984 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.01.08.836','DD-MM-YY HH24:MI:SS:FF3' ),49.59,712997 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.01.08.695','DD-MM-YY HH24:MI:SS:FF3' ),49.59,712185 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.01.08.695','DD-MM-YY HH24:MI:SS:FF3' ),49.59,712261 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.02.32.244','DD-MM-YY HH24:MI:SS:FF3' ),49.46,725577 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.02.26.181','DD-MM-YY HH24:MI:SS:FF3' ),49.49,724664 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.02.25.540','DD-MM-YY HH24:MI:SS:FF3' ),49.49,723366 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.02.26.181','DD-MM-YY HH24:MI:SS:FF3' ),49.49,725242 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.02.26.181','DD-MM-YY HH24:MI:SS:FF3' ),49.49,725477 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.02.25.947','DD-MM-YY HH24:MI:SS:FF3' ),49.49,724521 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.02.25.540','DD-MM-YY HH24:MI:SS:FF3' ),49.49,723943 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.02.25.540','DD-MM-YY HH24:MI:SS:FF3' ),49.49,724086 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.02.34.103','DD-MM-YY HH24:MI:SS:FF3' ),49.49,725609 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.02.15.118','DD-MM-YY HH24:MI:SS:FF3' ),49.5,720166 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.02.15.118','DD-MM-YY HH24:MI:SS:FF3' ),49.5,720066 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.02.11.774','DD-MM-YY HH24:MI:SS:FF3' ),49.5,718524 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.02.15.696','DD-MM-YY HH24:MI:SS:FF3' ),49.5,722086 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.02.11.774','DD-MM-YY HH24:MI:SS:FF3' ),49.5,718092 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.02.11.774','DD-MM-YY HH24:MI:SS:FF3' ),49.5,715673 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.02.15.118','DD-MM-YY HH24:MI:SS:FF3' ),49.51,719666 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.02.12.555','DD-MM-YY HH24:MI:SS:FF3' ),49.52,719384 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.03.28.963','DD-MM-YY HH24:MI:SS:FF3' ),49.48,728830 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.03.11.884','DD-MM-YY HH24:MI:SS:FF3' ),49.48,726609 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.03.28.963','DD-MM-YY HH24:MI:SS:FF3' ),49.48,728943 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.03.45.947','DD-MM-YY HH24:MI:SS:FF3' ),49.49,729627 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.03.12.259','DD-MM-YY HH24:MI:SS:FF3' ),49.49,726830 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.03.46.494','DD-MM-YY HH24:MI:SS:FF3' ),49.49,733653 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.03.46.510','DD-MM-YY HH24:MI:SS:FF3' ),49.49,733772 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.03.12.259','DD-MM-YY HH24:MI:SS:FF3' ),49.49,727830 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.03.59.119','DD-MM-YY HH24:MI:SS:FF3' ),49.5,735772 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.03.47.369','DD-MM-YY HH24:MI:SS:FF3' ),49.5,734772 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.04.20.463','DD-MM-YY HH24:MI:SS:FF3' ),49.48,740621 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.04.12.369','DD-MM-YY HH24:MI:SS:FF3' ),49.48,740538 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.04.20.463','DD-MM-YY HH24:MI:SS:FF3' ),49.48,741021 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.04.10.588','DD-MM-YY HH24:MI:SS:FF3' ),49.49,740138 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.04.09.463','DD-MM-YY HH24:MI:SS:FF3' ),49.49,738320 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.04.09.135','DD-MM-YY HH24:MI:SS:FF3' ),49.49,737122 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.04.09.135','DD-MM-YY HH24:MI:SS:FF3' ),49.49,736424 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.04.09.260','DD-MM-YY HH24:MI:SS:FF3' ),49.49,737598 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.04.09.744','DD-MM-YY HH24:MI:SS:FF3' ),49.49,739360 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.04.09.135','DD-MM-YY HH24:MI:SS:FF3' ),49.49,736924 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.04.09.260','DD-MM-YY HH24:MI:SS:FF3' ),49.49,737784 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.04.09.463','DD-MM-YY HH24:MI:SS:FF3' ),49.49,738145 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.04.09.744','DD-MM-YY HH24:MI:SS:FF3' ),49.49,739134 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.04.09.463','DD-MM-YY HH24:MI:SS:FF3' ),49.49,738831 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.04.56.215','DD-MM-YY HH24:MI:SS:FF3' ),49.5,742421 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.04.56.580','DD-MM-YY HH24:MI:SS:FF3' ),49.5,741777 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.04.56.215','DD-MM-YY HH24:MI:SS:FF3' ),49.5,742021 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.04.48.433','DD-MM-YY HH24:MI:SS:FF3' ),49.5,741091 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.04.56.840','DD-MM-YY HH24:MI:SS:FF3' ),49.51,743021 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.04.57.511','DD-MM-YY HH24:MI:SS:FF3' ),49.52,743497 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.05.00.270','DD-MM-YY HH24:MI:SS:FF3' ),49.52,744021 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.05.17.699','DD-MM-YY HH24:MI:SS:FF3' ),49.53,750292 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.05.00.433','DD-MM-YY HH24:MI:SS:FF3' ),49.53,747382 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.05.17.699','DD-MM-YY HH24:MI:SS:FF3' ),49.53,749939 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.05.15.152','DD-MM-YY HH24:MI:SS:FF3' ),49.53,749414 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.05.00.433','DD-MM-YY HH24:MI:SS:FF3' ),49.53,744882 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.05.08.110','DD-MM-YY HH24:MI:SS:FF3' ),49.54,749262 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.05.01.168','DD-MM-YY HH24:MI:SS:FF3' ),49.54,748418 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.05.01.152','DD-MM-YY HH24:MI:SS:FF3' ),49.54,748243 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.05.07.293','DD-MM-YY HH24:MI:SS:FF3' ),49.54,748862 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.06.09.433','DD-MM-YY HH24:MI:SS:FF3' ),49.51,750414 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.06.28.262','DD-MM-YY HH24:MI:SS:FF3' ),49.53,750930 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.06.28.887','DD-MM-YY HH24:MI:SS:FF3' ),49.53,751986 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.06.28.887','DD-MM-YY HH24:MI:SS:FF3' ),49.53,750986 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.06.30.997','DD-MM-YY HH24:MI:SS:FF3' ),49.55,753900 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.06.30.887','DD-MM-YY HH24:MI:SS:FF3' ),49.55,753222 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.06.29.809','DD-MM-YY HH24:MI:SS:FF3' ),49.55,753022 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.06.29.809','DD-MM-YY HH24:MI:SS:FF3' ),49.55,752847 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.06.42.622','DD-MM-YY HH24:MI:SS:FF3' ),49.56,755385 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.06.31.120','DD-MM-YY HH24:MI:SS:FF3' ),49.56,754385 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.06.49.590','DD-MM-YY HH24:MI:SS:FF3' ),49.6,759087 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.06.50.341','DD-MM-YY HH24:MI:SS:FF3' ),49.6,759217 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.06.49.590','DD-MM-YY HH24:MI:SS:FF3' ),49.6,758701 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.06.57.262','DD-MM-YY HH24:MI:SS:FF3' ),49.6,761049 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.06.48.637','DD-MM-YY HH24:MI:SS:FF3' ),49.6,757827 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.06.48.120','DD-MM-YY HH24:MI:SS:FF3' ),49.6,757385 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.06.56.466','DD-MM-YY HH24:MI:SS:FF3' ),49.62,761001 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.06.56.137','DD-MM-YY HH24:MI:SS:FF3' ),49.62,760109 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.06.56.137','DD-MM-YY HH24:MI:SS:FF3' ),49.62,759617 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.06.56.278','DD-MM-YY HH24:MI:SS:FF3' ),49.62,760265 );
insert into mytable1 values (to_timestamp ('04-MAR-08 09.06.56.137','DD-MM-YY HH24:MI:SS:FF3' ),49.62,759954 );
so if I do
SELECT DISTINCT row_number() over( partition BY TRUNC(my_time, 'hh24') + (TRUNC(TO_CHAR(my_time,'mi')))/24/60 order by TRUNC(my_time, 'hh24') + (TRUNC(TO_CHAR(my_time,'mi')))/24/60 ASC ) num,
MIN(price) over (partition BY TRUNC(my_time, 'hh24') + (TRUNC(TO_CHAR(my_time,'mi')))/24/60) low ,
MAX(price) over (partition BY TRUNC(my_time, 'hh24') + (TRUNC(TO_CHAR(my_time,'mi')))/24/60) high ,
-- sum(volume) over( partition by trunc(my_time, 'hh24') + (trunc(to_char(my_time,'mi')))/24/60 order by trunc(my_time, 'hh24') + (trunc(to_char(my_time,'mi')))/24/60 asc ) volume,
TRUNC(my_time, 'hh24') + (TRUNC(TO_CHAR(my_time,'mi')))/24/60 TIME ,
price ,
COUNT( *) over( partition BY TRUNC(my_time, 'hh24') + (TRUNC(TO_CHAR(my_time,'mi')))/24/60 order by TRUNC(my_time, 'hh24') + (TRUNC(TO_CHAR(my_time,'mi')))/24/60 ASC ,price ASC,volume ASC ) TRADE,
first_value(price) over( partition BY TRUNC(my_time, 'hh24') + (TRUNC(TO_CHAR(my_time,'mi')))/24/60 order by TRUNC(my_time, 'hh24') + (TRUNC(TO_CHAR(my_time,'mi')))/24/60 ASC,volume ASC ) OPEN ,
first_value(price) over( partition BY TRUNC(my_time, 'hh24') + (TRUNC(TO_CHAR(my_time,'mi')))/24/60 order by TRUNC(my_time, 'hh24') + (TRUNC(TO_CHAR(my_time,'mi')))/24/60 DESC,volume DESC) CLOSE ,
lag(price) over ( order by TRUNC(my_time, 'hh24') + (TRUNC(TO_CHAR(my_time,'mi')))/24/60) lag_all
FROM mytable1
WHERE my_time > to_timestamp('04032008:09:00:00','DDMMYYYY:HH24:MI:SS')
AND my_time < to_timestamp('04032008:09:01:00','DDMMYYYY:HH24:MI:SS')
GROUP BY TRUNC(my_time, 'hh24') + (TRUNC(TO_CHAR(my_time,'mi')))/24/60 ,
price ,
volume
ORDER BY TRUNC(my_time, 'hh24') + (TRUNC(TO_CHAR(my_time,'mi')))/24/60,
price ,
num;
i have
NUM|LOW|HIGH|TIME|PRICE|TRADE|OPEN|CLOSE|LAG_ALL
1|49.55|49.62|04/03/2008 09:00:00|49.55|1|49.55|49.59|
2|49.55|49.62|04/03/2008 09:00:00|49.55|2|49.55|49.59|49.55
3|49.55|49.62|04/03/2008 09:00:00|49.55|3|49.55|49.59|49.55
4|49.55|49.62|04/03/2008 09:00:00|49.55|4|49.55|49.59|49.55
5|49.55|49.62|04/03/2008 09:00:00|49.55|5|49.55|49.59|49.55
6|49.55|49.62|04/03/2008 09:00:00|49.55|6|49.55|49.59|49.55
7|49.55|49.62|04/03/2008 09:00:00|49.56|7|49.55|49.59|49.55
8|49.55|49.62|04/03/2008 09:00:00|49.57|8|49.55|49.59|49.56
9|49.55|49.62|04/03/2008 09:00:00|49.58|9|49.55|49.59|49.57
10|49.55|49.62|04/03/2008 09:00:00|49.59|10|49.55|49.59|49.58
11|49.55|49.62|04/03/2008 09:00:00|49.59|11|49.55|49.59|49.59
12|49.55|49.62|04/03/2008 09:00:00|49.59|12|49.55|49.59|49.59
13|49.55|49.62|04/03/2008 09:00:00|49.59|13|49.55|49.59|49.59
14|49.55|49.62|04/03/2008 09:00:00|49.6|14|49.55|49.59|49.59
15|49.55|49.62|04/03/2008 09:00:00|49.6|15|49.55|49.59|49.6
16|49.55|49.62|04/03/2008 09:00:00|49.6|16|49.55|49.59|49.6
17|49.55|49.62|04/03/2008 09:00:00|49.62|17|49.55|49.59|49.6
Witch is errouneous
because
if I do'nt put the volume column in the script I get another result
SELECT DISTINCT row_number() over( partition BY TRUNC(my_time, 'hh24') + (TRUNC(TO_CHAR(my_time,'mi')))/24/60 order by TRUNC(my_time, 'hh24') + (TRUNC(TO_CHAR(my_time,'mi')))/24/60 ASC ) num,
MIN(price) over (partition BY TRUNC(my_time, 'hh24') + (TRUNC(TO_CHAR(my_time,'mi')))/24/60) low ,
MAX(price) over (partition BY TRUNC(my_time, 'hh24') + (TRUNC(TO_CHAR(my_time,'mi')))/24/60) high ,
-- sum(volume) over( partition by trunc(my_time, 'hh24') + (trunc(to_char(my_time,'mi')))/24/60 order by trunc(my_time, 'hh24') + (trunc(to_char(my_time,'mi')))/24/60 asc ) volume,
TRUNC(my_time, 'hh24') + (TRUNC(TO_CHAR(my_time,'mi')))/24/60 TIME ,
price ,
COUNT( *) over( partition BY TRUNC(my_time, 'hh24') + (TRUNC(TO_CHAR(my_time,'mi')))/24/60 order by TRUNC(my_time, 'hh24') + (TRUNC(TO_CHAR(my_time,'mi')))/24/60 ASC ,price ASC ) TRADE,
first_value(price) over( partition BY TRUNC(my_time, 'hh24') + (TRUNC(TO_CHAR(my_time,'mi')))/24/60 order by TRUNC(my_time, 'hh24') + (TRUNC(TO_CHAR(my_time,'mi')))/24/60 ASC ) OPEN ,
first_value(price) over( partition BY TRUNC(my_time, 'hh24') + (TRUNC(TO_CHAR(my_time,'mi')))/24/60 order by TRUNC(my_time, 'hh24') + (TRUNC(TO_CHAR(my_time,'mi')))/24/60 DESC) CLOSE ,
lag(price) over ( order by TRUNC(my_time, 'hh24') + (TRUNC(TO_CHAR(my_time,'mi')))/24/60) lag_all
FROM mytable1
WHERE my_time > to_timestamp('04032008:09:00:00','DDMMYYYY:HH24:MI:SS')
AND my_time < to_timestamp('04032008:09:01:00','DDMMYYYY:HH24:MI:SS')
GROUP BY TRUNC(my_time, 'hh24') + (TRUNC(TO_CHAR(my_time,'mi')))/24/60 ,
price
ORDER BY TRUNC(my_time, 'hh24') + (TRUNC(TO_CHAR(my_time,'mi')))/24/60,
price ,
num;
I get this
NUM|LOW|HIGH|TIME|PRICE|TRADE|OPEN|CLOSE|LAG_ALL
1|49.55|49.62|04/03/2008 09:00:00|49.55|1|49.55|49.55|
2|49.55|49.62|04/03/2008 09:00:00|49.56|2|49.55|49.55|49.55
3|49.55|49.62|04/03/2008 09:00:00|49.57|3|49.55|49.55|49.56
4|49.55|49.62|04/03/2008 09:00:00|49.58|4|49.55|49.55|49.57
5|49.55|49.62|04/03/2008 09:00:00|49.59|5|49.55|49.55|49.58
6|49.55|49.62|04/03/2008 09:00:00|49.6|6|49.55|49.55|49.59
7|49.55|49.62|04/03/2008 09:00:00|49.62|7|49.55|49.55|49.6
How can I have the right count with all the column of the table?
BabataI'm not sure what in your eye the "right count" is. but I think the DISTINCT keyword is hiding the problems that you have. It could also be the reason for the different number of results between query one and query two.
-
TSQL query to calculate Count / Sum grouping by date on a Pivot Table
Hi All
I need help to group the pivot table output to group by dates and sum/count the values. I have a table like shown below.
Date
Student
Subject
Hunt
Marks
18/02/2014
Sam
Maths
1
20
18/02/2014
Sam
Maths
1
10
18/02/2014
Sam
Maths
2
30
18/02/2014
Luke
Science
1
50
17/02/2014
Sam
Maths
2
50
17/02/2014
Luke
Science
2
60
16/02/2014
Luke
Science
2
20
16/02/2014
Luke
Science
3
20
I want to Group by dates and move the Hunt to columns calculating their counts and sum their marks too. Like given below.
I wrote a pivot query like below but If i group it with dates and calculate the total marks it throws aggregate errors.
Create Table Student_Log ([Date] datetime ,Student varchar (20), Subject varchar (20) ,Hunt int ,Marks int )
Go
INSERT INTO Student_Log ([Date],Student, [Subject],Hunt,Marks) VALUES('2014-02-18 15:00:00.000','Sam ','Maths','1','20')
INSERT INTO Student_Log ([Date],Student, [Subject],Hunt,Marks) VALUES('2014-02-18 15:00:00.000','Sam ','Maths','1','10')
INSERT INTO Student_Log ([Date],Student, [Subject],Hunt,Marks) VALUES('2014-02-18 15:00:00.000','Sam ','Maths','2','30')
INSERT INTO Student_Log ([Date],Student, [Subject],Hunt,Marks) VALUES('2014-02-18 15:00:00.000','Luke','Science','1','50')
INSERT INTO Student_Log ([Date],Student, [Subject],Hunt,Marks) VALUES('2014-02-17 15:00:00.000','Sam ','Maths','2','50')
INSERT INTO Student_Log ([Date],Student, [Subject],Hunt,Marks) VALUES('2014-02-17 15:00:00.000','Luke','Science','2','60')
INSERT INTO Student_Log ([Date],Student, [Subject],Hunt,Marks) VALUES('2014-02-16 15:00:00.000','Luke','Science','2','20')
INSERT INTO Student_Log ([Date],Student, [Subject],Hunt,Marks) VALUES('2014-02-16 15:00:00.000','Luke','Science','3','20')
Go
select * from Student_Log
select [DATE] , [Student], [Subject] ,[1],[2],[3],[4],Total =([1]+[2]+[3]+[4])
from
( select [Date], [Student], [Subject],[Hunt],[Marks] from Student_Log
)x
pivot
count ( [Hunt]) for [Hunt]
in ([1],[2],[3],[4])
)p
order by [Date] desc
I have done this far only. More than this I need to enhance it with the Percentage of Hunts for each Student.
ie like below table.
On 18th Sam in Maths he had 2 rows on 1st hunt and 1 row on 2nd hunt. So On the Pivot table is it possible to represent it on percentage using the Total Attempts column.
Thanks a lot in advance.
Its runnung in SQL 2000 Server.Create Table Student_Log ([Date] datetime ,Student varchar (20), Subject varchar (20) ,Hunt int ,Marks int )
Go
INSERT INTO Student_Log ([Date],Student, [Subject],Hunt,Marks) VALUES('2014-02-18 15:00:00.000','Sam ','Maths','1','20')
INSERT INTO Student_Log ([Date],Student, [Subject],Hunt,Marks) VALUES('2014-02-18 15:00:00.000','Sam ','Maths','1','10')
INSERT INTO Student_Log ([Date],Student, [Subject],Hunt,Marks) VALUES('2014-02-18 15:00:00.000','Sam ','Maths','2','30')
INSERT INTO Student_Log ([Date],Student, [Subject],Hunt,Marks) VALUES('2014-02-18 15:00:00.000','Luke','Science','1','50')
INSERT INTO Student_Log ([Date],Student, [Subject],Hunt,Marks) VALUES('2014-02-17 15:00:00.000','Sam ','Maths','2','50')
INSERT INTO Student_Log ([Date],Student, [Subject],Hunt,Marks) VALUES('2014-02-17 15:00:00.000','Luke','Science','2','60')
INSERT INTO Student_Log ([Date],Student, [Subject],Hunt,Marks) VALUES('2014-02-16 15:00:00.000','Luke','Science','2','20')
INSERT INTO Student_Log ([Date],Student, [Subject],Hunt,Marks) VALUES('2014-02-16 15:00:00.000','Luke','Science','3','20')
Go
select * from Student_Log
;with mycte as
Select [Date], [Student], [Subject] ,
Count(CASE WHEN [Hunt]=1 Then Hunt End) as Hunt1,
Count(CASE WHEN [Hunt]=2 Then Hunt End) as Hunt2,
Count(CASE WHEN [Hunt]=3 Then Hunt End) as Hunt3,
Count(CASE WHEN [Hunt]=4 Then Hunt End) as Hunt4,
Count(CASE WHEN [Hunt]=1 Then Hunt End)
+Count(CASE WHEN [Hunt]=2 Then Hunt End)
+Count(CASE WHEN [Hunt]=3 Then Hunt End)+
+Count(CASE WHEN [Hunt]=4 Then Hunt End) as Total,
ISNULL(SUM(CASE WHEN [Hunt]=1 Then Marks End),0) as Mark1,
ISNULL(SUM(CASE WHEN [Hunt]=2 Then Marks End),0) as Mark2,
ISNULL(SUM(CASE WHEN [Hunt]=3 Then Marks End),0) as Mark3,
ISNULL(SUM(CASE WHEN [Hunt]=4 Then Marks End),0) as Mark4
from Student_Log
Group By [Date], [Student], [Subject]
Select [Date], [Student], [Subject]
, Cast(Hunt1*1./Total*100 as int) as [1]
, Cast(Hunt2*1./Total*100 as int) as [2]
,Cast(Hunt3*1./Total*100 as int) as [3]
,Cast(Hunt4*1./Total*100 as int) as [4]
,Total,Marks=( Mark1+Mark2+Mark3+Mark4)
from mycte
order by [Date] DESC, Student desc
drop table Student_Log -
Using count function with grouped records
Hi all,
This seems like it should be real easy but I have yet to figure out a simple way to do this.
Suppose I want to count Opportunities that are grouped by Sales Rep. At run-time I am filtering this list with a parameter for Sales Stage and created date.
I've simplified this greatly, but here's what my setup looks like now:
Sales Rep* ---------Count*_
<?for-each-group:Opportunity[SalesStage=param1 and Created>param2];./salesrep?>
<?salesrep?>-------<?count(current-group()/Id)?>
<?end for-each-group?>
Total_
The only solution I have so far to get my grand total is to create a variable and keep a running total which I'll then display in the Total column. While it works, it seems like there should be an easier way, like doing a simple count(Id) to get the grand total. But since the Total appears after the end for-each-group, I lose the filter that was applied to the group so that count is invalid.
Any thoughts from the experts?
Thanks!To get grand total
use
<?count(/Oppurtunity[SalesStage=param1 and Created>param2]/Id)?>since you did not mention the complete xml, i assumed, Opportunity as the Root.
if not, put the full path from the root.
if you give some xml sample, and explain the output you wanted, we can fix it immediately.
go thru these too..some thing can be pulled from here .
http://winrichman.blogspot.com/search/label/Summation%20In%20BIP
http://winrichman.blogspot.com/search/label/BIP%20Vertical%20sum -
Count, Group by between dates
I am trying to count the number of IDs dropped and enrolled in each unit for each of the 4 terms between their perspective dates.
There are 4 Terms and an ID can participate in a unit in any of these terms:
TERM
START_DATE
END_DATE
1
25-Feb-13
18-Mar-13
2
27-May-13
17-Jun-13
3
26-Aug-13
16-Sep-13
4
25-Nov-13
16-Dec-13
I am trying to count how many IDs enrolled in a unit between those dates and how many doped before the end date
The ENROL_DATE for each ID in a unit has to be between the given Term Dates to count.
Unit KIS has 1 ENROL and one DROP in TERM 1
UNIT POL occurs in TERM 2 and 4 and both DROP
UNIT LIN and 1 ENROL and 1 DROP in 2 different TERMS
My problem is how do i specify count ENROL and count drop between the Term dates and then group by TERM and UNIT.
Please see below table for given data and expected result. It should make sense.
Thanks,
{code}
CREATE TABLE DAN_GR4
(ID NUMBER(12),
STATUS VARCHAR2(12),
TERM NUMBER(12),
ENROL_DTE DATE,
TERM_START_DTE DATE,
TERM_END_DATE DATE,
UNIT VARCHAR2 (12));
INSERT INTO DAN_GR4 (ID,STATUS,TERM,ENROL_DTE,TERM_START_DTE,TERM_END_DATE,UNIT) VALUES ('1', 'ENROL' ,'1', '15-Mar-13' ,'25-Feb-13' ,'18-Mar-13', 'KIS');
INSERT INTO DAN_GR4 (ID,STATUS,TERM,ENROL_DTE,TERM_START_DTE,TERM_END_DATE,UNIT) VALUES ('1', 'DROP' ,'2', '27-MAY-13' ,'27-MAY-13' ,'17-JUN-13', 'POL');
INSERT INTO DAN_GR4 (ID,STATUS,TERM,ENROL_DTE,TERM_START_DTE,TERM_END_DATE,UNIT) VALUES ('1', 'DROP' ,'2', '27-JUN-13' ,'27-MAY-13' ,'17-JUN-13', 'LIN');
INSERT INTO DAN_GR4 (ID,STATUS,TERM,ENROL_DTE,TERM_START_DTE,TERM_END_DATE,UNIT) VALUES ('1', 'DROP' ,'3', '27-JUN-13' ,'27-MAY-13' ,'17-JUN-13', 'PUI');
INSERT INTO DAN_GR4 (ID,STATUS,TERM,ENROL_DTE,TERM_START_DTE,TERM_END_DATE,UNIT) VALUES ('2', 'DROP' ,'3', '26-SEP-13' ,'26-AUG-13' ,'16-SEP-13', 'POL');
INSERT INTO DAN_GR4 (ID,STATUS,TERM,ENROL_DTE,TERM_START_DTE,TERM_END_DATE,UNIT) VALUES ('2', 'ENROL' ,'4', '26-NOV-13' ,'25-NOV-13' ,'16-DEC-13', 'LIN');
INSERT INTO DAN_GR4 (ID,STATUS,TERM,ENROL_DTE,TERM_START_DTE,TERM_END_DATE,UNIT) VALUES ('3', 'DROP' ,'4', '15-DEC-13' ,'25-NOV-13' ,'16-DEC-13', 'LIN');
INSERT INTO DAN_GR4 (ID,STATUS,TERM,ENROL_DTE,TERM_START_DTE,TERM_END_DATE,UNIT) VALUES ('3', 'DROP' ,'4', '15-DEC-13' ,'25-NOV-13' ,'16-DEC-13', 'POL');
INSERT INTO DAN_GR4 (ID,STATUS,TERM,ENROL_DTE,TERM_START_DTE,TERM_END_DATE,UNIT) VALUES ('3', 'DROP' ,'1', '15-DEC-13' ,'25-FEB-13' ,'18-MAR-13', 'KIS');
{code}
GIVES:
ID
STATUS
TERM
ENROL_DTE
TERM_START_DTE
TERM_END_DATE
UNIT
1
ENROL
1
15-Mar-13
25-Feb-13
18-Mar-13
KIS
1
DROP
2
27-May-13
27-May-13
17-Jun-13
POL
1
DROP
2
27-Jun-13
27-May-13
17-Jun-13
LIN
1
DROP
3
27-Jun-13
27-May-13
17-Jun-13
PUI
2
DROP
3
26-Sep-13
26-Aug-13
16-Sep-13
POL
2
ENROL
4
26-Nov-13
25-Nov-13
16-Dec-13
LIN
3
DROP
4
15-Dec-13
25-Nov-13
16-Dec-13
LIN
3
DROP
4
15-Dec-13
25-Nov-13
16-Dec-13
POL
3
DROP
1
15-Dec-13
25-Feb-13
18-Mar-13
KIS
WANT:
UNIT
TERM_START_DTE
DROP_BEFORE_END_DATE
TERM
KIS
1
1
1
POL
1
2
POL
1
3
POL
1
4
LIN
1
1
4
LIN
1
2
PUI
1
3
USING:
Oracle Database 10g Enterprise Edition Release 10.2.0.5.0 - 64biwith
dan_gr4 as
(select '1' id,
'ENROL' status,
'1' term,
to_date('15-Mar-13','dd-MON-rr') enrol_dte,
to_date('25-Feb-13','dd-MON-rr') term_start_dte,
to_date('18-Mar-13','dd-MON-rr') term_end_dte,
'KIS' unit
from dual
union all
select '1','DROP','2',to_date('27-MAY-13','dd-MON-rr'),to_date('27-MAY-13','dd-MON-rr'),to_date('17-JUN-13','dd-MON-rr'),'POL' from dual union all
select '1','DROP','2',to_date('27-JUN-13','dd-MON-rr'),to_date('27-MAY-13','dd-MON-rr'),to_date('17-JUN-13','dd-MON-rr'),'LIN' from dual union all
select '1','DROP','3',to_date('27-JUN-13','dd-MON-rr'),to_date('27-MAY-13','dd-MON-rr'),to_date('17-JUN-13','dd-MON-rr'),'PUI' from dual union all
select '2','DROP','3',to_date('26-SEP-13','dd-MON-rr'),to_date('26-AUG-13','dd-MON-rr'),to_date('16-SEP-13','dd-MON-rr'),'POL' from dual union all
select '2','ENROL','4',to_date('26-NOV-13','dd-MON-rr'),to_date('25-NOV-13','dd-MON-rr'),to_date('16-DEC-13','dd-MON-rr'),'LIN' from dual union all
select '3','DROP','4',to_date('15-DEC-13','dd-MON-rr'),to_date('25-NOV-13','dd-MON-rr'),to_date('16-DEC-13','dd-MON-rr'),'LIN' from dual union all
select '3','DROP','4',to_date('15-DEC-13','dd-MON-rr'),to_date('25-NOV-13','dd-MON-rr'),to_date('16-DEC-13','dd-MON-rr'),'POL' from dual union all
select '3','DROP','1',to_date('15-DEC-13','dd-MON-rr'),to_date('25-FEB-13','dd-MON-rr'),to_date('18-MAR-13','dd-MON-rr'),'KIS' from dual
select unit,term,max(decode(enrolled,'within','Late')) late_enrolment,max(dropped) when_dropped
from (select unit,term,
case when status = 'ENROL'
then case when enrol_dte between term_start_dte and term_end_dte
then 'within'
else 'outside'
end
end enrolled,
case when status = 'DROP'
then case when enrol_dte < term_start_dte
then 'before start'
when enrol_dte < term_end_dte
then 'before end'
else 'after end'
end
end dropped
from dan_gr4
group by unit,term
order by unit,term
UNIT
TERM
LATE_ENROLMENT
WHEN_DROPPED
KIS
1
Late
after end
LIN
2
after end
LIN
4
Late
before end
POL
2
before end
POL
3
after end
POL
4
before end
PUI
3
after end
Regards
Etbin -
Select max date from a table with multiple records
I need help writing an SQL to select max date from a table with multiple records.
Here's the scenario. There are multiple SA_IDs repeated with various EFFDT (dates). I want to retrieve the most recent effective date so that the SA_ID is unique. Looks simple, but I can't figure this out. Please help.
SA_ID CHAR_TYPE_CD EFFDT CHAR_VAL
0000651005 BASE 15-AUG-07 YES
0000651005 BASE 13-NOV-09 NO
0010973671 BASE 20-MAR-08 YES
0010973671 BASE 18-JUN-10 NOHi,
Welcome to the forum!
Whenever you have a question, post a little sample data in a form that people can use to re-create the problem and test their ideas.
For example:
CREATE TABLE table_x
( sa_id NUMBER (10)
, char_type VARCHAR2 (10)
, effdt DATE
, char_val VARCHAR2 (10)
INSERT INTO table_x (sa_id, char_type, effdt, char_val)
VALUES (0000651005, 'BASE', TO_DATE ('15-AUG-2007', 'DD-MON-YYYY'), 'YES');
INSERT INTO table_x (sa_id, char_type, effdt, char_val)
VALUES (0000651005, 'BASE', TO_DATE ('13-NOV-2009', 'DD-MON-YYYY'), 'NO');
INSERT INTO table_x (sa_id, char_type, effdt, char_val)
VALUES (0010973671, 'BASE', TO_DATE ('20-MAR-2008', 'DD-MON-YYYY'), 'YES');
INSERT INTO table_x (sa_id, char_type, effdt, char_val)
VALUES (0010973671, 'BASE', TO_DATE ('18-JUN-2010', 'DD-MON-YYYY'), 'NO');
COMMIT;Also, post the results that you want from that data. I'm not certain, but I think you want these results:
` SA_ID LAST_EFFD
651005 13-NOV-09
10973671 18-JUN-10That is, the latest effdt for each distinct sa_id.
Here's how to get those results:
SELECT sa_id
, MAX (effdt) AS last_effdt
FROM table_x
GROUP BY sa_id
; -
Get articles with max date only
Hello Everyone
I've got a little problem with a sql query. I got two tables and a lot of articles. The articles are listed multiple times and got different dates. I want to select the following rows:
p.productname
i.amount
i.date
The problem is, that every product only should be listed one time (something like UNIC or DISTINCT). And it should be the product with the highest date. Is there something like MAX(date) that I can use?
What I already have is...
SELECT
p.productname,
i.amount,
i.date
FROM op_inventory i
LEFT JOIN products p
ON p.ItemID = i.fk_article
Now, how can I solve my problem above?...
Greets DolliqueHmm, thx but, that didn't solve my problem.
I've just found a german website with a detailed explantion of the solution that mack gave me.
I used that one and it seems to work quite fine, expects from I somehow can't use GROUP BY. Every time I use it, it returns a Executing DB error (like before)...
I'll see if I can solve that problem...
Thx you anyway!
Toby
PS:
My code now is
SELECT
i.id,
i.amount,
i.date,
p.productname
FROM op_inventory i
LEFT JOIN products p
ON p.ItemID = i.fk_article
<!---JOIN (
SELECT
id id_m,
MAX(date) maxdate
FROM op_inventory
GROUP BY id_m
) temp
ON i.id = temp.id_m AND i.date = temp.maxdate--->
<cfoutput>#gi_where#</cfoutput>
i.date = (
SELECT MAX(o.date)
FROM op_inventory o
WHERE o.fk_article = i.fk_article
<!---GROUP BY i.id--->
And the content of #gi_where# is:
WHERE p.category = #lb_group# AND -
Trying to get max date grouped by type
hello,
i have a request table with:
request name
request start date
The requests run daily so i have multiple records. In answers i want to display request name and the max(request start date), grouped by request name. Is it possible to due this in answers only? if i must use the repository how do i do it? I'm new to building subject areas.You can do this in answers only, If you dont have access to RPD.
Sol1: Create a report with two columns request name,request start date. Open pivot table and add request start date to measures and apply Aggregation of Max on date.
sol2: Create a report with two columns request name,request start date, Change the Fx of request start date to max(request start date by request name) . This way table view also show request max date by request names. -
File Count with selected date range
Hi,
Our requirement is to get the file count with selected date by the user from two sharepoint date time controls i.e. dtp1 and dtp2 into the data table. I am able to get the file count of specific folder from Pages library through below code. Now need to get
the selected date range from two date time picker controls and check with the item created by is within the date range. If yes I need to get the file count.
So please share your ideas/thoughts to do the same.
SPList list =
wikiweb.Lists["Pages"];
SPFolderCollection oFolders
= list.RootFolder.SubFolders["foldername"].SubFolders;
DataTable dt
= new DataTable();
dt.Columns.Add("Column1");
DataRow dr;
if (oFolders.Count
> 0)
foreach (SPFolder oFolder in oFolders)
if (!oFolder.Name.Equals("Forms"))
dr
= dt.NewRow();
dr["Column1"] = oFolder.ItemCount.ToString();
dt.Rows.Add(dr);
Regards,
Sudheer
Thanks & Regards, SudheerHi,
I have modified the code as below
if((DateTime)(oFolder.Item.File.TimeCreated>dtFromDate.SelectedDate)&&(DateTime)(oFolder.Item.File.TimeCreated<dtToDate.SelectedDate))
But still it is throwing the error.
Please share your ideas on the same.
Regards,
Sudheer
Thanks & Regards, Sudheer
Maybe you are looking for
-
Component to Composite Video for Ipod touch
I have a 8gb ipod touch and i have the composite cables for my old ipod so i can watch videos on my tv. I've read that the touch needs the new composite cables and i don't have a hdtv to support that. Is there any converter i can buy to make this wor
-
Switched to Monthly Full Bill Payment and no bill
Hopefully a mod will pick this up and look at it. I updated my payments before my February 2015 bill so I could pay full bill each month, the bill was generated and reports my next bill is in May 2015 not March 2015. When contacted BT they confirmed
-
How to override the default delete operation
Hi, I am new to Jheadstart, java coding for that matter. Here's my situation, I have a view which is based on a function (function returns a collection). I have created instead of triggers on this view to perform insert/update/delete operations. All
-
10.10.10.4 (255.255.55.248) is my workstation on Vlan1 (10.10.10.1) see attached. My fw log indicates drop packet to 10.10.10.4. Does this show my 10 addr is visible externally? Is my internal addr. being NATed correctly? If NAT config is wrong, how
-
Does Icloud sync for bookmarks between Ipad and mac pro with lion work for anyone?
My wife has an Ipad 2 and Macbook Pro tunning Lion. Bookmarks and docs do not sync on icloud. I see from the discussions many are having this problem. My question is does this work for anyone? Am wasting my time trying to make it work?