Count query

I had already asked a question related to this and having a little more problem in this
i had created the schema in following link
http://sqlfiddle.com/#!3/1fcdb/7
the query works for me. But there is a problem. Data in database may change time to time. in some time there will not be any engr posted if it is okey. when engr is posted then the output changes.
i use a code for the different grade people in a company
1-3 are Engrs, 4-5 are Supdt, 6 and above are Mechanic
i dont want to count Engrs from the table i want to count Supdt & Mechanic only it means the KDPR>=4-5 (Supdt) and kdpr>=6 (Mechanic)  should only be counted in this
I am a System Administrator at Vadodara

Hi, the following videos may be helpful:
http://www.youtube.com/watch?v=sOp_IS9kQZY 
http://technet.microsoft.com/en-us/library/dd299414(v=sql.100).aspx
Regards, Leo

Similar Messages

  • Incorrect count query

    I'm using CursoredStream to fetch a large result set. When doing a cursor.size() Toplink generates the wrong count query. I do have group value functions in the query as shown below.
    For e.g. this is the cursor query:
    --- Actual select query ---
    SELECT t0.AWARD_ID, t0.AWARD_NUMBER, t0.FINAL_AMOUNT, t2.NAME,
    SUM(t1.REPORT_AMOUNT), t4.FISCAL_YEAR, t5.DESCRIPTION
    FROM REF_ENTITY t6, T_CODE t5, SOL t4, APP t3, APP_VERSION t2, SUBG t1,
    AWARD t0
    WHERE ((((t4.FISCAL_YEAR = '1993') AND (t6.REF_ENTITY_ID = 5)) AND (t4.SUBG_
    REPORT = 'Y')) AND ((t5.T_CODE_ID = t0.T_CODE_ID_STATUS) AND ((t2.APP_ID = t3.A
    PP_ID) AND ((t6.REF_ENTITY_ID = t4.REF_ENTITY_ID) AND ((t4.SOL_ID = t3.SOL_ID)
    AND ((t3.APP_ID = t0.APP_ID) AND (t0.AWARD_ID = t1.AWARD_ID)))))))
    GROUP BY t0.FINAL_AMOUNT, t0.AWARD_ID, t0.AWARD_NUMBER, t4.FISCAL_YEAR, t5.D
    ESCRIPTION, t2.NAME ORDER BY t0.AWARD_NUMBER ASC
    The count query generated by Toplink is:
    SELECT COUNT(*)
    FROM REF_ENTITY t4, AWARD t3, APP t2, SOL t1, SUBG t0
    WHERE ((((t1.FISCAL_YEAR = ?) AND (t4.REF_ENTITY_ID = ?))
    AND (t1.SUBG_REPORT = 'Y')) AND ((t4.REF_ENTITY_ID = t1.REF_ENTITY_ID)
    AND ((t1.SOL_ID = t2.SOL_ID) AND ((t2.APP_ID = t3.GMS_APP_ID)
    AND (t3.AWARD_ID = t0.AWARD_ID)))))
    I have to provide my own ValueReadQuery to get the correct count in the above case. Are there any rules for determining when we have to explicitly provide our own count query?
    cheers
    p.s: Toplink version 9.0.3.4

    Abe,
    This sounds like a bug. I assume you are using a ReportQuery to generate the initial SQL. You should only have to provide a ValueReadQuery to a cursored query when using SQL, stored-procedure, or to optimize what is being generated.
    This looks like the group-by on your ReportQuery is being ignored on the size query. The best plan is to submit this to support with the TopLink code used and the information provided in this post.
    In the mean time the work-around of providing your own query is probably the best solution.
    Doug

  • Count Query Performance

    How can we improve the perfromace of a count query?
    Suppose I have a table A that holds more than 50 million rows.
    Now if i want to count the no of rows which is the best one
    1) Select count(*) from A.
    Definitely not as it is doing a full table scan
    2) Select count(primary_key) from A.
    3) Select count(row_id) from A.
    One more question whats the difference between select count(*) from table_name and select count(1) from table_a. Many people suggest count(1) and i dont see any reason though.
    Kindly guide me.

    > Please see my points 1,2 and 3.
    Can this change the execution plan (path) of the CBO in anyway?
    1. count rows
    2. count rows using primary key
    3. counting rows using physical row addresses
    The fact is that the rows, and all the rows, need to be counted. The CBO will choose the most "attractive" I/O path - i.e. the smallest one, the one with the least amount of I/O. It does not need tricks like COUNT(1) or COUNT(PK) or COUNT(ROWID) in order to make an appropriate decision.
    Example:
    SQL> create table tab1 nologging as select level as ID, LPAD('*',4000,'*') as STUFF from dual connect by level <= 10000;
    Table created.
    SQL> set autotrace on
    Running a SELECT COUNT(*) without any alternate I/O paths (no indexes exist)
    SQL> select count(*) from tab1;
    COUNT(*)
    10000
    Execution Plan
    Plan hash value: 899213575
    | Id | Operation | Name | Rows | Cost (%CPU)| Time |
    | 0 | SELECT STATEMENT | | 1 | 2306 (4)| 00:00:28 |
    | 1 | SORT AGGREGATE | | 1 | | |
    | 2 | TABLE ACCESS FULL| TAB1 | 9982 | 2306 (4)| 00:00:28 |
    Note
    - dynamic sampling used for this statement
    Statistics
    28 recursive calls
    0 db block gets
    10087 consistent gets
    10000 physical reads
    0 redo size
    208 bytes sent via SQL*Net to client
    238 bytes received via SQL*Net from client
    2 SQL*Net roundtrips to/from client
    0 sorts (memory)
    0 sorts (disk)
    1 rows processed
    Creating an PK index
    SQL> alter table tab1 add constraint pk_tab1 primary key(id) using index;
    Table altered.
    Running the same SELECT COUNT(*) - but the CBO now sees that the PK index
    is smaller and cheaper to scan than scanning the table
    SQL> select count(*) from tab1;
    COUNT(*)
    10000
    Execution Plan
    Plan hash value: 1796789124
    | Id | Operation | Name | Rows | Cost (%CPU)| Time |
    | 0 | SELECT STATEMENT | | 1 | 9 (23)| 00:00:01 |
    | 1 | SORT AGGREGATE | | 1 | | |
    | 2 | INDEX FAST FULL SCAN| PK_TAB1 | 9982 | 9 (23)| 00:00:01 |
    Note
    - dynamic sampling used for this statement
    Statistics
    194 recursive calls
    0 db block gets
    131 consistent gets
    20 physical reads
    0 redo size
    222 bytes sent via SQL*Net to client
    238 bytes received via SQL*Net from client
    2 SQL*Net roundtrips to/from client
    5 sorts (memory)
    0 sorts (disk)
    1 rows processed
    SQL>

  • Count query taking time

    I have a query -->select c1,c2,c3 from table1 . This query takes only few milliseconds. But when I take count from the same query i.e. when I execute select count(c1,c2,c3) from table1 then it takes a very long time (about 1 min). The table1 contains about 25000 rows. Please help to improve performance of Count query.

    Satej wrote:
    I have a query -->select c1,c2,c3 from table1 . This query takes only few milliseconds. But when I take count from the same query i.e. when I execute select count(c1,c2,c3) from table1 then it takes a very long time (about 1 min).Classic misperception of Toad, SQL Navigator and similar tool users. All these tools fetch just first result screen and show time it took to fetch just that and not the time to fetch all rows. And in order to count you need to fetch all rows.l That is why select count(*) takes longer. But 1 min for 25000 rows is a bit long. Check execution plan to see what is going on.
    SY.

  • Invalid SQL generated for count query.

    A SQL statement was generated with the table name information missing when
    a count SQL query was issued (e.g. SELECT COUNT(*) FROM WHERE <SOME
    EXPRESSION>). The same query worked fine before switching to vertical
    class mapping.
    Basically no queries work with the result set to "count(this)". The
    temporary work-around was to retrieve the collection and get the size. I
    would rather not have to do this.
    Any ideas?

    Brendan,
    What version of Kodo is this happening with? Can you send a test case
    demonstrating this to [email protected]?
    -Patrick
    Brendan Brothers wrote:
    A SQL statement was generated with the table name information missing when
    a count SQL query was issued (e.g. SELECT COUNT(*) FROM WHERE <SOME
    EXPRESSION>). The same query worked fine before switching to vertical
    class mapping.
    Basically no queries work with the result set to "count(this)". The
    temporary work-around was to retrieve the collection and get the size. I
    would rather not have to do this.
    Any ideas?

  • Sub-Select Count query breaking TOAD

    Oracle 10.2.0.4.0
    Running TOAD 9.1
    I am running some SQL on our eBusiness Suite:
    SELECT pha.segment1
         , pha.type_lookup_code
         , (SELECT COUNT(DISTINCT pha2.po_header_id)
              FROM po.po_headers_all pha2
                 , po.po_lines_all pla
             WHERE pha2.po_header_id = pla.po_header_id
               AND pla.contract_id = pha.po_header_id) po_count
         , (SELECT MAX(pha2.creation_date)
              FROM po.po_headers_all pha2
                 , po.po_lines_all pla
             WHERE pha2.po_header_id = pla.po_header_id
               AND pla.contract_id = pha.po_header_id) latest_cpa_po
      FROM po.po_headers_all pha
         , po.po_vendors pv
         , po.po_vendor_sites_all pvsa
    WHERE pha.vendor_id = pv.vendor_id
       AND pha.vendor_site_id = pvsa.vendor_site_id
    --   AND pv.VENDOR_NAME LIKE 'H%'
       AND pha.vendor_id = 98
       AND pha.type_lookup_code = 'CONTRACT'
       AND pha.org_id IN(7041, 7042);The above query runs quicky (approx. 1 second). If I take out the AND pha.vendor_id = 98 then the query takes a few minutes to run.
    When I try to export it, or scroll down to view > 500 rows, TOAD crashes.
    I know this isn't a TOAD forum, but I think that this is probably an issue with my no doubt rubbish SQL.
    If I take out this sub-select, then the problem doesn't happen:
         , (SELECT COUNT(DISTINCT pha2.po_header_id)
              FROM po.po_headers_all pha2
                 , po.po_lines_all pla
             WHERE pha2.po_header_id = pla.po_header_id
               AND pla.contract_id = pha.po_header_id) po_countHowever, I can't work out a better way of getting the data I need.
    The sub-select counts POs which have been raised where the contractID on the PO line is the same as the PO Header ID from the main query.
    Any advice please, on what I could do to sort this out would be much appreciated.
    Thanks!

    Hi,
    It looks like you can replace both scalar sub-queries with a join, like this:
    WITH     header_lines_summary     AS
         SELECT    pla.contract_id
              ,       COUNT (DISTINCT pha2.po_header_id)     AS po_count
              ,       MAX (pha2.creation_date)          AS latest_cpa_po
              FROM        po.po_headers_all pha2
                 ,        po.po_lines_all   pla
             WHERE        pha2.po_header_id = pla.po_header_id
          GROUP BY       pla.contract_id
    )                                        -- Everything up to this line is new
    SELECT pha.segment1
         , pha.type_lookup_code
         , hls.po_count                              -- Changed
         , hls.latest_cpa_po                         -- Changed
      FROM po.po_headers_all     pha
         , po.po_vendors           pv
         , po.po_vendor_sites_all      pvsa
         , header_lines_summary     hls                    -- New
    WHERE pha.vendor_id          = pv.vendor_id
       AND pha.vendor_site_id     = pvsa.vendor_site_id
       AND pha.po_header_id          = hls.contract_id (+)          -- New
    --   AND pv.VENDOR_NAME      LIKE 'H%'
       AND pha.vendor_id           = 98
       AND pha.type_lookup_code      = 'CONTRACT'
       AND pha.org_id           IN (7041, 7042);Aside from the sub-query (which is entirely new), the query above is just what you posted, with 2 lines changed and 2 lines added, as marked.
    This should be more efficient, but I don't know for certain that it will solve the Toad problem.
    I hope this answers your question.
    If not, post a little sample data (CREATE TABLE and INSERT statements, relevant columns only) for all tables, and also post the results you want from that data.
    It never hurts to say what version of Oracle you're using.

  • Help with a COUNT query...

    I have created the following tables:
    create table ZONE (
    ZoneNo number(2),
    State char(3),
    PRIMARY KEY (ZoneNo)
    create table SCHOOL (
    SchoolType char(1),
    SchoolNumber number(4),
    SchoolName char(40),
    SchoolZone number(2),
    PRIMARY KEY (SchoolNumber,SchoolType),
    FOREIGN KEY (SchoolZone) REFERENCES ZONE
    create table TEACHER (
    TeacherCode char(5),
    Surname char(30),
    FirstName char(30),
    Gender char(1),
    WorksFor number(4),
    YearCommenced number(4),
    LivesIn number(2),
    TeachingType char(1),
    PRIMARY KEY (TeacherCode),
    FOREIGN KEY (WorksFor, TeachingType) REFERENCES SCHOOL(SchoolNumber,SchoolType),
    FOREIGN KEY (LivesIn) REFERENCES ZONE
    create table COMPUTER (
    SerialNumber char(10),
    Description char(50),
    RegisteredTo char(5),
    PRIMARY KEY (SerialNumber),
    FOREIGN KEY (RegisteredTo) REFERENCES TEACHER
    My question is, how would i make a COUNT (or similar) query that would display each SchoolName and show the total amount of teachers who work at that school in the second column?

    Yes, it needs to join all columns on foreign key.
    This is usual, gentle, safety, and universal approach.
    insert into ZONE values (6,'CA');
    insert into SCHOOL values ('E',600,'Oracle Elemental School',6);
    insert into SCHOOL values ('J',600,'Oracle Middle School',6);
    insert into SCHOOL values ('S',600,'Oracle High School',6);
    insert into TEACHER values (12001,'Suzuki','Akiko','F',600, 2005, 6, 'E');
    insert into TEACHER values (12002,'Scott','Tiger','M',600, 2005, 6, 'E');
    insert into TEACHER values (12003,'Zhong','San','M',600, 2005, 6, 'J');
    insert into TEACHER values (12004,'Li','Si','M',600, 2005, 6, 'J');
    insert into TEACHER values (12005,'Wang','Wu','M',600, 2005, 6, 'J');
    insert into TEACHER values (12006,'Kim','Youngae','F',600, 2005, 6, 'S');
    insert into TEACHER values (12007,'Kim','Jiu','F',600, 2005, 6, 'S');
    insert into TEACHER values (12008, null,'Joe','M',600, 2005, 6, 'S');
    column schoolname format a30
    set lines 120
    select s.SchoolName
    ,count(*) "ALL"
    ,count(t.TeacherCode) "ALL(CODE)"
    ,count(distinct t.TeacherCode) "DISTINCT(CODE)"
    ,count(t.surname) "ALL(NAME)"
    ,count(distinct t.surname) "DISTINCT(NAME)"
    from School s, Teacher t
    where s.SchoolNumber = t.WorksFor
    and s.SchoolType = t.TeachingType
    group by s.SchoolName
    SCHOOLNAME                            ALL  ALL(CODE) DISTINCT(CODE)  ALL(NAME) DISTINCT(NAME)
    Oracle Elemental School                 2          2              2          2              2
    Oracle High School                      3          3              3          2              1
    Oracle Middle School                    3          3              3          3              3select s.SchoolName
    ,count(*) "ALL"
    ,count(t.TeacherCode) "ALL(CODE)"
    ,count(distinct t.TeacherCode) "DISTINCT(CODE)"
    ,count(t.surname) "ALL(NAME)"
    ,count(distinct t.surname) "DISTINCT(NAME)"
    from School s, Teacher t
    where s.SchoolNumber = t.WorksFor
    --and s.SchoolType = t.TeachingType
    group by s.SchoolName
    SCHOOLNAME                            ALL  ALL(CODE) DISTINCT(CODE)  ALL(NAME) DISTINCT(NAME)
    Oracle Elemental School                 8          8              8          7              6
    Oracle High School                      8          8              8          7              6
    Oracle Middle School                    8          8              8          7              6

  • Using a "Sum" Calculated Field on a "Count" query column?

    Here's my query using the Query Report Builder:
    SELECT Col1, COUNT(Col1) AS Count_Column
    FROM Table
    GROUP BY Col1
    It generates a report that lists all the values of column
    "Col1" and how many times each value was used in Col1
    For instance, if Col1 contained the value 'A' 12 times, 'B' 6
    times, and 'C' 19 times, the report would generate this:
    A - 12
    B - 6
    C - 19
    What i need as a column footer is the total count of all the
    values, which in this case 12+6+19=37
    I am using a calculated field, setting the data type to
    Double, the calcuation to Sum, and the perform calculation on to
    'query.Count_Column'. Reset Field When is set to None.
    When I run the report, it doubles the last number in the
    report's Count column (19) and displays 38 on the page. I tested
    this with another column and it doubled the last number in the
    report as well.
    How can I get it to properly Sum my Count_Column?

    Hi,
    You need to check note 208366.1 which explains why totals can be blank. Without knowing the detail of you decode function it is hard to say what needs to be changed. Try putting a sum in front of the decode e.g.
    sum(decode(period, 'Jan period', value, 0))
    Hope that helps,
    Rod West

  • Multiple count query

    I'm trying to write a function to compare two different counts.
    I have patient on multiple drugs...some drugs are allergy drugs and at least one allergy drug must be marked as primary.
    select count(drug_role) as vAllergyCount, count(primary_flag) as PF, ap.patient_id
    from ae_all_drugs aed, ae_patient ap
    where drug_role = 'Allergy'
    and aed.patient_id = ap.patient_id
    and ap.patient_id like '2012%'
    group by ap.patient_id,primary_flag
    order by ap.patient_id
    when grouped by patient id and primary flag I would receive a record like this:
    VAlllergyCount | PF | patient_id
    1 | 0 | 2012AB1
    3 | 1 | 2012AB1
    If i only group by patient_id I get this (which is what I want):
    VAlllergyCount | PF| patient_id
    4 | 1 | 2012AB1
    But if I try to compare the two:
    IF vAllergyCount > 1 and PF = 1 THEN 'SUCCESS' ELSIF 'FAIL'
    I receive the error message "exact fetch returns more than requested number of rows" becuase the results are really my first query.
    Can anyone help me re-write my query to compare the values?
    P.S. I'm very new to this and this forum and I apologize for not inserting code tags. I don't know how.

    Hi,
    979105 wrote:
    Thank you for your hlep.
    I need to write a function because the SQL query is going to be used as an edit check for another program. So yes, it will have to worth my time.
    You are correct. I have a patient on multiple drugs. I want to know if there is at least one allergy drug (drug_role) > 1 if there is at least ONE primary allergy drug.
    I did initially write my function with the patient_id as an input value. Why would I not need the vAllergyCount or the pf if that is an argument?If all you want to know is 'SUCCESS' or 'FAIL', then write a query that produces 'SUCCESS' or 'FAIL'. Inside the query, it may be handy to have something like vAllergyCount or pf, but that doesn't mean that they will be returned by the query, or that you need PL/SQL variables for them.
    Here's my original function:That's a PROCEDURE, not a function. Sometimes, people use the word "procedure" lossely, to mean any kind of stored code, including functions, but never the other way around.
    \Don't forget to put another \ tag at the end of the formatted section.
    \ tags are supposed to occur in pairs. If you have an odd number of \ tags, the last one has no effect, and is displayed as regular text.
    PROCEDURE COUNT_ALLERGY_MED (
         ivPatient_ID IN VARCHAR2,
         ovErrorText          OUT     VARCHAR2 )
    IS
         vCase_ID          VARCHAR2 (25);
         vSusCount           NUMBER (3);
         vSuspectMed          VARCHAR2 (1);
         --vPrimaryFlag     VARCHAR2(1);
    cursor c1 is
         select count(*) as vSusCount
         from ae_suspect_drugs aed
         where drug_role = 'S'
         and aed.case_id = ivCase_ID
         --group by ivcase_id;
    BEGIN
    open c1;
    fetch c1 into vSusCount;
    close c1;
    IF vSusCount < 1 /*AND vPrimaryFlag IS NULL*/ THEN
         ovErrorText := 'Suspect Drug Count is less than 1' || 'SusCount: ' || vSusCount;
         ELSIF vSusCount = 1 /*AND vPrimaryFlag IS NULL*/ THEN
         ovErrorText := 'Suspect Drug count is equal to 1' || 'SusCount: ' || vSusCount;
         ELSIF vSusCount > 1 /*AND vPrimaryFlag IS NULL*/ THEN
         ovErrorText := 'Suspect Drug count is greater than 1 ' || 'SusCount: ' || vSusCount;
         ELSE
         ovErrorText := 'All criteria met';
    END IF;
    END COUNT_SUSPECT_MED;It seems like it would be more convenient to have a FUNCTION, rather than a PROCEDURE; that way you could call the function directly in a SQL statement (e.g., in the WHERE clause of a query or DML statement) as well as in PL/SQL.
    The procedure you posted seems to be quite a bit differetn from the problem you asked eariler.
    Here's one way you might write a fucntion for the earlier problem:CREATE OR REPLACE FUNCTION count_allergy_med
    ( ivPatient_ID IN VARCHAR2
    RETURN VARCHAR2
    IS
    return_str VARCHAR2 (50);          -- To be returned
    BEGIN
    SELECT CASE
         WHEN COUNT (drug_role) > 1
              AND COUNT (primary_flag) = 1
              THEN 'SUCCESS'
              ELSE 'FAIL'
         END     
    INTO return_str
    FROM ae_all_drugs     aed
    ,      ae_patient           ap
    WHERE drug_role          = 'Allergy'
    AND     aed.patient_id      = ap.patient_id
    AND     ap.patient_id     = ivPatient_ID;
    RETURN return_str;
    END count_allergy_med;
    Notice that there are no columns in the result set, nor variables in the function, that correspond to vAllergyCount or pf.
    The function above returns 1 of 2 possible values.  You couold just as well write a function that had 4 possible outcomes, or 4 basic results, each containing a variable number, such asSuspect Drug Count is less than 1. SusCount: 0
    I'm not sure why you would want such as result, since 0 is the only possible count that is less than 1, but it's easy enough to produce results like that if you really want them.
    Once again, if you'd like help, post a clear example of what you want.  Include CREATE TABLE and INSERT statements for some sample data.  Post some code where you might use the function that you're thinking of.  Give a couple of different arguments for that functuion, and the results you would want for each argument given the same sample data.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

  • Customer Count Query

    Hello All
    I need help in writing a Optimized query .
    Scenario :
    We have several customers across multiple vendors .We want to get Customercount by vendor for a year 
    In this scenario we wanted to claculate 
    Customer Count for Vendor 
    Example 
    2014  Kroger(Vendor) has 100 customers
    2014  Target(Vendor) has 50 customers
    I wanted to see out of those 100 customers for Kroger who were not customer of kroger in 2012 and 2013 but customer of target
    Similarly  out of those 50 customers for Target who were not customer of target in 2012 and 2013 but customer of
    kroger
    In 2012 same customer might have bought with different vendor.We are calculating new to vendor 
    C1  2012   Kroger 100
    C1  2012   Walmart 100
    C2  2012   target 100
    C2  2012   Kroger 100
    C1  2014   target 100
    C2  2014   Kroger 100
    C2  2014   target 100
    C2  2014   walmart 100
    In this case C1 is customer count for target   as he did not buy with target but with other vendors
    and c2 canot be counted towarsd kroger or walmart becauase be bought with both of them but can be counterd towarrds walmart as he did not buy with walmart
    I am trying to write a query to get 2012 and 2013 data for all vendors and then join with 2014 data set
    but the thing is i am getting wrong count 
    Please let me know

    this?
    SELECT Vendor,
    YEAR(DATEADD(yy,DATEDIFF(yy,0,Datefield),0)) AS Yr,
    COUNT(CustomerID) AS CustomerCount,
    COUNT(CASE WHEN Cnt = 0 THEN CustomerID ELSE NULL END) AS FirstTimeCustomerCount
    FROM Table t
    OUTER APPLY (SELECT COUNT(*) AS Cnt
    FROM Table
    WHERE CustomerID = t.CustomerID
    AND DATEDIFF(yy,0,Datefield) < DATEDIFF(yy,0,t.Datefield)
    AND Vendor = t.Vendor)
    GROUP BY Vendor,DATEDIFF(yy,0,Datefield)
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • Problem with count query

    I am connecting to the excel and running the following query:
    Select Count(*) from [RIF$];
    I is working fine and giving me total count of rows
    Now when i try the following version of count
    Select Count(Distinct Status) from [RIF$];
    where Status is a column name.
    It is giving the following error:
    java.sql.SQLException: [Microsoft][ODBC Excel Driver] Syntax error (missing operator) in query expression 'Count(Distinct Status)'.
    Pls help me
    its as always urgent

    This shouldn't execute in the quiery analyzer too, you don't have a java problem. I am not sure this is the optimal but you can try something like:
    select count(c_id) from (select distinct pcid c_id from contragent) cntr HTH
    Mike

  • Bug SQL Dev 1.5.4 Build MAIN-5940 Exporting count(*) query results

    when trying to export query results containing an analytic count the export dialog fails to open:
    When running the following query
    select count(*) over (partition by dummy order by rownum) xx
      from dual;right clicking on the results grid and selecting export->xls (or any other format) fails to open the export dialog box.
    changing the asterisk (*) to another value e.g. 1 or dummy corrects this issue for this particular query as does changing the analytic fuction to an aggregate function.
    Edited by: Sentinel on Aug 12, 2009 11:05 AM

    I get the same error when exporting the same statement - however, there is a known problem with 1.5.4 and failing on exports of certain statements (see 1.5.3 - Export results to XLS ORA-00936 which talks about there still being problems in 1.5.4 although the thread is titled 1.5.3).
    theFurryOne

  • Simple Count query and fetching data

    Hello,
    I have a table Emp :
    Name, age, sal, dateUpdated
    A 21 100 6/4/10
    B 21 101 6/4/10
    C 32 101 2/2/2
    D 20 100 3/3/3
    I am trying to count the number of people in the same age group AND fetch that number.
    For ex, I want to count the number of people that were updated today and then fetch their age.
    for ex : something like this : select count(dateUpdated) from Emp where date updated is today
    will only return 2 as result.
    how should my query be if I want to even fetch the individual age along with the count ?
    cheers,

    You can use {noformat}{noformat} tags to preserve your code format. That would help every one to read your post better.                                                                                                                                                                                                                                                           

  • Hairy count(*) query

    I am trying to get a count of my rows into 'buckets' by age.
    bucket 1: < 5 minutes
    bucket 2: 5 to 8 minutes
    bucket 3: 8 to 15 minutes
    bucket 4: > 15 minutes
    with c as ( select * from cases b where b.IS_ACTIVE = 'Y')
    select count(*) as count from c where c.RESPONSE_SLA_START_DATE > SYSDATE - (5/24*60)
    union all
    select count(*) as count1 from c where c.RESPONSE_SLA_START_DATE < SYSDATE - (5/(24*60)) and c.RESPONSE_SLA_START_DATe > SYSDATE - (8/(24*60))
    union all
    select count(*) as count2 from c where c.RESPONSE_SLA_START_DATE < SYSDATE - (8/(24*60)) and c.RESPONSE_SLA_START_DATe > SYSDATE - (15/(24*60))
    union all
    select count(*) as count3 from c where c.RESPONSE_SLA_START_DATE < SYSDATE - (15/(24*60));
    Everything works as expected, but my results are all being crammed into a single column "COUNT" rather than 4 separate columns "COUNT", "COUNT1", "COUNT2", "COUNT3".
    COUNT
    0
    0
    0
    137
    Any ideas or suggestions as to how I can get this divided into a single row with 4 columns rather than 4 rows with 1 column?

    Hi,
    Welcome to the forum!
    Here's one way:
    WITH     got_bucket     AS
         SELECT     CASE
                  WHEN  response_sla_start_date > SYSDATE - ( 5 / (24 * 60))
                        THEN  1
                  WHEN  response_sla_start_date > SYSDATE - ( 8 / (24 * 60))
                        THEN  2
                  WHEN  response_sla_start_date > SYSDATE - (15 / (24 * 60))
                        THEN  3
                  WHEN  response_sla_start_date IS NOT NULL
                        THEN  4
              END     AS bucket
         FROM     cases
         WHERE     is_active     = 'Y'
    SELECT       COUNT (CASE WHEN bucket = 1 THEN 1 END)     AS under_5
    ,       COUNT (CASE WHEN bucket = 2 THEN 1 END)     AS from_5_to_8
    ,       COUNT (CASE WHEN bucket = 3 THEN 1 END)     AS from_8_to_15
    ,       COUNT (CASE WHEN bucket = 4 THEN 1 END)     AS from_15_up
    FROM       got_bucket
    ;If something is exactly (down to the second) 5, or 8, or 15 minutes old, it will go into the higher-numbered bucket.
    Be careful! When using 2 or more arithmetic operators in the same expression, use parentheses to group them the way you want. "SYSDATE - (5/24*60)" means 12 and a half days ago.
    Whenever you have a question, say which version of Oracle you're running. The query above will work in Oracle 9.1 and higher, but if you have Oracle 11.1 or higher, then you could make it a little simpler, using the SELECT ... PIVOT feature in the main query.

  • Estimated Row Count Query. How to avoid?

    Hi,
    I use Jdeveloper 9.0.3. Production to evaluate its capability with swing JClient. I found that before the form is shown on the desktop the framework runs the quety like this.
    SELECT /*+ ORDERED USE_NL(RtEmployee) USE_NL(RtUnitmeasure) USE_NL(RtGroupcode) */ count(1) FROM (SELECT RtProduct.PRODUCTID, RtProduct.ID_UM, RtUnitmeasure.SHORTNAMERUS, RtProduct.ID_COMMON, RtProduct.NAMERUS, RtProduct.NAMEENG, RtProduct.ID_GROUPCODE, RtGroupcode.GROUPCODE, RtProduct.IS_GRCODEFAULT, RtProduct.VAT, RtProduct.PRICE_U, RtProduct.MINPRICE, RtProduct.MARKUP, RtProduct.IS_DISCOUNT, RtProduct.NOTE, RtEmployee.SURNAME, RtProduct.CREATEDATE, RtProduct.MODIFYDATE, RtProduct.ID_OPERATOR, RtProduct.ID, RtUnitmeasure.ID AS ID1, RtGroupcode.ID AS ID2, RtEmployee.ID AS ID3 FROM RT_PRODUCT RtProduct, RT_EMPLOYEE RtEmployee, RT_UNITMEASURE RtUnitmeasure, RT_GROUPCODE RtGroupcode WHERE ((RtProduct.ID_OPERATOR = RtEmployee.ID)AND (RtProduct.ID_UM = RtUnitmeasure.ID))AND (RtProduct.ID_GROUPCODE = RtGroupcode.ID))
    to estimate dataset cardinality.
    Attention development team. Plase pay attention that the statement is badly formed as
    Hints entered into "Query Hint" field of the VO is placed in outer query but intended to use in sub query and will not be taken into account by optimizer!
    My question is how to avoid this query and/or how can I replace a functionality for estimating dataset cardinality by user defined function or so?
    Best regards,
    Vyacheslav

    I tried the suggested solution but could not figure
    out how to get the current query string to extract
    the WHERE clause to use in my own query or to get the
    parameters to build my own WHERE. How about a more
    explicit code example?
    Joe:
    A few different ways to do this. First, note that getQueryHitCount()
    method receives the following parameter:
    ViewRowSetImpl viewRowSet
    To get the entire query statement, you can call
    String entireQuery = vo.getQuery();
    For each row set (identified by the viewRowSet parameter), you have a
    set of where-clause parameters. Some of these are user provided
    (through setWhereClauseParam calls) and some of these are system
    provided (system provided params are primarily for view links).
    To get all where-clause params (both user and system params), you
    call:
    Object[] paramVals = viewRowSet.getParametersAsStorageTypes();
    Again, paramVals include both user and system param vals.
    If you do
    int noUserParams = viewRowSet.getWhereClauseParams().length;
    noUserParams will tell how many of these are user supplied:
    paramVals[0] .. paramVals[noUserParams-1] are user supplied and the
    rest are system supplied.
    If you want to get the Long postings are being truncated to ~1 kB at this time.

Maybe you are looking for