SQL question too many values?

Hey guys I have written an SQL script but it keeps erroring saying too many values I have this at the moment
SELECT DISTINCT Student.studentID, Student_medical.medical_complaint, Student_medical.medical_treatment
FROM Student, student_medical
IN (SELECT Student.studentID, Matron.staffno
FROM Student, Matron)
Group by Student.studentID;
Basically what I'm trying to do is match the student ID to the medical complaint and treatment they have had and then match the student ID to the Matron that gave the treatment
Any help you can provide would be greatly appreciated :)
Thanks for reading.

SomeoneElse wrote:
Your subquery is wrong. You're using an in clause with one column but your subquery has two columns.
Also, you have two tables in the subquery but no join condition.
Can you include the matron table in the join with your other tables instead of a subquery?I tried this but it when I do the where clause to match up the matron that treated the student I for some reason get a result where the student ID = the matron ID so student ID is 9 the matron ID will be 9 and next row the STUDENT ID could be 1 and the MATRON ID will be 1 which is wrong :(! so I'm not sure what to do

Similar Messages

  • Pl/Sql block returning too many values

    Hi
    Below is simple pl/sql block.while excecuting this i'm getting too many values exception
    declare
    attrib QW_ACCT_ATTR%ROWTYPE;
    begin
    SELECT ATTR_END_DATE, ATTR_NM, ATTR_VAL INTO attrib FROM (SELECT ATTR_END_DATE, ATTR_NM, ATTR_VAL FROM QW_ACCT_ATTR where CUST_ACCT_ID='5158660414' AND ATTR_NM = 'SS' ORDER BY ATTR_END_DATE DESC) where rownum = 1;
    DBMS_OUTPUT.PUT_LINE('end daate is...'||attrib.ATTR_END_DATE);
    end;
    could anybody please help me how to rewrite this qwery to get only one record.
    we are not supposed to use cursors here,
    thanks

    I am just changing your logic,
    declare
    attrib QW_ACCT_ATTR%ROWTYPE;
    begin
    SELECT ATTR_END_DATE, ATTR_NM, ATTR_VAL INTO attrib FROM (SELECT ATTR_END_DATE, ATTR_NM, ATTR_VAL FROM QW_ACCT_ATTR where CUST_ACCT_ID='5158660414' AND ATTR_NM = 'SS' ORDER BY ATTR_END_DATE DESC where rownum = 1) ;
    DBMS_OUTPUT.PUT_LINE('end daate is...'||attrib.ATTR_END_DATE);
    end;

  • PL/SQL: ORA-00913: too many values

    I don't find out why i'mg getting an "ORA-00913: too many values" Error
    This Example works fine:
    DECLARE
    TYPE session_type IS TABLE OF v$session%ROWTYPE ;
    blocking_sessions session_type;
    BEGIN
    select * bulk collect into blocking_sessions from v$session where blocking_session is not null;
    END;
    But in this Example i'm getting an ORA-00913. Can anybody tell me what i'm doing wrong?
    DECLARE
    TYPE session_type IS TABLE OF v$session%ROWTYPE ;
    blocking_sessions session_type;
    BEGIN
    select distinct blocking_session bulk collect into blocking_sessions from v$session where blocking_session is not null;
    END;
    select distinct blocking_session bulk collect into blocking_sessions from v$session where blocking_session is not null;
    ERROR at line 7:
    ORA-06550: line 7, column 70:
    PL/SQL: ORA-00913: too many values
    ORA-06550: line 7, column 1:
    PL/SQL: SQL Statement ignored

    OK this one works also:
    DECLARE
    TYPE session_type IS TABLE OF NUMBER ;
    blocking_sessions session_type;
    BEGIN
    select distinct blocking_session bulk collect into blocking_sessions from v$session where blocking_session is not null;
    END;
    But when i'm selecting for about 20 columns of a table with 30 columns. Do I have to declare every single column?

  • ROWTYPE and "Too many values" error...

    I'm in the middle of trying to understand the inner workings of %ROWTYPE and how I can copy the structure of a table with it. I think I'm understanding the gist of it, but I continue to fail in pinpointing the actual values I try to insert with an example tutorial I'm using at the moment and was hoping someone could help me understand my problem better so that I can bridge this mental gap I seem to be slipping into...
    That said, I have the following table: ct_employee
    The columns of this table (and their schema settings) are as follows:
    empno - NUMBER(4,0)
    ename - VARCHAR2(20)
    job - VARCHAR2(20)
    mgr - NUMBER(4,0)
    hiredate - DATE
    sal - NUMBER(7,2)
    comm - NUMBER(7,2)
    ct_department - NUMBER(2,0)The SQL I'm using in all this is the following:
    SET VERIFY OFF
    DEFINE emp_num = 7369;
    DECLARE
      emp_rec ct_retired_emps%ROWTYPE;
    BEGIN
      SELECT *
      INTO emp_rec
      FROM ct_employee
      WHERE empno = &emp_num;
      emp_rec.leavedate := SYSDATE;
      UPDATE ct_retired_emps SET ROW = emp_rec
      WHERE empno = &emp_num;
    END;As I hope you can tell from the above, I'm trying to create a variable (emp_rec) to store the structure of ct_employee where upon I then copy a record into emp_rec if and only if the empno column from ct_employee matches that of the emp_num "7369".
    I'm using SQL*PLUS with 10g in all this (+a program I love, by the way; it's really easy to use and very informative+) and when I press the "Run Script" button, I receive the following Script Output:
    Error report:
    ORA-06550: line 6, column 3:PL/SQL: ORA-00913: too many values
    ORA-06550: line 4, column 3:
    PL/SQL: SQL Statement ignored
    06550. 00000 - "line %s, column %s:\n%s"
    *Cause:    Usually a PL/SQL compilation error.
    *Action:>
    What I translate from this is that there's either a column number mismatch or else there's some value being attempted to be inserted into the variable I created that isn't matching the structure of the variable.
    Anyway, if someone around here could help me with this, you would make my day.

    It's still not updating the table. :(
    Here's where I am...
    I currently have the following pl/sql:
    SET VERIFY OFF
    DEFINE emp_num = 7369;
    DECLARE
      emp_rec ct_retired_emps%ROWTYPE;
    BEGIN
      SELECT *
      INTO emp_rec
      FROM ct_employee
      WHERE empno = &emp_num;
      emp_rec.leavedate := SYSDATE;
      UPDATE ct_retired_emps SET ROW = emp_rec
      WHERE empno = &emp_num;
    END;I'm trying to avoid as much hard-coding as possible, hence my use of the all selector, but because of this, SQL*Plus is now giving me a run-time error of the following:
    Error report:ORA-06550: line 6, column 3:
    PL/SQL: ORA-00913: too many values
    ORA-06550: line 4, column 3:
    PL/SQL: SQL Statement ignored
    06550. 00000 - "line %s, column %s:\n%s"
    *Cause:    Usually a PL/SQL compilation error.
    *Action:>
    So to remedy this, I then try the following revised PL/SQL:
    SET VERIFY OFFDEFINE emp_num = 7369;
    DECLARE
    emp_rec ct_retired_emps%ROWTYPE;
    BEGIN
    SELECT empno, ename, job, mgr, hiredate, SYSDATE as leavdate, sal, comm, ct_department
    INTO emp_rec
    FROM ct_employee
    WHERE empno = &emp_num;
    UPDATE ct_retired_emps SET ROW = emp_rec
    WHERE empno = &emp_num;
    END;>
    This time, everything runs smoothly, however, no information is updated into the ct_retired_emps! Ha!
    I've verified that there's a record in ct_employee that has "7369" for its empno column value, so there should be no missing value concern from all this. The only other thing I can think of is that there must be something askew with my logic, but as to what, I have no clue. Below are my columns for both tables:
    ct_employee -
    empno, ename, job, mgr, hiredate, sal, comm, ct_department
    ct_retired_emps -
    empno, ename, job, mgr, hiredate, leavedate, sal, comm, deptno
    My immediate questions:
    1.) I know nothing about debugging or troubleshooting PL/SQL, but I do know that I need to review the contents of the emp_rec variable above to see what all is inside it so that I can better assess what's going on here. I've tried to use DBMS_OUTPUT.PUT_LINE(emp_rec) but this does nothing and only creates another run-time error. What's a way in which I can output the contents of this regardless of run-time success? Would I need to code in an EXCEPTION section and THEN use DBMS_OUTPUT...?
    2.) SELECTING * in the first snippet above meant that I was disallowed the use of the emp_rec.leavedate := SYSDATE; after the SELECT. How might oneself SELECT * AND+ use emp_rec.leavedate := SYSDATE; in the same EXECUTION section?
    3.) With everything I've provided here, do you see any obvious issues?
    It doesn't take a genius to realize I'm new at this stuff, but I am trying to provide everything I can to learn here so if any of you can help this poor guy out, he'd be very grateful.

  • 11g-[nQSError: 42029] Subquery contains too many values for the IN predicat

    Hi,
    I am having 2 reports one is for subquery which returns inputs to Main report. Actually the report was working fine in 10g. But in 11g we are gettting following error:
    View Display Error
    Odbc driver returned an error (SQLExecDirectW).
    Error Details
    Error Codes: OPR4ONWY:U9IM8TAC:OI2DL65P
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 43113] Message returned from OBIS. [nQSError: 43119] Query Failed: [nQSError: 42029] Subquery contains too many values for the IN predicate.Please have your System Administrator look at the log for more details on this error. (HY000)
    Please have your System Administrator look at the log for more details on this error.
    Getting same error after modofying the parameter value MAX_EXPANDED_SUBQUERY_PREDICATES to 12000
    Please suggest what could be the other reason it may fail or any other settings we need to check.
    Regards,
    ckeng

    ckeng,
    Normally the IN clause has restriction of 10000 values in general sql/plsql we will go with inline queries i think model your rpd to generate inner queries
    select * from emp where dept_id in (Select distinct dept_id from dept);
    or have a condition/filter on sub report and make one more inner report with sub-filter but definitely it will cause performance issues.
    thanks,
    Saichand.v

  • Too many values.Not displaying all the data

    Hi All,
            I have facing the problem, while showing the map for different location. It says that Too many
    "ColumnName" Values.
    I cant find any where what is the limitation while representing the maps.......
    Please share any one have aware of it.
    Thanks in advance........
    ChinniKrishna T Database developer

    Hi,
    I think this article explain it:
    http://office.microsoft.com/en-001/excel/bubble-and-scatter-charts-in-power-view-HA104017427.aspx
     Tip    Pick a category that doesn’t have too many values. If the category has more than 2,000 values, you see a note that the chart is “showing representative sample” rather than all the categories.
    Regards,
    Christian HL
    Microsoft Online Community Support
    Please remember to click “Mark as Answer” on the post that helps you, and to click “Unmark as Answer” if a marked post does not actually answer your question. This can be beneficial to other community members reading the thread.

  • Too many values issues in sub query

    Hi ALL,
    I have a sql query like below :
    SELECT A,B,C,(SELECT D,E,F,G FROM ABC,DEF,... WHERE ABC.ID=DEF.ID......) NAME FROM TEST1,TEST2 WHERE .......
    So my subquery is throwing an error as too many values so how can i get column values in my subquery through single select statement.
    Please help me on this

    user13424229 wrote:
    Hi ALL,
    I have a sql query like below :
    SELECT A,B,C,(SELECT D,E,F,G FROM ABC,DEF,... WHERE ABC.ID=DEF.ID......) NAME FROM TEST1,TEST2 WHERE .......
    So my subquery is throwing an error as too many values so how can i get column values in my subquery through single select statement.Select Sub-Query can Select only 1 Column and 1 Record at a Time.
    If you wish to cater it, you will have to write the same select statement for all the columns individually. viz. D, E, F, G shall amount to 4 individual selects in your main select statement.

  • Insert ORA-00913: too many values  --  urgent help

    Hi there
    Its pretty urgent, got  stuck up....
    To avoid the undo snapshot error, I am using this procedure to migrate the smaller chunks of  huge volume of  table data into new tables. This below code works well if the columns  are very less. And this procedure is not working if the tables columns are morethan 30 columns and throwing the error   PL/SQL: ORA-00913: too many values
    CREATE OR REPLACE PROCEDURE migration AS
       TYPE array_tp IS TABLE OF tranproc%ROWTYPE;
       l_array array_tp;
       CURSOR c IS
          select * from tranproc p where trunc(date)<=trunc(sysdate)-180;
       l_cnt1 NUMBER :=0;
       l_cnt2 NUMBER :=0;
       l_cnt3 NUMBER :=0;
    BEGIN
       OPEN c;
       LOOP
          FETCH c BULK COLLECT INTO l_array LIMIT 10000;
          EXIT WHEN l_array.COUNT = 0;
          l_cnt1 := c%ROWCOUNT;
          FORALL i IN 1 .. l_array.COUNT
             INSERT INTO TMP_Transpoc VALUES l_array(i);
          l_cnt2 := l_cnt2 + SQL%rowcount;
       END LOOP;
       l_cnt3 := c%ROWCOUNT;
       CLOSE c;
       END;
    16    22    PL/SQL: ORA-00913: too many values
    its falling
    line 16: INSERT INTO TMP_Transpoc VALUES l_array(i);
    Above table i.e tranproc  has around 80 columns .
    i am not pl/sql expert, kindly advise how to resolve it.. i am fine with  alternative approach, just i need a smaller chunk commit.

    Actually, Direct Path does not necessarily require NOLOGGING. If you successfully invoke Direct Path (look for LOAD AS SELECT or DIRECT LOAD INTO in the execution plan) then you are inserting into blocks above the high-water mark (HWM) and there is virtually no UNDO generated for the changes in the table segment.
    However, the index maintenance (if any) will require UNDO, and it may be a lot. If this is going into a new table, then you should be able to create the index after the table is populated.
    Also beware of the NOLOGGING advice. In many cases, an individual SQL statement can not disable logging. And yet, if you do bypass REDO logging, be very sure you understand the consequences for your ability to recover.

  • Too many values when trying insert records by bulk collect

    Hi
    Can anyone advice on the bulk collect error please?
    Following is my code where I am getting too many values error...
    TYPE p_empid_type IS TABLE OF emp%ROWTYPE;
          v_empid               p_empid_type;
       BEGIN
          SELECT DISTINCT emp_id , 'ABC'
          BULK COLLECT INTO v_empid
                     FROM emp
                    WHERE empid IN (SELECT ord_id
                                       FROM table_x
                                      WHERE column_x = 'ABC');
          FORALL i IN v_empid.FIRST .. v_empid.LAST
             INSERT INTO my_table
                  VALUES v_empid(i);
          COMMIT;
    PL/SQL: ORA-00913: too many values in line - BULK COLLECT INTO v_empid

    Hello, since you're SELECTing a constant string, why not:
    TYPE p_empid_type IS TABLE OF INTEGER;
          v_empid               p_empid_type;
       BEGIN
          SELECT DISTINCT emp_id
          BULK COLLECT INTO v_empid
                     FROM emp
                    WHERE empid IN (SELECT ord_id
                                       FROM table_x
                                      WHERE column_x = 'ABC');
          FORALL i IN v_empid.FIRST .. v_empid.LAST
             INSERT INTO my_table
                  VALUES v_empid(i), 'ABC';
    Edit Untested: may not work
    This would be the best BULK COLLECT of all:
    INSERT /*+ APPEND */ INTO my_table
    SELECT DISTINCT emp_id , 'ABC'
      FROM emp
    WHERE empid IN (SELECT ord_id
          FROM table_x
         WHERE column_x = 'ABC');
          COMMIT;

  • ORA-00913: too many values - how to change decimal separator?

    I want to use SQL Developer's database export and need advice how to tweak the decimal separator from , to . in the sql-inserts created.
    Preferrably in SQL Developer, because this error can be easily happen and is hard to catch during import.+
    See below how to reproduce this:
    h3. 1.) ddl (ugly)
    set SERVEROUTPUT on
    CREATE TABLE AADECIMEXPORT
    "DECIMALNUMBER" NUMBER
    , "HUBBABUBBA" varchar2(20 byte)
    CREATE TABLE succeeded.
    h3. 2.) inserts by hand
    insert into AADECIMEXPORT(DECIMALNUMBER,HUBBABUBBA) values(10,'smells integer');
    insert into AADECIMEXPORT(DECIMALNUMBER,HUBBABUBBA) values(3.141592654,'smells rounded pi');
    1 rows inserted
    1 rows inserted
    ¨
    h3. 3.) select * from AADECIMEXPORT
    DECIMALNUMBER HUBBABUBBA
    10 SMELLS INTEGER
    3,141592654 smells rounded pi
    h3. 4.) then use sqldevelopers Tools-"database export" to export this table
    -- File created - tiistai-marraskuu-23-2010
    -- DDL for Table AADECIMEXPORT
    CREATE TABLE "AADECIMEXPORT"
    (     "DECIMALNUMBER" NUMBER,
         "HUBBABUBBA" VARCHAR2(20)
    -- DATA FOR TABLE AADECIMEXPORT
    -- FILTER = none used
    REM INSERTING into AADECIMEXPORT
    Insert into AADECIMEXPORT (DECIMALNUMBER,HUBBABUBBA) values (10,'smells integer');
    Insert into AADECIMEXPORT (DECIMALNUMBER,HUBBABUBBA) values (3,141592654,'smells rounded pi');
    -- END DATA FOR TABLE AADECIMEXPORT
    h3. 5.) Test the insert
    Insert into AADECIMEXPORT (DECIMALNUMBER,HUBBABUBBA) values (3,141592654,'smells rounded pi');
    Error starting at line 49 in command:
    Insert into AADECIMEXPORT (DECIMALNUMBER,HUBBABUBBA) values (3,141592654,'smells rounded pi')
    Error at Command Line:49 Column:12
    Error report:
    SQL Error: ORA-00913: too many values
    00913. 00000 - "too many values"
    *CAUSE:   
    *Action:                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

    About
    Oracle SQL Developer 2.1.1.64
    Version 2.1.1.64
    Build MAIN-64.45
    Copyright © 2005,2009 Oracle. All Rights Reserved.
    IDE Version: 11.1.1.2.36.55.30
    Product ID: oracle.sqldeveloper
    Product Version: 11.1.1.64.45
    Version
    Component     Version
    =========     =======
    Java(TM) Platform     1.6.0_14
    Oracle IDE     2.1.1.64.45
    Versioning Support     2.1.1.64.45

  • JDBC receiver error - ORA-00913: too many values

    Hi all,
    Facing a strange issue with proxy-jdbc issue.
    Message fails with error - ORA-00913: too many values
    Handling the missing fields from the source with - Empty String in the comm. Channel
    Empty string handling works fine - whenever there is no value coming form souce, null is inserted into the field value on the DB.
    Trying to insert the same data manually, it works fine. But gives this error END2END.
    When testing END2END, if the field values are made couple of chars less in length, that works fine....fails with the actual data.....seems to be the issue with the field lenghts on the DB.
    Increased all the lengths by 10 chars..still does not work.
    Except the key fields of the DB table, all others are nullable.
    Is there anything else, I am missing...
    Note: Generally, ORA-00913: too many values comes when the number of fields and the number of values donot match in an INSERT/UPDATE statement.
    reg

    Hi
    When testing END2END, if the field values are made couple of chars less in length, that works fine....fails with the actual data.....seems to be the issue with the field lenghts on the DB.
    Increased all the lengths by 10 chars..still does not work
    Well the error you mention generally occurs when the insert statement has more fields mention than in the table .ie you are mentioning extra field .aur this error is due to the field lenght of any of the filed you had to check in the oracle itself  by using the command DESCRIBE there you ll be able to see the field lenght of each and every field .
    hope your problem got resolved :
    Regard's
    Chetan Ahuja

  • ORA-00913 too many values issue

    Hi,
    Here is my query,can anyone please suggest where could be wrong in my update query.I am getting error ORA-00913 says too many values in the clause
    update [email protected]
    set processed_flag=1
    WHERE gs_FGCODE='SH' and GS_ID in
    (select ID from [email protected]
    WHERE ID in
    (select hid from [email protected]
    WHERE JAG_NAME='JAGET_ASN_Number' and JAG_VALUE in (
    select CUSTOMER_ID, c.CUST_NAME, SALES_ORG, DIVISION,SHIPMENT_ID,BILL_OF_LADING,
    SAP_PROCESS_DATE SHIP_DATE, SAP_CREATE_DATE, IDOC_NUMBER,
    to_char(IDOC_SENT_DATE), IDOC_LAST_UPDATE_DATE, IDOC_STATUS, IDOC_STATUS_DESC
    from [email protected] a, edit_customer_master c
    where
    c.DIV_CUSTOMER_ID=to_char(to_number(a.CUSTOMER_ID)) and SAP_CREATE_DATE > trunc(sysdate-7) and
    customer_id||DIVISION<>'0010000064GVA' and
    BILL_OF_LADING||DIVISION in (
    select substr(BILL_OF_LADING,1,17)||division from [email protected]
    minus
    select distinct substr(asn_number,1,17)||replace(replace(replace(business_id,63,'LLL'),2,'JJJ'),1,'KKK') from EDIT_OUT_TRANSACTION WHERE business_id not in (3,23) and TRANSACTION_TYPE=856
    Appreciate your any inputs
    TIA,
    RMG

    update [email protected]
    set processed_flag=1
    WHERE gs_FGCODE='SH' and GS_ID in
    (select ID from [email protected]
    WHERE ID in
    (select hid from [email protected]
    WHERE JAG_NAME='JAGET_ASN_Number' and JAG_VALUE in (
    select CUSTOMER_ID, c.CUST_NAME, SALES_ORG, DIVISION,SHIPMENT_ID,BILL_OF_LADING,
    SAP_PROCESS_DATE SHIP_DATE, SAP_CREATE_DATE, IDOC_NUMBER,
    to_char(IDOC_SENT_DATE), IDOC_LAST_UPDATE_DATE, IDOC_STATUS, IDOC_STATUS_DESCfrom [email protected] a, edit_customer_master c
    where
    c.DIV_CUSTOMER_ID=to_char(to_number(a.CUSTOMER_ID)) and SAP_CREATE_DATE > trunc(sysdate-7) and
    customer_id||DIVISION<>'0010000064GVA' and
    BILL_OF_LADING||DIVISION in (
    select substr(BILL_OF_LADING,1,17)||division from [email protected]
    minus
    select distinct substr(asn_number,1,17)||replace(replace(replace(business_id,63,'LLL'),2,'JJJ'),1,'KKK') from EDIT_OUT_TRANSACTION WHERE business_id not in (3,23) and TRANSACTION_TYPE=856
    I think here(bolded part) is the problem.

  • HT201363 my id is blocke I answered security question too many wrong questions and i dont have rescue email plz help

    my id is blocke I answered security question too many wrong questions and i dont have rescue email plz help

    hi. i forget the answer of my security question and i dont have an rescue email. how i can change it? please help me. and now im in Iran and i cant call too security team . please help me.
    ipad
    iphone

  • Multiple Selects = Too many values?

    Hi. I have a table of employees, a table of departments (called accounts), and a table containing test results. Every employee is supposed to have a test each year, unless the test was "positive" in a previous year.
    I want a query that lists everyone that hasn't taken the test, but I don't want to include the ones that were "positive".
    I did some incremental query building, making sure each query returned what I wanted before making it more complex (for me). Apologies if I'm providing too much information.
    This returns just the test results that are positive.
    -- Get list of result codes that are positive
    SELECT ResultID FROM Result WHERE ResultTreatAsPositive = 1;
    This returns everyone that has had a positive result at any time.
    -- Get list of people having positive results at any time
    SELECT DISTINCT empID, empFName, empLName, empAcctNum, actDescription
    FROM Employee, Account, TestResult
    WHERE empID = testEmployeeID
    AND empAcctNum = actNumber
    AND empActive = 1
    AND testResult IN (SELECT ResultID FROM Result WHERE ResultTreatAsPositive = 1)
    ORDER BY actDescription;
    This returns everyone that had a test during the previous year
    -- Get everyone having any results during date range
    SELECT DISTINCT empID, empFName, empLName, empAcctNum, actDescription
    FROM Employee, Account, TestResult
    WHERE empID = testEmployeeID
    AND empAcctNum = actNumber
    AND testReadDate BETWEEN to_date('2005/01/01', 'yyyy/mm/dd') AND to_date('2005/12/31', 'yyyy/mm/dd')
    AND empActive = 1
    ORDER BY actDescription;
    So far, so good. However, when I try to put all of these together, I get errors:
    This should return all employees minus ones with a positive result (at any time) and minus any test taken this year
    -- Get everyone else
    SELECT DISTINCT empID, empFName, empLName, empTitle, empAcctNum, actDescription
    FROM Employee, Account, TestResult
    WHERE empID = testEmployeeID
    AND empAcctNum = actNumber
    AND empActive = 1
    AND (empID NOT IN (SELECT DISTINCT empID, empFName, empLName, empAcctNum, actDescription FROM Employee, Account, TestResult WHERE empID = testEmployeeID AND empAcctNum = actNumber AND empActive = 1 AND testResult IN (SELECT ResultID FROM Result WHERE ResultTreatAsPositive = 1)))
    AND (empID NOT IN (SELECT DISTINCT empID, empFName, empLName, empAcctNum, actDescription FROM Employee, Account, TestResult WHERE empID = testEmployeeID AND empAcctNum = actNumber AND testReadDate BETWEEN to_date('1988/01/01', 'yyyy/mm/dd') AND to_date('2007/01/01', 'yyyy/mm/dd') AND empActive = 1))
    ORDER BY actDescription, empLName, empFName;
    I get ERROR at line 6: (my first empID NOT IN line)
    ORA-00913: too many values.
    How can I write this query so that it works? Any tips are appreciated, even those that point out I can't write a query very well!
    --G                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

    Missed that one and was thrown off by those 2005, 1998, 2007 dates …
    Looks like you’re on the right track though.Sorry. I cobbled that code from some of my tests (grab everything -- seeing multiple rows makes me feel better!)
    Looking at your suggestion on TRUNC, I can't really do that, as user can specify everyone that doesn't qualify for a single month -- apparently, some departments are assigned a month to have their tests, while others can go the entire year.
    Thanks for the help.
    --G                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • ORA-00913: too many values when running a Delete

    Hi When I run the both the select statements inlcuding the minus it works,
    but when I run the Delete statement it gives me error ORA-00913: too many values
    Can someone tell me if there is anything I am missing
    DELETE FROM tablename where rowid in
    select rowid,
    to_number(lndr_spcl_allow_int_rate_pct),
    tablename.lndr_spcl_end_prin_bal_amt,
    tablename.lndr_spcl_avg_dly_prin_bal_amt,
    tablename.lndr_spcl_adj_avg_dly_prin_amt,
    tablename.calculated_amt,
    tablename.paid_amt,
    tablename.reported_amt,
    --tablename.processing_date_id, this date changes 
    tablename.date_paid_id,
    tablename.lndr_spcl_allwnc_ctgry_id,
    tablename.affected_period_qtr_date_id,
    tablename.loan_type_id,
    tablename.lndr_id,
    tablename.report_fiscal_qtr_date_id,
    tablename.report_qtr_date_id,
    tablename.lndr_geography_id,
    tablename.lndr_billing_code_id,
    tablename.orig_billing_code,
    tablename.orig_allowance_category,
    tablename.form_id,
    tablename.srvcr_id,
    tablename.document_long_desc,
    tablename.document_date_id,
    tablename.affected_period_qtr_date_desc,
    tablename.document_type
    from tablename
    where lndr_spcl_allow_int_rate_pct not like ('%V%')
    and lndr_spcl_allow_int_rate_pct not like ('N%')
    minus
    select min(rowid),
    to_number(lndr_spcl_allow_int_rate_pct),
    tablename.lndr_spcl_end_prin_bal_amt,
    tablename.lndr_spcl_avg_dly_prin_bal_amt,
    tablename.lndr_spcl_adj_avg_dly_prin_amt,
    tablename.calculated_amt,
    tablename.paid_amt,
    tablename.reported_amt,
    --tablename.processing_date_id, this date changes 
    tablename.date_paid_id,
    tablename.lndr_spcl_allwnc_ctgry_id,
    tablename.affected_period_qtr_date_id,
    tablename.loan_type_id,
    tablename.lndr_id,
    tablename.report_fiscal_qtr_date_id,
    tablename.report_qtr_date_id,
    tablename.lndr_geography_id,
    tablename.lndr_billing_code_id,
    tablename.orig_billing_code,
    tablename.orig_allowance_category,
    tablename.form_id,
    tablename.srvcr_id,
    tablename.document_long_desc,
    tablename.document_date_id,
    tablename.affected_period_qtr_date_desc,
    tablename.document_type
    from tablename
    where lndr_spcl_allow_int_rate_pct not like ('%V%')
    and lndr_spcl_allow_int_rate_pct not like ('N%')
    group by to_number(lndr_spcl_allow_int_rate_pct),
    tablename.lndr_spcl_end_prin_bal_amt,
    tablename.lndr_spcl_avg_dly_prin_bal_amt,
    tablename.lndr_spcl_adj_avg_dly_prin_amt,
    tablename.calculated_amt,
    tablename.paid_amt,
    tablename.reported_amt,
    --tablename.processing_date_id, this date changes 
    tablename.date_paid_id,
    tablename.lndr_spcl_allwnc_ctgry_id,
    tablename.affected_period_qtr_date_id,
    tablename.loan_type_id,
    tablename.lndr_id,
    tablename.report_fiscal_qtr_date_id,
    tablename.report_qtr_date_id,
    tablename.lndr_geography_id,
    tablename.lndr_billing_code_id,
    tablename.orig_billing_code,
    tablename.orig_allowance_category,
    tablename.form_id,
    tablename.srvcr_id,
    tablename.document_long_desc,
    tablename.document_date_id,
    tablename.affected_period_qtr_date_desc,
    tablename.document_type)

    Perhaps this should be your final query ->
    DELETE FROM tablename
    where rowid in (
                      select rowid
                      from tablename
                      where lndr_spcl_allow_int_rate_pct not like ('%V%')
                      and lndr_spcl_allow_int_rate_pct not like ('N%')
                      minus
                      select min(rowid)
                      from tablename
                      where lndr_spcl_allow_int_rate_pct not like ('%V%')
                      and lndr_spcl_allow_int_rate_pct not like ('N%')
                      group by to_number(lndr_spcl_allow_int_rate_pct),
                      tablename.lndr_spcl_end_prin_bal_amt,
                      tablename.lndr_spcl_avg_dly_prin_bal_amt,
                      tablename.lndr_spcl_adj_avg_dly_prin_amt,
                      tablename.calculated_amt,
                      tablename.paid_amt,
                      tablename.reported_amt,
                      --tablename.processing_date_id, this date changes
                      tablename.date_paid_id,
                      tablename.lndr_spcl_allwnc_ctgry_id,
                      tablename.affected_period_qtr_date_id,
                      tablename.loan_type_id,
                      tablename.lndr_id,
                      tablename.report_fiscal_qtr_date_id,
                      tablename.report_qtr_date_id,
                      tablename.lndr_geography_id,
                      tablename.lndr_billing_code_id,
                      tablename.orig_billing_code,
                      tablename.orig_allowance_category,
                      tablename.form_id,
                      tablename.srvcr_id,
                      tablename.document_long_desc,
                      tablename.document_date_id,
                      tablename.affected_period_qtr_date_desc,
                      tablename.document_type
                    );N.B.: Not Tested...
    Regards.
    Satyaki De.

Maybe you are looking for