ROWTYPE and "Too many values" error...

I'm in the middle of trying to understand the inner workings of %ROWTYPE and how I can copy the structure of a table with it. I think I'm understanding the gist of it, but I continue to fail in pinpointing the actual values I try to insert with an example tutorial I'm using at the moment and was hoping someone could help me understand my problem better so that I can bridge this mental gap I seem to be slipping into...
That said, I have the following table: ct_employee
The columns of this table (and their schema settings) are as follows:
empno - NUMBER(4,0)
ename - VARCHAR2(20)
job - VARCHAR2(20)
mgr - NUMBER(4,0)
hiredate - DATE
sal - NUMBER(7,2)
comm - NUMBER(7,2)
ct_department - NUMBER(2,0)The SQL I'm using in all this is the following:
SET VERIFY OFF
DEFINE emp_num = 7369;
DECLARE
  emp_rec ct_retired_emps%ROWTYPE;
BEGIN
  SELECT *
  INTO emp_rec
  FROM ct_employee
  WHERE empno = &emp_num;
  emp_rec.leavedate := SYSDATE;
  UPDATE ct_retired_emps SET ROW = emp_rec
  WHERE empno = &emp_num;
END;As I hope you can tell from the above, I'm trying to create a variable (emp_rec) to store the structure of ct_employee where upon I then copy a record into emp_rec if and only if the empno column from ct_employee matches that of the emp_num "7369".
I'm using SQL*PLUS with 10g in all this (+a program I love, by the way; it's really easy to use and very informative+) and when I press the "Run Script" button, I receive the following Script Output:
Error report:
ORA-06550: line 6, column 3:PL/SQL: ORA-00913: too many values
ORA-06550: line 4, column 3:
PL/SQL: SQL Statement ignored
06550. 00000 - "line %s, column %s:\n%s"
*Cause:    Usually a PL/SQL compilation error.
*Action:>
What I translate from this is that there's either a column number mismatch or else there's some value being attempted to be inserted into the variable I created that isn't matching the structure of the variable.
Anyway, if someone around here could help me with this, you would make my day.

It's still not updating the table. :(
Here's where I am...
I currently have the following pl/sql:
SET VERIFY OFF
DEFINE emp_num = 7369;
DECLARE
  emp_rec ct_retired_emps%ROWTYPE;
BEGIN
  SELECT *
  INTO emp_rec
  FROM ct_employee
  WHERE empno = &emp_num;
  emp_rec.leavedate := SYSDATE;
  UPDATE ct_retired_emps SET ROW = emp_rec
  WHERE empno = &emp_num;
END;I'm trying to avoid as much hard-coding as possible, hence my use of the all selector, but because of this, SQL*Plus is now giving me a run-time error of the following:
Error report:ORA-06550: line 6, column 3:
PL/SQL: ORA-00913: too many values
ORA-06550: line 4, column 3:
PL/SQL: SQL Statement ignored
06550. 00000 - "line %s, column %s:\n%s"
*Cause:    Usually a PL/SQL compilation error.
*Action:>
So to remedy this, I then try the following revised PL/SQL:
SET VERIFY OFFDEFINE emp_num = 7369;
DECLARE
emp_rec ct_retired_emps%ROWTYPE;
BEGIN
SELECT empno, ename, job, mgr, hiredate, SYSDATE as leavdate, sal, comm, ct_department
INTO emp_rec
FROM ct_employee
WHERE empno = &emp_num;
UPDATE ct_retired_emps SET ROW = emp_rec
WHERE empno = &emp_num;
END;>
This time, everything runs smoothly, however, no information is updated into the ct_retired_emps! Ha!
I've verified that there's a record in ct_employee that has "7369" for its empno column value, so there should be no missing value concern from all this. The only other thing I can think of is that there must be something askew with my logic, but as to what, I have no clue. Below are my columns for both tables:
ct_employee -
empno, ename, job, mgr, hiredate, sal, comm, ct_department
ct_retired_emps -
empno, ename, job, mgr, hiredate, leavedate, sal, comm, deptno
My immediate questions:
1.) I know nothing about debugging or troubleshooting PL/SQL, but I do know that I need to review the contents of the emp_rec variable above to see what all is inside it so that I can better assess what's going on here. I've tried to use DBMS_OUTPUT.PUT_LINE(emp_rec) but this does nothing and only creates another run-time error. What's a way in which I can output the contents of this regardless of run-time success? Would I need to code in an EXCEPTION section and THEN use DBMS_OUTPUT...?
2.) SELECTING * in the first snippet above meant that I was disallowed the use of the emp_rec.leavedate := SYSDATE; after the SELECT. How might oneself SELECT * AND+ use emp_rec.leavedate := SYSDATE; in the same EXECUTION section?
3.) With everything I've provided here, do you see any obvious issues?
It doesn't take a genius to realize I'm new at this stuff, but I am trying to provide everything I can to learn here so if any of you can help this poor guy out, he'd be very grateful.

Similar Messages

  • Too many values when trying insert records by bulk collect

    Hi
    Can anyone advice on the bulk collect error please?
    Following is my code where I am getting too many values error...
    TYPE p_empid_type IS TABLE OF emp%ROWTYPE;
          v_empid               p_empid_type;
       BEGIN
          SELECT DISTINCT emp_id , 'ABC'
          BULK COLLECT INTO v_empid
                     FROM emp
                    WHERE empid IN (SELECT ord_id
                                       FROM table_x
                                      WHERE column_x = 'ABC');
          FORALL i IN v_empid.FIRST .. v_empid.LAST
             INSERT INTO my_table
                  VALUES v_empid(i);
          COMMIT;
    PL/SQL: ORA-00913: too many values in line - BULK COLLECT INTO v_empid

    Hello, since you're SELECTing a constant string, why not:
    TYPE p_empid_type IS TABLE OF INTEGER;
          v_empid               p_empid_type;
       BEGIN
          SELECT DISTINCT emp_id
          BULK COLLECT INTO v_empid
                     FROM emp
                    WHERE empid IN (SELECT ord_id
                                       FROM table_x
                                      WHERE column_x = 'ABC');
          FORALL i IN v_empid.FIRST .. v_empid.LAST
             INSERT INTO my_table
                  VALUES v_empid(i), 'ABC';
    Edit Untested: may not work
    This would be the best BULK COLLECT of all:
    INSERT /*+ APPEND */ INTO my_table
    SELECT DISTINCT emp_id , 'ABC'
      FROM emp
    WHERE empid IN (SELECT ord_id
          FROM table_x
         WHERE column_x = 'ABC');
          COMMIT;

  • PL/SQL: ORA-00913: too many values

    I don't find out why i'mg getting an "ORA-00913: too many values" Error
    This Example works fine:
    DECLARE
    TYPE session_type IS TABLE OF v$session%ROWTYPE ;
    blocking_sessions session_type;
    BEGIN
    select * bulk collect into blocking_sessions from v$session where blocking_session is not null;
    END;
    But in this Example i'm getting an ORA-00913. Can anybody tell me what i'm doing wrong?
    DECLARE
    TYPE session_type IS TABLE OF v$session%ROWTYPE ;
    blocking_sessions session_type;
    BEGIN
    select distinct blocking_session bulk collect into blocking_sessions from v$session where blocking_session is not null;
    END;
    select distinct blocking_session bulk collect into blocking_sessions from v$session where blocking_session is not null;
    ERROR at line 7:
    ORA-06550: line 7, column 70:
    PL/SQL: ORA-00913: too many values
    ORA-06550: line 7, column 1:
    PL/SQL: SQL Statement ignored

    OK this one works also:
    DECLARE
    TYPE session_type IS TABLE OF NUMBER ;
    blocking_sessions session_type;
    BEGIN
    select distinct blocking_session bulk collect into blocking_sessions from v$session where blocking_session is not null;
    END;
    But when i'm selecting for about 20 columns of a table with 30 columns. Do I have to declare every single column?

  • JDBC receiver error - ORA-00913: too many values

    Hi all,
    Facing a strange issue with proxy-jdbc issue.
    Message fails with error - ORA-00913: too many values
    Handling the missing fields from the source with - Empty String in the comm. Channel
    Empty string handling works fine - whenever there is no value coming form souce, null is inserted into the field value on the DB.
    Trying to insert the same data manually, it works fine. But gives this error END2END.
    When testing END2END, if the field values are made couple of chars less in length, that works fine....fails with the actual data.....seems to be the issue with the field lenghts on the DB.
    Increased all the lengths by 10 chars..still does not work.
    Except the key fields of the DB table, all others are nullable.
    Is there anything else, I am missing...
    Note: Generally, ORA-00913: too many values comes when the number of fields and the number of values donot match in an INSERT/UPDATE statement.
    reg

    Hi
    When testing END2END, if the field values are made couple of chars less in length, that works fine....fails with the actual data.....seems to be the issue with the field lenghts on the DB.
    Increased all the lengths by 10 chars..still does not work
    Well the error you mention generally occurs when the insert statement has more fields mention than in the table .ie you are mentioning extra field .aur this error is due to the field lenght of any of the filed you had to check in the oracle itself  by using the command DESCRIBE there you ll be able to see the field lenght of each and every field .
    hope your problem got resolved :
    Regard's
    Chetan Ahuja

  • How do I open a library book on a recently acquired Samsung Nook?  I get "too many activations" error. I have an active adobe digital editions and my other devices work fine.  Custumer Support was unable to get the new device activated.

    I have a library book on the bookshelf of a newly acquired Samsung Nook but I cannot open it.  My other devices work fine but I cannot activate the new unit.  I get a "too many activations" error message.  Customer Services say I have a valid license but they deemed my problem to be a technical issue and referred me to the forum.  Thanks

    Step 1 - by trial and error...
    So far, I have been able to create physical files containing MP3 and JPG on the NAS using the Windows XP systems to copy from shared locations on the Vista and Win7 boxes.  This process has been aided by the use of a 600 GB SATA 2 capable hard drive enclosure.  I first attach to Win 7 or Win Vista and reboot to see the local drive spaces formatted on the portable device.  Then I copy files from the user's private directories to the public drive space.  When the portable drive is wired to an XP box, I can use Windows to move the files from the portable device to the NAS without any of the more advanced file attributes being copied to the NAS.  Once the files are on the NAS, I can add the new folder(s) to iTunes on any of the computers and voila, the data becomes sharable via iTunes.  So far, this works for anything that I have completely purchased, or for MP3's I made from the AIC files created when I purchased alblums via iTunes. 
    I have three huge boxes full of vynl records I've accumulated.  The ones that I've successfully digitized via a turntable attached to the sound card on one of my computers and third party software, have found their way to the NAS after being imported into iTunes and using it to bring down available album art work.  In general I've been reasonably well pleased with the sound quality of digital MP3 files created this way, but the software I've been using sometimes has serious problems automatically separating individual songs from the album tracks and re-converting "one at a time" isn't very efficient.

  • Insert ORA-00913: too many values  --  urgent help

    Hi there
    Its pretty urgent, got  stuck up....
    To avoid the undo snapshot error, I am using this procedure to migrate the smaller chunks of  huge volume of  table data into new tables. This below code works well if the columns  are very less. And this procedure is not working if the tables columns are morethan 30 columns and throwing the error   PL/SQL: ORA-00913: too many values
    CREATE OR REPLACE PROCEDURE migration AS
       TYPE array_tp IS TABLE OF tranproc%ROWTYPE;
       l_array array_tp;
       CURSOR c IS
          select * from tranproc p where trunc(date)<=trunc(sysdate)-180;
       l_cnt1 NUMBER :=0;
       l_cnt2 NUMBER :=0;
       l_cnt3 NUMBER :=0;
    BEGIN
       OPEN c;
       LOOP
          FETCH c BULK COLLECT INTO l_array LIMIT 10000;
          EXIT WHEN l_array.COUNT = 0;
          l_cnt1 := c%ROWCOUNT;
          FORALL i IN 1 .. l_array.COUNT
             INSERT INTO TMP_Transpoc VALUES l_array(i);
          l_cnt2 := l_cnt2 + SQL%rowcount;
       END LOOP;
       l_cnt3 := c%ROWCOUNT;
       CLOSE c;
       END;
    16    22    PL/SQL: ORA-00913: too many values
    its falling
    line 16: INSERT INTO TMP_Transpoc VALUES l_array(i);
    Above table i.e tranproc  has around 80 columns .
    i am not pl/sql expert, kindly advise how to resolve it.. i am fine with  alternative approach, just i need a smaller chunk commit.

    Actually, Direct Path does not necessarily require NOLOGGING. If you successfully invoke Direct Path (look for LOAD AS SELECT or DIRECT LOAD INTO in the execution plan) then you are inserting into blocks above the high-water mark (HWM) and there is virtually no UNDO generated for the changes in the table segment.
    However, the index maintenance (if any) will require UNDO, and it may be a lot. If this is going into a new table, then you should be able to create the index after the table is populated.
    Also beware of the NOLOGGING advice. In many cases, an individual SQL statement can not disable logging. And yet, if you do bypass REDO logging, be very sure you understand the consequences for your ability to recover.

  • 11g-[nQSError: 42029] Subquery contains too many values for the IN predicat

    Hi,
    I am having 2 reports one is for subquery which returns inputs to Main report. Actually the report was working fine in 10g. But in 11g we are gettting following error:
    View Display Error
    Odbc driver returned an error (SQLExecDirectW).
    Error Details
    Error Codes: OPR4ONWY:U9IM8TAC:OI2DL65P
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 43113] Message returned from OBIS. [nQSError: 43119] Query Failed: [nQSError: 42029] Subquery contains too many values for the IN predicate.Please have your System Administrator look at the log for more details on this error. (HY000)
    Please have your System Administrator look at the log for more details on this error.
    Getting same error after modofying the parameter value MAX_EXPANDED_SUBQUERY_PREDICATES to 12000
    Please suggest what could be the other reason it may fail or any other settings we need to check.
    Regards,
    ckeng

    ckeng,
    Normally the IN clause has restriction of 10000 values in general sql/plsql we will go with inline queries i think model your rpd to generate inner queries
    select * from emp where dept_id in (Select distinct dept_id from dept);
    or have a condition/filter on sub report and make one more inner report with sub-filter but definitely it will cause performance issues.
    thanks,
    Saichand.v

  • SQL question too many values?

    Hey guys I have written an SQL script but it keeps erroring saying too many values I have this at the moment
    SELECT DISTINCT Student.studentID, Student_medical.medical_complaint, Student_medical.medical_treatment
    FROM Student, student_medical
    IN (SELECT Student.studentID, Matron.staffno
    FROM Student, Matron)
    Group by Student.studentID;
    Basically what I'm trying to do is match the student ID to the medical complaint and treatment they have had and then match the student ID to the Matron that gave the treatment
    Any help you can provide would be greatly appreciated :)
    Thanks for reading.

    SomeoneElse wrote:
    Your subquery is wrong. You're using an in clause with one column but your subquery has two columns.
    Also, you have two tables in the subquery but no join condition.
    Can you include the matron table in the join with your other tables instead of a subquery?I tried this but it when I do the where clause to match up the matron that treated the student I for some reason get a result where the student ID = the matron ID so student ID is 9 the matron ID will be 9 and next row the STUDENT ID could be 1 and the MATRON ID will be 1 which is wrong :(! so I'm not sure what to do

  • Pl/Sql block returning too many values

    Hi
    Below is simple pl/sql block.while excecuting this i'm getting too many values exception
    declare
    attrib QW_ACCT_ATTR%ROWTYPE;
    begin
    SELECT ATTR_END_DATE, ATTR_NM, ATTR_VAL INTO attrib FROM (SELECT ATTR_END_DATE, ATTR_NM, ATTR_VAL FROM QW_ACCT_ATTR where CUST_ACCT_ID='5158660414' AND ATTR_NM = 'SS' ORDER BY ATTR_END_DATE DESC) where rownum = 1;
    DBMS_OUTPUT.PUT_LINE('end daate is...'||attrib.ATTR_END_DATE);
    end;
    could anybody please help me how to rewrite this qwery to get only one record.
    we are not supposed to use cursors here,
    thanks

    I am just changing your logic,
    declare
    attrib QW_ACCT_ATTR%ROWTYPE;
    begin
    SELECT ATTR_END_DATE, ATTR_NM, ATTR_VAL INTO attrib FROM (SELECT ATTR_END_DATE, ATTR_NM, ATTR_VAL FROM QW_ACCT_ATTR where CUST_ACCT_ID='5158660414' AND ATTR_NM = 'SS' ORDER BY ATTR_END_DATE DESC where rownum = 1) ;
    DBMS_OUTPUT.PUT_LINE('end daate is...'||attrib.ATTR_END_DATE);
    end;

  • Too many values issues in sub query

    Hi ALL,
    I have a sql query like below :
    SELECT A,B,C,(SELECT D,E,F,G FROM ABC,DEF,... WHERE ABC.ID=DEF.ID......) NAME FROM TEST1,TEST2 WHERE .......
    So my subquery is throwing an error as too many values so how can i get column values in my subquery through single select statement.
    Please help me on this

    user13424229 wrote:
    Hi ALL,
    I have a sql query like below :
    SELECT A,B,C,(SELECT D,E,F,G FROM ABC,DEF,... WHERE ABC.ID=DEF.ID......) NAME FROM TEST1,TEST2 WHERE .......
    So my subquery is throwing an error as too many values so how can i get column values in my subquery through single select statement.Select Sub-Query can Select only 1 Column and 1 Record at a Time.
    If you wish to cater it, you will have to write the same select statement for all the columns individually. viz. D, E, F, G shall amount to 4 individual selects in your main select statement.

  • ORA-00913 too many values issue

    Hi,
    Here is my query,can anyone please suggest where could be wrong in my update query.I am getting error ORA-00913 says too many values in the clause
    update [email protected]
    set processed_flag=1
    WHERE gs_FGCODE='SH' and GS_ID in
    (select ID from [email protected]
    WHERE ID in
    (select hid from [email protected]
    WHERE JAG_NAME='JAGET_ASN_Number' and JAG_VALUE in (
    select CUSTOMER_ID, c.CUST_NAME, SALES_ORG, DIVISION,SHIPMENT_ID,BILL_OF_LADING,
    SAP_PROCESS_DATE SHIP_DATE, SAP_CREATE_DATE, IDOC_NUMBER,
    to_char(IDOC_SENT_DATE), IDOC_LAST_UPDATE_DATE, IDOC_STATUS, IDOC_STATUS_DESC
    from [email protected] a, edit_customer_master c
    where
    c.DIV_CUSTOMER_ID=to_char(to_number(a.CUSTOMER_ID)) and SAP_CREATE_DATE > trunc(sysdate-7) and
    customer_id||DIVISION<>'0010000064GVA' and
    BILL_OF_LADING||DIVISION in (
    select substr(BILL_OF_LADING,1,17)||division from [email protected]
    minus
    select distinct substr(asn_number,1,17)||replace(replace(replace(business_id,63,'LLL'),2,'JJJ'),1,'KKK') from EDIT_OUT_TRANSACTION WHERE business_id not in (3,23) and TRANSACTION_TYPE=856
    Appreciate your any inputs
    TIA,
    RMG

    update [email protected]
    set processed_flag=1
    WHERE gs_FGCODE='SH' and GS_ID in
    (select ID from [email protected]
    WHERE ID in
    (select hid from [email protected]
    WHERE JAG_NAME='JAGET_ASN_Number' and JAG_VALUE in (
    select CUSTOMER_ID, c.CUST_NAME, SALES_ORG, DIVISION,SHIPMENT_ID,BILL_OF_LADING,
    SAP_PROCESS_DATE SHIP_DATE, SAP_CREATE_DATE, IDOC_NUMBER,
    to_char(IDOC_SENT_DATE), IDOC_LAST_UPDATE_DATE, IDOC_STATUS, IDOC_STATUS_DESCfrom [email protected] a, edit_customer_master c
    where
    c.DIV_CUSTOMER_ID=to_char(to_number(a.CUSTOMER_ID)) and SAP_CREATE_DATE > trunc(sysdate-7) and
    customer_id||DIVISION<>'0010000064GVA' and
    BILL_OF_LADING||DIVISION in (
    select substr(BILL_OF_LADING,1,17)||division from [email protected]
    minus
    select distinct substr(asn_number,1,17)||replace(replace(replace(business_id,63,'LLL'),2,'JJJ'),1,'KKK') from EDIT_OUT_TRANSACTION WHERE business_id not in (3,23) and TRANSACTION_TYPE=856
    I think here(bolded part) is the problem.

  • Document contains too many nodes error when extracting xml tag name

    I Have a large xml file in which the tag contains ~: as the value.
    Now I am trying to extract all the tags which have ~: as the value and store that column using the following query and insert into a table.
    insert into space_md select distinct xmltype(extract(value(x), '/').getstringval()).getrootelement() COLUMN_NAME
    from gt_xmltype_tab gt, TABLE(XMLSequence(extract(gt.xmlfile1, '/ROWSET/ROW/*'))) x
    where instr(extract(value(x),'/').getstringval(),'~:') > 1;
    The xml file was generated using dbms_xmlgen.getxml. Table has 48 columns and around 4000 rows.
    My above insert query gave me an error of 31186 too many nodes error.
    I am using oracle version 10.2.0.3.
    Following are the set of commands I ran....
    SQL> insert into gt_xmltype_tab(xmlfile1)
    values(XMLType(bfilename('BKUP_RES','QC.xml'),nls_charset_id('AL32UTF8'))); 2
    1 row created.
    SQL> SQL>
    SQL> insert into restore_space_metadata select distinct 'QC', xmltype(extract(value(x), '/').getstringval()).getrootelement() COLUMN_NAME
    2 from gt_xmltype_tab gt, TABLE(XMLSequence(extract(gt.xmlfile1, '/ROWSET/ROW/*'))) x
    3 where instr(extract(value(x),'/').getstringval(),'~:') > 1;
    insert into restore_space_metadata select distinct 'QC', xmltype(extract(value(x), '/').getstringval()).getrootelement() COLUMN_NAME
    ERROR at line 1:
    ORA-31186: Document contains too many nodes
    Is there a better way of extracting the xml tag element name based on the xmltag content?
    There is one other table which has 172 columns but only 100 rows so it doesnt create any problem on that table.
    But this QC table has less columns but many many rows...
    Any suggestions

    There is a requiremnent of taking centain type of data backup and restore it.
    It was implemented on flat file approach which was giving errors.
    So it was implemented using XML approach.
    Read data, store in xml file and read from xml file and load it into table.
    Further, found that dbms_xmlstore is not able to handle tag only with whitespace
    and tried to use the loading xml file into xmltype table column and extract data.
    XMLTYPE column also has same problem of ignoring whitespace when used with extractvalue functions.
    So for the workaround I update xmlfile having only one more more whitespace in the tag to have ~: character once.
    After restoring data from xml to table I would run update qeury to update ~: to " ".
    Now instead of running blind update for all the tables and all the columns from ~: to " " I thouhgt whyy not create a xml file of tag having ~:
    along with its tablename.
    and thats where I found the problem of too many nodes...
    THe insert query you saw is populating table for table_name and column_name with tag having only ~: in it.
    I hope this gives you the fair idea of stuff I am doing.

  • Multiple Selects = Too many values?

    Hi. I have a table of employees, a table of departments (called accounts), and a table containing test results. Every employee is supposed to have a test each year, unless the test was "positive" in a previous year.
    I want a query that lists everyone that hasn't taken the test, but I don't want to include the ones that were "positive".
    I did some incremental query building, making sure each query returned what I wanted before making it more complex (for me). Apologies if I'm providing too much information.
    This returns just the test results that are positive.
    -- Get list of result codes that are positive
    SELECT ResultID FROM Result WHERE ResultTreatAsPositive = 1;
    This returns everyone that has had a positive result at any time.
    -- Get list of people having positive results at any time
    SELECT DISTINCT empID, empFName, empLName, empAcctNum, actDescription
    FROM Employee, Account, TestResult
    WHERE empID = testEmployeeID
    AND empAcctNum = actNumber
    AND empActive = 1
    AND testResult IN (SELECT ResultID FROM Result WHERE ResultTreatAsPositive = 1)
    ORDER BY actDescription;
    This returns everyone that had a test during the previous year
    -- Get everyone having any results during date range
    SELECT DISTINCT empID, empFName, empLName, empAcctNum, actDescription
    FROM Employee, Account, TestResult
    WHERE empID = testEmployeeID
    AND empAcctNum = actNumber
    AND testReadDate BETWEEN to_date('2005/01/01', 'yyyy/mm/dd') AND to_date('2005/12/31', 'yyyy/mm/dd')
    AND empActive = 1
    ORDER BY actDescription;
    So far, so good. However, when I try to put all of these together, I get errors:
    This should return all employees minus ones with a positive result (at any time) and minus any test taken this year
    -- Get everyone else
    SELECT DISTINCT empID, empFName, empLName, empTitle, empAcctNum, actDescription
    FROM Employee, Account, TestResult
    WHERE empID = testEmployeeID
    AND empAcctNum = actNumber
    AND empActive = 1
    AND (empID NOT IN (SELECT DISTINCT empID, empFName, empLName, empAcctNum, actDescription FROM Employee, Account, TestResult WHERE empID = testEmployeeID AND empAcctNum = actNumber AND empActive = 1 AND testResult IN (SELECT ResultID FROM Result WHERE ResultTreatAsPositive = 1)))
    AND (empID NOT IN (SELECT DISTINCT empID, empFName, empLName, empAcctNum, actDescription FROM Employee, Account, TestResult WHERE empID = testEmployeeID AND empAcctNum = actNumber AND testReadDate BETWEEN to_date('1988/01/01', 'yyyy/mm/dd') AND to_date('2007/01/01', 'yyyy/mm/dd') AND empActive = 1))
    ORDER BY actDescription, empLName, empFName;
    I get ERROR at line 6: (my first empID NOT IN line)
    ORA-00913: too many values.
    How can I write this query so that it works? Any tips are appreciated, even those that point out I can't write a query very well!
    --G                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

    Missed that one and was thrown off by those 2005, 1998, 2007 dates …
    Looks like you’re on the right track though.Sorry. I cobbled that code from some of my tests (grab everything -- seeing multiple rows makes me feel better!)
    Looking at your suggestion on TRUNC, I can't really do that, as user can specify everyone that doesn't qualify for a single month -- apparently, some departments are assigned a month to have their tests, while others can go the entire year.
    Thanks for the help.
    --G                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • ORA-00913: too many values when running a Delete

    Hi When I run the both the select statements inlcuding the minus it works,
    but when I run the Delete statement it gives me error ORA-00913: too many values
    Can someone tell me if there is anything I am missing
    DELETE FROM tablename where rowid in
    select rowid,
    to_number(lndr_spcl_allow_int_rate_pct),
    tablename.lndr_spcl_end_prin_bal_amt,
    tablename.lndr_spcl_avg_dly_prin_bal_amt,
    tablename.lndr_spcl_adj_avg_dly_prin_amt,
    tablename.calculated_amt,
    tablename.paid_amt,
    tablename.reported_amt,
    --tablename.processing_date_id, this date changes 
    tablename.date_paid_id,
    tablename.lndr_spcl_allwnc_ctgry_id,
    tablename.affected_period_qtr_date_id,
    tablename.loan_type_id,
    tablename.lndr_id,
    tablename.report_fiscal_qtr_date_id,
    tablename.report_qtr_date_id,
    tablename.lndr_geography_id,
    tablename.lndr_billing_code_id,
    tablename.orig_billing_code,
    tablename.orig_allowance_category,
    tablename.form_id,
    tablename.srvcr_id,
    tablename.document_long_desc,
    tablename.document_date_id,
    tablename.affected_period_qtr_date_desc,
    tablename.document_type
    from tablename
    where lndr_spcl_allow_int_rate_pct not like ('%V%')
    and lndr_spcl_allow_int_rate_pct not like ('N%')
    minus
    select min(rowid),
    to_number(lndr_spcl_allow_int_rate_pct),
    tablename.lndr_spcl_end_prin_bal_amt,
    tablename.lndr_spcl_avg_dly_prin_bal_amt,
    tablename.lndr_spcl_adj_avg_dly_prin_amt,
    tablename.calculated_amt,
    tablename.paid_amt,
    tablename.reported_amt,
    --tablename.processing_date_id, this date changes 
    tablename.date_paid_id,
    tablename.lndr_spcl_allwnc_ctgry_id,
    tablename.affected_period_qtr_date_id,
    tablename.loan_type_id,
    tablename.lndr_id,
    tablename.report_fiscal_qtr_date_id,
    tablename.report_qtr_date_id,
    tablename.lndr_geography_id,
    tablename.lndr_billing_code_id,
    tablename.orig_billing_code,
    tablename.orig_allowance_category,
    tablename.form_id,
    tablename.srvcr_id,
    tablename.document_long_desc,
    tablename.document_date_id,
    tablename.affected_period_qtr_date_desc,
    tablename.document_type
    from tablename
    where lndr_spcl_allow_int_rate_pct not like ('%V%')
    and lndr_spcl_allow_int_rate_pct not like ('N%')
    group by to_number(lndr_spcl_allow_int_rate_pct),
    tablename.lndr_spcl_end_prin_bal_amt,
    tablename.lndr_spcl_avg_dly_prin_bal_amt,
    tablename.lndr_spcl_adj_avg_dly_prin_amt,
    tablename.calculated_amt,
    tablename.paid_amt,
    tablename.reported_amt,
    --tablename.processing_date_id, this date changes 
    tablename.date_paid_id,
    tablename.lndr_spcl_allwnc_ctgry_id,
    tablename.affected_period_qtr_date_id,
    tablename.loan_type_id,
    tablename.lndr_id,
    tablename.report_fiscal_qtr_date_id,
    tablename.report_qtr_date_id,
    tablename.lndr_geography_id,
    tablename.lndr_billing_code_id,
    tablename.orig_billing_code,
    tablename.orig_allowance_category,
    tablename.form_id,
    tablename.srvcr_id,
    tablename.document_long_desc,
    tablename.document_date_id,
    tablename.affected_period_qtr_date_desc,
    tablename.document_type)

    Perhaps this should be your final query ->
    DELETE FROM tablename
    where rowid in (
                      select rowid
                      from tablename
                      where lndr_spcl_allow_int_rate_pct not like ('%V%')
                      and lndr_spcl_allow_int_rate_pct not like ('N%')
                      minus
                      select min(rowid)
                      from tablename
                      where lndr_spcl_allow_int_rate_pct not like ('%V%')
                      and lndr_spcl_allow_int_rate_pct not like ('N%')
                      group by to_number(lndr_spcl_allow_int_rate_pct),
                      tablename.lndr_spcl_end_prin_bal_amt,
                      tablename.lndr_spcl_avg_dly_prin_bal_amt,
                      tablename.lndr_spcl_adj_avg_dly_prin_amt,
                      tablename.calculated_amt,
                      tablename.paid_amt,
                      tablename.reported_amt,
                      --tablename.processing_date_id, this date changes
                      tablename.date_paid_id,
                      tablename.lndr_spcl_allwnc_ctgry_id,
                      tablename.affected_period_qtr_date_id,
                      tablename.loan_type_id,
                      tablename.lndr_id,
                      tablename.report_fiscal_qtr_date_id,
                      tablename.report_qtr_date_id,
                      tablename.lndr_geography_id,
                      tablename.lndr_billing_code_id,
                      tablename.orig_billing_code,
                      tablename.orig_allowance_category,
                      tablename.form_id,
                      tablename.srvcr_id,
                      tablename.document_long_desc,
                      tablename.document_date_id,
                      tablename.affected_period_qtr_date_desc,
                      tablename.document_type
                    );N.B.: Not Tested...
    Regards.
    Satyaki De.

  • ORA-00913: too many values - how to change decimal separator?

    I want to use SQL Developer's database export and need advice how to tweak the decimal separator from , to . in the sql-inserts created.
    Preferrably in SQL Developer, because this error can be easily happen and is hard to catch during import.+
    See below how to reproduce this:
    h3. 1.) ddl (ugly)
    set SERVEROUTPUT on
    CREATE TABLE AADECIMEXPORT
    "DECIMALNUMBER" NUMBER
    , "HUBBABUBBA" varchar2(20 byte)
    CREATE TABLE succeeded.
    h3. 2.) inserts by hand
    insert into AADECIMEXPORT(DECIMALNUMBER,HUBBABUBBA) values(10,'smells integer');
    insert into AADECIMEXPORT(DECIMALNUMBER,HUBBABUBBA) values(3.141592654,'smells rounded pi');
    1 rows inserted
    1 rows inserted
    ¨
    h3. 3.) select * from AADECIMEXPORT
    DECIMALNUMBER HUBBABUBBA
    10 SMELLS INTEGER
    3,141592654 smells rounded pi
    h3. 4.) then use sqldevelopers Tools-"database export" to export this table
    -- File created - tiistai-marraskuu-23-2010
    -- DDL for Table AADECIMEXPORT
    CREATE TABLE "AADECIMEXPORT"
    (     "DECIMALNUMBER" NUMBER,
         "HUBBABUBBA" VARCHAR2(20)
    -- DATA FOR TABLE AADECIMEXPORT
    -- FILTER = none used
    REM INSERTING into AADECIMEXPORT
    Insert into AADECIMEXPORT (DECIMALNUMBER,HUBBABUBBA) values (10,'smells integer');
    Insert into AADECIMEXPORT (DECIMALNUMBER,HUBBABUBBA) values (3,141592654,'smells rounded pi');
    -- END DATA FOR TABLE AADECIMEXPORT
    h3. 5.) Test the insert
    Insert into AADECIMEXPORT (DECIMALNUMBER,HUBBABUBBA) values (3,141592654,'smells rounded pi');
    Error starting at line 49 in command:
    Insert into AADECIMEXPORT (DECIMALNUMBER,HUBBABUBBA) values (3,141592654,'smells rounded pi')
    Error at Command Line:49 Column:12
    Error report:
    SQL Error: ORA-00913: too many values
    00913. 00000 - "too many values"
    *CAUSE:   
    *Action:                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

    About
    Oracle SQL Developer 2.1.1.64
    Version 2.1.1.64
    Build MAIN-64.45
    Copyright © 2005,2009 Oracle. All Rights Reserved.
    IDE Version: 11.1.1.2.36.55.30
    Product ID: oracle.sqldeveloper
    Product Version: 11.1.1.64.45
    Version
    Component     Version
    =========     =======
    Java(TM) Platform     1.6.0_14
    Oracle IDE     2.1.1.64.45
    Versioning Support     2.1.1.64.45

Maybe you are looking for

  • How to authenticate CXF-Webservice against external LDAP in WebLogic?

    Hi there, I'm trying to integrate our Camel-application into WebLogic 12c. All the incoming endpoints are CXF-based webservices. These are secured by "UsernameToken Timestamp" with the WSS4JInInterceptor configured like this: <bean id="wss4jInInterce

  • How to define a PooledConnection on Bea-Weblogic

    Hi I've the following method that runs on tomcat and oracle-embedded server. With the JNDI name I want to get PooledConnection. When I define a connection-pool I use the oracle.jdbc.OracleDriver Can someone help me? Thanks Christian public Connection

  • Printing issues in CS5

    I found this thread that pretty much goes over the issues I have been having. I was wondering if this was every resolved. It looks like the last time this was addressed was May of 2011. http://forums.adobe.com/thread/634402 concerned and confused.

  • Bought photoshop elements 13

    I just bought Photoshop Elements 13 to go with my new macbook pro.  Tore open the box, went to install, only to remember new computer does not have a slot for the disc.  Can I register the serial number and get the download version? Angie

  • Mpeg-2 camcorder movie play in quicktime but no sound

    I bought a camcorder that saves files to its hard disk as MPG (MPEG-2). I got the extension mpeg-2 so that I could open them in quicktime. I've got the video but no audio. Any ideas? iMac   Mac OS X (10.4.9)