SQL LOADER DATA INSERT ISSUES

I tried inserting data into an empty table in my database but got the following error:
SQL*Loader: Release 11.1.0.6.0 - Production on Tue Jul 7 22:01:35 2009
Copyright (c) 1982, 2007, Oracle. All rights reserved.
SQL*Loader-350: Syntax error at line 7.
Expecting "," or ")", found "VARCHAR2".
countryname VARCHAR2(25),
^
What is wrong what this please? Find below the contents of .CTL and .DAT files.
Content of load_country.CTL
LOAD DATA
infile '/D:/Data/Load/load_country.dat'
INTO TABLE ASSET.COUNTRY
INSERT
FIELDS TERMINATED BY ';'
(countrycode,
countryname VARCHAR2(25),
population NUMBER(10))
The content of the load_country.DAT file:
BEN;Benin;8791832
BFA;Burkina Faso;15746232
CPV;Cape Verde;503000
Thank you.
Ofonime Essien.

Hello,
After solving the Varchar2 problem, I received this error for the NUMBER data type:
SQL*Loader: Release 11.1.0.6.0 - Production on Wed Jul 8 11:47:16 2009
Copyright (c) 1982, 2007, Oracle. All rights reserved.
SQL*Loader-350: Syntax error at line 8.
Expecting "," or ")", found "number".
population number(10))
^
I replaced the Number data type with Float but received this error:
SQL*Loader: Release 11.1.0.6.0 - Production on Wed Jul 8 11:52:27 2009
Copyright (c) 1982, 2007, Oracle. All rights reserved.
SQL*Loader-350: Syntax error at line 8.
Expecting valid column specification, "," or ")", found "(".
population Float(10))
^
Replacing Number data type with Integer, I received this error:
SQL*Loader: Release 11.1.0.6.0 - Production on Wed Jul 8 12:17:11 2009
Copyright (c) 1982, 2007, Oracle. All rights reserved.
Control File: D:\Data\Load\tsl.CTL
Data File: D:\Data\Load\c.dat
Bad File: D:\Data\Load\c.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table C, loaded from every logical record.
Insert option in effect for this table: INSERT
Column Name Position Len Term Encl Datatype
COUNTRYCODE FIRST * ; CHARACTER
COUNTRYNAME NEXT 27 VARCHAR
POPULATION NEXT 10 INTEGER
SQL*Loader-941: Error during describe of table C
ORA-04043: object C does not exist
Thank you so much.
Ofonime.
Edited by: ofonime on Jul 8, 2009 7:09 PM

Similar Messages

  • Using SQL Loader to insert data based on conditionally statements

    I would like to use sql loader to insert records from a single text file into an Oracle database table. Sounds easy but the problem is that I need to check the existence of data in another table (Table A) before data can be loaded into Table B.
    Table A has two columns: dept_no (primary key value) and dept_name (char)
    Table B has five columns: emp_no (primary key value), dept_no (foreign key value to Table A), employee (char), job_title (char) and salary (number)
    Text File looks like this:
    Finance Jones President 10000
    HR Smith Admin 2000
    HelpDesk Jenkins Technician 3000
    Problem:
    1. I need sql loader to insert records into Table B by first checking if the first field in the file is in Table A.
    2. If value exists get it's dept_no value.
    3. If value doesn't exist discard record ( I might want to have sql loader insert a new record for this into Table A)
    4. Using value from #2, insert the value in Table B for dept_no column.
    5. Also assign a sequence value for the emp_no value in Table B.
    Any guidance is greatly appreciated.
    Thanks,

    Hello,
    I am not sure this is possible with SQL loader. I would rather use an external table based on your file.
    Then, I would use SQL to load your data into your table, based on your conditions.
    Your request is not very complicated, writing the SQL to do that will be really more simple than trying to do it with SQL loader.
    Hope it will help.
    Regards,
    Sylvie

  • SQL Loader and Insert Into Performance Difference

    Hello All,
    Im in a situation to measure performance difference between SQL Loader and Insert into. Say there 10000 records in a flat file and I want to load it into a staging table.
    I know that if I use PL/SQL UTL_FILE to do this job performance will degrade(dont ask me why im going for UTL_FILE instead of SQL Loader). But I dont know how much. Can anybody tell me the performance difference in % (like 20% will decrease) in case of 10000 records.
    Thanks,
    Kannan.

    Kannan B wrote:
    Do not confuse the topic, as I told im not going to use External tables. This post is to speak the performance difference between SQL Loader and Simple Insert Statement.I don't think people are confusing the topic.
    External tables are a superior means of reading a file as it doesn't require any command line calls or external control files to be set up. All that is needed is a single external table definition created in a similar way to creating any other table (just with the additional external table information obviously). It also eliminates the need to have a 'staging' table on the database to load the data into as the data can just be queried as needed directly from the file, and if the file changes, so does the data seen through the external table automatically without the need to re-run any SQL*Loader process again.
    Who told you not to use External Tables? Do they know what they are talking about? Can they give a valid reason why external tables are not to be used?
    IMO, if you're considering SQL*Loader, you should be considering External tables as a better alternative.

  • SQL Loader UTF8 Charset  issue

    Hi,
    My database charset is UTF8, NLS_LANG parameter on mid tier (unix) is UTF8. I have created a spool file (Korean language) in UTF8 charset. Also, verified the dat file using a unicode text editor. When I am trying to upload data back to the database using SQL loader data is not getting uploaded properly. It uploads after doing some internal encoding/decoding. I have tried with characterset parament in ctrl file as UTF8.
    Could anyone pls help me identifying the issue and respective resolution?
    Thanks in advance.
    Sanyog

    Also useful for you (NLS_LANG and sqlloader) may be this thread:
    sqllder especial carector issue

  • Create sql loader data file dynamically

    Hi,
    I want a sample program/approach which is used to create a sql loader data file.
    The program will read table name as i/p and will use
    select stmt will column list derived from user_tab_columns from data dictionary
    assuming multiple clob columns in the column list.
    Thanks
    Manoj

    I 'm writing clob and other columns to a sql loader dat file.
    Below sample code for writing clob column is giving file write error.
    How can I write multiple clobs to dat file so that control file will handle it correctly
    offset NUMBER := 1;
    chunk VARCHAR2(32000);
    chunk_size NUMBER := 32000;
    WHILE( offset < dbms_lob.getlength(l_rec_type.narrative) )
    LOOP
    chunk := dbms_lob.substr(l_rec_type.narrative, chunk_size, offset );
    utl_file.put( l_file_handle, chunk );
         utl_file.fflush(l_file_handle);
    offset := offset + chunk_size;
    END LOOP;
         utl_file.new_line(l_file_handle);

  • Why the delivery date is the same date as 'transptn plan date" & loading date' & ' good issue' & GR end date'

    Hi Experts,
    why the delivery date is the same date as ‘transptn plan date” & loading date’ & ‘ good issue’ & GR end date’.
    in shipping tab i can see Planned Deliv. Time  170 Days ... wat could be the reason.
    Many Thanks:
    Raj Kashyap

    Hi Jurgen,,
    Thanks for quick reply!!
    But i didnot find any things like that .. what could be the customizing .. and we are using GATP from APO side.
    \Raj Kashyap

  • SQL*Loader Date issue

    How do i get sql loader to recognize recognize AM/PM. My control file is below. The error I get is syntax error at line 14. Expecting field-name, found ")".
    OPTIONS ( SKIP=0)
    LOAD DATA
    INFILE '/ftp_data/labor_scheduling/kronos_punch_temp/kronos_punches.csv'
    BADFILE '/ftp_data/labor_scheduling/kronos_punches/kronos_punches.bad'
    DISCARDFILE '/ftp_data/labor_scheduling/kronos_punches/kronos_punches.dsc'
    INTO TABLE "STAFFING"."KRONOS_TEMP_PUNCHES"
    TRUNCATE
    FIELDS TERMINATED BY ','
    OPTIONALLY ENCLOSED BY '"'
    (EMPLOYEE_ID,
    SCHEDULE_DATE DATE "MM/DD/YY",
    PUNCH DATE "MM/DD/YY HH:MI AM"
    )

    Hi Gary,
    Could your problem be with this?
    (EMPLOYEE_ID,You have no data type for that field.
    Regards
    Peter

  • Sql loader - Data loading issue with no fixed record length

    Hi All,
    I am trying to load the following data through sql loader. However the records # 1, 3 & 4 are only loading succesfully into the table and rest of the records showing as BAD. What is missing in my syntax?
    .ctl file:
    LOAD DATA
    INFILE 'C:\data.txt'
    BADFILE 'c:\data.BAD'
    DISCARDFILE 'c:\data.DSC' DISCARDMAX 50000
    INTO TABLE icap_gcims
    TRAILING NULLCOLS
         CUST_NBR_MAIN          POSITION(1:9) CHAR NULLIF (CUST_NBR_MAIN=BLANKS),
         CONTACT_TYPE          POSITION(10:11) CHAR NULLIF (CONTACT_TYPE=BLANKS),
         INQUIRY_TYPE          POSITION(12:13) CHAR NULLIF (INQUIRY_TYPE=BLANKS),
         INQUIRY_MODEL          POSITION(14:20) CHAR NULLIF (INQUIRY_MODEL=BLANKS),
         INQUIRY_COMMENTS     POSITION(21:60) CHAR NULLIF (INQUIRY_COMMENTS=BLANKS),
         OTHER_COLOUR POSITION(61:75) CHAR NULLIF (OTHER_COLOUR=BLANKS),
         OTHER_MAKE          POSITION(76:89) CHAR NULLIF (OTHER_MAKE=BLANKS),
         OTHER_MODEL_DESCRIPTION POSITION(90:109) CHAR NULLIF (OTHER_MODEL_DESCRIPTION=BLANKS),
         OTHER_MODEL_YEAR POSITION(110:111) CHAR NULLIF (OTHER_MODEL_YEAR=BLANKS)
    data.txt file:
    000000831KHAN
    000000900UHFA WANTS NEW WARRANTY ID 000001017OHAL
    000001110KHAP
    000001812NHDE231291COST OF SERVICE INSPECTIONS TOO HIGH MAXIMA 92 MK
    000002015TPFA910115CUST UPSET WITH AIRPORT DLR. $200 FOR PLUGS,OIL,FILTER CHANGE. FW
    Thanks,

    Hi,
    Better if you have given the table structure, I check your script it was fine
    11:39:01 pavan_Real>create table test1(
    11:39:02   2  CUST_NBR_MAIN  varchar2(50),
    11:39:02   3  CONTACT_TYPE varchar2(50),
    11:39:02   4  INQUIRY_TYPE varchar2(50),
    11:39:02   5  INQUIRY_MODEL varchar2(50),
    11:39:02   6  INQUIRY_COMMENTS varchar2(50),
    11:39:02   7  OTHER_COLOUR varchar2(50),
    11:39:02   8  OTHER_MAKE varchar2(50),
    11:39:02   9  OTHER_MODEL_DESCRIPTION varchar2(50),
    11:39:02  10  OTHER_MODEL_YEAR varchar2(50)
    11:39:02  11  );
    Table created.
    11:39:13 pavan_Real>select  * from test1;
    no rows selected
    C:\Documents and Settings\ivy3905>sqlldr ara/ara@pavan_real
    control = C:\control.ctl
    SQL*Loader: Release 9.2.0.1.0 - Production on Sat Sep 12 11:41:27 2009
    Copyright (c) 1982, 2002, Oracle Corporation.  All rights reserved.
    Commit point reached - logical record count 5
    11:42:20 pavan_Real>select count(*) from test1;
      COUNT(*)                                                                     
             5    control.ctl
    LOAD DATA
    INFILE 'C:\data.txt'
    BADFILE 'c:\data.BAD'
    DISCARDFILE 'c:\data.DSC' DISCARDMAX 50000
    INTO TABLE test1
    TRAILING NULLCOLS
    CUST_NBR_MAIN POSITION(1:9) CHAR NULLIF (CUST_NBR_MAIN=BLANKS),
    CONTACT_TYPE POSITION(10:11) CHAR NULLIF (CONTACT_TYPE=BLANKS),
    INQUIRY_TYPE POSITION(12:13) CHAR NULLIF (INQUIRY_TYPE=BLANKS),
    INQUIRY_MODEL POSITION(14:20) CHAR NULLIF (INQUIRY_MODEL=BLANKS),
    INQUIRY_COMMENTS POSITION(21:60) CHAR NULLIF (INQUIRY_COMMENTS=BLANKS),
    OTHER_COLOUR POSITION(61:75) CHAR NULLIF (OTHER_COLOUR=BLANKS),
    OTHER_MAKE POSITION(76:89) CHAR NULLIF (OTHER_MAKE=BLANKS),
    OTHER_MODEL_DESCRIPTION POSITION(90:109) CHAR NULLIF (OTHER_MODEL_DESCRIPTION=BLANKS),
    OTHER_MODEL_YEAR POSITION(110:111) CHAR NULLIF (OTHER_MODEL_YEAR=BLANKS)
    data.txt
    000000831KHAN
    000000900UHFA WANTS NEW WARRANTY ID 000001017OHAL
    000001110KHAP
    000001812NHDE231291COST OF SERVICE INSPECTIONS TOO HIGH MAXIMA 92 MK
    000002015TPFA910115CUST UPSET WITH AIRPORT DLR. $200 FOR PLUGS,OIL,FILTER CHANGE. FW
    CUST_NBR_MAIN     CONTACT_TYPE     INQUIRY_TYPE     INQUIRY_MODEL     INQUIRY_COMMENTS     OTHER_COLOUR     OTHER_MAKE     OTHER_MODEL_DESCRIPTION     OTHER_MODEL_YEAR
    000000831     KH     AN     NULL     NULL     NULL     NULL     NULL     NULL
    000000900     UH     FA      WANTS     NEW WARRANTY ID 000001017OHAL     NULL     NULL     NULL     NULL
    000001110     KH     AP     NULL     NULL     NULL     NULL     NULL     NULL
    000001812     NH     DE     231291C     OST OF SERVICE INSPECTIONS TOO HIGH MAXI     MA 92 MK     NULL     NULL     NULL
    000002015     TP     FA     910115C     UST UPSET WITH AIRPORT DLR. $200 FOR PLU     GS,OIL,FILTER C     HANGE. FW     NULL     NULL- Pavan Kumar N
    Edited by: Pavan Kumar on Sep 12, 2009 11:46 AM

  • SQL*Loader to insert data file name during load

    I'd like to use a single control file to load data from different files (at different times) to the same table. I'd like this table to have a column to hold the name of the file the data came from. Is there a way for SQL*Loader to automatically do this? (I.e., as opposed to running an update query separately.) I can edit the control file before each load to set a CONSTANT to hold the new data file name, but I'd like it to pick this up automatically.
    Thanks for any help.
    -- Harvey

    Hello Harvey.
    I've previously attempted to store a value into a global/local OS variable and use this within a SQL*Loader control file (Unix OS and Oracle versions between 7.3.4 and 10g). I was unsuccessful in each attempt and approach I could imagine. It was very easy to use a sed script to make a copy of the control file, changing a string within it to do this however.
    Do you really want to store a file name on each and every record? Perhaps an alternative would be to use a relational model. Create a file upload log table that would store the file name and an upload # and then have the SQL*Loader control file call a function that would read that table for the most recent upload #. You'll save some disk space too.
    Hope this helps,
    Luke

  • SQL Loader - data exceeds maximum length

    I am having an issue with SQL Loader falsely reporting that a column is too long in a CSV upload file. The offending column, Notes, is defined in the staging table as VARCHAR2(1000). The text in the Notes column in the upload file for the record that is being rejected is only 237 characters long. I examined the raw data file with a hex editor and there are no special cahracters embedded in the column. The CSV upload was recreated but the false error remains.
    Any ideas what to check? Any suggestion appreciated.
    Here are the pertinent files.
    Control File:LOAD DATA
       INFILE 'Mfield_Upl.dat'
       BADFILE 'Mfield_Upl.bad'
       TRUNCATE
       INTO TABLE Mfield_UPL_Staging
       FIELDS TERMINATED BY ',' optionally enclosed by '"'
         ControlNo CHAR,
         PatientID CHAR,
         CollectDate DATE "MM/DD/YYYY",
         TestDate DATE "MM/DD/YYYY",
         AnalyteDesc CHAR,
         Results CHAR,
         HiLoFlag CHAR,
         LoRange CHAR,
         HiRange CHAR,
         UnitOfMeas CHAR,
         Comments CHAR,
         Notes CHAR,
         ClinicalEvent CHAR,
         OwnerLName CHAR,
         OwnerFName CHAR,
         PetName CHAR,
         AssecNo CHAR,
         SpecimenID CHAR
    {code}
    Staging Table:{code}
    CREATE TABLE Mfield_UPL_Staging
        ControlNo      VARCHAR2(20),
        PatientID      VARCHAR2(9),
        CollectDate    DATE,
        TestDate       DATE,
        AnalyteDesc    VARCHAR2(100),
        Results        VARCHAR2(100),
        HiLoFlag       CHAR(10),
        LoRange        VARCHAR2(15),
        HIRange        VARCHAR2(15),
        UnitOfMeas     VARCHAR2(25),
        Comments       VARCHAR2(100),
        Notes          VARCHAR2(1000),
        ClinicalEvent  VARCHAR2(20),
        OwnerLName     VARCHAR(50),
        OwnerFName     VARCHAR(50),
        PetName        VARCHAR(50),
        AssecNo        NUMBER(10),
        SpecimenID     NUMBER(10)
    {Code}
    Error Log File:{code}
    SQL*Loader: Release 9.2.0.1.0 - Production on Wed Aug 11 08:22:58 2010
    Copyright (c) 1982, 2002, Oracle Corporation.  All rights reserved.
    Control File:   Mfield_UPL_CSV.ctl
    Data File:      Mfield_UPL.dat
      Bad File:     Mfield_Upl.bad
      Discard File:  none specified
    (Allow all discards)
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 50
    Bind array:     64 rows, maximum of 256000 bytes
    Continuation:    none specified
    Path used:      Conventional
    Table MFIELD_UPL_STAGING, loaded from every logical record.
    Insert option in effect for this table: TRUNCATE
       Column Name                  Position   Len  Term Encl Datatype
    CONTROLNO                           FIRST     *   ,  O(") CHARACTER           
    PATIENTID                            NEXT     *   ,  O(") CHARACTER           
    COLLECTDATE                          NEXT     *   ,  O(") DATE MM/DD/YYYY     
    TESTDATE                             NEXT     *   ,  O(") DATE MM/DD/YYYY     
    ANALYTEDESC                          NEXT     *   ,  O(") CHARACTER           
    RESULTS                              NEXT     *   ,  O(") CHARACTER           
    HILOFLAG                             NEXT     *   ,  O(") CHARACTER           
    LORANGE                              NEXT     *   ,  O(") CHARACTER           
    HIRANGE                              NEXT     *   ,  O(") CHARACTER           
    UNITOFMEAS                           NEXT     *   ,  O(") CHARACTER           
    COMMENTS                             NEXT     *   ,  O(") CHARACTER           
    NOTES                                NEXT     *   ,  O(") CHARACTER           
    CLINICALEVENT                        NEXT     *   ,  O(") CHARACTER           
    OWNERLNAME                           NEXT     *   ,  O(") CHARACTER           
    OWNERFNAME                           NEXT     *   ,  O(") CHARACTER           
    PETNAME                              NEXT     *   ,  O(") CHARACTER           
    ASSECNO                              NEXT     *   ,  O(") CHARACTER           
    SPECIMENID                           NEXT     *   ,  O(") CHARACTER           
    Record 1042: Rejected - Error on table MFIELD_UPL_STAGING, column NOTES.
    Field in data file exceeds maximum length
    Table MFIELD_UPL_STAGING:
      3777 Rows successfully loaded.
      1 Row not loaded due to data errors.
      0 Rows not loaded because all WHEN clauses were failed.
      0 Rows not loaded because all fields were null.
    {code}                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

    Try:
    -- Etc ...
      Notes CHAR(1000),
    -- Etc ...SQL*Loader limits string buffer to 256 unless specified different.
    :p

  • SQL Loader and whitespace issues

    Hello users,
    I've got an issue that I know is breaking based on a comments field, which allows nearly every type of character in it.
    Long story short, I'm trying to upload an Access table, turned to .csv file into Oracle via SQL Loader.
    I'm able to do it if I delete or set the COMMENTS column to null before doing so.
    If I try the option with the contents of this field populated, it blows up. I've got it terminated by a tilde in the control file, which the COMMENTS field does not contain at all, even with some 5.000+ records.
    However, people upon entering comments have put many carriage returns in there. I noticed once I export the data to text or csv, then open in Excel, it shows the column breaking and spilling over into fields it shouldn't, thereby causing the big problem.
    I've been searching but haven't found a way I can wrap these whitespaces or have SQL Loader ignore them.
    I can't use TERMINATED BY WHITESPACE in the SQL Loader control file because that will cause the field to break without all the comments and I can't use optionally enclosed by " because these people have also used " in the COMMENTS field.
    Do I have any option here, other than to brute force make an update query of some kind to manually populate the data? I can do that if I have to but i'd sure rather have all the data dump in via SQL Loader if I can.
    Thanks!

    Actually, it wasn't related to the size of the column. I made it like 3000, to hold 3000 characters, which was plenty.
    But I used
    PRESERVE BLANKS and then it worked fine!

  • SQL Loader and INSERT Trigger

    I have problem and your help to solve it would be very much appreciated.
    I am uploading a text file with SQL Loader into a table. Since I used APPEND option in the Loader, I don't want records to be duplicated. So, I wrote a "BEFORE INSERT .. FOR EACH ROW" trigger to check whether that row already exists or not.
    For example, let us consider a table TEST as follows.
    Fld1     NUMBER(2);
    Fld2     VARCHAR2(10);
    Fld3     VARCHAR2(10);
    I have a trigger on this table.
    CREATE OR REPLACE TRIGGER Trg_Bef_Insert_Test
    BEFORE INSERT ON Test FOR EACH ROW
    DECLARE
    vCount NUMBER(2);
    DuplicateRow EXCEPTION;
    BEGIN
    SELECT Count(*) INTO vCount FROM Test
         WHERE fld1 || fld2 || fld3 = :new.fld1 || :new.fld2 || :new.fld3;
    IF vCount > 0 THEN
         RAISE DuplicateRow;
    END IF;
    EXCEPTION
    WHEN DuplicateRow THEN
         Raise_Application_Error (-20001,'Record already exists');
    WHEN OTHERS THEN
         DBMS_OUTPUT.PUT_LINE('ERROR : ' || SQLCODE || '; ' || SUBSTR(SQLERRM, 1, 150));
    END;
    Please refer to the following SQL statements which I executed in the SQL Plus.
    SQL> insert into test values (1,'one','first');
    1 row created.
    SQL> insert into test values (1,'one','first');
    insert into test values (1,'one','first')
    ERROR at line 1:
    ORA-20001: Record already exists
    ORA-06512: at "CAMELLIA.TRG_TEST", line 13
    ORA-04088: error during execution of trigger 'CAMELLIA.TRG_TEST'
    Would anyone tell me why do errors -6512 and -4088 occur ?
    Also, if you have any other suggestion to handle this situation, please let me know.
    By the way, I am using Oracle 8.1.7.
    Thank you.

    There are a few things wrong here, but you should really use a unique constraint for this.
    SQL> create table t (a number, b number, c number,
      2      constraint uk unique (a, b, c));
    Table created.Here's an example data file with 12 records three of which are duplicates.
    1,2,3
    3,4,5
    6,7,8
    3,2,1
    5,5,5
    3,4,5
    3,2,1
    1,1,1
    2,2,2
    6,7,8
    8,8,8
    9,9,9And a control file
    load data
    infile 'in.dat'
    append
    into table t
    fields terminated by ',' optionally enclosed by '"'
    (a, b, c)Running it with sql loader, inserts the nine records, outputs the three duplicates to a .bad file and logs all the errors in the .log file. No need for triggers or any code.
    $ sqlldr control=in.ctl
    Username:xxx
    Password:
    SQL*Loader: Release 9.2.0.1.0 - Production on Mon Apr 21 23:16:44 2003
    Copyright (c) 1982, 2002, Oracle Corporation.  All rights reserved.
    Commit point reached - logical record count 12
    $ cat in.bad
    3,4,5
    3,2,1
    6,7,8
    SQL> select * from t;
             A          B          C
             1          2          3
             3          4          5
             6          7          8
             3          2          1
             5          5          5
             1          1          1
             2          2          2
             8          8          8
             9          9          9
    9 rows selected.

  • SQL Loader Date Problems

    Hi Guys,
    We have a SQL Loader script that used to data in the 'mm/dd/yyyy' format. Post migration of the script onto a seprate server the format is getting stored in 'dd/mm/yyyy' format in a VARCHAR datatype column for DOB.
    Please find below the Loader script :
    "insert into tsa_lists values ('" & ttsa_list_typ & "','" & ttsa_list_num & "',"
                                        if ttsa_list_typ <> "AuthRep" then
                                        inscmd.CommandText= "insert into tsa_lists values ('" & ttsa_list_typ & "','" & ttsa_list_num & "',?,?,?,?,?,?,?,?,?,?,sysdate,?,?)"
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_sid",200,1,30,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_lastname",200,1,500,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_firstname",200,1,500,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_firstletter",200,1,1,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_middlename",200,1,500,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_dob",200,1,100,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_pob",200,1,200,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_citizenship",200,1,200,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_cleared",200,1,3,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_misc",200,1,2000,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_firstname_orig",200,1,500,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_lastname_orig",200,1,500,"")Data coming in the below format:
    SID     CLEARED     LASTNAME     FIRSTNAME     MIDDLENAME     TYPE     DOB     POB     CITIZENSHIP     PASSPORT/IDNUMBER     MISC     
    3799509          A     ABD AL SALAM MARUF ABDALLAH               01-Jul-55                         
    3799512          A     ABD AL SALAM MARUF ABDALLAH               01-Jan-80                         
    3727959          A     KHALID KHALIL IBRAHIM MAHDI               11-Nov-50                         
    3458238          A     KHALID KHALIL IBRAHIM MAHDI               08-Jan-81                         
    3458242          A     KHALID KHALIL IBRAHIM MAHDI               31-Jul-81                         
    3458231          A     KHALID KHALIL IBRAHIM MAHDI               01-Aug-81                         
    2407275          A     MUSA BARARUDDIN Y DAGAM               19-Aug-62                         
    Please can you guys suggest a way.
    Cheers,
    Shazin

    1) That does not look like anything recognized by "SQL*Loader" utility.
    2) If you just really mean a "sql" statement to load data...then:
    a) If table column is date type, then use the TO_DATE() function
    b) If column is Varchar2 you should not save date's in a varchar2 column.
    In any case check out the NLS_DATE_FORMAT parameter on the database and or the same parameter on client side.
    :p

  • SQL Loader Direct Path issue

    Hi,
    I am trying to load data using SQL Loader Direct Path. One of the tables has a REC_UPD_DT column which is of timestamp datatype. The table was created with default value of SYSTIMESTAMP on this column. However when I load the table, the REC_UPD_DT is null due to the behavior(bypasses the sqlfunctions, etc) of the Direct Path method.
    What is the way out to achieve my objective?
    I do not want to use the conventional path since that is proven to be slow for my case.
    Thanks in advance
    Pavin

    Hi,
    My apologies for not providing requested information before. Here are the scripts:
    I am testing from my windows m/c.
    Database version: 10.2.0.3
    Table creation script:
    CREATE TABLE TXN_DEFAULT
    (MAPPER_Q_ID NUMBER(1,0) NOT NULL,
    TRANSACTION_SEQUENCE NUMBER(10,0) NOT NULL,
    RECORD_COUNT NUMBER(4,0),
    STATUS VARCHAR2(1 CHAR),
    USER_ID VARCHAR2(20 CHAR),
    USER_IP_ADDRESS VARCHAR2(35 CHAR),
    REC_UPD_DT TIMESTAMP (6) DEFAULT SYSTIMESTAMP
    Loader Control file script:
    OPTIONS (DIRECT=TRUE)
    load data
    CHARACTERSET WE8ROMAN8
    into table TXN_DEFAULT
    append
    FIELDS TERMINATED BY '     '
    trailing nullcols
    MAPPER_Q_ID,
    TRANSACTION_SEQUENCE,
    RECORD_COUNT,
    STATUS,
    USER_ID,
    USER_IP_ADDRESS,
    REC_UPD_DT     
    Sample Dat file contents:
    0     251939112     3     N     chowdsr     10.222.79.20     
    1     251939113     140     N     zhuanre     146.222.52.176     
    2     251939114     41     N     sombupo     146.222.187.155     
    0     251939116     75     N     BaseAdmin     146.222.4.246     
    Log File contents:
    SQL*Loader: Release 10.2.0.1.0 - Production on Mon Dec 7 13:22:23 2009
    Copyright (c) 1982, 2005, Oracle. All rights reserved.
    Control File: Control.ctl
    Character Set WE8ROMAN8 specified for all input.
    Data File: Control.dat
    Bad File: Control.bad
    Discard File: none specified
    (Allow all discards)
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 50
    Continuation: none specified
    Path used: Direct
    Table TXN_DEFAULT, loaded from every logical record.
    Insert option in effect for this table: APPEND
    TRAILING NULLCOLS option in effect
    Column Name Position Len Term Encl Datatype
    MAPPER_Q_ID FIRST * WHT CHARACTER
    TRANSACTION_SEQUENCE NEXT * WHT CHARACTER
    RECORD_COUNT NEXT * WHT CHARACTER
    STATUS NEXT * WHT CHARACTER
    USER_ID NEXT * WHT CHARACTER
    USER_IP_ADDRESS NEXT * WHT CHARACTER
    REC_UPD_DT NEXT * WHT CHARACTER
    Table TXN_DEFAULT:
    4 Rows successfully loaded.
    0 Rows not loaded due to data errors.
    0 Rows not loaded because all WHEN clauses were failed.
    0 Rows not loaded because all fields were null.
    Bind array size not used in direct path.
    Column array rows : 5000
    Stream buffer bytes: 256000
    Read buffer bytes: 1048576
    Total logical records skipped: 0
    Total logical records read: 4
    Total logical records rejected: 0
    Total logical records discarded: 0
    Total stream buffers loaded by SQL*Loader main thread: 1
    Total stream buffers loaded by SQL*Loader load thread: 0
    Run began on Mon Dec 07 13:22:23 2009
    Run ended on Mon Dec 07 13:22:23 2009
    Elapsed time was: 00:00:00.75
    CPU time was: 00:00:00.08
    If I run using Conventional path, the REC_UPD_DT is populated with the a systimestamp value. However, this column values are null when I use the Direct path. Log files have no errors. What is the way out to achieve my objective?
    Thanks in advance.
    - Pavin

  • Using sql Loader to insert from table?

    Hi ,,
    Can I use the SQL Loader tool to insert data from table to table, I want it to be instead of the following sql script:
    DELETE FROM TABLE1 WHERE PK IN (SELECT PK FROM TABLE2);
    INSERT INTO TABLE1 SELECT * FROM TABLE2;
    COMMIT;
    the purpose of using such method is to get the benefit of SQL Loader tool Features (log file with details, insert the correct records , ...... etc)
    Database: 9i
    O/S: Windows 2000 Server
    Edited by: eng. Habeeli on Jan 30, 2011 9:05 AM

    thanks all,
    I know that I cannot but just I wanna make sure form your experiences.
    BTW, the following method:
    DELETE FROM TABLE1 WHERE PK IN (SELECT PK FROM TABLE2);
    INSERT INTO TABLE1 SELECT * FROM TABLE2;
    COMMIT;
    Lacking the ability of loading some rows and just ignore rows with violations, and also lacking the feature of recording a log for each violation record.
    this method will stop once am error raise up and roll back all inserted record.
    so any suggestions to get such a result??

Maybe you are looking for

  • Proxy messages not processed in PI 7.1 pipeline coming from local client

    Dear experts, We have recently installed a PI7.1 system with two clients: 200 - Integration Server 250 - Local Integration Engine Now, when we send proxy messages from the 250 client to the 200 client, the messages are not processed in the integratio

  • I want to upgrade my I mac 10.5.8 to yosmite

    WWhat the best way to do this

  • Getting Dump while using PYXX_READ_PAYROLL_RESULT FM

    I m trying to get the gross salary and ESI contribution amount for an indian based employee. While using the FM 'PYXX_READ_PAYROLL_RESULT' i m getting the following dump. *******************************************************************************

  • File sharing problems

    My husband's computer is a G5 using 10.4.9. It is connected to our modem/router by ethernet. My computer is a macbook pro using Snow Leopard. His computer showed up in my sharing menu (yay!) but then disappeared. I went onto his computer and noticed

  • Large File Support

    Hi all, I'm new to the fourms but not new to the products. Recently I've been working on a project that requires very large XML files, like over 200,000 lines of code. I have noticed that when I work on these large files the software is really buggy.