SQL*Loader and timestamp values

I'm loading timestamp values (among other data) from a text file into Oracle using SQL*Loader.
The data I load is formatted like the following:
2008/11/13 23:55:21.366
2008/11/13 23:55:22.782
2008/11/13 23:55:25.879
Hence my control file look like this:
TSTAMP TIMESTAMP "YYYY/MM/DD HH24:MI:SS.FF",
The timestamp data in the input file are in UTC, however I load into a column that is of type "TIMESTAMP(3)", i.e. in server time. This is on purpose. I do not want the data to be in UTC when I look at them.
Therefore after the load I have to manually do
UPDATE mytable
SET tstamp = tstamp + (tstamp - sys_extract_utc(tstamp));
This update-after-loading actually takes quite some time due to the amount of records. I would like to avoid it and instead do it as part of the load. Is this possible somehow?
I'm on Oracle 10.2.
Thanks.

Hi
What about setting a special Time Zone in your database?
You have two options when setting which time zone the database belongs to. You can either qualify it as a displacement from GMT/UTC in the format of 'hh:mm' or you can specify it as a name that has an entry in the V$TIMEZONE table.
select tzname,tzabbrev from V$TIMEZONE_NAMES;
select DBTIMEZONE from dual;
ALTER database SET TIME_ZONE = 'Denmark/Copenhagen';
select SESSIONTIMEZONE from dual;
select CURRENT_TIMESTAMP from dual;
See that link
http://download.oracle.com/docs/cd/B19306_01/server.102/b14231/create.htm#sthref319
Edited by: Hub on Dec 7, 2008 3:58 PM

Similar Messages

  • SQL Loader and Insert Into Performance Difference

    Hello All,
    Im in a situation to measure performance difference between SQL Loader and Insert into. Say there 10000 records in a flat file and I want to load it into a staging table.
    I know that if I use PL/SQL UTL_FILE to do this job performance will degrade(dont ask me why im going for UTL_FILE instead of SQL Loader). But I dont know how much. Can anybody tell me the performance difference in % (like 20% will decrease) in case of 10000 records.
    Thanks,
    Kannan.

    Kannan B wrote:
    Do not confuse the topic, as I told im not going to use External tables. This post is to speak the performance difference between SQL Loader and Simple Insert Statement.I don't think people are confusing the topic.
    External tables are a superior means of reading a file as it doesn't require any command line calls or external control files to be set up. All that is needed is a single external table definition created in a similar way to creating any other table (just with the additional external table information obviously). It also eliminates the need to have a 'staging' table on the database to load the data into as the data can just be queried as needed directly from the file, and if the file changes, so does the data seen through the external table automatically without the need to re-run any SQL*Loader process again.
    Who told you not to use External Tables? Do they know what they are talking about? Can they give a valid reason why external tables are not to be used?
    IMO, if you're considering SQL*Loader, you should be considering External tables as a better alternative.

  • SQL *Loader and External Table

    Hi,
    Can anyone tell me the difference between SQL* Loader and External table?
    What are the conditions under we can use SQL * Loader and External Table.
    Thanx

    External tables are accessible from SQL, which generally simplifies life if the data files are physically located on the database server since you don't have to coordinate a call to an external SQL*Loader script with other PL/SQL processing. Under the covers, external tables are normally just invoking SQL*Loader.
    SQL*Loader is more appropriate if the data files are on a different server or if it is easier to call an executable rather than calling PL/SQL (i.e. if you have a batch file that runs on a server other than the database server that wants to FTP a data file from a FTP server and then load the data into Oracle).
    Justin

  • Help in calling sql loader and an oracle procedure in a script

    Hi Guru's
    please help me in writing an unix script which will call sql loader and also an oracle procedure..
    i wrote an script which is as follows.
    !/bin/sh
    clear
    #export ORACLE_SID='HOBS2'
    sqlldr USERID=load/ps94mfo16 CONTROL=test_nica.ctl LOG=test_nica.log
    retcode=`echo $?`
    case "$retcode" in
    0) echo "SQL*Loader execution successful" ;;
    1) echo "SQL*Loader execution exited with EX_FAIL, see logfile" ;;
    2) echo "SQL*Loader execution exited with EX_WARN, see logfile" ;;
    3) echo "SQL*Loader execution encountered a fatal error" ;;
    *) echo "unknown return code";;
    esac
    sqlplus USERID=load/ps94mfo16 << EOF
    EXEC DO_TEST_SHELL_SCRIPT
    it is loading the data in to an oracle table
    but the procedure is not executed..
    any valuable suggestion is highly appriciated..
    Cheers

    multiple duplicate threads:
    to call an oracle procedure and sql loader in an unix script
    Re: Can some one help he sql loader issue.

  • HELP: SQL*LOADER AND Ref Column

    Hallo,
    I have already posted and I really need help and don't come further with this
    I have the following problem. I have 2 tables which I created the following way:
    CREATE TYPE gemark_schluessel_t AS OBJECT(
    gemark_id NUMBER(8),
    gemark_schl NUMBER(4),
    gemark_name VARCHAR2(45)
    CREATE TABLE gemark_schluessel_tab OF gemark_schluessel_t(
    constraint pk_gemark PRIMARY KEY(gemark_id)
    CREATE TYPE flurstueck_t AS OBJECT(
    flst_id NUMBER(8),
    flst_nr_zaehler NUMBER(4),
    flst_nr_nenner NUMBER(4),
    zusatz VARCHAR2(2),
    flur_nr NUMBER(2),
    gemark_schluessel REF gemark_schluessel_t,
    flaeche SDO_GEOMETRY
    CREATE TABLE flurstuecke_tab OF flurstueck_t(
    constraint pk_flst PRIMARY KEY(flst_id),
    constraint uq_flst UNIQUE(flst_nr_zaehler,flst_nr_nenner,zusatz,flur_nr),
    flst_nr_zaehler NOT NULL,
    flur_nr NOT NULL,
    gemark_schluessel REFERENCES gemark_schluessel_tab
    Now I have data in the gemark_schluessel_tab which looks like this (a sample):
    1 101 Borna
    2 102 Draisdorf
    Now I wanna load data in my flurstuecke_tab with SQL*Loader and there I have problems with my ref column gemark_schluessel.
    One data record looks like this in my file (it is without geometry)
    1|97|7||1|1|
    If I wanna load my data record, it does not work. The reference (the system generated OID) should be taken from gemark_schluessel_tab.
    LOAD DATA
    INFILE *
    TRUNCATE
    CONTINUEIF NEXT(1:1) = '#'
    INTO TABLE FLURSTUECKE_TAB
    FIELDS TERMINATED BY '|'
    TRAILING NULLCOLS (
    flst_id,
    flst_nr_zaehler,
    flst_nr_nenner,
    zusatz,
    flur_nr,
    gemark_schluessel REF(CONSTANT 'GEMARK_SCHLUESSEL_TAB',GEMARK_ID),
    gemark_id FILLER
    BEGINDATA
    1|97|7||1|1|
    Is there a error I made?
    Thanks in advance
    Tig

    multiple duplicate threads:
    to call an oracle procedure and sql loader in an unix script
    Re: Can some one help he sql loader issue.

  • Using SQL*Loader and UTL_FILE to load and unload large files(i.e PDF,DOCs)

    Problem : Load PDF or similiar files( stored at operating system) into an oracle table using SQl*Loader .
    and than Unload the files back from oracle tables to prevoius format.
    I 've used SQL*LOADER .... " sqlldr " command as :
    " sqlldr scott/[email protected] control=c:\sqlldr\control.ctl log=c:\any.txt "
    Control file is written as :
    LOAD DATA
    INFILE 'c:\sqlldr\r_sqlldr.txt'
    REPLACE
    INTO table r_sqlldr
    Fields terminated by ','
    id sequence (max,1) ,
    fname char(20),
    data LOBFILE(fname) terminated by EOF )
    It loads files ( Pdf, Image and more...) that are mentioned in file r_sqlldr.txt into oracle table r_sqlldr
    Text file ( used as source ) is written as :
    c:\kalam.pdf,
    c:\CTSlogo1.bmp
    c:\any1.txt
    after this load ....i used UTL_FILE to unload data and write procedure like ...
    CREATE OR REPLACE PROCEDURE R_UTL AS
    l_file UTL_FILE.FILE_TYPE;
    l_buffer RAW(32767);
    l_amount BINARY_INTEGER ;
    l_pos INTEGER := 1;
    l_blob BLOB;
    l_blob_len INTEGER;
    BEGIN
    SELECT data
    INTO l_blob
    FROM r_sqlldr
    where id= 1;
    l_blob_len := DBMS_LOB.GETLENGTH(l_blob);
    DBMS_OUTPUT.PUT_LINE('blob length : ' || l_blob_len);
    IF (l_blob_len < 32767) THEN
    l_amount :=l_blob_len;
    ELSE
    l_amount := 32767;
    END IF;
    DBMS_LOB.OPEN(l_blob, DBMS_LOB.LOB_READONLY);
    l_file := UTL_FILE.FOPEN('DBDIR1','Kalam_out.pdf','w', 32767);
    DBMS_OUTPUT.PUT_LINE('File opened');
    WHILE l_pos < l_blob_len LOOP
    DBMS_LOB.READ (l_blob, l_amount, l_pos, l_buffer);
    DBMS_OUTPUT.PUT_LINE('Blob read');
    l_pos := l_pos + l_amount;
    UTL_FILE.PUT_RAW(l_file, l_buffer, TRUE);
    DBMS_OUTPUT.PUT_LINE('writing to file');
    UTL_FILE.FFLUSH(l_file);
    UTL_FILE.NEW_LINE(l_file);
    END LOOP;
    UTL_FILE.FFLUSH(l_file);
    UTL_FILE.FCLOSE(l_file);
    DBMS_OUTPUT.PUT_LINE('File closed');
    DBMS_LOB.CLOSE(l_blob);
    EXCEPTION
    WHEN OTHERS THEN
    IF UTL_FILE.IS_OPEN(l_file) THEN
    UTL_FILE.FCLOSE(l_file);
    END IF;
    DBMS_OUTPUT.PUT_LINE('Its working at last');
    END R_UTL;
    This loads data from r_sqlldr table (BOLBS) to files on operating system ,,,
    -> Same procedure with minor changes is used to unload other similar files like Images and text files.
    In above example : Loading : 3 files 1) Kalam.pdf 2) CTSlogo1.bmp 3) any1.txt are loaded into oracle table r_sqlldr 's 3 rows respectively.
    file names into fname column and corresponding data into data ( BLOB) column.
    Unload : And than these files are loaded back into their previous format to operating system using UTL_FILE feature of oracle.
    so PROBLEM IS : Actual capacity (size ) of these files is getting unloaded back but with quality decreased. And PDF file doesnt even view its data. means size is almot equal to source file but data are lost when i open it.....
    and for images .... imgaes are getting loaded an unloaded but with colors changed ....
    Also features ( like FFLUSH ) of Oracle 've been used but it never worked
    ANY SUGGESTIONS OR aLTERNATE SOLUTION TO LOAD AND UNLOAD PDFs through Oracle ARE REQUESTED.
    ------------------------------------------------------------------------------------------------------------------------

    Thanks Justin ...for a quick response ...
    well ... i am loading data into BLOB only and using SQL*Loader ...
    I've never used dbms_lob.loadFromFile to do the loads ...
    i 've opend a file on network and than used dbms_lob.read and
    UTL_FILE.PUT_RAW to read and write data into target file.
    actually ...my process is working fine with text files but not with PDF and IMAGES ...
    and your doubt of ..."Is the data the proper length after reading it in?" ..m not getting wat r you asking ...but ... i think regarding data length ..there is no problem... except ... source PDF length is 90.4 kb ..and Target is 90.8 kb..
    thats it...
    So Request u to add some more help ......or should i provide some more details ??

  • Is there any difference in Oracle 9i SQL Loader and Oracle 10g SQL Loader

    Hi
    Can anyone tell me whether is there any difference in Oracle 9i SQL Loader and Oracle 10g SQL Loader?
    I am upgrading the 9i db to 10g and wanted to run the 9i SQL Loader control files on upgraded 10g db. So please let me know is there any difference which I need to consider any modifications in the control files..
    Thank you in advance
    Adi

    answered

  • Import and process larger data with SQL*Loader and Java resource

    Hello,
    I have a project to import data from a text file in a schedule. A lager data, with nearly 20,000 record/1 hours.
    After that, we have to analysis the data, and export the results into a another database.
    I research about SQL*Loader and Java resource to do these task. But I have no experiment about that.
    I'm afraid of the huge data, Oracle could be slowdown or the session in Java Resource application could be timeout.
    Please tell me some advice about the solution.
    Thank you very much.

    With '?' mark i mean " How i can link this COL1 with column in csv file ? "
    Attilio

  • Problem with SQL*Loader and different date formats in the same file

    DB: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
    System: AIX 5.3.0.0
    Hello,
    I'm using SQL*Loader to import semi-colon separated values into a table. The files are delivered to us by a data provider who concatenates data from different sources and this results in us having different date formats within the same file. For example:
    ...;2010-12-31;22/11/1932;...
    I load this data using the following lines in the control file:
    EXECUTIONDATE1     TIMESTAMP     NULLIF EXECUTIONDATE1=BLANKS     "TO_DATE(:EXECUTIONDATE1, 'YYYY-MM-DD')",
    DELDOB          TIMESTAMP     NULLIF DELDOB=BLANKS          "TO_DATE(:DELDOB, 'DD/MM/YYYY')",
    The relevant NLS parameters:
    NLS_LANGUAGE=FRENCH
    NLS_DATE_FORMAT=DD/MM/RR
    NLS_DATE_LANGUAGE=FRENCH
    If I load this file as is the values loaded into the table are 31 dec 2010 and 22 nov *2032*, aven though the years are on 4 digits. If I change the NLS_DATE_FORMAT to DD/MM/YYYY then the second date value will be loaded correctly, but the first value will be loaded as 31 dec *2020* !!
    How can I get both date values to load correctly?
    Thanks!
    Sylvain

    This is very strange, after running a few tests I realized that if the year is 19XX then it will get loaded as 2019, and if it is 20XX then it will be 2020. I'm guessing it may have something to do with certain env variables that aren't set up properly because I'm fairly sure my SQL*Loader control file is correct... I'll run more tests :-(

  • SQL Loader and control file

    I know I've done this before but, I don't use SQL Loader often and I'm having issues getting a file to load.
    The table has 6 columns in it - one of which is a timestamp.
    I was having issues loading it initially with date format issues. I ruled out any issues with the timestamp format by simply loading a dummy table with some timestamp based data and had no issue.
    So - I think the issue is around the fact that the table I'm loading's first column being a value I'm attempting to default with a sequence when loading and - that I'm screwing up something there.
    Table is as such:
    CREATE TABLE ACS_IPS
        (seq_ips NUMBER,
        col2   VARCHAR2(100),
        col3  VARCHAR2(100),
        col4    VARCHAR2(100),
        col5 TIMESTAMP,
        col6    VARCHAR2(100),
        col7    VARCHAR2(100),
        col8 DATE  DEFAULT SYSDATE NOT NULL,
        col9 VARCHAR2(30) DEFAULT USER);The control file is:
    load data
    truncate
    into table acs_ips
    fields terminated by ","
    trailing nullcols
        seq_ips "seq_ips.nextval",
        col2,
        col3,
        col4,
        col5 TIMESTAMP "YYYY-MM-DD HH24:MI:SS.FF9",
        col6,
        col7
        )The sequence column isn't in the file being loaded... and - there are additional columns that are defaulted on the table that aren't in the control file.
    Any help is appreciated... The error I'm getting is:
    Rejected - Error on table CAMS.ACS_IPS, column ACTION_START.
    ORA-01841: (full) year must be between -4713 and +9999, and not be 0

    Yeah - not sure that clears it up...
    The sequence in my table is the first column - and - I think that's the problem... the sequence is being loaded as the first column in the control file but - it's not in the file being loaded so - it's skewing (again - I think??) the data being read in - which is why I'm getting the timestamp issues on the one column (it's actually reading the next column in the file vs. the actual timestamp one).
    If that's the issue - I'm not sure how to avoid it without restructuring the table to stick the sequence physically at the end. I'm certain that's not necessary and - I'm overlooking something that's otherwise simple but evading me.

  • SQL*Loader and DECODE function

    Hi All,
    I am loading data from data files into oracle tables and while loading the data using SQL*Loader, the following requirement needs to be fulfilled.
    1) If OQPR < 300, RB = $ 0-299, SC = "SC1"
    2) If 300 < OQPR < 1200, RB = $ 300-1199, SC = "SC2"
    3) If 1200 < OQPR < 3000, RB = $ 1200-2999, SC = "SC3"
    4) If OQPR > 3000 USD, RB = > $3000, SC = "SC4"
    Here OPQR is a field in the data file.
    Can anyone suggest how do we handle this using DECODE function? Triggers and PL/SQL functions are not to be used.
    TIA.
    Regards,
    Ravi.

    The following expression gives you different values for your different intervals and boundaries :
    SIGN(:OQPR - 300) + SIGN(:OQPR - 1200) + SIGN(:OQPR - 3000)

  • SQL*Loader and binary data

    i have a C routine that builds SQL*Loader input files. the input files contain multiple records, with a couple of integer columns and a raw(1400) field. the control file specifies a record separator of '|', which seems weird (having text in the middle of raw data) but also is somewhat co-operative (see below).
    so i basically write each integer to the file, then a short (2-byte) length value for the raw field, then the raw field. then the '|' separator.
    i've noticed that if the size of raw field is 400 bytes or less, everything works fine, i get the correct number of records in the database.
    unfortunately, with a size of 401 or more, SQL*Loader parses the thing into twice as many records as it should. so if i've written 3 records to my input data file, with each record's raw field at 400 bytes or less i get 3 records loaded. but any with a raw field of 401+, i get two records for each.
    any ideas why? and how to correct this? also, any ideas on a better way to do this? all the examples of large data in the online doc and the o'reilly book favor showing examples with large character data, which does me not much good.
    tia. john

    i have a C routine that builds SQL*Loader input files. the input files contain multiple records, with a couple of integer columns and a raw(1400) field. the control file specifies a record separator of '|', which seems weird (having text in the middle of raw data) but also is somewhat co-operative (see below).
    so i basically write each integer to the file, then a short (2-byte) length value for the raw field, then the raw field. then the '|' separator.
    i've noticed that if the size of raw field is 400 bytes or less, everything works fine, i get the correct number of records in the database.
    unfortunately, with a size of 401 or more, SQL*Loader parses the thing into twice as many records as it should. so if i've written 3 records to my input data file, with each record's raw field at 400 bytes or less i get 3 records loaded. but any with a raw field of 401+, i get two records for each.
    any ideas why? and how to correct this? also, any ideas on a better way to do this? all the examples of large data in the online doc and the o'reilly book favor showing examples with large character data, which does me not much good.
    tia. john

  • SQL Loader and Batch ID

    Hi All,
    In our application, we are allowing user to upload data using excel sheet in UI.
    We are using PHP script in UI and using SQL Loader to load data from excel sheet to temp_table.
    The temp_table has a primary key.
    Here my question is , Is there any way to put some batch id for every upload in that table in automatic way ?
    so that we can easily extract the data by using batch id
    we are using Oracle 11g.

    All that does is load a constant value, in which case you might as well just use constant 815 in your control file. If you want to automatically increment the value for each batch, then you need to use a different method.
    Please see the example below. Prior to each data load, it loads the next value of the sequence into a separate table, then selects that value during the data load. Note that a SQL*Loader expression that uses select must be enclosed within parentheses within the double quotes.
    SCOTT@orcl_11gR2> host type test1.dat
    1 Prod1
    2 Prod2
    3 Prod3
    4 Prod4
    5 Prod5
    SCOTT@orcl_11gR2> host type test2.dat
    6 Prod6
    7 Prod7
    8 Prod8
    SCOTT@orcl_11gR2> host type batch.ctl
    options(load=1)
    load data
    replace
    into table batch_tab
    (batch_id expression "test_seq.nextval")
    SCOTT@orcl_11gR2> host type data.ctl
    load data
    append
    into table temp_table
    fields terminated by whitespace
    trailing nullcols
    (p_id,
    p_name,
    batch_id expression "(select batch_id from batch_tab)")
    SCOTT@orcl_11gR2> create table temp_table
      2    (p_id      number primary key,
      3     p_name    varchar2(6),
      4     batch_id  number)
      5  /
    Table created.
    SCOTT@orcl_11gR2> create sequence test_seq
      2  /
    Sequence created.
    SCOTT@orcl_11gR2> create table batch_tab
      2    (batch_id  number)
      3  /
    Table created.
    SCOTT@orcl_11gR2> -- first load:
    SCOTT@orcl_11gR2> host sqlldr scott/tiger control=batch.ctl log=batch1.log
    SQL*Loader: Release 11.2.0.1.0 - Production on Fri Apr 19 17:16:33 2013
    Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
    Commit point reached - logical record count 1
    SCOTT@orcl_11gR2> host sqlldr scott/tiger control=data.ctl data=test1.dat log=test1.log
    SQL*Loader: Release 11.2.0.1.0 - Production on Fri Apr 19 17:16:33 2013
    Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
    Commit point reached - logical record count 5
    SCOTT@orcl_11gR2> select * from batch_tab
      2  /
      BATCH_ID
             1
    1 row selected.
    SCOTT@orcl_11gR2> select * from temp_table
      2  /
          P_ID P_NAME   BATCH_ID
             1 Prod1           1
             2 Prod2           1
             3 Prod3           1
             4 Prod4           1
             5 Prod5           1
    5 rows selected.
    SCOTT@orcl_11gR2> -- second load:
    SCOTT@orcl_11gR2> host sqlldr scott/tiger control=batch.ctl log=batch2.log
    SQL*Loader: Release 11.2.0.1.0 - Production on Fri Apr 19 17:16:33 2013
    Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
    Commit point reached - logical record count 1
    SCOTT@orcl_11gR2> host sqlldr scott/tiger control=data.ctl data=test2.dat log=test2.log
    SQL*Loader: Release 11.2.0.1.0 - Production on Fri Apr 19 17:16:33 2013
    Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
    Commit point reached - logical record count 3
    SCOTT@orcl_11gR2> select * from batch_tab
      2  /
      BATCH_ID
             2
    1 row selected.
    SCOTT@orcl_11gR2> select * from temp_table
      2  /
          P_ID P_NAME   BATCH_ID
             1 Prod1           1
             2 Prod2           1
             3 Prod3           1
             4 Prod4           1
             5 Prod5           1
             6 Prod6           2
             7 Prod7           2
             8 Prod8           2
    8 rows selected.

  • SQL*Loader and multiple files

    Hello, am tasked with loading tables with 21+ million rows. Will SQL*Loader perform better with one large file or many smaller files? Is there any ideal file size for optimal performance? Thank you
    David

    Don, when I tried to loada 21M row table using direct, I get the following messages:
    Record 2373: Rejected - Error on table STAGE_CUSTOMER.
    ORA-03113: end-of-file on communication channel
    SQL*Loader-926: OCI error while uldlfca:OCIDirPathColArrayLoadStream for table STAGE_CUSTOMER
    SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
    SQL*Loader-925: Error while uldlgs: OCIStmtExecute (ptc_hp)
    ORA-03114: not connected to ORACLE
    SQL*Loader-925: Error while uldlgs: OCIStmtFetch (ptc_hp)
    ORA-24338: statement handle not executed
    Here is the SQL*Loader log:
    SQL*Loader: Release 9.2.0.1.0 - Production on Thu Apr 26 15:38:29 2007
    Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
    Control File: stage_customer.ctl
    Character Set UTF8 specified for all input.
    First primary datafile stage_Customer_20070301.csv has a
    utf8 byte order mark in it.
    Data File: stage_Customer_20070301.csv
    Bad File: stage_Customer_20070301.bad
    Discard File: none specified
    (Allow all discards)
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 1000
    Continuation: none specified
    Path used: Direct
    Silent options: FEEDBACK
    Table STAGE_CUSTOMER, loaded from every logical record.
    Insert option in effect for this table: APPEND
    TRAILING NULLCOLS option in effect
    Column Name Position Len Term Encl Datatype
    ROW_ID SEQUENCE (MAX, 1)
    CUSTOMER_ACCT_NUM FIRST * , O(") CHARACTER
    IS_DELETED NEXT * , O(") CHARACTER
    SQL string for column : "CASE WHEN UPPER(:Is_Deleted) IN ('FALSE','0') THEN 0 ELSE 1 END"
    NAME_PREFIX NEXT * , O(") CHARACTER
    FIRST_NAME NEXT * , O(") CHARACTER
    MIDDLE_NAME NEXT * , O(") CHARACTER
    LAST_NAME NEXT * , O(") CHARACTER
    NAME_SUFFIX NEXT * , O(") CHARACTER
    NICK_NAME NEXT * , O(") CHARACTER
    ALT_FIRST_NAME NEXT * , O(") CHARACTER
    ALT_LAST_NAME NEXT * , O(") CHARACTER
    MARKETING_SOURCE_ID NEXT * , O(") CHARACTER
    HOME_PHONE NEXT * , O(") CHARACTER
    WORK_PHONE NEXT * , O(") CHARACTER
    MOBILE_PHONE NEXT * , O(") CHARACTER
    ALTERNATE_PHONE NEXT * , O(") CHARACTER
    EMAIL_ADDR NEXT * , O(") CHARACTER
    ALT_EMAIL_ADDR NEXT * , O(") CHARACTER
    BIRTH_DATE NEXT * , O(") CHARACTER
    SQL string for column : "TRUNC(TO_DATE(:Birth_Date, 'MM/DD/YYYY HH24:MI:SS'))"
    SALES_CHANNEL_ID NEXT * , O(") CHARACTER
    SQL string for column : "decode(:Sales_Channel_id,NULL,NULL,NULL)"
    ASSOCIATE_NUMBER NEXT * , O(") CHARACTER
    ALT_ASSOCIATE_NUMBER NEXT * , O(") CHARACTER
    UPDATE_LOCATION_CD NEXT * , O(") CHARACTER
    UPDATE_DATE NEXT * , O(") CHARACTER
    SQL string for column : "TRUNC(TO_DATE(:Update_Date, 'MM/DD/YYYY HH24:MI:SS'))"
    CUSTOMER_LOGIN_NAME NEXT * , O(") CHARACTER
    DISCOUNT_CD NEXT * , O(") CHARACTER
    DISCOUNT_PERCENT NEXT * , O(") CHARACTER
    BUSINESS_NAME NEXT * , O(") CHARACTER
    POS_TAX_FLAG NEXT * , O(") CHARACTER
    POS_TAX_PROMPT NEXT * , O(") CHARACTER
    SQL string for column : "CASE WHEN UPPER(:POS_TAX_PROMPT) IN ('FALSE','0') THEN 0 ELSE 1 END"
    POS_DEFAULT_TAX_ID NEXT * , O(") CHARACTER
    POS_TAX_ID_EXPIRATION_DATE NEXT * , O(") CHARACTER
    POS_AUTHORIZED_USER_FLAG NEXT * , O(") CHARACTER
    SQL string for column : "CASE WHEN UPPER(:POS_AUTHORIZED_USER_FLAG) IN ('FALSE','0') THEN 0 ELSE 1 END"
    POS_ALLOW_PURCHASE_ORDER_FLAG NEXT * , O(") CHARACTER
    SQL string for column : "CASE WHEN UPPER(:POS_ALLOW_PURCHASE_ORDER_FLAG) IN ('FALSE','0') THEN 0 ELSE 1 END"
    ADDRESS1 NEXT * , O(") CHARACTER
    ADDRESS2 NEXT * , O(") CHARACTER
    ADDRESS3 NEXT * , O(") CHARACTER
    CITY NEXT * , O(") CHARACTER
    STATE_CD NEXT * , O(") CHARACTER
    POSTAL_CD NEXT * , O(") CHARACTER
    COUNTRY_CD NEXT * , O(") CHARACTER
    ALLOW_UPDATE NEXT * , O(") CHARACTER
    SQL string for column : "CASE WHEN UPPER(:ALLOW_UPDATE) IN ('FALSE','0') THEN 0 ELSE 1 END"
    ACTION_CODE NEXT * , O(") CHARACTER
    SQL string for column : "CASE WHEN UPPER(:action_code) IN ('FALSE','0') THEN 0 ELSE 1 END"
    ACCOUNT_TYPE_ID NEXT * , O(") CHARACTER
    LOCALE_CD NEXT * , O(") CHARACTER
    SQL string for column : "CASE WHEN :Locale_CD IS NOT NULL AND :Locale_CD LIKE '__-__' THEN :Locale_CD ELSE 'en-US' END"
    IS_READY_FOR_PROCESSING CONSTANT
    Value is '1'
    IS_BUSINESS CONSTANT
    Value is '0'
    HAD_ERRORS CONSTANT
    Value is '0'
    Record 2373: Rejected - Error on table STAGE_CUSTOMER.
    ORA-03113: end-of-file on communication channel
    SQL*Loader-926: OCI error while uldlfca:OCIDirPathColArrayLoadStream for table STAGE_CUSTOMER
    SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
    SQL*Loader-925: Error while uldlgs: OCIStmtExecute (ptc_hp)
    ORA-03114: not connected to ORACLE
    SQL*Loader-925: Error while uldlgs: OCIStmtFetch (ptc_hp)
    ORA-24338: statement handle not executed
    Table STAGE_CUSTOMER:
    0 Rows successfully loaded.
    1 Row not loaded due to data errors.
    0 Rows not loaded because all WHEN clauses were failed.
    0 Rows not loaded because all fields were null.
    Bind array size not used in direct path.
    Column array rows : 5000
    Stream buffer bytes: 256000
    Read buffer bytes: 1048576
    Total logical records skipped: 0
    Total logical records read: 3469
    Total logical records rejected: 1
    Total logical records discarded: 0
    Direct path multithreading optimization is disabled
    Run began on Thu Apr 26 15:38:29 2007
    Run ended on Thu Apr 26 15:38:30 2007
    Elapsed time was: 00:00:01.18
    CPU time was: 00:00:00.32

  • SQL Loader and INSERT Trigger

    I have problem and your help to solve it would be very much appreciated.
    I am uploading a text file with SQL Loader into a table. Since I used APPEND option in the Loader, I don't want records to be duplicated. So, I wrote a "BEFORE INSERT .. FOR EACH ROW" trigger to check whether that row already exists or not.
    For example, let us consider a table TEST as follows.
    Fld1     NUMBER(2);
    Fld2     VARCHAR2(10);
    Fld3     VARCHAR2(10);
    I have a trigger on this table.
    CREATE OR REPLACE TRIGGER Trg_Bef_Insert_Test
    BEFORE INSERT ON Test FOR EACH ROW
    DECLARE
    vCount NUMBER(2);
    DuplicateRow EXCEPTION;
    BEGIN
    SELECT Count(*) INTO vCount FROM Test
         WHERE fld1 || fld2 || fld3 = :new.fld1 || :new.fld2 || :new.fld3;
    IF vCount > 0 THEN
         RAISE DuplicateRow;
    END IF;
    EXCEPTION
    WHEN DuplicateRow THEN
         Raise_Application_Error (-20001,'Record already exists');
    WHEN OTHERS THEN
         DBMS_OUTPUT.PUT_LINE('ERROR : ' || SQLCODE || '; ' || SUBSTR(SQLERRM, 1, 150));
    END;
    Please refer to the following SQL statements which I executed in the SQL Plus.
    SQL> insert into test values (1,'one','first');
    1 row created.
    SQL> insert into test values (1,'one','first');
    insert into test values (1,'one','first')
    ERROR at line 1:
    ORA-20001: Record already exists
    ORA-06512: at "CAMELLIA.TRG_TEST", line 13
    ORA-04088: error during execution of trigger 'CAMELLIA.TRG_TEST'
    Would anyone tell me why do errors -6512 and -4088 occur ?
    Also, if you have any other suggestion to handle this situation, please let me know.
    By the way, I am using Oracle 8.1.7.
    Thank you.

    There are a few things wrong here, but you should really use a unique constraint for this.
    SQL> create table t (a number, b number, c number,
      2      constraint uk unique (a, b, c));
    Table created.Here's an example data file with 12 records three of which are duplicates.
    1,2,3
    3,4,5
    6,7,8
    3,2,1
    5,5,5
    3,4,5
    3,2,1
    1,1,1
    2,2,2
    6,7,8
    8,8,8
    9,9,9And a control file
    load data
    infile 'in.dat'
    append
    into table t
    fields terminated by ',' optionally enclosed by '"'
    (a, b, c)Running it with sql loader, inserts the nine records, outputs the three duplicates to a .bad file and logs all the errors in the .log file. No need for triggers or any code.
    $ sqlldr control=in.ctl
    Username:xxx
    Password:
    SQL*Loader: Release 9.2.0.1.0 - Production on Mon Apr 21 23:16:44 2003
    Copyright (c) 1982, 2002, Oracle Corporation.  All rights reserved.
    Commit point reached - logical record count 12
    $ cat in.bad
    3,4,5
    3,2,1
    6,7,8
    SQL> select * from t;
             A          B          C
             1          2          3
             3          4          5
             6          7          8
             3          2          1
             5          5          5
             1          1          1
             2          2          2
             8          8          8
             9          9          9
    9 rows selected.

Maybe you are looking for

  • Apple Mail and GoogleMail via IMAP

    Disclaimer: Apple does not necessarily endorse any suggestions, solutions, or third-party software products that may be mentioned in the topic below. Apple encourages you to first seek a solution at Apple Support. The following links are provided as

  • Can't print pdf's with new Adobe Reader XI

    Hi there. I've just downloaded the latest version of Adobe Reader XI. I can open up pdf's but when I try to print it crashes. I am using a Mac OS X Version 10.6.8. I'm not very techie so plain language would be really helpful. I did click a box about

  • Mailbox very tiny - how to enlarge?

    I installed Thunderbird on my 27inch iMac. I find the lettertype in the program itself much to tiny. Changing te font size through the normal menu's only enlarges the font size in de emails but nog the headers, the titels of the email etc. Can anybod

  • Monitor Changes in  PRD System

    Hi, Is there any transaction Code to monitor the changes (a log file kind of thing) that were made to the objects in the production system on day to day basis ? If there is no such transaction code is there any other way to trace the changes any work

  • Import books on CD into audibooks?

    How to I import books on CD into Audibooks in iTunes? iTunes identifies the CD as music and imports as a music CD.Help!