SQL Loader Inserting Log File Statistics to a table

Hello.
I'm contemplating how to approach gathering the statistics from the SQL Loader log file to insert them into a table. I've approached this from a Korn Shell Script perspective previously, but now that I'm working in a Windows environment and my peers aren't keen about batch files and scripting I thought I'd attempt to use SQL Loader itself to read the log file and insert one or more records into a table that tracks data uploads. Has anyone created a control file that accomplishes this?
My current environment:
Windows 2003 Server
SQL*Loader: Release 10.2.0.1.0
Thanks,
Luke

Hello.
Learned a little about inserting into multiple tables with delimited records. Here is my current tested control file:
LOAD DATA
APPEND
INTO TABLE upload_log
WHEN (1:12) = 'SQL*Loader: '
FIELDS TERMINATED BY WHITESPACE
TRAILING NULLCOLS
(  upload_log_id   RECNUM
, filler_field_0  FILLER
, filler_field_1  FILLER
, filler_field_2  FILLER
, filler_field_3  FILLER
, filler_field_4  FILLER
, filler_field_5  FILLER
, day_of_week
, month
, day_of_month
, time_of_day
, year
, log_started_on          "TO_DATE((:month ||' '|| :day_of_month ||' '|| :time_of_day ||' '|| :year), 'Mon DD HH24:MI:SS YYYY')"
INTO TABLE upload_log
WHEN (1:11) = 'Data File: '
FIELDS TERMINATED BY ':'
(  upload_log_id    RECNUM
, filler_field_0   FILLER  POSITION(1)
, input_file_name          "TRIM(:input_file_name)"
INTO TABLE upload_log
WHEN (1:6) = 'Table '
FIELDS TERMINATED BY WHITESPACE
(  upload_log_id   RECNUM
, filler_field_0  FILLER  POSITION(1)
, table_name              "RTRIM(:table_name, ',')"
INTO TABLE upload_rejects
WHEN (1:7) = 'Record '
FIELDS TERMINATED BY ':'
(  upload_rejects_id  RECNUM
, record_number      POSITION(1)  "TO_NUMBER(SUBSTR(:record_number,8,20))"
, reason
INTO TABLE upload_rejects
WHEN (1:4) = 'ORA-'
FIELDS TERMINATED BY ':'
(  upload_rejects_id  RECNUM
, error_code         POSITION(1)
, error_desc
INTO TABLE upload_log
WHEN (1:22) = 'Total logical records '
FIELDS TERMINATED BY WHITESPACE
(  upload_log_id      RECNUM
, filler_field_0     FILLER  POSITION(1)
, filler_field_1     FILLER
, filler_field_2     FILLER
, action                     "RTRIM(:action, ':')"
, number_of_records
INTO TABLE upload_log
WHEN (1:13) = 'Run began on '
FIELDS TERMINATED BY WHITESPACE
TRAILING NULLCOLS
(  upload_log_id   RECNUM
, filler_field_0  FILLER  POSITION(1)
, filler_field_1  FILLER
, filler_field_2  FILLER
, day_of_week
, month
, day_of_month
, time_of_day
, year
, run_began_on            "TO_DATE((:month ||' '|| :day_of_month ||' '|| :time_of_day ||' '|| :year), 'Mon DD HH24:MI:SS YYYY')"
INTO TABLE upload_log
WHEN (1:13) = 'Run ended on '
FIELDS TERMINATED BY WHITESPACE
TRAILING NULLCOLS
(  upload_log_id   RECNUM
, filler_field_0  FILLER  POSITION(1)
, filler_field_1  FILLER
, filler_field_2  FILLER
, day_of_week
, month
, day_of_month
, time_of_day
, year
, run_ended_on            "TO_DATE((:month ||' '|| :day_of_month ||' '|| :time_of_day ||' '|| :year), 'Mon DD HH24:MI:SS YYYY')"
INTO TABLE upload_log
WHEN (1:18) = 'Elapsed time was: '
FIELDS TERMINATED BY ':'
(  upload_log_id   RECNUM
, filler_field_0  FILLER  POSITION(1)
, filler_field_1  FILLER
, filler_field_2  FILLER
, elapsed_time
INTO TABLE upload_log
WHEN (1:14) = 'CPU time was: '
FIELDS TERMINATED BY ':'
(  upload_log_id   RECNUM
, filler_field_0  FILLER  POSITION(1)
, filler_field_1  FILLER
, filler_field_2  FILLER
, cpu_time
)Here are the basic table create scripts:
TRUNCATE TABLE upload_log;
DROP TABLE upload_log;
CREATE TABLE upload_log
(  upload_log_id      INTEGER
, day_of_week        VARCHAR2(  3)
, month              VARCHAR2(  3)
, day_of_month       INTEGER
, time_of_day        VARCHAR2(  8)
, year               INTEGER
, log_started_on     DATE
, input_file_name    VARCHAR2(255)
, table_name         VARCHAR2( 30)
, action             VARCHAR2( 10)
, number_of_records  INTEGER
, run_began_on       DATE
, run_ended_on       DATE
, elapsed_time       VARCHAR2(  8)
, cpu_time           VARCHAR2(  8)
TRUNCATE TABLE upload_rejects;
DROP TABLE upload_rejects;
CREATE TABLE upload_rejects
(  upload_rejects_id  INTEGER
, record_number      INTEGER
, reason             VARCHAR2(255)
, error_code         VARCHAR2(  9)
, error_desc         VARCHAR2(255)
);Now, if I could only insert a single record to the upload_log table (per table logged); adding separate columns for skipped, read, rejected, discarded quantities. Any advice on how to use SQL Loader to do this (writing a procedure would be fairly simple, but I'd like to perform all of the work in one place if at all possible)?
Thanks,
Luke
Edited by: Luke Mackey on Nov 12, 2009 4:28 PM

Similar Messages

  • SQL*Loader Sequential Data File Record Processing?

    If I use the conventional path will SQL*Loader process a data file sequentially from top to bottom?  I have a file comprised of header and detail records with no value found in the detail records that can be used to relate to the header records.  The only option is to derive a header value via a sequence (nextval) and then populate the detail records with the same value pulled from the same sequence (currval).  But for this to work SQL*Loader must process the file in the exact same sequence that the data has been written to the data file.  I've read through the 11g Oracle® Database Utilities SQL*Loader sections looking for proof that this is what will happen but haven't found this information and I don't want to assume that SQL*Loader will always process the data file records sequentially.
    Thank you

    Oracle Support responded with the following statement.
    "Yes, SQL*LOADER process data file from top to bottom.
    This was touched in the note below:
    SQL*Loader - How to Load a Single Logical Record from Physical Records which Include Linefeeds (Doc ID 160093.1)"
    Jason

  • Loading an Excel file into a Z-Table

    Hi Gurus,
    Can somebody guide me how to load an excel file into  a Z-Table.

    Hi,
    You need to write a ABAP code for this....
    The following is the code for loading data into an internal table:
    types: begin of ttab,
           rec(1000) type c,
           end of ttab.
    types: begin of tdat,
           fld1(10) type c,
           fld2(10) type c,
           fld3(10) type c,
           end of tdat.
    data: itab type table of ttab with header line.
    data: idat type table of tdat with header line.
    data: file_str type string.
    parameters: p_file type localfile.
    at selection-screen on value-request for p_file.
      call function 'KD_GET_FILENAME_ON_F4'
           exporting
                static    = 'X'
           changing
                file_name = p_file.
    start-of-selection.
      file_str = p_file.
      call function 'GUI_UPLOAD'
           exporting
                filename                = file_str
           tables
                data_tab                = itab
           exceptions
                file_open_error         = 1
                file_read_error         = 2
                no_batch                = 3
                gui_refuse_filetransfer = 4
                invalid_type            = 5
                no_authority            = 6
                unknown_error           = 7
                bad_data_format         = 8
                header_not_allowed      = 9
                separator_not_allowed   = 10
                header_too_long         = 11
                unknown_dp_error        = 12
                access_denied           = 13
                dp_out_of_memory        = 14
                disk_full               = 15
                dp_timeout              = 16
                others                  = 17.
    delete itab index 1.
      loop at itab.
        clear idat.
        split itab-rec at cl_abap_char_utilities=>horizontal_tab
                              into idat-fld1
                                   idat-fld2
                                   idat-fld3.
        append idat.
      endloop.
      loop at idat.
        write:/ idat-fld1, idat-fld2, idat-fld3.
      endloop.
    From this internal store it onto the database.....
    Sharma one more thing you must have posted this question in ABAP fourum for better results....
    Hope it helps,
    Thanks,
    Happy Life,
    Aravind

  • SQL*Loader and multiple files

    Hello, am tasked with loading tables with 21+ million rows. Will SQL*Loader perform better with one large file or many smaller files? Is there any ideal file size for optimal performance? Thank you
    David

    Don, when I tried to loada 21M row table using direct, I get the following messages:
    Record 2373: Rejected - Error on table STAGE_CUSTOMER.
    ORA-03113: end-of-file on communication channel
    SQL*Loader-926: OCI error while uldlfca:OCIDirPathColArrayLoadStream for table STAGE_CUSTOMER
    SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
    SQL*Loader-925: Error while uldlgs: OCIStmtExecute (ptc_hp)
    ORA-03114: not connected to ORACLE
    SQL*Loader-925: Error while uldlgs: OCIStmtFetch (ptc_hp)
    ORA-24338: statement handle not executed
    Here is the SQL*Loader log:
    SQL*Loader: Release 9.2.0.1.0 - Production on Thu Apr 26 15:38:29 2007
    Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
    Control File: stage_customer.ctl
    Character Set UTF8 specified for all input.
    First primary datafile stage_Customer_20070301.csv has a
    utf8 byte order mark in it.
    Data File: stage_Customer_20070301.csv
    Bad File: stage_Customer_20070301.bad
    Discard File: none specified
    (Allow all discards)
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 1000
    Continuation: none specified
    Path used: Direct
    Silent options: FEEDBACK
    Table STAGE_CUSTOMER, loaded from every logical record.
    Insert option in effect for this table: APPEND
    TRAILING NULLCOLS option in effect
    Column Name Position Len Term Encl Datatype
    ROW_ID SEQUENCE (MAX, 1)
    CUSTOMER_ACCT_NUM FIRST * , O(") CHARACTER
    IS_DELETED NEXT * , O(") CHARACTER
    SQL string for column : "CASE WHEN UPPER(:Is_Deleted) IN ('FALSE','0') THEN 0 ELSE 1 END"
    NAME_PREFIX NEXT * , O(") CHARACTER
    FIRST_NAME NEXT * , O(") CHARACTER
    MIDDLE_NAME NEXT * , O(") CHARACTER
    LAST_NAME NEXT * , O(") CHARACTER
    NAME_SUFFIX NEXT * , O(") CHARACTER
    NICK_NAME NEXT * , O(") CHARACTER
    ALT_FIRST_NAME NEXT * , O(") CHARACTER
    ALT_LAST_NAME NEXT * , O(") CHARACTER
    MARKETING_SOURCE_ID NEXT * , O(") CHARACTER
    HOME_PHONE NEXT * , O(") CHARACTER
    WORK_PHONE NEXT * , O(") CHARACTER
    MOBILE_PHONE NEXT * , O(") CHARACTER
    ALTERNATE_PHONE NEXT * , O(") CHARACTER
    EMAIL_ADDR NEXT * , O(") CHARACTER
    ALT_EMAIL_ADDR NEXT * , O(") CHARACTER
    BIRTH_DATE NEXT * , O(") CHARACTER
    SQL string for column : "TRUNC(TO_DATE(:Birth_Date, 'MM/DD/YYYY HH24:MI:SS'))"
    SALES_CHANNEL_ID NEXT * , O(") CHARACTER
    SQL string for column : "decode(:Sales_Channel_id,NULL,NULL,NULL)"
    ASSOCIATE_NUMBER NEXT * , O(") CHARACTER
    ALT_ASSOCIATE_NUMBER NEXT * , O(") CHARACTER
    UPDATE_LOCATION_CD NEXT * , O(") CHARACTER
    UPDATE_DATE NEXT * , O(") CHARACTER
    SQL string for column : "TRUNC(TO_DATE(:Update_Date, 'MM/DD/YYYY HH24:MI:SS'))"
    CUSTOMER_LOGIN_NAME NEXT * , O(") CHARACTER
    DISCOUNT_CD NEXT * , O(") CHARACTER
    DISCOUNT_PERCENT NEXT * , O(") CHARACTER
    BUSINESS_NAME NEXT * , O(") CHARACTER
    POS_TAX_FLAG NEXT * , O(") CHARACTER
    POS_TAX_PROMPT NEXT * , O(") CHARACTER
    SQL string for column : "CASE WHEN UPPER(:POS_TAX_PROMPT) IN ('FALSE','0') THEN 0 ELSE 1 END"
    POS_DEFAULT_TAX_ID NEXT * , O(") CHARACTER
    POS_TAX_ID_EXPIRATION_DATE NEXT * , O(") CHARACTER
    POS_AUTHORIZED_USER_FLAG NEXT * , O(") CHARACTER
    SQL string for column : "CASE WHEN UPPER(:POS_AUTHORIZED_USER_FLAG) IN ('FALSE','0') THEN 0 ELSE 1 END"
    POS_ALLOW_PURCHASE_ORDER_FLAG NEXT * , O(") CHARACTER
    SQL string for column : "CASE WHEN UPPER(:POS_ALLOW_PURCHASE_ORDER_FLAG) IN ('FALSE','0') THEN 0 ELSE 1 END"
    ADDRESS1 NEXT * , O(") CHARACTER
    ADDRESS2 NEXT * , O(") CHARACTER
    ADDRESS3 NEXT * , O(") CHARACTER
    CITY NEXT * , O(") CHARACTER
    STATE_CD NEXT * , O(") CHARACTER
    POSTAL_CD NEXT * , O(") CHARACTER
    COUNTRY_CD NEXT * , O(") CHARACTER
    ALLOW_UPDATE NEXT * , O(") CHARACTER
    SQL string for column : "CASE WHEN UPPER(:ALLOW_UPDATE) IN ('FALSE','0') THEN 0 ELSE 1 END"
    ACTION_CODE NEXT * , O(") CHARACTER
    SQL string for column : "CASE WHEN UPPER(:action_code) IN ('FALSE','0') THEN 0 ELSE 1 END"
    ACCOUNT_TYPE_ID NEXT * , O(") CHARACTER
    LOCALE_CD NEXT * , O(") CHARACTER
    SQL string for column : "CASE WHEN :Locale_CD IS NOT NULL AND :Locale_CD LIKE '__-__' THEN :Locale_CD ELSE 'en-US' END"
    IS_READY_FOR_PROCESSING CONSTANT
    Value is '1'
    IS_BUSINESS CONSTANT
    Value is '0'
    HAD_ERRORS CONSTANT
    Value is '0'
    Record 2373: Rejected - Error on table STAGE_CUSTOMER.
    ORA-03113: end-of-file on communication channel
    SQL*Loader-926: OCI error while uldlfca:OCIDirPathColArrayLoadStream for table STAGE_CUSTOMER
    SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
    SQL*Loader-925: Error while uldlgs: OCIStmtExecute (ptc_hp)
    ORA-03114: not connected to ORACLE
    SQL*Loader-925: Error while uldlgs: OCIStmtFetch (ptc_hp)
    ORA-24338: statement handle not executed
    Table STAGE_CUSTOMER:
    0 Rows successfully loaded.
    1 Row not loaded due to data errors.
    0 Rows not loaded because all WHEN clauses were failed.
    0 Rows not loaded because all fields were null.
    Bind array size not used in direct path.
    Column array rows : 5000
    Stream buffer bytes: 256000
    Read buffer bytes: 1048576
    Total logical records skipped: 0
    Total logical records read: 3469
    Total logical records rejected: 1
    Total logical records discarded: 0
    Direct path multithreading optimization is disabled
    Run began on Thu Apr 26 15:38:29 2007
    Run ended on Thu Apr 26 15:38:30 2007
    Elapsed time was: 00:00:01.18
    CPU time was: 00:00:00.32

  • SQL Loader Inserts chr(13) and chr(10) in the first column of every row.

    Hi,
    I have exported a data in a pipe delimited file using TOAD in one database. Now I want to load the data in my local database using SQL Loader. However every time I try to load the data a double quote followed by a new line is entered for the first column of each row. Unfortunately the delimited file is very big and hence can't be posted here. However I tried the same with a customized table and its data and found the same problem. Below are the table structures and control file that I used.
    create table test_sql
    a varchar2(30),
    b date
    insert into test_sql values('51146263',sysdate-3);
    insert into test_sql values('51146261,sysdate-1);
    EXPORTED PIPE DELIMITED FILE_
    A|B|!##!
    51146261|04/14/13 4:55:18 PM|!##!
    51146263|04/12/13 4:55:32 PM|!##!
    create table test_sql1 as select * from test_sql where 1=2;
    CONTROL FILE_
    OPTIONS(SKIP=1)
    LOAD DATA
    INFILE 'C:\Users\Prithwish\Desktop\Test.txt' "str '!##!'"
    PRESERVE BLANKS
    INTO TABLE TEST_SQL1
    FIELDS TERMINATED BY '|' OPTIONALLY ENCLOSED BY '"'
    TRAILING NULLCOLS
    A CHAR(2000),
    B DATE "MM/DD/YYYY HH12:MI:SS AM"
    select * from TEST_SQL1;
    After this when I paste it in notepad I get the following result
    A B
    51146261"     14-APR-0013 16:55:18
    51146263"     12-APR-0013 16:55:32.
    I have no idea how the quotes or the newline appear. Is this a Toad bug? Any help would be greatly appreciated. Really urgent.
    Thanks in advance
    Regards

    Hi Harry,
    I actually thought that the str !##! was causing the problem. Actually my original export has some new lines in some specific columns so I can't keep the new line as my line terminator and hence I kept the !##! as my terminator.
    When I put the same data in a notepad and load it there is no problem at all. For e.g I just typed the following in a notepad and the data loaded just fine.
    A|B|!##!
    51146261|01-01-01 10:10:10 AM|!##!
    51146263|01-01-01 11:11:11 AM|!##!
    Its just when I load the exported file there the problem arises though I have verified the file using UNIX as well using octal dump and found no hidden characters.
    Regards,
    Prithwish

  • Using sql load insert multiple fields data into a single column in database

    Hi ,
    I have my log file in sun OS box something like this
    =======
    (07/29/2009 00:02:24.467) 367518 (07/29/2009 00:02:26.214) 949384011
    (07/29/2009 00:02:26.236) 3675 (07/29/2009 00:02:28.207) 949395117
    (07/29/2009 00:02:28.240) 337710 (07/29/2009 00:02:30.621) 949400864
    =============
    I am trying to insert the data into oracle data base as follows.
    =============================
    column1 : (07/29/2009 00:02:24.467)
    column2 : 367518
    column3 : (07/29/2009 00:02:26.214)
    column4 : 949384011
    ===========================
    Can anyone help me with the control file format?
    someone suggested me the code below.
    ==========
    LOAD DATA
    INFILE 'D:\work\load.txt'
    INTO TABLE sample
    (col1 POSITION(02:24) char,
    col2 POSITION(27:32) INTEGER EXTERNAL,
    col3 POSITION(35:57) CHAR,
    col4 POSITION(60:68) INTEGER EXTERNAL
    ===========
    but this works only for the fixed length data? Please help

    user11744904 wrote:
    Hi ,
    I have my log file in sun OS box something like this
    =======
    (07/29/2009 00:02:24.467) 367518 (07/29/2009 00:02:26.214) 949384011
    (07/29/2009 00:02:26.236) 3675 (07/29/2009 00:02:28.207) 949395117
    (07/29/2009 00:02:28.240) 337710 (07/29/2009 00:02:30.621) 949400864
    =============
    I am trying to insert the data into oracle data base as follows.
    =============================
    column1 : (07/29/2009 00:02:24.467)
    column2 : 367518
    column3 : (07/29/2009 00:02:26.214)
    column4 : 949384011
    ===========================
    Can anyone help me with the control file format?
    someone suggested me the code below.
    ==========
    LOAD DATA
    INFILE 'D:\work\load.txt'
    INTO TABLE sample
    (col1 POSITION(02:24) char,
    col2 POSITION(27:32) INTEGER EXTERNAL,
    col3 POSITION(35:57) CHAR,
    col4 POSITION(60:68) INTEGER EXTERNAL
    ===========
    but this works only for the fixed length data? Please helpIs the requirement to load all data in a single column or multiple columns? The thread subject and body are conflicting.

  • SQL Loader and control file changes for different users

    In the front end of my application I can select a data file and a control file, and load data to the table mentioned in .ctl file. Every user who logs in uses the same .ctl file and so loads onto the same table. Now I want the user to load data onto the table in his own schema. I can get the username of the user currently logged in and i want to insert it into that username.table. So can i copy the contents of the .ctl file into a variable, modify it into username.table in that string and pass that variable as a parameter to the sqlldr command instead of the .ctl file.
    Or is there a better way how I can modify the same control file everytime to change tablename to username.tablename in .ctl file and pass to sqlldr to load data to table in local user schema table.
    Thanks and Regards

    Thanks for the reply .. user do have their user credentials but only for the application ... but all users use a common loader and control file once they log into the application. So irrespective of which user is logged in he selects the same control file and loads to the same table mentioned in the control file .. i instead want user to be able to load to the table in control file but into his schema like username.tablename instead of just the tablename mentioned in .ctl file.

  • SQL*Loader with multiple files

    Gurus,
    I search the documentation and this forum and haven't found a solution to my issue yet...
    I am not expert of SQL*Loader. I have used SQL*Loader to copy from one file to a table many times. But I have not copied multiple files into one table especially with different names.
    More specifically....
    I need to load data from multiple files into a table. But the file names will be different each time. A file will be created every hour. The file name will consist of the root file name appended by a time stamp. For example, a file created on 10/07/2010 at 2:15 P.M. would be filea100720101415.txt while a file created on 10/08/2010 at 8:15 A.M. would be filea100820100815.txt. All the files will be in one directory.How can I load the data from the files using SQL*Loader?
    My database: Oracle 10g Release 2
    Operating System: Windows 2003 Server
    Please assist.
    Robert

    sect55 wrote:
    Gurus,
    I search the documentation and this forum and haven't found a solution to my issue yet...
    I am not expert of SQL*Loader. I have used SQL*Loader to copy from one file to a table many times. But I have not copied multiple files into one table especially with different names.
    More specifically....
    I need to load data from multiple files into a table. But the file names will be different each time. A file will be created every hour. The file name will consist of the root file name appended by a time stamp. For example, a file created on 10/07/2010 at 2:15 P.M. would be filea100720101415.txt while a file created on 10/08/2010 at 8:15 A.M. would be filea100820100815.txt. All the files will be in one directory.How can I load the data from the files using SQL*Loader?
    My database: Oracle 10g Release 2
    Operating System: Windows 2003 Server
    Please assist.
    RobertToo bad this isn't in *nix, where you get a powerful shell scripting capability. 
    That said, here is the core of the solution .... you will also need a way to identify files that have been processed vs. new ones. Maybe rename them, maybe move them. But with this sample you can see the basics. From there it is really an issue of DOS scripting, which would better be found by googling around a bit.
    cd c:\loadfiles
    FOR %%datfile IN (*.txt) DO SQLLDR CONTROL=sample.ctl, LOG=sample.log, BAD=baz.bad, DATA=%%datfileTry googling "dos scripting language". You'll find lots of tutorials and ideas on "advanced" (well, as advanced as DOS gets) techniques to solve your problem.
    Edited by: EdStevens on Dec 1, 2010 5:03 PM

  • Problem import csv file with SQL*loader and control file

    I have a *csv file looking like this:
    E0100070;EKKJ 1X10/10 1 KV;1;2003-06-16;01C;75
    E0100075;EKKJ 1X10/10 1 KV;500;2003-06-16;01C;67
    E0100440;EKKJ 2X2,5/2,5 1 KV;1;2003-06-16;01C;37,2
    E0100445;EKKJ 2X2,5/2,5 1 KV;500;2003-06-16;01C;33,2
    E0100450;EKKJ 2X4/4 1 KV;1;2003-06-16;01C;53
    E0100455;EKKJ 2X4/4 1 KV;500;2003-06-16;01C;47,1
    I want to import this csv file to this table:
    create table artikel (artnr varchar2(10), namn varchar2(25), fp_storlek number, datum date, mtrlid varchar2(5), pris number);
    My controlfile looks like this:
    LOAD DATA
    INFILE 'e:\test.csv'
    INSERT
    INTO TABLE ARTIKEL
    FIELDS TERMINATED BY ';'
    TRAILING NULLCOLS
    (ARTNR, NAMN, FP_STORLEK char "to_number(:fp_storlek,'99999')", DATUM date 'yyyy-mm-dd', MTRLID, pris char "to_number(:pris,'999999D99')")
    I cant get sql*loader to import the last column(pris) as I want. It ignore my decimal point which in this case is "," and not "." maybe this is the problem. If the decimal point is the problem how can I get oracle to recognize "," as a decimal point??
    the result from the import now, is that a decimal number (37,2) becomes 372 in the table

    Set NLS_NUMERIC_CHARACTERS environment variable at OS level, before running SqlLoader :
    $ cat test.csv
    E0100070;EKKJ 1X10/10 1 KV;1;2003-06-16;01C;75
    E0100075;EKKJ 1X10/10 1 KV;500;2003-06-16;01C;67
    E0100440;EKKJ 2X2,5/2,5 1 KV;1;2003-06-16;01C;37,2
    E0100445;EKKJ 2X2,5/2,5 1 KV;500;2003-06-16;01C;33,2
    E0100450;EKKJ 2X4/4 1 KV;1;2003-06-16;01C;53
    E0100455;EKKJ 2X4/4 1 KV;500;2003-06-16;01C;47,1
    $ cat artikel.ctl
    LOAD DATA
    INFILE 'test.csv'
    replace
    INTO TABLE ARTIKEL
    FIELDS TERMINATED BY ';'
    TRAILING NULLCOLS
    (ARTNR, NAMN, FP_STORLEK char "to_number(:fp_storlek,'99999')", DATUM date 'yyyy-mm-dd', MTRLID, pris char "to_number(:pris,'999999D99')")
    $ sqlldr scott/tiger control=artikel
    SQL*Loader: Release 10.1.0.3.0 - Production on Sat Nov 12 15:10:01 2005
    Copyright (c) 1982, 2004, Oracle.  All rights reserved.
    Commit point reached - logical record count 6
    $ sqlplus scott/tiger
    SQL*Plus: Release 10.1.0.3.0 - Production on Sat Nov 12 15:10:11 2005
    Copyright (c) 1982, 2004, Oracle.  All rights reserved.
    Connected to:
    Oracle Database 10g Enterprise Edition Release 10.1.0.3.0 - Production
    With the Partitioning, OLAP and Data Mining options
    SQL> select * from artikel;
    ARTNR      NAMN                      FP_STORLEK DATUM      MTRLI       PRIS
    E0100070   EKKJ 1X10/10 1 KV                  1 16/06/2003 01C           75
    E0100075   EKKJ 1X10/10 1 KV                500 16/06/2003 01C           67
    E0100440   EKKJ 2X2,5/2,5 1 KV                1 16/06/2003 01C          372
    E0100445   EKKJ 2X2,5/2,5 1 KV              500 16/06/2003 01C          332
    E0100450   EKKJ 2X4/4 1 KV                    1 16/06/2003 01C           53
    E0100455   EKKJ 2X4/4 1 KV                  500 16/06/2003 01C          471
    6 rows selected.
    SQL> exit
    Disconnected from Oracle Database 10g Enterprise Edition Release 10.1.0.3.0 - Production
    With the Partitioning, OLAP and Data Mining options
    $ export NLS_NUMERIC_CHARACTERS=',.'
    $ sqlldr scott/tiger control=artikel
    SQL*Loader: Release 10.1.0.3.0 - Production on Sat Nov 12 15:10:41 2005
    Copyright (c) 1982, 2004, Oracle.  All rights reserved.
    Commit point reached - logical record count 6
    $ sqlplus scott/tiger
    SQL*Plus: Release 10.1.0.3.0 - Production on Sat Nov 12 15:10:45 2005
    Copyright (c) 1982, 2004, Oracle.  All rights reserved.
    Connected to:
    Oracle Database 10g Enterprise Edition Release 10.1.0.3.0 - Production
    With the Partitioning, OLAP and Data Mining options
    SQL> select * from artikel;
    ARTNR      NAMN                      FP_STORLEK DATUM      MTRLI       PRIS
    E0100070   EKKJ 1X10/10 1 KV                  1 16/06/2003 01C           75
    E0100075   EKKJ 1X10/10 1 KV                500 16/06/2003 01C           67
    E0100440   EKKJ 2X2,5/2,5 1 KV                1 16/06/2003 01C         37,2
    E0100445   EKKJ 2X2,5/2,5 1 KV              500 16/06/2003 01C         33,2
    E0100450   EKKJ 2X4/4 1 KV                    1 16/06/2003 01C           53
    E0100455   EKKJ 2X4/4 1 KV                  500 16/06/2003 01C         47,1
    6 rows selected.
    SQL>                                                                            Control file is exactly as yours, I just put replace instead of insert.

  • Sql loader maximum data file size..?

    Hi - I wrote sql loader script runs through shell script which will import data into table from CSV file. CSV file size is around 700MB. I am using Oracle 10g with Sun Solaris 5 environment.
    My question is, is there any maximum data file size. The following code from my shell script.
    SQLLDR=
    DB_USER=
    DB_PASS=
    DB_SID=
    controlFile=
    dataFile=
    logFileName=
    badFile=
    ${SQLLDR} userid=$DB_USER"/"$DB_PASS"@"$DB_SID \
              control=$controlFile \
              data=$dataFile \
              log=$logFileName \
              bad=$badFile \
              direct=true \
              silent=all \
              errors=5000Here is my control file code
    LOAD DATA
    APPEND
    INTO TABLE KEY_HISTORY_TBL
    WHEN OLD_KEY <> ''
    AND NEW_KEY <> ''
    FIELDS TERMINATED BY ','
    TRAILING NULLCOLS
            OLD_KEY "LTRIM(RTRIM(:OLD_KEY))",
            NEW_KEY "LTRIM(RTRIM(:NEW_KEY))",
            SYS_DATE "SYSTIMESTAMP",
            STATUS CONSTANT 'C'
    )Thanks,
    -Soma
    Edited by: user4587490 on Jun 15, 2011 10:17 AM
    Edited by: user4587490 on Jun 15, 2011 11:16 AM

    Hello Soma.
    How many records exist in your 700 MB CSV file? How many do you expect to process in 10 minutes? You may want to consider performing a set of simple unit tests with 1) 1 record, 2) 1,000 records, 3) 100 MB filesize, etc. to #1 validate that your shell script and control file syntax function as expected (including the writing of log files, etc.), and #2 gauge how long the processing will take for the full file.
    Hope this helps,
    Luke
    Please mark the answer as helpful or answered if it is so. If not, provide additional details.
    Always try to provide actual or sample statements and the full text of errors along with error code to help the forum members help you better.

  • SQL loader with *.csv file

    Good morning,
    I have a *.CVS file containing data like this:
    A123456789,Ah Tong Station,Jalan Dungun
    I would like insert the data into a table that returns 1 row for each value
    id outlet_name addr_1A123456789 Ah Tong Station Jalan Dungun
    etc
    In the stored procedure, SQL loader, I used
    Insert into XXXXX values
    (SUBSTR(input_buffer, 1, 10),
    SUBSTR(input_buffer, 11, 60),
    SUBSTR(input_buffer, 61, 110);
    but the problem is that insertion of the outlet_name and addr_1 will not follow the subcripts that provided. Instead in gave an output like that:
    id outlet_name addr_1
    A123456789 ,Ah Tong Station,Jalan Dungun NULL
    Hope to get some advice asap. Thanks. :)
    Pauline

    Why not use the SQL*Loader proper to load this in to your table? It will be significantly faster than anything you could do in PL/SQL, especially if your volumes increase. And it will take care of the parsing for you, as long as you tell it the field delimiter. You've already seen that you can't rely on fixed widths when using delimited data.
    Given this data...
    A123456789,Ah Tong Station,Jalan Dungun
    A234,Some Station,Some Place...the following SQL*Loader control file ( stored in a file called mytable.ctl ) will load your table ( note I've used truncate - you could also append )...
    load data
    infile 'mydata.csv'
    truncate
    into table mytable
    fields terminated by ','
    (  id
    ,  outlet_name
    ,  addr_1
    )This is invoked using the following:-
    sqlldr userid/password@tns control=mytable.ctlThe results are:-
    SQL> select * from mytable;
    ID                             OUTLET_NAME                    ADDR_1
    A123456789                     Ah Tong Station                Jalan Dungun
    A234                           Some Station                   Some PlaceYou will find this much easier. Of course, you could always avoid the database altogether if you don't need to actually store this data. You could just parse the source .csv ( if you are on UNIX this will be really easy using an awk one-liner ) and write it to an output file.
    Anyway, hope this helps.
    Regards
    Adrian

  • Unable to view SQL Request in Log files

    Hi Folks,
    I am facing an issue which I am unable to find out the solution to view the physical query generated in log files in Presentation Services.
    Below is the SQL Request generated but I want to view the exact physical query i.e SQL Request which is hitting DB.
    So please guiude me to resolve this issue, I guess it is because of Initialization blocks created which is blocking to view the SQL request.
    -------------------- SQL Request:
    set variable LOGLEVEL = 7;SELECT "- Policy Effective-Start Date"."Start Quarter" saw_0, "- Insurance Policy Facts".Revenue saw_1, "- Insurance Policy Facts"."# Insurance Policies" saw_2, "Insurance Policy".Status saw_3, "Insurance Policy".Type saw_4 FROM "Insurance Policies" WHERE ("Insurance Policy".Type = 'Policy') AND ("- Policy Effective-Start Date"."Start Julian Day Number" BETWEEN VALUEOF(CURRENT_JULIAN_DAY)-365 AND VALUEOF("CURRENT_JULIAN_DAY")) ORDER BY saw_0, saw_3, saw_4
    /* QUERY_SRC_CD='rawSQL' */
    Regards
    Dj

    There is no Enterprise Edition of SSMS. There is SSMS Basic and SSMS Complete. Prior to 2012 sp1, only SSMS Basic were available with Express Edition - but as of 2012 sp1 Expredd also offers SSMS Complete. SSMS Complete is selected bu default when you install
    SSMS (unless you are prior to 2012 sp1 and are using Express, of course).
    However, even SSMS Basic *should* show Agent assuming you have permissions for that. This is hearsay, but from trusted sources. Here is what to do:
    Check what is installed for the machine from where you are running SSMS. You can do that using SQL Server Installation Center - see this blog post: http://sqlblog.com/blogs/tibor_karaszi/archive/2011/02/10/what-does-this-express-edition-look-like-anyhow.aspx
     (towards the end).
    On that machine try both this problematic account as well as an account which is sysadmin. Does the sysadmin account see Agent? If so, you know permissions aren't granted properly. If not, then you know the tool is the problme.
    Also try the problematic account from a machine where you know you see Agent normally. Again, this will help you assess whether the problem is the tool (SSMS) or permissions for the account.
    Tibor Karaszi, SQL Server MVP |
    web | blog

  • Help! SQL server database log file increasing enormously

    I have 5 SSIS jobs running in sql server job agent and some of them are pulling transactional data into our database over the interval of 4 hours frequently. The problem is log file of our database is growing rapidly which means in a day, it eats up 160GB of
    disk space. Since our requirement dont need In-point recovery, so I set the recovery model to SIMPLE, eventhough I set it to SIMPLE, the log
    data consumes more than 160GB in a day. Because of disk full, the scheduled jobs getting failed often.Temporarily I am doing DETACH approach
    to cleanup the log.
    FYI: All the SSIS packages in the job is using Transaction on
    some tasks. for eg. Sequence Cointainer
    I want a permanent solution to keep log file in a particular memory limit and as I said earlier I dont want my log data for future In-Point recovery, so no need to take log backup at all.
    And one more problem is that in our database,the transactional table has 10 million records in it and some master tables have over 1000 records on them but our mdf file
    size is about 50 GB now.I dont believe that this 10 million records should make it to 50GB memory consumption.Whats the problem here?
    Help me on these issues. Thanks in advance.

    And one more problem is that in our database,the transactional table has 10 million records in it and some master tables have over 1000 records on them but our mdf file
    size is about 50 GB now.I dont believe that this 10 million records should make it to 50GB memory consumption.Whats the problem here?
    Help me on these issues.
    For SSIS part of question it would be better if you ask in SSIS forum although noting is going to change about logging behavior. You can increase some space on log file and also should batch your transactions as already suggested
    Regarding memory question about SQL Server, once it utilizes memory is not going to release unless there is windows OS faces memory pressure and SQLOS asks SQL Server to trim down its memory consumption. So again if you have set max server memory to some
    where near 50 SQL Server will utilize that much memory eventually. So what you are seeing is totally normal. Remember its costtly task for SQL Server to release and take memory so it avoids it by caching as much as possible plus it caches more so as to avoid
    physical reads which are costly
    When log file is getting full what does below transaction return
    select log_reuse_wait_desc from sys.databases where name='db_name'
    Can you manually introduce chekpoint in ETL query. Try this it might help you
    Please mark this reply as answer if it solved your issue or vote as helpful if it helped so that other forum members can benefit from it
    My Technet Wiki Article
    MVP

  • SQL loader and stream files with new line '

    I have been trying unsuccessfully to load EDI files using SQL loader. The problem
    is that the lines are terminated by ' and when I use the stream file option it does
    not recognise the line terminator given. As I understand it from the documentation
    this should work - but it does not. I have also used the Hex option with no better
    result. Does anyone have any ideas ?
    I can and have used tr "[']" "[\n]" in Unix to convert the ' to newlines - I just
    wonder am I missing something in SQL loader which will allow me to do this ?
    This is the sql loader control file
    LOAD DATA
    INFILE 'WS860685.MFD' "Str ''' "
    BADFILE 'WS860685.bad'
    DISCARDFILE 'WS860685.dsc'
    INTO TABLE "DUND1"."EDI_LOADED_TEMP"
    REPLACE
    FIELDS TERMINATED BY '+'
    TRAILING NULLCOLS
    (L1,
    L2,
    L3,
    L4,
    L5,
    L6,
    L7,
    L8,
    L9,
    L10,
    L11,
    L12,
    L13,
    L14,
    L15,
    L16,
    L17,
    L18,
    L19,
    L20,
    L21,
    L22,
    L23,
    L24,
    L25,
    L26,
    L27,
    L28,
    L29,
    L30,
    L31,
    L32,
    L33,
    L34,
    L35,
    L36,
    L37,
    L38,
    L39,
    L40,
    LNO)
    Heres a sample of the data
    UNB+UNOA:2+5398888501357+5398888501838+080306:0737+395+ DESADV+++1'UNH+0001+DESADV:D:93A:UN:EAN004'

    http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:6020061915147
    answers this question perfectly

  • SQL Loader and control file

    I know I've done this before but, I don't use SQL Loader often and I'm having issues getting a file to load.
    The table has 6 columns in it - one of which is a timestamp.
    I was having issues loading it initially with date format issues. I ruled out any issues with the timestamp format by simply loading a dummy table with some timestamp based data and had no issue.
    So - I think the issue is around the fact that the table I'm loading's first column being a value I'm attempting to default with a sequence when loading and - that I'm screwing up something there.
    Table is as such:
    CREATE TABLE ACS_IPS
        (seq_ips NUMBER,
        col2   VARCHAR2(100),
        col3  VARCHAR2(100),
        col4    VARCHAR2(100),
        col5 TIMESTAMP,
        col6    VARCHAR2(100),
        col7    VARCHAR2(100),
        col8 DATE  DEFAULT SYSDATE NOT NULL,
        col9 VARCHAR2(30) DEFAULT USER);The control file is:
    load data
    truncate
    into table acs_ips
    fields terminated by ","
    trailing nullcols
        seq_ips "seq_ips.nextval",
        col2,
        col3,
        col4,
        col5 TIMESTAMP "YYYY-MM-DD HH24:MI:SS.FF9",
        col6,
        col7
        )The sequence column isn't in the file being loaded... and - there are additional columns that are defaulted on the table that aren't in the control file.
    Any help is appreciated... The error I'm getting is:
    Rejected - Error on table CAMS.ACS_IPS, column ACTION_START.
    ORA-01841: (full) year must be between -4713 and +9999, and not be 0

    Yeah - not sure that clears it up...
    The sequence in my table is the first column - and - I think that's the problem... the sequence is being loaded as the first column in the control file but - it's not in the file being loaded so - it's skewing (again - I think??) the data being read in - which is why I'm getting the timestamp issues on the one column (it's actually reading the next column in the file vs. the actual timestamp one).
    If that's the issue - I'm not sure how to avoid it without restructuring the table to stick the sequence physically at the end. I'm certain that's not necessary and - I'm overlooking something that's otherwise simple but evading me.

Maybe you are looking for