Loading leap year date using SQL*Loader

Hello,
I have a problem loading a date '29/02/2000' using SQL*Loader. This date is on a leap year. I'm getting an error message from SQL*Loader as 'ORA-01839: date not valid for month specified'. My colleague and I have tried using various date functions to convert the data into date, but no luck.
I would appreciate any helps,
Bruce

Thanks for your help, I found the bug on my control file. I was using the RTRIM function to remove bad timestamp such as '29/02/2000 0:00:00'. So instead of using this statement:
LOG_DATE DATE "DD/MM/RRRR" "RTRIM(:LOG_DATE,'0:00:00')"
I was using the statement below with a space before the '0:00:00' string literal, with the intention to remove a space also:
LOG_DATE DATE "DD/MM/RRRR" "RTRIM(:LOG_DATE,' 0:00:00')"
Well, it turned out that if there was a space before the string literal, RTRIM function would trim the matching string plus any '0' characters from the right, including the '000' that belongs to '2000'. Thus, the error.
Thanks again,
Bruce

Similar Messages

  • CSV FILES DOESN'T LOAD WITH RIGHT DATA USING SQL LOADER

    Hi pals, I have the following information in csv file:
    MEXICO,Seretide_Q110,2010_SEE_01,Sales Line,OBJECTIVE,MEXICO,Q110,11/01/2010,02/04/2010,Activo,,,MEXICO
    MEXICO,Seretide_Q210,2010_SEE_02,Sales Line,OBJECTIVE,MEXICO,Q210,05/04/2010,25/06/2010,Activo,,,MEXICO
    When I use SQLLOADER the data is loaded as follow:*
    EXICO,Seretide_Q110,2010_SEE_01,Sales Line,OBJECTIVE,MEXICO,Q110,11/01/2010,02/04/2010,Activo,,,MEXICO
    And for the next data in a csv file too:
    MX_001,MEXICO,ASMA,20105912,Not Verified,General,,RH469364,RH469364,Change Request,,,,,,,Y,MEXICO,RH469364
    MX_002,MEXICO,ASMA,30094612,Verified,General,,LCS1405,LCS1405,Change Request,,,,,,,Y,MEXICO,LCS1405
    the data is loaded as follow:
    X_001,MEXICO,ASMA,20105912,Not Verified,General,,RH469364,RH469364,Change Request,,,,,,,Y,MEXICO,RH469364
    X_002,MEXICO,ASMA,30094612,Verified,General,,LCS1405,LCS1405,Change Request,,,,,,,Y,MEXICO,LCS1405
    I mean the first character is truncated and this bug happens with all my data. Any suggestion? I really hope you can help me.
    Edited by: user11260938 on 11/06/2009 02:17 PM
    Edited by: Mariots on 12/06/2009 09:37 AM
    Edited by: Mariots on 12/06/2009 09:37 AM

    Your table and view don't make sense so I created a "dummy" table to match your .ctl file.
    SQL> create table CCI_SRC_MX
      2  (ORG_BU               varchar2(30)
      3  ,name                 varchar2(30)
      4  ,src_num              varchar2(30)
      5  ,src_cd               varchar2(30)
      6  ,sub_type             varchar2(30)
      7  ,period_bu            varchar2(30)
      8  ,period_name          varchar2(30)
      9  ,prog_start_dt        date
    10  ,prog_end_dt          date
    11  ,status_cd            varchar2(30)
    12  ,X_ACTUALS_CALC_DATE  date
    13  ,X_ACTUAL_UPDATE_SRC  varchar2(30)
    14  ,prod_bu              varchar2(30)
    15  ,ROW_ID               NUMBER(15,0)
    16  ,IF_ROW_STAT          VARCHAR2(90)
    17  ,JOB_ID               NUMBER(15,0)
    18  );
    Table created.
    SQL> create sequence GSK_GENERAL_SEQ;
    Sequence created.I simplified your .ctl file and moved all the constant and sequence stuff to the end. I also changed the format masks to match the dates in your data.
    LOAD DATA
    INFILE 'SBSLSLT.txt'
    BADFILE 'SBSLSLT.bad'
    DISCARDFILE 'SBSLSLT.dis'
    APPEND
    INTO TABLE CCI_SRC_MX
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
    TRAILING NULLCOLS
    (ORG_BU
    ,NAME
    ,SRC_NUM
    ,SRC_CD
    ,SUB_TYPE
    ,PERIOD_BU
    ,PERIOD_NAME
    ,PROG_START_DT          DATE 'dd/mm/yyyy'
    ,PROG_END_DT            DATE 'dd/mm/yyyy'
    ,STATUS_CD
    ,X_ACTUALS_CALC_DATE    DATE 'dd/mm/yyyy'
    ,X_ACTUAL_UPDATE_SRC
    ,PROD_BU
    ,row_id                 "GSK_GENERAL_SEQ.nextval"
    ,if_row_stat            CONSTANT 'UPLOADED'
    ,job_id                 constant 36889106
    {code}
    When I run SQL Loader, I get this:
    {code}
    SQL> select * from CCI_SRC_MX;
    ORG_BU  NAME           SRC_NUM      SRC_CD      SUB_TYPE   PERIOD_BU  PERIOD_NAME  PROG_START_DT        PROG_END_DT          STATUS_CD  PROD_BU  ROW_ID IF_ROW_STAT    JOB_ID
    MEXICO  Seretide_Q110  2010_SEE_01  Sales Line  OBJECTIVE  MEXICO     Q110         11-JAN-2010 00:00:00 02-APR-2010 00:00:00 Activo     MEXICO        1 UPLOADED     36889106
    MEXICO  Seretide_Q210  2010_SEE_02  Sales Line  OBJECTIVE  MEXICO     Q210         05-APR-2010 00:00:00 25-JUN-2010 00:00:00 Activo     MEXICO        2 UPLOADED     36889106
    {code}                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Loading data with dates using SQL*Loader

    Dear everyone
    I am currently trying to load some data containing dates using SQL*Loader.
    For termination of fields I have been using ^ because I have some book titles which contain " and ' as part of their title. I found that the TO_DATE function did not seem to work using ^ instead of ". Would I be correct? I think the Oracle manual says that " must be used.
    After some Web research I eventually amended my control file to as follows:
    load data
    infile 'h:\insert_statements\22_insert_into_SCAN_FILE_INFO.txt'
    REPLACE
    into table SCAN_FILE_INFO
    fields terminated by "," optionally enclosed by '^'
    TRAILING NULLCOLS
    (scan_id, scan_filename
    file_format_id
    orig_scanning_resolution_dpi
    scanner_id, scanner_operator_id
    scanning_date "TO_DATE (:scanning_date, 'YYYY-MM-DD')"
    original_map_publication_id
    reprint_publication_id)
    A simple line of data is as follow:
    280001, ^1910 - London^, 270001, 400, 250001, 260001, "TO_DATE('2007-06-06', 'YYYY-MM-DD')", 200019,
    The final column being null.
    However when I attempt that I get the following error message:
    Record 1: Rejected - Error on table SCAN_FILE_INFO, column SCANNING_DATE.
    ORA-01841: (full) year must be between -4713 and +9999, and not be 0
    If I change the scanning_date part to:
    scanning_date "EXPRESSION TO_DATE (:scanning_date, 'YYYY-MM-DD')",
    or
    scanning_date "CONSTANT TO_DATE (:scanning_date, 'YYYY-MM-DD')",
    I get the error message:
    Record 1: Rejected - Error on table SCAN_FILE_INFO, column SCANNING_DATE.
    ORA-00917: missing comma
    As soon as I do the following:
    scanning_date "EXPRESSION, TO_DATE (:scanning_date, 'YYYY-MM-DD')",
    or
    scanning_date "CONSTANT, TO_DATE (:scanning_date, 'YYYY-MM-DD')",
    I get too many values error message:
    Record 1: Rejected - Error on table SCAN_FILE_INFO.
    ORA-00913: too many values
    I also tested out scanning_date DATE "YYYY-MM-DD", but that just gave the same ORA-01841 error message as above.
    I must be doing something very simple which is wrong but I cannot figure it out.
    Kind regards
    Tim

    And why do you have scanning date as "TO_DATE('2007-06-06', 'YYYY-MM-DD')" in your infile? All you need is 2007-06-06. If you can not change infile generation code, use:
    load data
    infile 'h:\insert_statements\22_insert_into_SCAN_FILE_INFO.txt'
    REPLACE
    into table SCAN_FILE_INFO
    fields terminated by "," optionally enclosed by '^'
    TRAILING NULLCOLS
    (scan_id, scan_filename
    file_format_id
    orig_scanning_resolution_dpi
    scanner_id, scanner_operator_id
    scanning_date "TO_DATE(REPLACE(REPLACE(:scanning_date,'TO_DATE('),'''YYYY-MM-DD'')'), 'YYYY-MM-DD')"
    original_map_publication_id
    reprint_publication_id)SY.

  • Issue while loading a csv file using sql*loader...

    Hi,
    I am loading a csv file using sql*loader.
    On the number columns where there is data populated in them, decimal number/integers, the row errors out on the error -
    ORA-01722: invalid number
    I tried checking the value picking from the excel,
    and found the chr(13),chr(32),chr(10) values characters on the value.
    ex: select length('0.21') from dual is giving a value of 7.
    When i checked each character as
    select ascii(substr('0.21',5,1) from dual is returning a value 9...etc.
    I tried the following command....
    "to_number(trim(replace(replace(replace(replace(:std_cost_price_scala,chr(9),''),chr(32),''),chr(13),''),chr(10),'')))",
    to remove all the non-number special characters. But still facing the error.
    Please let me know, any solution for this error.
    Thanks in advance.
    Kiran

    control file:
    OPTIONS (ROWS=1, ERRORS=10000)
    LOAD DATA
    CHARACTERSET WE8ISO8859P1
    INFILE '$Xx_TOP/bin/ITEMS.csv'
    APPEND INTO TABLE XXINF.ITEMS_STAGE
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' TRAILING NULLCOLS
    ItemNum                    "trim(replace(replace(:ItemNum,chr(9),''),chr(13),''))",
    cross_ref_old_item_num               "trim(replace(replace(:cross_ref_old_item_num,chr(9),''),chr(13),''))",
    Mas_description               "trim(replace(replace(:Mas_description,chr(9),''),chr(13),''))",
    Mas_long_description               "trim(replace(replace(:Mas_long_description,chr(9),''),chr(13),''))",
    Org_description               "trim(replace(replace(:Org_description,chr(9),''),chr(13),''))",
    Org_long_description               "trim(replace(replace(:Org_long_description,chr(9),''),chr(13),''))",
    user_item_type                    "trim(replace(replace(:user_item_type,chr(9),''),chr(13),''))",
    organization_code               "trim(replace(replace(:organization_code,chr(9),''),chr(13),''))",
    primary_uom_code               "trim(replace(replace(:primary_uom_code,chr(9),''),chr(13),''))",
    inv_default_item_status          "trim(replace(replace(:inv_default_item_status,chr(9),''),chr(13),''))",
    inventory_item_flag               "trim(replace(replace(:inventory_item_flag,chr(9),''),chr(13),''))",
    stock_enabled_flag               "trim(replace(replace(:stock_enabled_flag,chr(9),''),chr(13),''))",
    mtl_transactions_enabled_flag          "trim(replace(replace(:mtl_transactions_enabled_flag,chr(9),''),chr(13),''))",
    revision_qty_control_code          "trim(replace(replace(:revision_qty_control_code,chr(9),''),chr(13),''))",
    reservable_type               "trim(replace(replace(:reservable_type,chr(9),''),chr(13),''))",
    check_shortages_flag               "trim(replace(replace(:check_shortages_flag,chr(9),''),chr(13),''))",
    shelf_life_code               "trim(replace(replace(replace(replace(:shelf_life_code,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    shelf_life_days               "trim(replace(replace(replace(replace(:shelf_life_days,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    lot_control_code               "trim(replace(replace(:lot_control_code,chr(9),''),chr(13),''))",
    auto_lot_alpha_prefix               "trim(replace(replace(:auto_lot_alpha_prefix,chr(9),''),chr(13),''))",
    start_auto_lot_number               "trim(replace(replace(:start_auto_lot_number,chr(9),''),chr(13),''))",
    negative_measurement_error          "trim(replace(replace(replace(replace(:negative_measurement_error,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    positive_measurement_error          "trim(replace(replace(replace(replace(:positive_measurement_error,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    serial_number_control_code          "trim(replace(replace(:serial_number_control_code,chr(9),''),chr(13),''))",
    auto_serial_alpha_prefix          "trim(replace(replace(:auto_serial_alpha_prefix,chr(9),''),chr(13),''))",
    start_auto_serial_number          "trim(replace(replace(:start_auto_serial_number,chr(9),''),chr(13),''))",
    location_control_code               "trim(replace(replace(:location_control_code,chr(9),''),chr(13),''))",
    restrict_subinventories_code          "trim(replace(replace(:restrict_subinventories_code,chr(9),''),chr(13),''))",
    restrict_locators_code               "trim(replace(replace(:restrict_locators_code,chr(9),''),chr(13),''))",
    bom_enabled_flag               "trim(replace(replace(:bom_enabled_flag,chr(9),''),chr(13),''))",
    costing_enabled_flag               "trim(replace(replace(:costing_enabled_flag,chr(9),''),chr(13),''))",
    inventory_asset_flag               "trim(replace(replace(:inventory_asset_flag,chr(9),''),chr(13),''))",
    default_include_in_rollup_flag          "trim(replace(replace(:default_include_in_rollup_flag,chr(9),''),chr(13),''))",
    cost_of_goods_sold_account          "trim(replace(replace(:cost_of_goods_sold_account,chr(9),''),chr(13),''))",
    std_lot_size                    "trim(replace(replace(replace(replace(:std_lot_size,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    sales_account                    "trim(replace(replace(:sales_account,chr(9),''),chr(13),''))",
    purchasing_item_flag               "trim(replace(replace(:purchasing_item_flag,chr(9),''),chr(13),''))",
    purchasing_enabled_flag          "trim(replace(replace(:purchasing_enabled_flag,chr(9),''),chr(13),''))",
    must_use_approved_vendor_flag          "trim(replace(replace(:must_use_approved_vendor_flag,chr(9),''),chr(13),''))",
    allow_item_desc_update_flag          "trim(replace(replace(:allow_item_desc_update_flag,chr(9),''),chr(13),''))",
    rfq_required_flag               "trim(replace(replace(:rfq_required_flag,chr(9),''),chr(13),''))",
    buyer_name                    "trim(replace(replace(:buyer_name,chr(9),''),chr(13),''))",
    list_price_per_unit               "trim(replace(replace(replace(replace(:list_price_per_unit,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    taxable_flag                    "trim(replace(replace(:taxable_flag,chr(9),''),chr(13),''))",
    purchasing_tax_code               "trim(replace(replace(:purchasing_tax_code,chr(9),''),chr(13),''))",
    receipt_required_flag               "trim(replace(replace(:receipt_required_flag,chr(9),''),chr(13),''))",
    inspection_required_flag          "trim(replace(replace(:inspection_required_flag,chr(9),''),chr(13),''))",
    price_tolerance_percent          "trim(replace(replace(replace(replace(:price_tolerance_percent,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    expense_account               "trim(replace(replace(:expense_account,chr(9),''),chr(13),''))",
    allow_substitute_receipts_flag          "trim(replace(replace(:allow_substitute_receipts_flag,chr(9),''),chr(13),''))",
    allow_unordered_receipts_flag          "trim(replace(replace(:allow_unordered_receipts_flag,chr(9),''),chr(13),''))",
    receiving_routing_code               "trim(replace(replace(:receiving_routing_code,chr(9),''),chr(13),''))",
    inventory_planning_code          "trim(replace(replace(:inventory_planning_code,chr(9),''),chr(13),''))",
    min_minmax_quantity               "trim(replace(replace(replace(replace(:min_minmax_quantity,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    max_minmax_quantity               "trim(replace(replace(replace(replace(:max_minmax_quantity,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    planning_make_buy_code               "trim(replace(replace(:planning_make_buy_code,chr(9),''),chr(13),''))",
    source_type                    "trim(replace(replace(:source_type,chr(9),''),chr(13),''))",
    mrp_safety_stock_code               "trim(replace(replace(:mrp_safety_stock_code,chr(9),''),chr(13),''))",
    material_cost                    "trim(replace(replace(:material_cost,chr(9),''),chr(13),''))",
    mrp_planning_code               "trim(replace(replace(:mrp_planning_code,chr(9),''),chr(13),''))",
    customer_order_enabled_flag          "trim(replace(replace(:customer_order_enabled_flag,chr(9),''),chr(13),''))",
    customer_order_flag               "trim(replace(replace(:customer_order_flag,chr(9),''),chr(13),''))",
    shippable_item_flag               "trim(replace(replace(:shippable_item_flag,chr(9),''),chr(13),''))",
    internal_order_flag               "trim(replace(replace(:internal_order_flag,chr(9),''),chr(13),''))",
    internal_order_enabled_flag          "trim(replace(replace(:internal_order_enabled_flag,chr(9),''),chr(13),''))",
    invoice_enabled_flag               "trim(replace(replace(:invoice_enabled_flag,chr(9),''),chr(13),''))",
    invoiceable_item_flag               "trim(replace(replace(:invoiceable_item_flag,chr(9),''),chr(13),''))",
    cross_ref_ean_code               "trim(replace(replace(:cross_ref_ean_code,chr(9),''),chr(13),''))",
    category_set_intrastat               "trim(replace(replace(:category_set_intrastat,chr(9),''),chr(13),''))",
    CustomCode                    "trim(replace(replace(:CustomCode,chr(9),''),chr(13),''))",
    net_weight                    "trim(replace(replace(replace(replace(:net_weight,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    production_speed               "trim(replace(replace(:production_speed,chr(9),''),chr(13),''))",
    LABEL                         "trim(replace(replace(:LABEL,chr(9),''),chr(13),''))",
    comment1_org_level               "trim(replace(replace(:comment1_org_level,chr(9),''),chr(13),''))",
    comment2_org_level               "trim(replace(replace(:comment2_org_level,chr(9),''),chr(13),''))",
    std_cost_price_scala               "to_number(trim(replace(replace(replace(replace(:std_cost_price_scala,chr(9),''),chr(32),''),chr(13),''),chr(10),'')))",
    supply_type                    "trim(replace(replace(:supply_type,chr(9),''),chr(13),''))",
    subinventory_code               "trim(replace(replace(:subinventory_code,chr(9),''),chr(13),''))",
    preprocessing_lead_time          "trim(replace(replace(replace(replace(:preprocessing_lead_time,chr(9),''),chr(32),''),chr(13),''),chr(10),''))",
    processing_lead_time                "trim(replace(replace(replace(replace(:processing_lead_time,chr(9),''),chr(32),''),chr(13),''),chr(10),''))",
    wip_supply_locator               "trim(replace(replace(:wip_supply_locator,chr(9),''),chr(13),''))"
    Sample data from csv file.
    "9901-0001-35","390000","JMKL16 Pipe bend 16 mm","","JMKL16 Putkikaari 16 mm","","AI","FJE","Ea","","","","","","","","","","","","","","","","","","","","","","","","","21-21100-22200-00000-00000-00-00000-00000","0","21-11100-22110-00000-00000-00-00000-00000","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","0.1","Pull","AFTER PROD","","","Locator for Production"
    The load errors out on especially two columns :
    1) std_cost_price_scala
    2) list_price_per_unit
    both are number columns.
    And when there is data being provided on them. It errors out. But, if they are holding null values, the records go through fine.
    Message was edited by:
    KK28

  • How can we load a LOBS data usng Sql loader?

    How can we load a LOBS data usng Sql loader?

    Did you go through SQL Loader documentation?
    http://download-east.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_loading.htm#i1006803

  • Error in loading data using SQL loader

    I am getting a error like ‘SQL*Loader -350 syntax error of illegal combination of non-alphanumeric characters’ while loading a file using SQL loader in RHEL. The command used to run SQL*Loader is:
    Sqlldr userid=<username>/<password> control =data.ctl
    The control file, data.ctl is :
    LOAD data
    infile '/home/oraprod/data.txt'
    append  into table test
    empid terminated by ',',
    fname terminated by ',',
    lname terminated by ',',
    salary terminated by whitespace
    The data.txt file is:
    1,Kaushal,halani,5000
    2,Chetan,halani,1000
    I hope, my question is clear.
    Please revert with the reply to my query.
    Regards

    Replace ''{" by "(" in your control file
    LOAD data
    infile 'c:\data.txt'
    append  into table emp_t
    empid terminated by ',',
    fname terminated by ',',
    lname terminated by ',',
    salary terminated by whitespace
    C:\>sqlldr user/pwd@database control=c.ctl
    SQL*Loader: Release 10.2.0.3.0 - Production on Wed Nov 13 10:10:24 2013
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Commit point reached - logical record count 1
    Commit point reached - logical record count 2
    SQL> select * from emp_t;
         EMPID FNAME                LNAME                    SALARY
             1 Kaushal              halani                     5000
             2 Chetan               halani                     1000
    Best regards
    Mohamed Houri

  • ORA-01841 Error when value for date col is NULL in .dat (using SQL Loader)

    Hello Gurus,
    I have some data in .dat file which needs to be loaded into oracle table. I am using SQL * Loader to do the job. Although "NULLIF col_name =BLANKS" works for character datatype, but when value for date col is NULL then I get ORA-01841 error. I have to make NULL for all rows withour value for date column
    Early reply will be highly appreciated
    Farooq

    Hi,
    May be this problem is not with the NULLIF. The value for the date column is not in proper date format.
    create table:
    create table kk (empno number, ename varchar2(20), deptno number, hiredate date)
    Control file:
    LOAD DATA
    INFILE 'd:\kk\empdata.dat'
    insert into TABLE kk ( empno position (1:2) integer external,
    ename position(4:5) char NULLIF ename=BLANKS,
    deptno position (7:8) integer external NULLIF deptno=BLANKS,
    hiredate position (10:20) date NULLIF hiredate=BLANKS)
    data file:
    10 KK 01-jan-2005
    20 10
    SELECT * FROM KK;
    EMPNO ENAME DEPTNO HIREDATE
    10 KK 01-JAN-05
    20 10
    Verify the data file.
    Hope it will help

  • How to export&import data using sql *loader

    Hi all,
    How to export&import data from sql*loader. Give me the clear steps..
    Thanks in Advance

    Hi did you already exported data from SQL SERVER? if not using SQL*LOADER you cannot export data. SQL*LOADER is only mean for importing data from flat files(usually text files) into ORACLE tables.
    for importing data into oracle tables using sql*loader use below steps
    1) create a sql*loader control file.
    it looks like as follows
    LOAD DATA
    INFILE 'sample.dat'
    BADFILE 'sample.bad'
    DISCARDFILE 'sample.dsc'
    APPEND
    INTO TABLE emp
    TRAILING NULLCOLS
    or for sample script of control file search google.
    2) at command prompt issue following
    $ sqlldr test/test
    enter control file=<give control file name which you create earlier>
    debug any errors (if occured)

  • Loading millions of rows using SQL*loader to a table with constraints

    I have a table with constraints and I need to load millions of rows in it using SQL*Loader.
    What is the best way to do this, means what SQL*Loader options to use, for getting the best loading performance and how to deal with constraints?
    Regards

    - check if your table has check constraints (like column not null)
    if you trust the data in the file you have to load you can disable this constrainst and after the loader enable this constrainst.
    - Check if you can modify the table and place it in nologging mode (generate less redo but ONLY is SOME Conditions)
    Hope it helps
    Rui Madaleno

  • Loading from text file using Sql Loader

    I need to load data from a text file into Oracle table. The file has long strings of text in the following format:
    12342||||||Lots and lots of text with all kinds of characters including
    ^&*!#%#^@ etc.xxxxxxxxxxxxxxxxxxxxxxx
    yyyyyyyyyyyyyyyyyyyyyyyyytrrrrrrrrrrrrrrrrrrr
    uuuuuuuuuuuuuuuuuuurtgggggggggggggggg.||||||||
    45356|||||||||||again lots and lots of text.uuuuuudccccccccccccccccccccd
    gyhjjjjjjjjjjjjjjjjjjjjjjjjkkkkkkkkkkkkklllllllllllnmmmmmmmmmmmmnaaa|||||||.
    There are pipes within the text as well. On the above example, the line starting with 12342 is an entire record that needs to be loaded into a CLOB column. The next record would be the 45356 one. Therefore, all records have a bunch of pipes at the end, so the only way to know where a new record starts is to see where the next number is after all the ending pipes. The only other thing I know is that there are a fixed number of pipes in each record.
    Does anyone have any ideas on how I can load the data into the table either using sql loader or any other utility? Any input would be greatly appreciated. Thanks.

    STFF [url http://forums.oracle.com/forums/thread.jspa?messageID=1773678&#1764219]Sqlldr processing of records with embedded newline and delimiter

  • Problem with loading data using SQL LOADER

    I am having following files with me when i run following command at command prompt
    sqlldr scott/tiger@genuat control =c:\emp.ctl
    then giving error as
    SQL Loader 500: unable to open file
    SQL Loader 553: file not found
    emp.dat file data
    1111,sneha,CLERK     7902,17-Dec-80,800,20
    2222,manoj,SALESMAN,7698,20-Feb-72     ,1600,6500,30
    3333,sheela,MANAGER,7839,8-Apr-81,2975,20     
    emp.ctl file
    LOAD DATA
    INFILE 'c:\emp.dat'
    APPEND
    INTO TABLE emp
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
    TRAILING NULLCOLS
    (EMPNO,
    ENAME ,
    JOB,
    MGR,
    HIREDATE,
    SAL ,
    COMM,
    DEPTNO)
    can anyone tell me what is problem in above file why data is not loaded in table??

    I don't find any problem if you invoke the SQLLDR using the below command(and if you are certain that the control file resides in the C: drive).
    sqlldr scott/tiger@genuat control =c:\emp.ctl
    If this doesn't work then invoke the SQLLDR from the C: prompt itself.
    sqlldr scott/tiger@genuat control=emp.ctl
    It would locate the control file and check whether the sqlldr completes successfully?

  • How to load time only no date using sql loader

    I want to load just time portion from the data, the data looks like this
    08/10/09 ,FZ10, AD2R, DFHMIRS , 14, 01:12:07.001230, 01:02:07.112354, TRANS PURGED / TIMED OUT ,
    control file:
    LOAD DATA
    APPEND
    INTO TABLE TRANS_ABENDS
    FIELDS TERMINATED ',' TRAILING NULLCOLS
         TRANS_DATE           DATE "MM/DD/YYYY",
         TRANS_ID          ,
         ABEND_CODE          ,
         ABEND_PGM           ,
         ABEND_COUNT           ,
         RESPONSE_TIME      "to_date(:RESPONSE_TIME,'dd/mm/yy HH:MI:SS.ffffff')",
         CPU_TIME          "to_date(:RESPONSE_TIME,'dd/mm/yy HH:MI:SS.ffffff')",
         ABEND_RESOLUTION     ,
         INSTANCE          CONSTANT 'EAST'
    I can load the time using the control file but it adds the first of the month as date in the column. I want to load only time value. Appreciate any help on this.

    I meant, how are these columns defined in the table that is being loaded?
    If these are defined as DATE, then as was pointed out Oracle will default the date portions if they are not provided in the formatted string.
    If these are defined as NUMBER (with some fractional portion), then you could use a combination of substr, multiplication and concatenation to derive the time as number in the form <seconds>.<fractional seconds>.
    You could also look into the data type INTERVAL DAY TO SECOND.
    SQL> create table trans_abends (
      2   trans_date date,
      3   trans_id char(4),
      4   abend_code char(4),
      5   abend_pgm char(8),
      6   abend_count number(4),
      7   response_time interval day (0) to second (6),
      8   cpu_time interval day (0) to second (6)
      9  )
    10  /
    Table created.
    SQL> insert into trans_abends values (sysdate, 'xxxx', 'xxxx', 'xxxxxxxx', 1, to_dsinterval('+0 01:12:07.001230'), to_dsinterval('+0 01:02:07.112354') )
      2  /
    1 row created.
    SQL> select response_time, cpu_time from trans_abends;
    RESPONSE_TIME
    CPU_TIME
    +0 01:12:07.001230
    +0 01:02:07.112354For the sql loader control file, try
    "to_dsinterval('+0 '||:RESPONSE_TIME)"
    not tested.
    Edited by: user142857 on Nov 18, 2009 12:20 PM
    Edited by: user142857 on Nov 18, 2009 12:27 PM

  • Error while loading data using SQL*Loader

    Hi All,
    I am now in process of loading data from MS SQL to Oracle Database.
    I am getting the data in excel format and i will convert them into csv.
    Upto converting everything is working fine.
    In MS SQL, table columns are case sensitive.
    So i created those tables in oracle db as same.
    There is one column "MaxNumber" which is of type float(49).
    The column is case sensitive.
    in the control file first i given as
    "MaxNumber" "TO_NUMBER(:MaxNumber,'99,999.99')"
    After executing the SQL*Loader i am getting the error
    SQL*Loader-466: Column MAXNUMBER does not exist in table TABLEONE.
    I changed the control file entry as
    "MaxNumber" "TO_NUMBER(:"MaxNumber",'99,999.99')"
    After the execution i got the error
    SQL*Loader-350: Syntax error at line 13.
    Expecting "," or ")", found "MaxDiscount".
    "MaxNumber" "TO_NUMBER(:"MaxNumber",'99,999.99')"
    Please Guide me in this issue.
    Regards
    Salih KM

    What I'm saying is, verify the column name. Dont post if not possible.
    Example follows, with one table intentionally "hidden".
    SQL> create table "tEsT" ("MaxNumber" float, "MaxnumbeR" number);
    SQL> select table_name,column_name from user_tab_columns where table_name like 't%';
    TABLE_NAME                     COLUMN_NAME
    tEsT                           MaxNumber
    tEsT                           MaxnumbeR
    teST                           iD
    teST                           MaxNumberHth,
    Fredrik

  • Help needed to load data using sql loader.

    Hi,
    I trying to load data from xls to oracle table(solaris OS) and its failing to load data.
    Control file:
    LOAD DATA
    CHARACTERSET UTF16
    BYTEORDER BIG ENDIAN
    INFILE cost.csv
    BADFILE consolidate.bad
    DISCARDFILE Sybase_inventory.dis
    INSERT
    INTO TABLE FIT_UNIX_NT_SERVER_COSTS
    FIELDS TERMINATED BY ','
    TRAILING NULLCOLS
    HOST_NM,
    SERVICE_9071_DOLLAR DOUBLE,
    SERVICE_9310_DOLLAR DOUBLE,
    SERVICE_9700_DOLLAR DOUBLE,
    SERVICE_9701_DOLLAR DOUBLE,
    SERVICE_9710_DOLLAR DOUBLE,
    SERVICE_9711_DOLLAR DOUBLE,
    SERVICE_9712_DOLLAR DOUBLE,
    SERVICE_9713_DOLLAR DOUBLE,
    SERVICE_9720_DOLLAR DOUBLE,
    SERVICE_9721_DOLLAR DOUBLE,
    SERVICE_9730_DOLLAR DOUBLE,
    SERVICE_9731_DOLLAR DOUBLE,
    SERVICE_9750_DOLLAR DOUBLE,
    SERVICE_9751_DOLLAR DOUBLE,
    GRAND_TOTAL DOUBLE
    Log file:
    Table FIT_UNIX_NT_SERVER_COSTS, loaded from every logical record.
    Insert option in effect for this table: INSERT
    TRAILING NULLCOLS option in effect
    Column Name Position Len Term Encl Datatype
    HOST_NM FIRST * , CHARACTER
    SERVICE_9071_DOLLAR NEXT 8 DOUBLE
    SERVICE_9310_DOLLAR NEXT 8 DOUBLE
    SERVICE_9700_DOLLAR NEXT 8 DOUBLE
    SERVICE_9701_DOLLAR NEXT 8 DOUBLE
    SERVICE_9710_DOLLAR NEXT 8 DOUBLE
    SERVICE_9711_DOLLAR NEXT 8 DOUBLE
    SERVICE_9712_DOLLAR NEXT 8 DOUBLE
    SERVICE_9713_DOLLAR NEXT 8 DOUBLE
    SERVICE_9720_DOLLAR NEXT 8 DOUBLE
    SERVICE_9721_DOLLAR NEXT 8 DOUBLE
    SERVICE_9730_DOLLAR NEXT 8 DOUBLE
    SERVICE_9731_DOLLAR NEXT 8 DOUBLE
    SERVICE_9750_DOLLAR NEXT 8 DOUBLE
    SERVICE_9751_DOLLAR NEXT 8 DOUBLE
    GRAND_TOTAL NEXT 8 DOUBLE
    Record 1: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column HOST_NM.
    Field in data file exceeds maximum length
    Table FIT_UNIX_NT_SERVER_COSTS:
    0 Rows successfully loaded.
    1 Row not loaded due to data errors.
    0 Rows not loaded because all WHEN clauses were failed.
    0 Rows not loaded because all fields were null.
    Please help me ASAP.
    Awaiting u r reply.

    Hi,
    I verified and everything looks fine according to me.
    Table structure:
    OST_NM VARCHAR2(30)
    SERVICE_9071_DOLLAR NUMBER(8,2)
    SERVICE_9310_DOLLAR NUMBER(8,2)
    SERVICE_9700_DOLLAR NUMBER(8,2)
    SERVICE_9701_DOLLAR NUMBER(8,2)
    SERVICE_9710_DOLLAR NUMBER(8,2)
    SERVICE_9711_DOLLAR NUMBER(8,2)
    SERVICE_9712_DOLLAR NUMBER(8,2)
    SERVICE_9713_DOLLAR NUMBER(8,2)
    SERVICE_9720_DOLLAR NUMBER(8,2)
    SERVICE_9721_DOLLAR NUMBER(8,2)
    SERVICE_9730_DOLLAR NUMBER(8,2)
    SERVICE_9731_DOLLAR NUMBER(8,2)
    SERVICE_9750_DOLLAR NUMBER(8,2)
    SERVICE_9751_DOLLAR NUMBER(8,2)
    GRAND_TOTAL NUMBER(8,2)
    Control file:
    LOAD DATA
    BYTEORDER BIG ENDIAN
    INFILE cost.csv
    BADFILE consolidate.bad
    DISCARDFILE Sybase_inventory.dis
    INSERT
    INTO TABLE FIT_UNIX_NT_SERVER_COSTS
    FIELDS TERMINATED BY ','
    TRAILING NULLCOLS
    HOST_NM ,
    SERVICE_9071_DOLLAR NUMBER(8,2),
    SERVICE_9310_DOLLAR NUMBER(8,2),
    SERVICE_9700_DOLLAR NUMBER(8,2),
    SERVICE_9701_DOLLAR NUMBER(8,2),
    SERVICE_9710_DOLLAR NUMBER(8,2),
    SERVICE_9711_DOLLAR NUMBER(8,2),
    SERVICE_9712_DOLLAR NUMBER(8,2),
    SERVICE_9713_DOLLAR NUMBER(8,2),
    SERVICE_9720_DOLLAR NUMBER(8,2),
    SERVICE_9721_DOLLAR NUMBER(8,2),
    SERVICE_9730_DOLLAR NUMBER(8,2),
    SERVICE_9731_DOLLAR NUMBER(8,2),
    SERVICE_9750_DOLLAR NUMBER(8,2),
    SERVICE_9751_DOLLAR NUMBER(8,2),
    GRAND_TOTAL NUMBER(8,2)
    Sample date file:
    ABOS12,122.46,,1315.00,,1400.00,,,,,,,,1855.62,,4693.07
    ABOS39,6391.16,,1315.00,,1400.00,,,,,,,,,4081.88,13188.04

  • Not loading from flat file using SQL*Loader

    Hi,
    I am trying to load from an excel file.
    first i converted excel file into csv file and save it as as dat file.
    in the excel file one column is salary and the data is like $100,000
    while converting xls to csv the salary is changed to "$100,000 " (with quotes and a space after the amount)
    after the last digit it will put a space.
    in the control file of sql*loader i had given
    salary "to_number('L999,999')"
    my problem is the space after the salary in the dat file.---> "$100,000 "
    what changes i have to make in the to_number function which is in the control file.
    Please guide me.
    Thanks & Regards
    Salih KM
    Message was edited by:
    kmsalih

    Thanks a lot Jens Petersen
    It's is loading ..........
    MI means miniute.
    am i correct.
    but i didn't get the logic behind that.
    can u please explain that.
    Thanks & Regards
    Salih KM

Maybe you are looking for