SQL Loader XML Missing Data

I am trying to load a XML file via SQL Loader. My loader file is as follows:
LOAD DATA
INFILE 'C:\Infile.xml' "str'</Record>'"
BADFILE 'C:\Infile.bad'
Replace
INTO TABLE Cust_Table
TRAILING NULLCOLS
dummy filler terminated by ">",
Month enclosed by "<Mo>" and "</M0>"
LOC_ID enclosed by "<LOC_ID>" and "</LOC_ID>",
LOC_DESC enclosed by "<LOC_DESC>"and "</LOC_DESC>",
Cust_ID enclosed by "<Cust_ID>" and "</Cust_ID>",
Cust_Name enclosed by "<Cust_Name>" and "</Cust_Name>" )
The problem I'm having is for some records the Cust_Id and Cust_Name is missing in the file. The error I'm getting when trying to load the file for those records is:
Rejected - Error on table Cust_Table, column Cust_Id.
Initial enclosure character not found
Not sure what I need to add to the loader file to resolve this error.
TIA,
Todd

I would suggest u use External tables since its more flexible then
sqlloader & is a better option.
For using external tables
1)u will have to create a directory first
2)Generally creation od directory is done by sys,hence after creating the directory
privileges read & write to be provided to user .
3)Creation of external tables.
4) Now use the table as a normal table to insert ,update delete in
ur table.
U can get more information from
http://www.oracle-base.com/articles/9i/SQLNewFeatures9i.php#ExternalTables
Create Directory <directory_name> as <Directory path where file be present>
Grant read,write on directory <directory_name> to <username>
CREATE TABLE <table_name>
(<column names>)
ORGANIZATION EXTERNAL
TYPE ORACLE_LOADER
DEFAULT DIRECTORY ,directory_name>
ACCESS PARAMETERS
RECORDS DELIMITED BY NEWLINE
FIELDS TERMINATED BY ','
LOCATION (<filename>)
PARALLEL 5
REJECT LIMIT 200;
Hope this helps.

Similar Messages

  • OWB11gR2 - simple and easy way to load XML formatted data into db tables?

    Hi,
    we're currently trying to load table data stored in XML files into our datawarehouse using OWB 11gR2.
    However, we're finding this is not quite as trivial as loading flat files...
    Most postings on this forum points to the blog-entry title "Leveraging XDB" found here (http://blogs.oracle.com/warehousebuilder/2007/09/leveraging_xdb.html).
    This blog also references the zip-file owb_xml_etl_utils.zip, which seems to have disappeared from it's original location and can now be found on sourceforge.
    Anyway, the solution described is for OWB 10g, and when trying to import experts from the zip-file etc. we end up not being able to run the "Create ETL from XSD" expert, as the 11gR2 client is different from the 10g and does not have the Experts menu et.al.
    Also, this solution was published over 3 years ago, and it seems rather strange that importing XML-formatted data should still be so cumbersome in the newer warehouse builder releases.
    The OWB 11gR2 documentation is very sparse (or rather - quite empty) on how to load XML data, all it has is a few lines on "XML Transformations", giving no clue as to how one goes about loading data.
    Is this really the state of things? Or are we missing some vital information here?
    We'd have thought that with 11g-releases, loading XML-data would be rather simple, quick and painless?
    Is there somewhere besides the blog mentioned above where we can find simple and to the point guidelines for OWB 11gR2 on how to load XML-formatted data into Oracle tables?
    Regards,
    -Haakon-

    Yes it is possible to use SQL*Loader to parse and load XML, but that is not what it was designed for and so is not recommended. You also don't need to register a schema, just to load/store/parse XML in the DB either.
    So where does that leave you?
    Some options
    {thread:id=410714} (see page 2)
    {thread:id=1090681}
    {thread:id=1070213}
    Those talk some about storage options and reading in XML from disk and parsing XML. They should also give you options to consider. Without knowing more about your requirements for the effort, it is difficult to give specific advice. Maybe your 7-8 tables don't exist and so using Object Relational Storage for the XML would be the best solution as you can query/update tables that Oracle creates based off the schema associated to the XML. Maybe an External Table definition works better for reading the XML into the system because this process will happen just once. Maybe using WebDAV makes more sense for loading XML to be parsed (I don't have much experience with this, just know it is possible from what I've read on the forums). Also, your version makes a difference as you have different options available depending upon the version of Oracle.
    Hope all that helps as a starter.
    Edited by: A_Non on Jul 8, 2010 4:31 PM
    A great example, see the answers by mdrake in {thread:id=1096784}

  • ORACLE 8I SQL*LOADER DATAFILE의 특정 FIELD DATA를 SKIP하고 LOADING하는 방법

    제품 : ORACLE SERVER
    작성날짜 : 2002-04-09
    ORACLE 8I SQL*LOADER DATAFILE의 특정 FIELD DATA를 SKIP하고 LOADING하는 방법
    ===========================================================================
    아래의 예제와 같이 가변 길이의 filed들이 ',', '|' 와 같은 구분자로
    구분이 되고 있는 경우 oracle 8i부터 제공되는 'FILLER'라고 하는 필드
    구분자를 사용하여 상태인식자로 표시하여 insert시 skip할 수 있다.
    <Example>
    TABLE : skiptab
    ===========================
         col1 varchar2(20)
         col2 varchar2(20)
         col3 varchar2(20)
    CONTROLFIEL : skip.ctl
    load data
    infile skip.dat
    into table skiptab
    fields terminated by ","
    (col1 char,
    col2 filler char,
    col3 char)
    DATAFILE : skip.dat
    SMITH, DALLAS, RESEARCH
    ALLEN, CHICAGO, SALES
    WARD, CHICAGO, SALES
    data loading :
    $sqlldr scott/tiger control=skip.ctl
    결과 :
    COL1 COL3
    SMITH RESEARCH
    ALLEN SALES
    WARD SALES

  • Load XML Videos data on tilelist(component) click

    Is this possible to load XML Videos data on tilelist(component) click..?

    yes. works for me:
    import fl.video.FLVPlayback;
    var flv_pb1:FLVPlayback = new FLVPlayback();
    var flv_pb2:FLVPlayback = new FLVPlayback();
    var flv_pb3:FLVPlayback = new FLVPlayback();
    flv_pb1.source="z_flvs/water.flv";
    flv_pb2.source="z_flvs/sample.mp4";
    flv_pb3.source="z_flvs/water.flv";
    tl.addItem({alt:"v 1", src: flv_pb1});
    tl.addItem({alt:"v 2", src: flv_pb2});
    tl.addItem({alt:"v 3", src: flv_pb3});
    tl.labelField = "alt";
    tl.sourceField = "src";
    tl.columnWidth = 400;
    tl.rowHeight = 600;
    tl.columnCount = tl.length;
    tl.rowCount = 1;
    tl.move(10, 10);

  • Most efficient way to load XML file data into tables

    I have a complex XML file running into MBs. I want to load it's data into 7-8 tables.
    Which way will be better:
    1) Use SQL Loader to actually load directly into the 7-8 tables directly by modifying the control card.
    Is this really possible and feasible? I am not even sure about it
    2) Load data as XML Type in a table and register it. Then extract from there to load into various tables.
    Please help. I have to find the most efficient way of doing it.
    Regards,
    Sudhir

    Yes it is possible to use SQL*Loader to parse and load XML, but that is not what it was designed for and so is not recommended. You also don't need to register a schema, just to load/store/parse XML in the DB either.
    So where does that leave you?
    Some options
    {thread:id=410714} (see page 2)
    {thread:id=1090681}
    {thread:id=1070213}
    Those talk some about storage options and reading in XML from disk and parsing XML. They should also give you options to consider. Without knowing more about your requirements for the effort, it is difficult to give specific advice. Maybe your 7-8 tables don't exist and so using Object Relational Storage for the XML would be the best solution as you can query/update tables that Oracle creates based off the schema associated to the XML. Maybe an External Table definition works better for reading the XML into the system because this process will happen just once. Maybe using WebDAV makes more sense for loading XML to be parsed (I don't have much experience with this, just know it is possible from what I've read on the forums). Also, your version makes a difference as you have different options available depending upon the version of Oracle.
    Hope all that helps as a starter.
    Edited by: A_Non on Jul 8, 2010 4:31 PM
    A great example, see the answers by mdrake in {thread:id=1096784}

  • Load xml file data in oracle 9i table

    Hi,
    I have oracle 9i R2 (9.2.0.8) and currently i am loading data using sql*loader as i am getting text file. I will be getting file in XML format shortly.
    I would like tp know if there is a sql*loader way to load xml file or another easy method that i can use it?
    If you can provide me very simplest way, i will really appreciate it.
    Thanks,

    Please refer
    http://download-uk.oracle.com/docs/cd/B12037_01/appdev.101/b10790/xdb25loa.htm

  • Problem with SQL*Loader and different date formats in the same file

    DB: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
    System: AIX 5.3.0.0
    Hello,
    I'm using SQL*Loader to import semi-colon separated values into a table. The files are delivered to us by a data provider who concatenates data from different sources and this results in us having different date formats within the same file. For example:
    ...;2010-12-31;22/11/1932;...
    I load this data using the following lines in the control file:
    EXECUTIONDATE1     TIMESTAMP     NULLIF EXECUTIONDATE1=BLANKS     "TO_DATE(:EXECUTIONDATE1, 'YYYY-MM-DD')",
    DELDOB          TIMESTAMP     NULLIF DELDOB=BLANKS          "TO_DATE(:DELDOB, 'DD/MM/YYYY')",
    The relevant NLS parameters:
    NLS_LANGUAGE=FRENCH
    NLS_DATE_FORMAT=DD/MM/RR
    NLS_DATE_LANGUAGE=FRENCH
    If I load this file as is the values loaded into the table are 31 dec 2010 and 22 nov *2032*, aven though the years are on 4 digits. If I change the NLS_DATE_FORMAT to DD/MM/YYYY then the second date value will be loaded correctly, but the first value will be loaded as 31 dec *2020* !!
    How can I get both date values to load correctly?
    Thanks!
    Sylvain

    This is very strange, after running a few tests I realized that if the year is 19XX then it will get loaded as 2019, and if it is 20XX then it will be 2020. I'm guessing it may have something to do with certain env variables that aren't set up properly because I'm fairly sure my SQL*Loader control file is correct... I'll run more tests :-(

  • SQL*Loader and binary data

    i have a C routine that builds SQL*Loader input files. the input files contain multiple records, with a couple of integer columns and a raw(1400) field. the control file specifies a record separator of '|', which seems weird (having text in the middle of raw data) but also is somewhat co-operative (see below).
    so i basically write each integer to the file, then a short (2-byte) length value for the raw field, then the raw field. then the '|' separator.
    i've noticed that if the size of raw field is 400 bytes or less, everything works fine, i get the correct number of records in the database.
    unfortunately, with a size of 401 or more, SQL*Loader parses the thing into twice as many records as it should. so if i've written 3 records to my input data file, with each record's raw field at 400 bytes or less i get 3 records loaded. but any with a raw field of 401+, i get two records for each.
    any ideas why? and how to correct this? also, any ideas on a better way to do this? all the examples of large data in the online doc and the o'reilly book favor showing examples with large character data, which does me not much good.
    tia. john

    i have a C routine that builds SQL*Loader input files. the input files contain multiple records, with a couple of integer columns and a raw(1400) field. the control file specifies a record separator of '|', which seems weird (having text in the middle of raw data) but also is somewhat co-operative (see below).
    so i basically write each integer to the file, then a short (2-byte) length value for the raw field, then the raw field. then the '|' separator.
    i've noticed that if the size of raw field is 400 bytes or less, everything works fine, i get the correct number of records in the database.
    unfortunately, with a size of 401 or more, SQL*Loader parses the thing into twice as many records as it should. so if i've written 3 records to my input data file, with each record's raw field at 400 bytes or less i get 3 records loaded. but any with a raw field of 401+, i get two records for each.
    any ideas why? and how to correct this? also, any ideas on a better way to do this? all the examples of large data in the online doc and the o'reilly book favor showing examples with large character data, which does me not much good.
    tia. john

  • Sql loader and bulk data

    hi,
    I want to insert 100,000 records daily in a table for the first month and then in next month these records are going to be replaced by new updated records.
    there might be few addition and deletion in the previous records also.
    actually its consumer data so there might be few consumer who have withdrawn the utility and there will be some more consumer added in the database.
    but almost 99% of the previous month data have to be updated/replaced with the fresh month data.
    For instance, what i have in my mind is that i will use sql loader to load data for the first month and then i will delete the previous data using sqlPlus and load the fresh month data using sql loader again.
    1. Is this ok ? or there is some better solution to this.
    2. I have heard of external files, are they feasible in my scenario?
    3. I have planned that i will make scripts for sqlPlus and Loader and use them in batch files. (OS windows 2003 server, Oracle 9i database). is there some better choice to make all the procedure automatic?
    looking for your suggestions
    nadeem ameer

    I would suggest u use External tables since its more flexible then
    sqlloader & is a better option.
    For using external tables
    1)u will have to create a directory first
    2)Generally creation od directory is done by sys,hence after creating the directory
    privileges read & write to be provided to user .
    3)Creation of external tables.
    4) Now use the table as a normal table to insert ,update delete in
    ur table.
    U can get more information from
    http://www.oracle-base.com/articles/9i/SQLNewFeatures9i.php#ExternalTables
    Create Directory <directory_name> as <Directory path where file be present>
    Grant read,write on directory <directory_name> to <username>
    CREATE TABLE <table_name>
    (<column names>)
    ORGANIZATION EXTERNAL
    TYPE ORACLE_LOADER
    DEFAULT DIRECTORY ,directory_name>
    ACCESS PARAMETERS
    RECORDS DELIMITED BY NEWLINE
    FIELDS TERMINATED BY ','
    LOCATION (<filename>)
    PARALLEL 5
    REJECT LIMIT 200;
    Hope this helps.

  • SQL Loader - Field in data file exceeds maximum length

    Dear All,
    I have a file which has more than 4000 characters in a field and I wish to load the data in a table with field length = 4000. but I receive error as
    Field in data file exceeds maximum lengthThe below given are the scripts and ctl file
    Table creation script:
    CREATE TABLE "TEST_TAB"
        "STR"  VARCHAR2(4000 BYTE),
        "STR2" VARCHAR2(4000 BYTE),
        "STR3" VARCHAR2(4000 BYTE)
      );Control file:
    LOAD DATA
    INFILE 'C:\table_export.txt'
    APPEND INTO TABLE TEST_TAB
    FIELDS TERMINATED BY '|'
    TRAILING NULLCOLS
    ( STR CHAR(4000) "SUBSTR(:STR,1,4000)" ,
    STR2 CHAR(4000) "SUBSTR(:STR2,1,4000)" ,
    STR3 CHAR(4000) "SUBSTR(:STR3,1,4000)"
    )Log:
    SQL*Loader: Release 10.2.0.1.0 - Production on Mon Jul 26 16:06:25 2010
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Control File:   C:\TEST_TAB.CTL
    Data File:      C:\table_export.txt
      Bad File:     C:\TEST_TAB.BAD
      Discard File:  none specified
    (Allow all discards)
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 0
    Bind array:     64 rows, maximum of 256000 bytes
    Continuation:    none specified
    Path used:      Conventional
    Table TEST_TAB, loaded from every logical record.
    Insert option in effect for this table: APPEND
    TRAILING NULLCOLS option in effect
       Column Name                  Position   Len  Term Encl Datatype
    STR                                 FIRST  4000   |       CHARACTER           
        SQL string for column : "SUBSTR(:STR,1,4000)"
    STR2                                 NEXT  4000   |       CHARACTER           
        SQL string for column : "SUBSTR(:STR2,1,4000)"
    STR3                                 NEXT  4000   |       CHARACTER           
        SQL string for column : "SUBSTR(:STR3,1,4000)"
    value used for ROWS parameter changed from 64 to 21
    Record 1: Rejected - Error on table TEST_TAB, column STR.
    Field in data file exceeds maximum length
    MAXIMUM ERROR COUNT EXCEEDED - Above statistics reflect partial run.
    Table TEST_TAB:
      0 Rows successfully loaded.
      1 Row not loaded due to data errors.
      0 Rows not loaded because all WHEN clauses were failed.
      0 Rows not loaded because all fields were null.
    Space allocated for bind array:                 252126 bytes(21 rows)
    Read   buffer bytes: 1048576
    Total logical records skipped:          0
    Total logical records read:             1
    Total logical records rejected:         1
    Total logical records discarded:        0
    Run began on Mon Jul 26 16:06:25 2010
    Run ended on Mon Jul 26 16:06:25 2010
    Elapsed time was:     00:00:00.22
    CPU time was:         00:00:00.15Please suggest a way to get it done.
    Thanks for reading the post!
    *009*

    Hi Toni,
    Thanks for the reply.
    Do you mean this?
    CREATE TABLE "TEST"."TEST_TAB"
        "STR"  VARCHAR2(4001),
        "STR2" VARCHAR2(4001),
        "STR3" VARCHAR2(4001)
      );However this does not work as the error would be:
    Error at Command Line:8 Column:20
    Error report:
    SQL Error: ORA-00910: specified length too long for its datatype
    00910. 00000 -  "specified length too long for its datatype"
    *Cause:    for datatypes CHAR and RAW, the length specified was > 2000;
               otherwise, the length specified was > 4000.
    *Action:   use a shorter length or switch to a datatype permitting a
               longer length such as a VARCHAR2, LONG CHAR, or LONG RAW*009*
    Edited by: 009 on Jul 28, 2010 6:15 AM

  • SQL LOADER Problem when data is loaded but not come in standard formate

    Hi guys,
    I got problem when sql loader run data loaded successfully in table but UOM data not come in standard formate.
    UOM table column contains the Unit of measure data but in my excel sheet it's look like :
    EXCEl SHEET DATA:
    1541GAFB07080          0     Metres
    1541GAFE10040          109.6     Metres
    1541GAFE10050          594.2     Metres
    1541GAFE10070          126.26     Metres
    1541GAFE14040          6.12     Metres
    1541GAFE14050          0     Metres
    1541SAFA05210          0     Metres
    1541SAFA07100          0     Metres
    1551EKDA05210          0     Nos
    1551EKDA07100          0     Nos
    1551EKDA07120          0     Nos
    1551EKDA07140          0     Nos
    1551EKDA07200          0     Nos.
    1551EKDA08160          0     Nos.
    1551EKDA08180          0     Nos.
    1551EKDA08200          0     Nos.
    1551EKDA10080          41     Nos.
    1551EKDA10140          85     Nos.
    .ctl file :
    OPTIONS (silent=(header,feedback,discards))
    LOAD DATA
    INFILE *
    APPEND
    INTO TABLE XXPL_PO_REQUISITION_STG
    FIELDS TERMINATED BY ','
    OPTIONALLY ENCLOSED BY'"'
    TRAILING NULLCOLS
    ( ITEM_CODE CHAR,
    ITEM_DESCRIPTION CHAR "TRIM(:ITEM_DESCRIPTION)",
    QUANTITY,
    UOM,
    NEED_BY_DATE,
    PROJECT,
    TASK_NAME,
    BUYER,
    REQ_TYPE,
    STATUS,
    ORGANIZATION_CODE,
    LOCATION,
    SUBINVENTORY,
    LINE_NO,
    REQ_NUMBER,
    LOADED_FLAG CONSTANT 'N',
    SERIAL_NO "XXPL_PRREQ_SEQ.NEXTVAL",
    CREATED_BY,
    CREATION_DATE SYSDATE,
    LAST_UPDATED_BY,
    LAST_UPDATED_DATE,
    LAST_UPDATED_LOGIN
    Some output came in table like:
    W541WDCA05260 0 Metres|
    W541WDCA05290 3 Metres|
    W541WDCA05264 4 Metres|
    W541WDCA05280 8 Metres|
    1551EADA04240 0 Nos|
    1551EADA07100 0 Nos|
    1551EKDA10080 0 Nos.|
    1551EKDA10080 41 Nos.|
    proble in | delimiter...how to remove it ' | ' from my table when sqlloader program runnig ...... where i can change in .ctl file or excel file....it's urgent guys olz help me ..
    thanks

    Hi,
    How are you extracting the data to Excel sheet ?
    Please check the format type of the column in Excel sheet for UOM.
    There is no issue in the SQL loader control file, but issue is there in your source excel file. (Try using a different method to extract the data to Excel sheet.)
    Regards,
    Yuvaraj.C

  • Sql loader error with date format

    Hi everyone,
    I have table and have a data in one coulmn RECORDED_DATE like '20090224' and my client is asking me to load this coulmn data in 'yyyymmdd' format.I am strucked up with my ideas.I used to_date('20090124','yyyymmdd') in control file.but it is also not working Here it is my control file
    LOAD DATA
    INFILE 'C:\xxxx\SQLLDR\HE data\HE_data_Feb.txt'
    BADFILE 'C:\xxxx\SQLLDR\HE data.bad'
    DISCARDFILE 'C:\xxxx\SQLLDR\HE data.dsc'
    INTO TABLE LSCCMGR.FASTPAY_HE_DATA
    REPLACE
    fields terminated by X'09'
    TRAILING NULLCOLS
    (RECORDED_DATE "TO_DATE(:RECORDED_DATE,'mm/dd/yyyy')"
    AGENT_ID
    MEASURE
    TRANSACTIONS
    FEES
    If i excute like this i am getting the error like
    Record 1: Rejected - Error on table LSCCMGR.FASTPAY_HE_DATA, column RECORDED_DATE.
    ORA-01843: not a valid month
    Getting for all records,what i need o change to get the RECORDED_DATE as dateformat.Ple any one help me in this issue to resolve
    How can we perform this using sql loader pls let me know,Thanks in advance.
    Sravan

    Hi,
    &gt;&gt;(RECORDED_DATE "TO_DATE(:RECORDED_DATE,'mm/dd/yyyy')"
    *Change this line to*
    (RECORDED_DATE "TO_DATE(:RECORDED_DATE,'yyyymmdd')"Regards,

  • SQL Loader Problem with Date Format

    Dear all,
    I am dealing with a problem in loading data with SQL Loader. The problem is in the date format.
    More specifically, I created the following Control File:
    file.ctl
    LOAD DATA
    INFILE 'D:\gbal\chatium.log'
    APPEND INTO TABLE CHAT_SL
    FIELDS TERMINATED BY WHITESPACE
    TRAILING NULLCOLS
    (SL1 DATE "Mon DD, YYYY HH:MI:SS FF3AM",
    SL2 char,
    SL3 DATE "Mon DD, YYYY HH:MI:SS FF3AM",
    SL4 char,
    SL5 char,
    SL6 char,
    SL7 char,
    SL8 char,
    SL9 char,
    SL10 char,
    SL11 char,
    SL12 char,
    SL13 char,
    SL14 char,
    SL15 char)
    The data we want to load are in the following file:
    Apr 29, 2007 12:05:49 AM 1060615 Apr 29, 2007 12:05:35 AM 306978537730 24026384 chatium.user.userinfo WAP 0
    Apr 29, 2007 12:12:51 AM 1061251 Apr 29, 2007 12:12:27 AM 306978537730 24026384 chatium.channel.list WAP 0
    Apr 29, 2007 12:12:51 AM 1061264 Apr 29, 2007 12:12:32 AM 306978537730 24026384 chatium.channel.listdetail WAP 0
    Apr 29, 2007 12:13:51 AM 1061321 Apr 29, 2007 12:13:31 AM 306978537730 24026384 chatium.user.search WAP 0
    Apr 29, 2007 12:13:51 AM 1061330 Apr 29, 2007 12:13:37 AM 306978537730 24026384 chatium.user.userinfo WAP 0
    The error log file is the following:
    SQL*Loader: Release 9.2.0.1.0 - Production on Mon Apr 30 11:29:16 2007
    Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
    Control File: file.ctl
    Data File: D:\gbal\chatium.log
    Bad File: chatium.bad
    Discard File: none specified
    (Allow all discards)
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 50
    Bind array: 64 rows, maximum of 256000 bytes
    Continuation: none specified
    Path used: Conventional
    Table CHAT_SL, loaded from every logical record.
    Insert option in effect for this table: APPEND
    TRAILING NULLCOLS option in effect
    Column Name Position Len Term Encl Datatype
    SL1 FIRST * WHT DATE MonDD,YYYYHH:MI:SS
    SL2 NEXT * WHT CHARACTER
    SL3 NEXT * WHT CHARACTER
    SL4 NEXT * WHT CHARACTER
    SL5 NEXT * WHT CHARACTER
    SL6 NEXT * WHT CHARACTER
    SL7 NEXT * WHT CHARACTER
    SL8 NEXT * WHT CHARACTER
    SL9 NEXT * WHT CHARACTER
    SL10 NEXT * WHT CHARACTER
    SL11 NEXT * WHT CHARACTER
    SL12 NEXT * WHT CHARACTER
    SL13 NEXT * WHT CHARACTER
    SL14 NEXT * WHT CHARACTER
    SL15 NEXT * WHT CHARACTER
    Record 1: Rejected - Error on table CHAT_SL, column SL1.
    ORA-01840: input value not long enough for date format
    Record 2: Rejected - Error on table CHAT_SL, column SL1.
    ORA-01840: input value not long enough for date format
    Record 3: Rejected - Error on table CHAT_SL, column SL1.
    ORA-01840: input value not long enough for date format
    Record 4: Rejected - Error on table CHAT_SL, column SL1.
    ORA-01840: input value not long enough for date format
    I wonder if you could help me.
    Thank you very much in advance.
    Giorgos Baliotis

    SQL> select to_date('Apr 29, 2007 12:05:49 AM','Mon DD, YYYY HH:MI:SS FF3AM') from dual;
    select to_date('Apr 29, 2007 12:05:49 AM','Mon DD, YYYY HH:MI:SS FF3AM') from dual
    ERROR at line 1:
    ORA-01821: date format not recognized
    SQL> ed
    Wrote file afiedt.buf
      1* select to_date('Apr 29, 2007 12:05:49 AM','Mon DD, YYYY HH:MI:SS AM') from dual
    SQL> /
    TO_DATE(
    29/04/07
    SQL> Then, you defined blank space as separator, but there is spaces in your date inside your file. So, you should add double-quotes around the date field like below, and add optionally enclosed by '"' into your ctlfile.
    "Apr 29, 2007 12:05:49 AM" 1060615 "Apr 29, 2007 12:05:35 AM" 306978537730 24026384 chatium.user.userinfo WAP 0
    "Apr 29, 2007 12:12:51 AM" 1061251 "Apr 29, 2007 12:12:27 AM" 306978537730 24026384 chatium.channel.list WAP 0
    "Apr 29, 2007 12:12:51 AM" 1061264 "Apr 29, 2007 12:12:32 AM" 306978537730 24026384 chatium.channel.listdetail WAP 0
    "Apr 29, 2007 12:13:51 AM" 1061321 "Apr 29, 2007 12:13:31 AM" 306978537730 24026384 chatium.user.search WAP 0
    "Apr 29, 2007 12:13:51 AM" 1061330 "Apr 29, 2007 12:13:37 AM" 306978537730 24026384 chatium.user.userinfo WAP 0Example :
    http://download-uk.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_concepts.htm#sthref477
    Nicolas.

  • SQL Loader and formatting dates

    I've got dates formatted like so: 2012-05-10T17:04:51-08:00
    How can I get SQL Loader to load these into a Date column??
    Thanks

    Thanks for the reply-
    SQL Loader is running on Win, Oracle 11 on Linux
    The format is standard UTC
    YYYY-MM-DDTHH24:MI:SS±HH:MM Where the ±HH:MM refers to time zone offset from GMT. The "T" is just a separator between date and time (and is always "T").
    I looked at your references, and tried a few dozen variants without success.
    Most of my attempts have been using something similar to this
    "YYYY-MM-DDTHH24:MI:SSTZH:TZM"
    or this:
    "YYYY-MM-DD'T'HH24:MI:SSTZH:TZM"
    or this:
    'YYYY-MM-DD"T"HH24:MI:SSTZH:TZM'
    Thanks

  • SQL * Loader with Spanish data!

    Hi All,
    I have a requirement to load Spanish data into a table. I am using SQL*Loader control file to load the data.
    Everything is fine, but the Spanish charecters are not inserting as they have to. It is inserting some junk charecters.
    How can I solve this issue?
    Thanks in advance.

    Hello,
    Are you using init.ora (pfile) or spfile? Anyway you can do this
    sqlplus "/as sysdba"
    sql> create pfile='/location_to_pfile/init.ora' from spfile;
    File Created
    Modify init.ora file and add parameter to support spanish characters
    sql>shutdown immediate;
    sql>startup pfile='/path_to_pfile/init.ora';  -- YOu should be able mount and open and successfully
    Test your data using sqlldr and if it is doing what you are expected it do then.
    sql>shutdown immediate;
    sql>create spfile from pfile='/path_to_pfile/init.ora';
    File Created
    sql>startup;
    sql>show parameter you_parameter_nameor you can try this to
    sqlplus "/as sysdba"
    sql> alter system set parameter_name=value scope=both sid='*';
    or
    sql>alter system set paramter_name=value sid='*';
    sql> show parameter parameter_name;
    Test your data using sqlldr Regards

Maybe you are looking for