Xml  with sql loader

Hi,
i tried to load the xml datas into database using sql loader.
CREATE DIRECTORY xml_dir AS '/:';
GRANT READ ON DIRECTORY xml_dir TO wiki;
xml.ctl
OPTIONS (ERRORS=100, SILENT=(FEEDBACK))
LOAD DATA
INFILE 'emp.xml'
APPEND
INTO TABLE emp
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
(empno,ename,job,mgr,hiredate,sal,comm,deptno)
*emp.xml*
<name>
<empno>1</empno>
<ename>1</ename>
<job>1</job>
<mgr>1</mgr>
<hiredate></hiredate>
<sal>1</sal>
<comm>1</comm>
<deptno>1</deptno>
</name>
i got the following errors in log
Record 1: Rejected - Error on table EMP, column ENAME.
Column not found before end of logical record (use TRAILING NULLCOLS)
Record 2: Rejected - Error on table EMP, column ENAME.
how to the sql*loader identify seperator in xml file.
please help me out that.

Hi,
i tried to load the xml datas into database using sql loader.
CREATE DIRECTORY xml_dir AS '/:';
GRANT READ ON DIRECTORY xml_dir TO wiki;
xml.ctl
OPTIONS (ERRORS=100, SILENT=(FEEDBACK))
LOAD DATA
INFILE 'emp.xml'
APPEND
INTO TABLE emp
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
(empno,ename,job,mgr,hiredate,sal,comm,deptno)
*emp.xml*
<name>
<empno>1</empno>
<ename>1</ename>
<job>1</job>
<mgr>1</mgr>
<hiredate></hiredate>
<sal>1</sal>
<comm>1</comm>
<deptno>1</deptno>
</name>
i got the following errors in log
Record 1: Rejected - Error on table EMP, column ENAME.
Column not found before end of logical record (use TRAILING NULLCOLS)
Record 2: Rejected - Error on table EMP, column ENAME.
how to the sql*loader identify seperator in xml file.
please help me out that.

Similar Messages

  • How can I load this XML with sql loader ?

    Hi,
    Im try to load a XML file into a XML table:
    Create table TEST of XMLTYPE;
    XML file is something like this:
    <!DOCTYPE list SYSTEM "myfile.dtd">
    <list>
    <item id = 1> lot of tags an info here </item>
    <item id = 9000> lot of tags an info here </item>
    </list>
    I dont know how to exactly create an appropriate control file to my XML. All examples I saw in documentation are similar to this one:
    this is an example control file
    LOAD DATA
    INFILE *
    INTO TABLE test
    APPEND
    XMLTYPE (xmldata)
    FIELDS
    I dont know how to complete this control file. Can anyone help me ?
    thank you!
    Message was edited by:
    pollopolea

    Well I found this working code
    LOAD DATA
    INFILE *
    INTO TABLE test APPEND
    xmltype(xmldata)
    FIELDS
    ext_fname filler char(30),
    xmldata LOBFILE (ext_fname) TERMINATED BY EOF
    BEGINDATA
    mifile.xml
    mifile2.xml
    mifile3.xml
    the main diference I found is that when i use:
    Insert into test values (XMLType(bfilename('XMLDIR', 'swd_uC000025127.xml'),nls_charset_id('AL32UTF8')));
    tag <!DOCTYPE list SYSTEM "myfile.dtd"> is expanded
    (I explain at DTD insertion and errors with long DTD entities
    Now using sql loader the tag is not modified.
    I dont know if is there any difference, between each case. did I lost performance?
    Message was edited by:
    pollopolea

  • How to load XML file to table (non-XML) with SQL*Loader -- issue with nulls

    I have been attempting to use SQL*Loader to load an XML file into a "regular" Oracle table. All fields work fine, unless a null is encountered. The way that nulls are represented is shown below:
    <PAYLOAD>
    <FIELD1>ABCDEF</FIELD1>
    <FIELD2/>
    <FIELD3>123456</FIELD3>
    </PAYLOAD>
    In the above example, FIELD2 is a null field and that is the way it is presented. I have searched everywhere and have not found how I could code for this. The issue is that if FIELD2 is present, it is coded like: <FIELD2>SOMEDATA</FIELD2>, but the null is represented as <FIELD2/>. Here is a sample of the control file I am using to attempt the load -- very simplistic, but works fine when fields are present:
    load data
    infile 'testdata.xml' "str '<PAYLOAD>'"
    TRUNCATE
    into table DATA_FROM_XML
    FIELD1 ENCLOSED BY '<FIELD1>' AND '</FIELD1>',
    FIELD2 ENCLOSED BY '<FIELD2>' AND '</FIELD2>',
    FIELD3 ENCLOSED BY '<FIELD3>' AND '</FIELD3>')
    What do I need to do to account for the way that nulls are presented? I have tried everything I could glean from the web and the documentation and nothing has worked. Any help would be really appreciated.

    I hadn't even got that far. can you direct me to where the docs are to import data that is stored within xml but that you don't need any xml functionality, that just happens to be the format the data is stored in? thx

  • Load XML with SQL*Loader

    Goodmorning,
    I have this XML file:
    bq. &lt;?xml version="1.0"?&gt; \\     &lt;Header&gt; \\     &lt;DocName&gt;NOMRES&lt;/DocName&gt; \\     &lt;DocVersion&gt;3.2&lt;/DocVersion&gt; \\     &lt;Sender&gt;TIGF&lt;/Sender&gt; \\     &lt;Receiver&gt;TIENIG&lt;/Receiver&gt; \\     &lt;DocNumber&gt;120731&lt;/DocNumber&gt; \\     &lt;DocDate&gt;2008-12-14T16:36:43.9288481+01:00&lt;/DocDate&gt; \\     &lt;DocType&gt;J&lt;/DocType&gt; \\     &lt;Contract&gt;TIGF-TIENIG&lt;/Contract&gt; \\     &lt;/Header&gt; \\     &lt;ListOfGasDays&gt; \\     &lt;GasDay&gt; \\     &lt;Day&gt;2008-12-15&lt;/Day&gt; \\     &lt;BusinessRuleFlag&gt;Processed by adjacent TSO&lt;/BusinessRuleFlag&gt; \\     &lt;ListOfLibri&gt; \\     &lt;Libro&gt; \\     &lt;Logid&gt;62&lt;/Logid&gt; \\     &lt;Isbn&gt;88-251-7194-3&lt;/Isbn&gt; \\     &lt;Autore&gt;Elisa Bertino&lt;/Autore&gt; \\     &lt;Titolo&gt;Sistemi di basi di dati - Concetti e architetture&lt;/Titolo&gt; \\     &lt;Anno&gt;1997&lt;/Anno&gt; \\     &lt;Collocazione&gt;Dentro&lt;/Collocazione&gt; \\     &lt;Genere&gt;Informatica&lt;/Genere&gt; \\     &lt;Lingua&gt;Italiano&lt;/Lingua&gt; \\     &lt;/Libro&gt; \\     &lt;Libro&gt; \\     &lt;Logid&gt;63&lt;/Logid&gt; \\     &lt;Isbn&gt;978-88-04-56981-7&lt;/Isbn&gt; \\     &lt;Autore&gt;Dan Brown&lt;/Autore&gt; \\     &lt;Titolo&gt;Crypto&lt;/Titolo&gt; \\     &lt;Anno&gt;1998&lt;/Anno&gt; \\     &lt;Collocazione&gt;Dentro&lt;/Collocazione&gt; \\     &lt;Genere&gt;Thriller&lt;/Genere&gt; \\     &lt;Lingua&gt;Italiano&lt;/Lingua&gt; \\     &lt;/Libro&gt; \\     &lt;/ListOfLibri&gt; \\     &lt;/GasDay&gt; \\     &lt;GasDay&gt; \\     &lt;Day&gt;2008-12-15&lt;/Day&gt; \\     &lt;BusinessRuleFlag&gt;Confirmed&lt;/BusinessRuleFlag&gt; \\     &lt;ListOfLibri&gt; \\     &lt;Libro&gt; \\     &lt;Logid&gt;64&lt;/Logid&gt; \\     &lt;Isbn&gt;978-88-6061-131-4&lt;/Isbn&gt; \\     &lt;Autore&gt;Stephen King&lt;/Autore&gt; \\     &lt;Titolo&gt;Cell&lt;/Titolo&gt; \\     &lt;Anno&gt;2006&lt;/Anno&gt; \\     &lt;Collocazione&gt;Dentro&lt;/Collocazione&gt; \\     &lt;Genere&gt;Horror&lt;/Genere&gt; \\     &lt;Lingua&gt;Italiano&lt;/Lingua&gt; \\     &lt;/Libro&gt; \\     &lt;Libro&gt; \\     &lt;Logid&gt;65&lt;/Logid&gt; \\     &lt;Isbn&gt;1-56592-697-8&lt;/Isbn&gt; \\     &lt;Autore&gt;David C. Kreines&lt;/Autore&gt; \\     &lt;Titolo&gt;Oracle SQL - The Essential Reference&lt;/Titolo&gt; \\     &lt;Anno&gt;2000&lt;/Anno&gt; \\     &lt;Collocazione&gt;Dentro&lt;/Collocazione&gt; \\     &lt;Genere&gt;Informatica&lt;/Genere&gt; \\     &lt;Lingua&gt;Inglese&lt;/Lingua&gt; \\     &lt;/Libro&gt; \\     &lt;Libro&gt; \\     &lt;Logid&gt;66&lt;/Logid&gt; \\     &lt;Isbn&gt;978-88-6061-131-4&lt;/Isbn&gt; \\     &lt;Autore&gt;Stephen King&lt;/Autore&gt; \\     &lt;Titolo&gt;Cell&lt;/Titolo&gt; \\     &lt;Anno&gt;2006&lt;/Anno&gt; \\     &lt;Collocazione&gt;Dentro&lt;/Collocazione&gt; \\     &lt;Genere&gt;Horror&lt;/Genere&gt; \\     &lt;Lingua&gt;Italiano&lt;/Lingua&gt; \\     &lt;/Libro&gt; \\     &lt;/ListOfLibri&gt; \\     &lt;/GasDay&gt; \\     &lt;/ListOfGasDays&gt; \\     &lt;ListOfGeneralNotes&gt; \\     &lt;GeneralNote&gt; \\     &lt;Code&gt;100&lt;/Code&gt; \\     &lt;Message&gt;Rien a signaler&lt;/Message&gt; \\     &lt;/GeneralNote&gt; \\     &lt;/ListOfGeneralNotes&gt;
    and use this control file:
    bq. load data \\ infile "Esempio.XML" "str '&lt;/Libro&gt;'" \\ BADFILE "libri.bad" \\ DISCARDFILE "libri.dis" \\ DISCARDMAX 10000 \\ truncate \\ into table LIBRI \\ TRAILING NULLCOLS \\ ( \\ dummy filler terminated by '&lt;Libro&gt;', \\ Logid enclosed by "&lt;Logid&gt;" and "&lt;/Logid&gt;", \\ Isbn enclosed by "&lt;Isbn&gt;" and "&lt;/Isbn&gt;", \\ Autore enclosed by "&lt;Autore&gt;" and "&lt;/Autore&gt;", \\ Titolo enclosed by "&lt;Titolo&gt;" and "&lt;/Titolo&gt;", \\ Anno enclosed by "&lt;Anno&gt;" and "&lt;/Anno&gt;", \\ Collocazione enclosed by "&lt;Collocazione&gt;" and "&lt;/Collocazione&gt;", \\ Genere enclosed by "&lt;Genere&gt;" and "&lt;/Genere&gt;", \\ Lingua enclosed by "&lt;Lingua&gt;" and "&lt;/Lingua&gt;" \\ )
    being uploaded but I always send error in the first record.
    Someone said me why? differently if I set the control file?
    thanks

    I have the following XML data file and had the same loading issue.
    <?xml version="1.0"?>
    <Settlement_Info>
    <file_header>
    <agency_id>129</agency_id>
    <agency_form_number/>
    <omb_form_number/>
    <treasury_account_symbol/>
    <percent_of_amount>100.00</percent_of_amount>
    </file_header>
    <body_item>
    <item_header>
    <deposit_ticket_number>001296</deposit_ticket_number>
    <total_amount_of_sf215>1,318,542,280.16</total_amount_of_sf215>
    <number_of_collections>3,929</number_of_collections>
    <total_of_all_collections>1,318,542,280.16</total_of_all_collections>
    </item_header>
    <item_detail_record>
    <paygov_tx_id>FMG4</paygov_tx_id>
    <agency_tx_id>0000015901</agency_tx_id>
    <collection_amount>8,688.70</collection_amount>
    <collection_method>ACH</collection_method>
    <deposit_ticket_number>96</deposit_ticket_number>
    <settlement_date>12/15/2009</settlement_date>
    <collection_status>SETTLED</collection_status>
    <submitter_name>MORRIS</submitter_name>
    </item_detail_record>
    <item_detail_record>
    <paygov_tx_id>FMG5</paygov_tx_id>
    <agency_tx_id>0000015902</agency_tx_id>
    <collection_amount>42,198.66</collection_amount>
    <collection_method>ACH</collection_method>
    <deposit_ticket_number>001296</deposit_ticket_number>
    <settlement_date>12/15/2009</settlement_date>
    <collection_status>SETTLED</collection_status>
    <submitter_name>CASTLE</submitter_name>
    </item_detail_record>
    <item_detail_record>
    <paygov_tx_id>4FMG6</paygov_tx_id>
    <agency_tx_id>0000015903</agency_tx_id>
    <collection_amount>57,278.25</collection_amount>
    <collection_method>ACH</collection_method>
    <deposit_ticket_number>001296</deposit_ticket_number>
    <settlement_date>12/15/2009</settlement_date>
    <collection_status>SETTLED</collection_status>
    <submitter_name>FRANKLIN</submitter_name>
    </item_detail_record>
    </body_item>
         <file_footer>
         <file_name>ACHActivityFile_12152009.xml</file_name>
         <file_creation_date>12/15/2009 10:08:31 AM</file_creation_date>
         </file_footer>
         </Settlement_Info>
    Control file
    load data
    infile 'C:\sample1.xml' "str '</item_detail_record>'"
    truncate
    into table xml_test2
    TRAILING NULLCOLS
    dummy filler terminated by "<item_detail_record>",
    paygov_tx_id enclosed by "<paygov_tx_id>" and "</paygov_tx_id>",
    agency_tx_id enclosed by "<agency_tx_id>" and "</agency_tx_id>",
    collection_amount enclosed by "<collection_amount>" and "</collection_amount>",
    collection_method enclosed by "<collection_method>" and "</collection_method>",
    deposit_ticket_number enclosed by "<deposit_ticket_number>" and "</deposit_ticket_number>",
    settlement_date enclosed by "<settlement_date>" and "</settlement_date>",
    collection_status enclosed by "<collection_status>" and "</collection_status>",
    submitter_name enclosed by "<submitter_name>" and "</submitter_name>"
    table strucutre
    CREATE TABLE XML_TEST2
    PAYGOV_TX_ID VARCHAR2(30 BYTE),
    AGENCY_TX_ID VARCHAR2(30 BYTE),
    COLLECTION_AMOUNT VARCHAR2(30 BYTE),
    COLLECTION_METHOD VARCHAR2(30 BYTE),
    DEPOSIT_TICKET_NUMBER VARCHAR2(30 BYTE),
    SETTLEMENT_DATE VARCHAR2(30 BYTE),
    COLLECTION_STATUS VARCHAR2(30 BYTE),
    SUBMITTER_NAME VARCHAR2(60 BYTE)
    If I reomove the <file_header> and <item_header> blocks, the control file works perfectly, otherwise it skips the first record.
    thanks
    Reji

  • How can I load data into table with SQL*LOADER

    how can I load data into table with SQL*LOADER
    when column data length more than 255 bytes?
    when column exceed 255 ,data can not be insert into table by SQL*LOADER
    CREATE TABLE A (
    A VARCHAR2 ( 10 ) ,
    B VARCHAR2 ( 10 ) ,
    C VARCHAR2 ( 10 ) ,
    E VARCHAR2 ( 2000 ) );
    control file:
    load data
    append into table A
    fields terminated by X'09'
    (A , B , C , E )
    SQL*LOADER command:
    sqlldr test/test control=A_ctl.txt data=A.xls log=b.log
    datafile:
    column E is more than 255bytes
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)

    Check this out.
    http://download-west.oracle.com/docs/cd/B10501_01/server.920/a96652/ch06.htm#1006961

  • Loading Objects with SQL*Loader

    When loading a column object with SQL*Loader, is it possible to specify a column specific 'TERMINATED BY' clause in the control file?
    I've successfully defined column-level termination characters for regular columns and nested tables, but can't seem to find any way of achieving the same result with column objects.

    When loading a column object with SQL*Loader, is it possible to specify a column specific 'TERMINATED BY' clause in the control file?
    I've successfully defined column-level termination characters for regular columns and nested tables, but can't seem to find any way of achieving the same result with column objects.

  • Sql@loader-704  and ORA-12154: error messages when trying to load data with SQL Loader

    I have a data base with two tables that is used by Apex 4.2. One table has 800,000 records . The other has 7 million records
    The client recently upgraded from Apex 3.2 to Apex 4.2 . We exported/imported the data to the new location with no problems
    The source of the data is an old mainframe system; I needed to make changes to the source data and then load the tables.
    The first time I loaded the data i did it from a command line with SQL loader
    Now when I try to load the data I get this message:
    sql@loader-704 Internal error: ulconnect OCISERVERATTACH
    ORA-12154: tns:could not resolve the connect identifier specified
    I've searched for postings on these error message and they all seem to say that SQL Ldr can't find my TNSNAMES file.
    I am able to  connect and load data with SQL Developer; so SQL developer is able to find the TNSNAMES file
    However SQL Developer will not let me load a file this big
    I have also tried to load the file within Apex  (SQL Workshop/ Utilities) but again, the file is too big.
    So it seems like SQL Loader is the only option
    I did find one post online that said to set an environment variable with the path to the TNSNAMES file, but that didn't work..
    Not sure what else to try or where to look
    thanks

    Hi,
    You must have more than one tnsnames file or multiple installations of oracle. What i suggest you do (as I'm sure will be mentioned in ed's link that you were already pointed at) is the following (* i assume you are on windows?)
    open a command prompt
    set TNS_ADMIN=PATH_TO_DIRECTOT_THAT_CONTAINS_CORRECT_TNSNAMES_FILE (i.e. something like set TNS_ADMIN=c:\oracle\network\admin)
    This will tell oracle use the config files you find here and no others
    then try sqlldr user/pass@db (in the same dos window)
    see if that connects and let us know.
    Cheers,
    Harry
    http://dbaharrison.blogspot.com

  • Loading "fixed length" text files in UTF8 with SQL*Loader

    Hi!
    We have a lot of files, we load with SQL*Loader into our database. All Datafiles have fixed length columns, so we use POSITION(pos1, pos2) in the ctl-file. Till now the files were in WE8ISO8859P1 and everything was fine.
    Now the source-system generating the files changes to unicode and the files are in UTF8!
    The SQL-Loader docu says "The start and end arguments to the POSITION parameter are interpreted in bytes, even if character-length semantics are in use in a datafile....."
    As I see this now, there is no way to say "column A starts at "CHARACTER Position pos1" and ends at "Character Position pos2".
    I tested with
    load data
    CHARACTERSET AL32UTF8
    LENGTH SEMANTICS CHARACTER
    replace ...
    in the .ctl file, but when the first character with more than one byte encoding (for example ü ) is in the file, all positions of that record are mixed up.
    Is there a way to load these files in UTF8 without changing the file-definition to a column-seperator?
    Thanks for any hints - charly

    I have not tested this but you should be able to achieve what you want by using LENGTH SEMANTICS CHARACTER and by specifying field lengths (e.g. CHAR(5)) instead of only their positions. You could still use the POSITION(*+n) syntax to skip any separator columns that contain only spaces or tabs.
    If the above does not work, an alternative would be to convert all UTF8 files to UTF16 before loading so that they become fixed-width.
    -- Sergiusz

  • Problem with loading file with SQL loader

    i am getting a problem with loading a file with SQL loader. The loading is getting
    terminated after around 2000 rows whereas there are around 2700000 rows in the file.
    The file is like
    919879086475,11/17/2004,11/20/2004
    919879698625,11/17/2004,11/17/2004
    919879698628,11/17/2004,11/17/2004
    the control file, i am using is like:-
    load data
    infile 'c:\ran\temp\pps_fc.txt'
              into table bm_05oct06
    fields terminated by ","
    (mobile_no, fcal, frdate )
    I hope, my question is clear. Please help, in solving the doubt.
    regards.

    So which thread is telling the truth?
    Doubt with SQL loader file wih spaces
    Are the fields delimited with spaces or with commas?
    Perhaps they are a mixture of delimiters and that is where the error is coming in?

  • Ignoring constraints with SQL*Loader & triggers

    When I load a table with SQL*Loader that has foreign key constraints and a before insert trigger, my foreign key constraints are ignored. The end result is a detail table that contains invalid rows (where some column values do not exist in the master table). If I drop the trigger and just load with SQL*Loader, the foreign key constraints are recognized. Any ideas why this is happening? The trigger simply populates the last updated date and user columns.

    I found this from Asktom very nice. It will help u a lot.
    http://asktom.oracle.com/pls/ask/f?p=4950:9:1988009758486146475::NO:9:F4950_P9_DISPLAYID:8806498660292

  • Need help with SQL*Loader not working

    Hi all,
    I am trying to run SQL*Loader on Oracle 10g UNIX platform (Red Hat Linux) with below command:
    sqlldr userid='ldm/password' control=issue.ctl bad=issue.bad discard=issue.txt direct=true log=issue.log
    And get below errors:
    SQL*Loader-128: unable to begin a session
    ORA-01034: ORACLE not available
    ORA-27101: shared memory realm does not exist
    Linux-x86_64 Error: 2: No such file or directory
    Can anyone help me out with this problem that I am having with SQL*Loader? Thanks!
    Ben Prusinski

    Hi Frank,
    More progress, I exported the ORACLE_SID and tried again but now have new errors! We are trying to load an Excel CSV file into a new table on our Oracle 10g database. I created the new table in Oracle and loaded with SQL*Loader with below problems.
    $ export ORACLE_SID=PROD
    $ sqlldr 'ldm/password@PROD' control=prod.ctl log=issue.log bad=bad.log discard=discard.log
    SQL*Loader: Release 10.2.0.1.0 - Production on Tue May 23 11:04:28 2006
    Copyright (c) 1982, 2005, Oracle. All rights reserved.
    SQL*Loader: Release 10.2.0.1.0 - Production on Tue May 23 11:04:28 2006
    Copyright (c) 1982, 2005, Oracle. All rights reserved.
    Control File: prod.ctl
    Data File: prod.csv
    Bad File: bad.log
    Discard File: discard.log
    (Allow all discards)
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 50
    Bind array: 64 rows, maximum of 256000 bytes
    Continuation: none specified
    Path used: Conventional
    Table TESTLD, loaded from every logical record.
    Insert option in effect for this table: REPLACE
    Column Name Position Len Term Encl Datatype
    ISSUE_KEY FIRST * , CHARACTER
    TIME_DIM_KEY NEXT * , CHARACTER
    PRODUCT_CATEGORY_KEY NEXT * , CHARACTER
    PRODUCT_KEY NEXT * , CHARACTER
    SALES_CHANNEL_DIM_KEY NEXT * , CHARACTER
    TIME_OF_DAY_DIM_KEY NEXT * , CHARACTER
    ACCOUNT_DIM_KEY NEXT * , CHARACTER
    ESN_KEY NEXT * , CHARACTER
    DISCOUNT_DIM_KEY NEXT * , CHARACTER
    INVOICE_NUMBER NEXT * , CHARACTER
    ISSUE_QTY NEXT * , CHARACTER
    GROSS_PRICE NEXT * , CHARACTER
    DISCOUNT_AMT NEXT * , CHARACTER
    NET_PRICE NEXT * , CHARACTER
    COST NEXT * , CHARACTER
    SALES_GEOGRAPHY_DIM_KEY NEXT * , CHARACTER
    value used for ROWS parameter changed from 64 to 62
    Record 1: Rejected - Error on table TESTLD, column DISCOUNT_AMT.
    Column not found before end of logical record (use TRAILING NULLCOLS)
    Record 2: Rejected - Error on table TESTLD, column DISCOUNT_AMT.
    Column not found before end of logical record (use TRAILING NULLCOLS)
    Record 3: Rejected - Error on table TESTLD, column DISCOUNT_AMT.
    Column not found before end of logical record (use TRAILING NULLCOLS)
    Record 4: Rejected - Error on table TESTLD, column DISCOUNT_AMT.
    Column not found before end of logical record (use TRAILING NULLCOLS)
    Record 5: Rejected - Error on table TESTLD, column DISCOUNT_AMT.
    Column not found before end of logical record (use TRAILING NULLCOLS)
    Record 6: Rejected - Error on table TESTLD, column DISCOUNT_AMT.
    Column not found before end of logical record (use TRAILING NULLCOLS)
    Record 7: Rejected - Error on table TESTLD, column DISCOUNT_AMT.
    Column not found before end of logical record (use TRAILING NULLCOLS)
    Record 8: Rejected - Error on table TESTLD, column DISCOUNT_AMT.
    Column not found before end of logical record (use TRAILING NULLCOLS)
    Record 9: Rejected - Error on table ISSUE_FACT_TESTLD, column DISCOUNT_AMT.
    Column not found before end of logical record (use TRAILING NULLCOLS)
    Record 10: Rejected - Error on table TESTLD, column DISCOUNT_AMT.
    Column not found before end of logical record (use TRAILING NULLCOLS)
    Record 11: Rejected - Error on table TESTLD, column DISCOUNT_AMT.
    Column not found before end of logical record (use TRAILING NULLCOLS)
    Record 12: Rejected - Error on table TESTLD, column DISCOUNT_AMT.
    Column not found before end of logical record (use TRAILING NULLCOLS)
    Record 13: Rejected - Error on table TESTLD, column DISCOUNT_AMT.
    Column not found before end of logical record (use TRAILING NULLCOLS)
    Record 14: Rejected - Error on table TESTLD, column DISCOUNT_AMT.
    Column not found before end of logical record (use TRAILING NULLCOLS)
    Record 15: Rejected - Error on table TESTLD, column DISCOUNT_AMT.
    Column not found before end of logical record (use TRAILING NULLCOLS)
    Record 16: Rejected - Error on table TESTLD, column DISCOUNT_AMT.
    Column not found before end of logical record (use TRAILING NULLCOLS)
    Record 17: Rejected - Error on table TESTLD, column DISCOUNT_AMT.
    Column not found before end of logical record (use TRAILING NULLCOLS)
    Record 18: Rejected - Error on table TESTLD, column DISCOUNT_AMT.
    Column not found before end of logical record (use TRAILING NULLCOLS)
    Record 19: Rejected - Error on table TESTLD, column DISCOUNT_AMT.
    Column not found before end of logical record (use TRAILING NULLCOLS)
    Record 20: Rejected - Error on table TESTLD, column DISCOUNT_AMT.
    Column not found before end of logical record (use TRAILING NULLCOLS)
    Record 21: Rejected - Error on table TESTLD, column DISCOUNT_AMT.
    Column not found before end of logical record (use TRAILING NULLCOLS)
    Record 22: Rejected - Error on table TESTLD, column DISCOUNT_AMT.
    Column not found before end of logical record (use TRAILING NULLCOLS)
    Record 23: Rejected - Error on table TESTLD, column DISCOUNT_AMT.
    Column not found before end of logical record (use TRAILING NULLCOLS)
    Record 24: Rejected - Error on table TESTLD, column DISCOUNT_AMT.
    Column not found before end of logical record (use TRAILING NULLCOLS)
    Record 39: Rejected - Error on table TESTLD, column DISCOUNT_AMT.
    Column not found before end of logical record (use TRAILING NULLCOLS)
    MAXIMUM ERROR COUNT EXCEEDED - Above statistics reflect partial run.
    Table TESTLD:
    0 Rows successfully loaded.
    51 Rows not loaded due to data errors.
    0 Rows not loaded because all WHEN clauses were failed.
    0 Rows not loaded because all fields were null.
    Space allocated for bind array: 255936 bytes(62 rows)
    Read buffer bytes: 1048576
    Total logical records skipped: 0
    Total logical records read: 51
    Total logical records rejected: 51
    Total logical records discarded: 0
    Run began on Tue May 23 11:04:28 2006
    Run ended on Tue May 23 11:04:28 2006
    Elapsed time was: 00:00:00.14
    CPU time was: 00:00:00.01
    [oracle@casanbdb11 sql_loader]$
    Here is the control file:
    LOAD DATA
    INFILE issue_fact.csv
    REPLACE
    INTO TABLE TESTLD
    FIELDS TERMINATED BY ','
    ISSUE_KEY,
    TIME_DIM_KEY,
    PRODUCT_CATEGORY_KEY,
    PRODUCT_KEY,
    SALES_CHANNEL_DIM_KEY,
    TIME_OF_DAY_DIM_KEY,
    ACCOUNT_DIM_KEY,
    ESN_KEY,
    DISCOUNT_DIM_KEY,
    INVOICE_NUMBER,
    ISSUE_QTY,
    GROSS_PRICE,
    DISCOUNT_AMT,
    NET_PRICE,
    COST,
    SALES_GEOGRAPHY_DIM_KEY
    )

  • How to do it with SQL Loader

    All,
    I have two tables HEADER_TABLE and LINE_TABLE. Each header record can have multiple line records. I have to load data from a flat file to these tables.Flat file can have two types of records. H-Header, L-Line. It looks as follows.. Each H record can have multiple corresponding L records
    H..........
    L.......
    L......
    L......
    H.........
    L.......
    L......
    L......
    I have HEADER_ID column in HEADER_TABLE and HEADER_ID, LINE_ID columns in the LINE_TABLE.
    While loading data using SQL Loader, I need to generate HEADER_ID and LINE_ID values as follows and load them.
    H..........<HEADER_ID = 1>
    L....... <HEADER_ID = 1><LINE_ID = 1>
    L...... <HEADER_ID = 1><LINE_ID = 2>
    L...... <HEADER_ID = 1><LINE_ID = 3>
    H......... <HEADER_ID = 2>
    L....... <HEADER_ID = 2><LINE_ID = 4>
    L...... <HEADER_ID = 2><LINE_ID = 5>
    L...... <HEADER_ID = 2><LINE_ID = 6>
    Is it possible to do this with SQL LODER?
    I tried to do this with sequences. But it loaded the tables as follows.
    H..........<HEADER_ID = 1>
    L....... <HEADER_ID = 1><LINE_ID = 1>
    L...... <HEADER_ID = 1><LINE_ID = 2>
    L...... <HEADER_ID = 1><LINE_ID = 3>
    H......... <HEADER_ID = 2>
    L....... <HEADER_ID = 1><LINE_ID = 4>
    L...... <HEADER_ID = 1><LINE_ID = 5>
    L...... <HEADER_ID = 1><LINE_ID = 6>
    Thanks
    Ketha

    Morgan,
    Examples given in the link are quite generic and I have tried them. But my requirement is focused on generating header_id and line_id values as i have described. It seems that SQLLDR scans all records for a particular WHEN clause and insert them into the specified table. I think that if SQLLDR is made to read recod in the data file sequentially, this can be done.
    ANy idea of how to make SQLLDR read the records from the file sequentially?
    Thanks
    Ketha

  • Problem with SQL*Loader and different date formats in the same file

    DB: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
    System: AIX 5.3.0.0
    Hello,
    I'm using SQL*Loader to import semi-colon separated values into a table. The files are delivered to us by a data provider who concatenates data from different sources and this results in us having different date formats within the same file. For example:
    ...;2010-12-31;22/11/1932;...
    I load this data using the following lines in the control file:
    EXECUTIONDATE1     TIMESTAMP     NULLIF EXECUTIONDATE1=BLANKS     "TO_DATE(:EXECUTIONDATE1, 'YYYY-MM-DD')",
    DELDOB          TIMESTAMP     NULLIF DELDOB=BLANKS          "TO_DATE(:DELDOB, 'DD/MM/YYYY')",
    The relevant NLS parameters:
    NLS_LANGUAGE=FRENCH
    NLS_DATE_FORMAT=DD/MM/RR
    NLS_DATE_LANGUAGE=FRENCH
    If I load this file as is the values loaded into the table are 31 dec 2010 and 22 nov *2032*, aven though the years are on 4 digits. If I change the NLS_DATE_FORMAT to DD/MM/YYYY then the second date value will be loaded correctly, but the first value will be loaded as 31 dec *2020* !!
    How can I get both date values to load correctly?
    Thanks!
    Sylvain

    This is very strange, after running a few tests I realized that if the year is 19XX then it will get loaded as 2019, and if it is 20XX then it will be 2020. I'm guessing it may have something to do with certain env variables that aren't set up properly because I'm fairly sure my SQL*Loader control file is correct... I'll run more tests :-(

  • Loading huge file with Sql-Loader from Java

    Hi,
    I have a csv file with aprox. 3 and a half million records.
    I load this data with sqlldr from within java like this:
               String command = "sqlldr userid=" + user + "/" + pass
                        + "@" + service + " control='" + ctlFile + "'";
                System.out.println(command);
                if (System.getProperty("os.name").contains("Windows")) {
                    p = Runtime.getRuntime().exec("cmd /C " + command);
                } else {
                    p = Runtime.getRuntime().exec("sh -c " + command);
                }it does what I want to, load the data to a certain table, BUT it takes too much time, Is there a faster way to load data to an oracle db from within java?
    Thanks, any advice is very welcome

    Have your DBA work on this issue - they can monitor and check performance of SQL*Loader
    SQL*Loader performance tips          [Document 28631.1]
    SQL*LOADER SLOW PERFORMANCE          [Document 1026145.6]
    Master Note for SQL*Loader          [Document 1264730.1]
    HTH
    Srini

  • Loading multiple files with SQL Loader

    Hello.
    I will appreciate your recommendation about the way to load multiple files (to multiple tables) using SQL Loader with only one Control file.
    file1 to load to Table1, file2 to load to Table2 etc.
    How the Control file should look like?
    I was looking on Web, but didn't find exactly what I need.
    Thanks!

    Ctl File : myctl.ctl
    ---------- Start ---------
    LOAD DATA
    INFILE 'F:\sqlldr\abc1.dat'
    INFILE 'F:\sqlldr\abc2.dat'
    INTO TABLE hdfc1
    (TRANS_DATE CHAR,
    NARRATION CHAR,
    VALUE_DATE CHAR,
    DEBIT_AMOUNT INTEGER,
    CREDIT_AMOUNT INTEGER,
    CHQ_REF_NUMBER CHAR,
    CLOSING_BALANCE CHAR)
    INTO TABLE hdfc2
    (TRANS_DATE CHAR,
    NARRATION CHAR,
    VALUE_DATE CHAR,
    DEBIT_AMOUNT INTEGER,
    CREDIT_AMOUNT INTEGER,
    CHQ_REF_NUMBER CHAR,
    CLOSING_BALANCE CHAR)
    -----------End-----------
    Sqlldr Command
    sqlldr scott/tiger@dbtalk control=F:\sqlldr\myctl.ctl log=F:\sqlldr\ddl_file1.txt
    Regards,
    Abu

Maybe you are looking for

  • Multiple devices on WiFi...slow speed

    In our household, we have 2 iPads, 2 iPhones, a Mac Pro laptop, Apple TV and a primary machine that is connected to the Internet via an AT&T modem and distributes the WiFi signal throughout via an older style Airport Express (the flat, round-edged sq

  • Send a mail in java

    does anyone know how to send a mail if you press a certain button. some code would be helpful

  • 8i and 9i on same server?

    Forgive me if this has been asked before, or if this is a really stupid question. Is it possible to run both an 8i database and a 9i database on the same server? I'm thinking about trying separate Oracle homes to accomplish this, but I wanted to know

  • Fetch Netvalue and Sales order No.

    Hi Friends Kindly help me to Fetch 'Sales Order Number' and 'Net Value of the Sales Order in Document Currency' before Saving the Sales Order since i need to store in custom table. Thanks in Advance

  • New VLan can't talk to DHCP server

    We have just created a new vlan (172.18.1.0/24 vlan 3) on our network and we are trying to use an exsisting DHCP server (192.168.0.1 vlan 8) But the clients on the 172.18.1.0 network can't connect to this DHCP server on vlan 8. My questions are. 1. D