SQL*Loader Import Problem

Hi,
I am trying to import some data from .txt file to a table in Oracle DB using TOAD- Sql*Loader - Import wizard. I was able to load all the files successfully. All the files are pipe(|) delimited. But in one of the files, when the values for the field is NULL - it is populated as ' ' (space), I have trouble in loading these files.
Suppose I have the fields f1, f2, f3, f4 in the .txt file. And say I have the data as following (and delimter as pipe(|)):
f1 f2 f3 f4 f5
r1|01|xyz|123|abc
r2|02| |234|bcd * 3rd column being null(space)
r3| |sam|345|def * 2nd field being null(space)
|04|ram|456|efg * 1st field being null(space)
|05|abc|567|gef * 1st column being null
I have problems loading the records 4,5 using Sql*loader, as the first field being null, the data is not being loaded properly. Is there any alternative way of loading this file into my table. Please help me.

Pl post OS and database versions, along with the contents of the loader control file and the loader log file with the errors.
http://docs.oracle.com/cd/E11882_01/server.112/e22490/ldr_field_list.htm#i1009544
HTH
Srini

Similar Messages

  • SQL*Loader importing problem, with file with eastern european files

    Hello,
    on Oracle 11g with UTF-8 encoding, I tried to import a csv file into a table via sqlload, the separator is the semicolon ";" all work fine except for some lines witch are not well integrated (the concerned files come from Eastern European countries like Bulgary, Hungary and Czech Republic).
    For example:
    For:
    text_1; text_2; text_with_char_at_end_like_š; new_text
    during the integration instead of have:
    | text_1 | text_2 | text_with_char_at_end_like_š| new_text |
    I got:
    | text_1 | text_2 | text_with_char_at_end_like_š; new_text | null |
    does anyone has this problem, I tried to change the delimiter by code X'59', specified in sqlldr ENCODING UTF8 ... but it does not work
    do you have an idea about this problem
    Thank you in advance

    Thanks,
    the problem was solved since, the file was not in UTF8 format (for example GREEK FORMAT) and the NLS_LANG was AMERICAN_AMERICA ASCII
    then i translate all files to UTF 8 and changed the NLS_LANG to UtF8.
    Regards

  • SQL LOADER Import

    Hello
    So I have a export from a MYSQLl database a .sql file that I would like to import into oracle. Can I do that without creating a control file? Would I have to creat a control file for every table that is in the database?
    Brian Sims

    Hi Brian,
    when you don't have a control file the defaults will be used and the defaults commonly don't work in general. As you can imagine every table has its own data types which commonly requires a dedicated control file for every table.
    There's a dedicated SQL*Loader forum and they might be able to assist you to automate the process
    Export/Import/SQL Loader & External Tables

  • SQL loader import issue..

    Hi,
    I am working on 10g database on ibm AIX ..
    i need to use sql loader and insert some data in to table using a input file.
    The input file contains follwing format..- data.dat
    001 901 200 1611196 "dis
    ltype
    gu" Mhamicddu kuasa 12as king
    all these values are tab deliminated and one column value "disltypegu" is enclosed by " " and in next line. This "disltypegu" shuld be inserted in to one column.
    I am using following control file..
    LOAD DATA
    INFILE '/home/oracle/data.dat'
    INSERT INTO TABLE emp
    (e1,e2,e3,e4,e5,e6,e7,e8,e9)
    any idea ...wht all things i need to include in my control file to perform this insert...
    I am using below control file ..
    LOAD DATA
    INFILE '/home/oracle/data1.csv'
    APPEND
    INTO TABLE emp
    FIELDS TERMINATED BY X'09'
    OPTIONALLY ENCLOSED BY '"'
    TRAILING NULLCOLS
    (ename,edd,emob)
    By using this control file , we can import tab delemenated and enclosed by "" column ...it is working ..
    Only thing is if some column data is in next row , it is treating it as a second row and inserting in to new row..
    How to insert data in to one row..
    This is my data file ... with 3 column values .. ( ename -Robin , edd - Address , emob - 13)
    Robin "Address"
    13
    but it is inserting emob value (13) in ename ...like next line as a new row...

    oradba11 wrote:
    Hi,
    I am working on 10g database on ibm AIX ..
    i need to use sql loader and insert some data in to table using a input file.
    The input file contains follwing format..- data.dat
    001 901 200 1611196 "dis
    ltype
    gu" Mhamicddu kuasa 12as king
    all these values are tab deliminated and one column value "disltypegu" is enclosed by " " and in next line. This "disltypegu" shuld be inserted in to one column.
    I am using following control file..
    LOAD DATA
    INFILE '/home/oracle/data.dat'
    INSERT INTO TABLE emp
    (e1,e2,e3,e4,e5,e6,e7,e8,e9)
    any idea ...wht all things i need to include in my control file to perform this insert...
    I am using below control file ..
    LOAD DATA
    INFILE '/home/oracle/data1.csv'
    APPEND
    INTO TABLE emp
    FIELDS TERMINATED BY X'09'
    OPTIONALLY ENCLOSED BY '"'
    TRAILING NULLCOLS
    (ename,edd,emob)
    By using this control file , we can import tab delemenated and enclosed by "" column ...it is working ..
    Only thing is if some column data is in next row , it is treating it as a second row and inserting in to new row..
    How to insert data in to one row..
    This is my data file ... with 3 column values .. ( ename -Robin , edd - Address , emob - 13)
    Robin "Address"
    13
    but it is inserting emob value (13) in ename ...like next line as a new row...How is this data file created and what processes (automated AND manual) does it pass through before coming to sqlloader? It looks to me like it is getting manually copied and pasted from notepad with line wrap turned on. You really need to address why the data is being wrapped across multiple lines (records).
    Edited by: EdStevens on Mar 13, 2012 7:48 AM

  • How to filter some illegal rows when SQL Loader import data

    I want to import data in a csv file by SQL Loader.
    but , I don't want to import some illegal rows
    when the column 'name' is null
    how can I modify the SQL Loader ctrl file?

    Hi,
    refer this blogpost:
    http://gennick.com/allnull.html
    thanks,
    X A H E E R

  • SQL Loader Date Problems

    Hi Guys,
    We have a SQL Loader script that used to data in the 'mm/dd/yyyy' format. Post migration of the script onto a seprate server the format is getting stored in 'dd/mm/yyyy' format in a VARCHAR datatype column for DOB.
    Please find below the Loader script :
    "insert into tsa_lists values ('" & ttsa_list_typ & "','" & ttsa_list_num & "',"
                                        if ttsa_list_typ <> "AuthRep" then
                                        inscmd.CommandText= "insert into tsa_lists values ('" & ttsa_list_typ & "','" & ttsa_list_num & "',?,?,?,?,?,?,?,?,?,?,sysdate,?,?)"
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_sid",200,1,30,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_lastname",200,1,500,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_firstname",200,1,500,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_firstletter",200,1,1,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_middlename",200,1,500,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_dob",200,1,100,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_pob",200,1,200,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_citizenship",200,1,200,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_cleared",200,1,3,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_misc",200,1,2000,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_firstname_orig",200,1,500,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_lastname_orig",200,1,500,"")Data coming in the below format:
    SID     CLEARED     LASTNAME     FIRSTNAME     MIDDLENAME     TYPE     DOB     POB     CITIZENSHIP     PASSPORT/IDNUMBER     MISC     
    3799509          A     ABD AL SALAM MARUF ABDALLAH               01-Jul-55                         
    3799512          A     ABD AL SALAM MARUF ABDALLAH               01-Jan-80                         
    3727959          A     KHALID KHALIL IBRAHIM MAHDI               11-Nov-50                         
    3458238          A     KHALID KHALIL IBRAHIM MAHDI               08-Jan-81                         
    3458242          A     KHALID KHALIL IBRAHIM MAHDI               31-Jul-81                         
    3458231          A     KHALID KHALIL IBRAHIM MAHDI               01-Aug-81                         
    2407275          A     MUSA BARARUDDIN Y DAGAM               19-Aug-62                         
    Please can you guys suggest a way.
    Cheers,
    Shazin

    1) That does not look like anything recognized by "SQL*Loader" utility.
    2) If you just really mean a "sql" statement to load data...then:
    a) If table column is date type, then use the TO_DATE() function
    b) If column is Varchar2 you should not save date's in a varchar2 column.
    In any case check out the NLS_DATE_FORMAT parameter on the database and or the same parameter on client side.
    :p

  • Sql loader performance problem with xml

    Hi,
    i have to load a 400 mb big xml file into mz local machine's free oracle db
    i have tested a one record xml and was able to load succesfully, but 400 mb freeying for half an hour and does not even started?
    it is normal? is there any chance i will be able to load it, just need to wait?
    are there any faster solution?
    i ahve created a table below
    CREATE TABLE test_xml
    COL_ID VARCHAR2(1000),
    IN_FILE XMLTYPE
    XMLTYPE IN_FILE STORE AS CLOB
    and control file below
    LOAD DATA
    CHARACTERSET UTF8
    INFILE 'test.xml'
    APPEND
    INTO TABLE product_xml
    col_id filler CHAR (1000),
    in_file LOBFILE(CONSTANT "test.xml") TERMINATED BY EOF
    anything i am doing wrong? thanks for advices

    SQL*Loader: Release 11.2.0.2.0 - Production on H. Febr. 11 18:57:09 2013
    Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
    Control File: prodxml.ctl
    Character Set UTF8 specified for all input.
    Data File: test.xml
    Bad File: test.bad
    Discard File: none specified
    (Allow all discards)
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 5000
    Bind array: 64 rows, maximum of 256000 bytes
    Continuation: none specified
    Path used: Conventional
    Table PRODUCT_XML, loaded from every logical record.
    Insert option in effect for this table: APPEND
    Column Name Position Len Term Encl Datatype
    COL_ID FIRST 1000 CHARACTER
    (FILLER FIELD)
    IN_FILE DERIVED * EOF CHARACTER
    Static LOBFILE. Filename is bv_test.xml
    Character Set UTF8 specified for all input.
    SQL*Loader-605: Non-data dependent ORACLE error occurred -- load discontinued.
    ORA-01652: unable to extend temp segment by 128 in tablespace TEMP
    Table PRODUCT_XML:
    0 Rows successfully loaded.
    0 Rows not loaded due to data errors.
    0 Rows not loaded because all WHEN clauses were failed.
    0 Rows not loaded because all fields were null.
    Space allocated for bind array: 256 bytes(64 rows)
    Read buffer bytes: 1048576
    Total logical records skipped: 0
    Total logical records rejected: 0
    Total logical records discarded: 0
    Run began on H. Febr. 11 18:57:09 2013
    Run ended on H. Febr. 11 19:20:54 2013
    Elapsed time was: 00:23:45.76
    CPU time was: 00:05:05.50
    this is the log
    i have truncated everything i am not able to load 400 mega into 4 giga i cannot understand
    windows is not licensed 32 bit

  • BW SQL 2005 import problem

    Hi,
    I'm trying to import one view from SQL 2005 DB. I took into consideration all rules like upper case view name, I gave the required permissions to the user BW and also created the view by the same user. There's no problem with reading data source through BW. I created info objects, defined transfer rules as well. But unfortunatelly when I try to take data to the packages it waits in not completed status wiht yellow sign. I have 10 transactions in the View.
    I gave db owner, ddladmin and public permission to BW user from SQL side. ANy idea?

    Hi ilker,
    Below link will helps you to solve the issue and will have some sort of information.....
    http://technet.microsoft.com/en-us/library/dd299430.aspx
    Regards
    Sudheer

  • Solution to sql developer import problem - nullpointerexception

    This isn't really a question, but I thought I should get the record of my tribulations posted so someone else won't have to work as hard as I did.
    Importing from a spreadsheet that has a blank column will give 'an error has occurred' and a stacktrace like :
    java.lang.NullPointerException
         at oracle.dbtools.raptor.dialogs.importdata.ExcelImportDialog.getStringValue(ExcelImportDialog.java:291)
         at oracle.dbtools.raptor.dialogs.importdata.ExcelImportDialog.populateModel(ExcelImportDialog.java:347)
         at oracle.dbtools.raptor.dialogs.importdata.ExcelImportDialog.displayExcelDialog(ExcelImportDialog.java:148)
         at
    Getting rid of that column, should get you past this point and to the import wizard.

    The import from xls feature has been rewritten from the scratch and a new "Import Data" wizard has been made available in SQLDEV 1.5 EA3, that is yet to be announced. This should take care of the blanks in xls. The new wizard will also support import from cvs files. Watch for EA3 announcement to try out the new Import Data wiz.

  • Problem import csv file with SQL*loader and control file

    I have a *csv file looking like this:
    E0100070;EKKJ 1X10/10 1 KV;1;2003-06-16;01C;75
    E0100075;EKKJ 1X10/10 1 KV;500;2003-06-16;01C;67
    E0100440;EKKJ 2X2,5/2,5 1 KV;1;2003-06-16;01C;37,2
    E0100445;EKKJ 2X2,5/2,5 1 KV;500;2003-06-16;01C;33,2
    E0100450;EKKJ 2X4/4 1 KV;1;2003-06-16;01C;53
    E0100455;EKKJ 2X4/4 1 KV;500;2003-06-16;01C;47,1
    I want to import this csv file to this table:
    create table artikel (artnr varchar2(10), namn varchar2(25), fp_storlek number, datum date, mtrlid varchar2(5), pris number);
    My controlfile looks like this:
    LOAD DATA
    INFILE 'e:\test.csv'
    INSERT
    INTO TABLE ARTIKEL
    FIELDS TERMINATED BY ';'
    TRAILING NULLCOLS
    (ARTNR, NAMN, FP_STORLEK char "to_number(:fp_storlek,'99999')", DATUM date 'yyyy-mm-dd', MTRLID, pris char "to_number(:pris,'999999D99')")
    I cant get sql*loader to import the last column(pris) as I want. It ignore my decimal point which in this case is "," and not "." maybe this is the problem. If the decimal point is the problem how can I get oracle to recognize "," as a decimal point??
    the result from the import now, is that a decimal number (37,2) becomes 372 in the table

    Set NLS_NUMERIC_CHARACTERS environment variable at OS level, before running SqlLoader :
    $ cat test.csv
    E0100070;EKKJ 1X10/10 1 KV;1;2003-06-16;01C;75
    E0100075;EKKJ 1X10/10 1 KV;500;2003-06-16;01C;67
    E0100440;EKKJ 2X2,5/2,5 1 KV;1;2003-06-16;01C;37,2
    E0100445;EKKJ 2X2,5/2,5 1 KV;500;2003-06-16;01C;33,2
    E0100450;EKKJ 2X4/4 1 KV;1;2003-06-16;01C;53
    E0100455;EKKJ 2X4/4 1 KV;500;2003-06-16;01C;47,1
    $ cat artikel.ctl
    LOAD DATA
    INFILE 'test.csv'
    replace
    INTO TABLE ARTIKEL
    FIELDS TERMINATED BY ';'
    TRAILING NULLCOLS
    (ARTNR, NAMN, FP_STORLEK char "to_number(:fp_storlek,'99999')", DATUM date 'yyyy-mm-dd', MTRLID, pris char "to_number(:pris,'999999D99')")
    $ sqlldr scott/tiger control=artikel
    SQL*Loader: Release 10.1.0.3.0 - Production on Sat Nov 12 15:10:01 2005
    Copyright (c) 1982, 2004, Oracle.  All rights reserved.
    Commit point reached - logical record count 6
    $ sqlplus scott/tiger
    SQL*Plus: Release 10.1.0.3.0 - Production on Sat Nov 12 15:10:11 2005
    Copyright (c) 1982, 2004, Oracle.  All rights reserved.
    Connected to:
    Oracle Database 10g Enterprise Edition Release 10.1.0.3.0 - Production
    With the Partitioning, OLAP and Data Mining options
    SQL> select * from artikel;
    ARTNR      NAMN                      FP_STORLEK DATUM      MTRLI       PRIS
    E0100070   EKKJ 1X10/10 1 KV                  1 16/06/2003 01C           75
    E0100075   EKKJ 1X10/10 1 KV                500 16/06/2003 01C           67
    E0100440   EKKJ 2X2,5/2,5 1 KV                1 16/06/2003 01C          372
    E0100445   EKKJ 2X2,5/2,5 1 KV              500 16/06/2003 01C          332
    E0100450   EKKJ 2X4/4 1 KV                    1 16/06/2003 01C           53
    E0100455   EKKJ 2X4/4 1 KV                  500 16/06/2003 01C          471
    6 rows selected.
    SQL> exit
    Disconnected from Oracle Database 10g Enterprise Edition Release 10.1.0.3.0 - Production
    With the Partitioning, OLAP and Data Mining options
    $ export NLS_NUMERIC_CHARACTERS=',.'
    $ sqlldr scott/tiger control=artikel
    SQL*Loader: Release 10.1.0.3.0 - Production on Sat Nov 12 15:10:41 2005
    Copyright (c) 1982, 2004, Oracle.  All rights reserved.
    Commit point reached - logical record count 6
    $ sqlplus scott/tiger
    SQL*Plus: Release 10.1.0.3.0 - Production on Sat Nov 12 15:10:45 2005
    Copyright (c) 1982, 2004, Oracle.  All rights reserved.
    Connected to:
    Oracle Database 10g Enterprise Edition Release 10.1.0.3.0 - Production
    With the Partitioning, OLAP and Data Mining options
    SQL> select * from artikel;
    ARTNR      NAMN                      FP_STORLEK DATUM      MTRLI       PRIS
    E0100070   EKKJ 1X10/10 1 KV                  1 16/06/2003 01C           75
    E0100075   EKKJ 1X10/10 1 KV                500 16/06/2003 01C           67
    E0100440   EKKJ 2X2,5/2,5 1 KV                1 16/06/2003 01C         37,2
    E0100445   EKKJ 2X2,5/2,5 1 KV              500 16/06/2003 01C         33,2
    E0100450   EKKJ 2X4/4 1 KV                    1 16/06/2003 01C           53
    E0100455   EKKJ 2X4/4 1 KV                  500 16/06/2003 01C         47,1
    6 rows selected.
    SQL>                                                                            Control file is exactly as yours, I just put replace instead of insert.

  • URGENT: Problems Loading files with SQL Loader into a BLOB column

    Hi friends,
    I read a lot about how to load files into blob columns, but I found errors that I can't solve.
    I've read several notes in these forums, ine of them:
    sql loader: loading external file into blob
    and tried the solutions but without good results.
    Here are some of my tests:
    With this .ctl:
    LOAD DATA
    INFILE *
    INTO TABLE mytable
    REPLACE
    FIELDS TERMINATED BY ','
    number1 INTEGER EXTERNAL,
    cad1 CHAR(250),
    image1 LOBFILE(cad1) TERMINATED BY EOF
    BEGINDATA
    1153,/opt/oracle/appl/myapp/1.0.0/img/1153.JPG,
    the error when I execute sqlldr is:
    SQL*Loader-350: Syntax error at line 9.
    Expecting "," or ")", found "LOBFILE".
    image1 LOBFILE(cad1) TERMINATED BY EOF
    ^
    What problem exists with LOBFILE ??
    (mytable of course has number1 as a NUMBER, cad1 as VARCHAR2(250) and image1 as BLOB
    I tried too with :
    LOAD DATA
    INFILE sample.dat
    INTO TABLE mytable
    FIELDS TERMINATED BY ','
    (cad1 CHAR(3),
    cad2 FILLER CHAR(30),
    image1 BFILE(CONSTANT "/opt/oracle/appl/myapp/1.0.0/img/", cad2))
    sample.dat is:
    1153,1153.JPEG,
    and error is:
    SQL*Loader-350: Syntax error at line 6.
    Expecting "," or ")", found "FILLER".
    cad2 FILLER CHAR(30),
    ^
    I tried too with a procedure, but without results...
    Any idea about this error messages?
    Thanks a lot.
    Jose L.

    > So you think that if one person put an "urgent" in the subject is screwing the problems of
    other people?
    Absolutely. As you are telling them "My posting is more important than yours and deserve faster attention and resolution than yours!".
    So what could a typical response be? Someone telling you that his posting is more important by using the phrase "VERY URGENT!". And the next poster may decide that, no, his problem is evern more import - and use "EXTREMELY URGENT!!" as the subject. And the next one then raises the stakes by claiming his problem is "CODE RED! CRITICAL. DEFCON 4. URGENT!!!!".
    Stupid, isn't it? As stupid as your instance that there is nothing wrong with your pitiful clamoring for attention to your problem by saying it is urgent.
    What does the RFC's say about a meaningful title/subject in a public forum? I trust that you know what a RFC is? After all, you claim to have used public forums on the Internet for some years now..
    The RFC on "public forums" is called The Usenet Article Format. This is what it has to say about the SUBJECT of a public posting:
    =
    The "Subject" line (formerly "Title") tells what the message is about. It should be suggestive enough of the contents of the message to enable a reader to make a decision whether to read the message based on the subject alone. If the message is submitted in response to another message (e.g., is a follow-up) the default subject should begin with the four characters "Re: ", and the "References" line is required. For follow-ups, the use of the "Summary" line is encouraged.
    =
    ([url http://www.cs.tut.fi/~jkorpela/rfc/1036.html]RFC 1036, the Usenet article format)
    Or how about [url http://www.cs.tut.fi/~jkorpela/usenet/dont.html]The seven don'ts of Usenet?
    Point 7 of the Don'ts:
    Don't try to catch attention by typing something foolish like "PLEASE HELP ME!!!! URGENT!!! I NEED YOUR HELP!!!" into the Subject line. Instead, type something informative (using normal mixed case!) that describes the subject matter.
    Please tell me that you are not too thick to understand the basic principles of netiquette, or to argue with the RFCs that governs the very fabric of the Internet.
    As for when I have an "urgent" problem? In my "real" work? I take it up with Oracle Support on Metalink by filing an iTAR/SR. As any non-idiot should do with a real-life Oracle crisis problem.
    I do not barge into a public forum like you do, jump up and down, and demand quick attention by claiming that my problem is more important and more urgent and more deserving of attention that other people's problem in the very same forum.

  • SQL*Loader problem - not efficient, parsing error for big xml files

    Hi Experts,
    First of all, I would like to store xml files in object relation way. Therefore I created a schema and a table for it (see above).
    I wants to propagate it (by using generated xml files), hence I created a control file for sql loader (see above).
    I have two problems for it.
    1, It takes a lot of time. It means I can upload a ~80MB file in 2 hours and a half.
    2, At bigger files, I got the following error messages (OCI-31011: XML parsing failed OCI-19202: Error occurred in XML processing LPX-00243: element attribute value must be enclosed in quotes). It is quite interesting because my xml file is generated and I could generated and uploaded the first and second half of the file.
    Can you help me to solve these problems?
    Thanks,
    Adam
    Control file
    UNRECOVERABLE
    LOAD DATA
    CHARACTERSET UTF8
    INFILE *
    APPEND
    INTO TABLE coll_xml_objrel
    XMLTYPE(xml)
    FIELDS
    ident constant 2
    ,file_name filler char(100)
    ,xml LOBFILE (file_name) TERMINATED BY EOF
    BEGINDATA
    generated1000x10000.xml
    Sql Loader command
    sqlldr.exe username/password@//localhost:1521/SID control='loader.ctl' log='loadr.log' direct=true
    Schema
    <?xml version="1.0" encoding="UTF-8"?>
    <schema targetNamespace="http://www.something.com/shema/simple_searches" elementFormDefault="qualified" xmlns="http://www.w3.org/2001/XMLSchema" xmlns:tns="http://www.something.com/shema/simple_searches">
        <element name="searches" type="tns:searches_type"></element>
        <element name="search" type="tns:search_type"></element>
        <element name="results" type="tns:results_type"></element>
        <element name="result" type="tns:result_type"></element>
        <complexType name="searches_type">
            <sequence>
                <element ref="tns:search" maxOccurs="unbounded"></element>
            </sequence>
        </complexType>
        <complexType name="search_type">
            <sequence>
                <element ref="tns:results"></element>
            </sequence>
            <attribute ref="tns:id" use="required"></attribute>
            <attribute ref="tns:type" use="required"></attribute>
        </complexType>
        <complexType name="results_type">
            <sequence maxOccurs="unbounded">
                <element ref="tns:result"></element>
            </sequence>
        </complexType>
        <complexType name="result_type">
            <attribute ref="tns:id" use="required"></attribute>
        </complexType>
        <simpleType name="type_type">
            <restriction base="string">
                <enumeration value="value1"></enumeration>
                <enumeration value="value2"></enumeration>
            </restriction>
        </simpleType>
        <attribute name="type" type="tns:type_type"></attribute>
        <attribute name="id" type="string"></attribute>
    </schema>
    Create table
    create table coll_xml_objrel
    ident Number(20) primary key,
    xml xmltype)
    Xmltype column xml
    store as object relational
    xmlschema "http://www.something.com/schema/simple_searches.xsd"
    Element "searches";

    Hi Odie_63,
    Thanks for your answer.
    I will post this question in the XML DB forum too (edit: I realized that you have done it. Thanks for it).
    1, Version: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    2, see above
    3, I have registered my schema with using dbms_xmlschema.registerSchema function.
    Cheers,
    Adam
    XML generator:
    import java.io.FileNotFoundException;
    import java.io.FileOutputStream;
    import javax.xml.stream.XMLOutputFactory;
    import javax.xml.stream.XMLStreamException;
    import javax.xml.stream.XMLStreamWriter;
    public class mainGenerator {
        public static void main(String[] args) throws FileNotFoundException, XMLStreamException {
            // TODO Auto-generated method stub
            final long numberOfSearches = 500;
            final long numberOfResults = 10000;
            XMLOutputFactory xof = XMLOutputFactory.newFactory();
            XMLStreamWriter writer = xof.createXMLStreamWriter(new FileOutputStream("C:\\Working\\generated500x10000.xml"));
            writer.writeStartDocument();
            writer.writeStartElement("tns","searches", "http://www.something.com/schema/simple_searches");
            writer.writeNamespace("tns", "http://www.something.com/schema/simple_searches");
            for (long i = 0; i < numberOfSearches; i++){
                Long help = new Long(i);
                writer.writeStartElement("tns","search", "http://www.something.com/schema/simple_searches);
                writer.writeAttribute("tns", "http://www.something.com/schema/simple_searches", "type", "value1");
                writer.writeAttribute("tns", "http://www.something.com/schema/simple_searches", "id", help.toString());
                writer.writeStartElement("tns","results", "http://www.something.com/schema/simple_searches");
                for (long j = 0; j < numberOfResults; j++){
                    writer.writeStartElement("tns","result", "http://www.something.com/schema/simple_searches");
                    Long helper = new Long(i*numberOfResults+j);
                    writer.writeAttribute("tns", "http://www.something.com/schema/simple_searches", "id", helper.toString());
                    writer.writeEndElement();
                writer.writeEndElement();
                writer.writeEndElement();
            writer.writeEndElement();
            writer.writeEndDocument();
            writer.close();
    registerSchema:
    begin
    dbms_xmlschema.registerSchema(
    'http://www.something.com/schema/simple_searches',
    '<?xml version="1.0" encoding="UTF-8"?>
    <schema targetNamespace="http://www.something.com/schema/simple_searches" elementFormDefault="qualified" xmlns="http://www.w3.org/2001/XMLSchema" xmlns:tns="http://www.something.com/schema/simple_searches">
        <element name="searches" type="tns:searches_type"></element>
        <element name="search" type="tns:search_type"></element>
        <element name="results" type="tns:results_type"></element>
        <element name="result" type="tns:result_type"></element>
        <complexType name="searches_type">
            <sequence>
                <element ref="tns:search" maxOccurs="unbounded"></element>
            </sequence>
        </complexType>
        <complexType name="search_type">
            <sequence>
                <element ref="tns:results"></element>
            </sequence>
            <attribute ref="tns:id" use="required"></attribute>
            <attribute ref="tns:type" use="required"></attribute>
        </complexType>
        <complexType name="results_type">
            <sequence maxOccurs="unbounded">
                <element ref="tns:result"></element>
            </sequence>
        </complexType>
        <complexType name="result_type">
            <attribute ref="tns:id" use="required"></attribute>
        </complexType>
        <simpleType name="type_type">
            <restriction base="string">
                <enumeration value="value1"></enumeration>
                <enumeration value="value2"></enumeration>
            </restriction>
        </simpleType>
        <attribute name="type" type="tns:type_type"></attribute>
        <attribute name="id" type="string"></attribute>
    </schema>',
    TRUE, TRUE, FALSE, FALSE);
    end

  • Problem using SQL Loader with ODI

    Hi,
    I am having problems using SQL Loader with ODI. I am trying to fill an oracle table with data from a txt file. At first I had used "File to SQL" LKM, but due to the size of the source txt file (700MB), I decided to use "File to Oracle (SQLLDR)" LKM.
    The error that appears in myFile.txt.log is: "SQL*Loader-101: Invalid argument for username/password"
    I think that the problem could be in the definition of the data server (Physical architecutre in topology), because I have left blank Host, user and password.
    Is this the problem? What host and user should I use? With "File to SQL" works fine living this blank, but takes to much time.
    Thanks in advance

    I tried to use your code, but I couldn´t make it work (I don´t know Jython). I think the problem could be with the use of quotes
    Here is what I wrote:
    import os
    retVal = os.system(r'sqlldr control=E:\Public\TXTODI\PROFITA2/Profita2Final.txt.ctl log=E:\Public\TXTODI\PROFITA2/Profita2Final.txt.log userid=MYUSER/myPassword @ mySID')
    if retVal == 1 or retVal > 2:
    raise 'SQLLDR failed. Please check the for details '
    And the error message is:
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (innermost last):
    File "<string>", line 5, in ?
    SQLLDR failed. Please check the for details
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
         at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlC.treatTaskTrt(SnpSessTaskSqlC.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.i(e.java)
         at com.sunopsis.dwg.cmd.h.y(h.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Unknown Source)

  • Problem with field-length in sql-loader

    Hello,
    (sorry I see it's the wrong forum -> SQL-Developer, I searched for SQL-Loader, is there a possibility to change the forum ?)
    I can't find an answer for my question at google, so I hope there is someone in this forum who can help me.
    I have a dat-File that contains 12 500 000 records and want to laod it via sql-loader. The first field contains the ID and there were numbers from 1 to 12 500 000 stored.
    When I run the sql-loader, the ID run up to 9 999 999 and then, for the last 2 500 001 records, ist starts at 1 again.
    I noticed a few things :
    1. Numbers < 10 000 000 don't make Problems
    2. Numbers >= 10 000 000 make Problems, the first digit ( in this example "1") is cut, so the number "10 000 001" ist stored as "1". It comes to double entries (IDs 1 to 2 500 000).
    3. The same field-definition, I have for the third field of the record -> there is no Problem. THERE I can store any number.
    4. I tried to store a number > 100 000 000 -> the first digit was cut too, but ONLY the first digit.
    5. I'm able to store any number manually in the Database.
    So, I have a problem with the first field. If the number is greater then 10 000 000, the first Number is cut. It doesn't make any differance, if the number is 10 000 000 or 999 999 999, just the first digit, in the first field, is cut.
    Any idea ??????????
    Here some infos :
    Database :
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
    SQL-PLUS :
    SQL*Plus: Release 10.2.0.4.0 - Production
    Script sqlldr :
    sqlplus ${schema}/$2 <<EOF >>LOAD.LOG
    set timing on
    set echo on
    set heading off
    set heading on
    !sqlldr userid=${schema}/$2 control=surface_geometry.ctl log=surface_geometry.log
    exit
    EOF
    ctl-File :
    LOAD DATA
    INFILE imp_surface_geometry_test2
    TRUNCATE
    CONTINUEIF NEXT(1:1) = '#'
    INTO TABLE SURFACE_GEOMETRY
    FIELDS TERMINATED BY '|'
    TRAILING NULLCOLS (
    ID INTEGER EXTERNAL ,
    GMLID,
    GMLID_CODESPACE,
    PARENT_ID NULLIF PARENT_ID = BLANKS,
    ROOT_ID NULLIF ROOT_ID = BLANKS,
    IS_SOLID,
    IS_COMPOSITE,
    IS_TRIANGULATED,
    IS_XLINK,
    IS_REVERSE,
    GEB_ID,
    GEOMETRY COLUMN OBJECT
    SDO_GTYPE INTEGER EXTERNAL,
    SDO_SRID CONSTANT 31468,
    SDO_ELEM_INFO VARRAY TERMINATED BY '|/'
    (X FLOAT EXTERNAL),
    SDO_ORDINATES VARRAY TERMINATED BY '|/'
    (X FLOAT EXTERNAL)
    Table-Definition (sql-File) :
    CREATE TABLE SURFACE_GEOMETRY (
    ID NUMBER,
    GMLID VARCHAR2(256),
    GMLID_CODESPACE VARCHAR2(1000),
    PARENT_ID NUMBER,
    ROOT_ID NUMBER,
    IS_SOLID NUMBER(1,0),
    IS_COMPOSITE NUMBER(1,0),
    IS_TRIANGULATED NUMBER(1,0),
    IS_XLINK NUMBER(1,0),
    IS_REVERSE NUMBER(1,0),
    GEB_ID CHAR(7),
    GEOMETRY MDSYS.SDO_GEOMETRY,
    CONSTRAINT c_unique_id UNIQUE (ID))
    storage (initial 1M next 1M maxextents 1024) ;
    Some Entries in the dat-File :
    12556067| |XXX|12556066|12556066|0|0|0|0|0| |
    #3003|1|1003|1|/
    #4479400.000000|5333360.000000| 526.870000|4479380.000000|5333360.000000| 526.720000|4479400.000000|5333340.000000| 526.980000|4479400.000000|5333360.000000| 526.870000|/
    12556068| |XXX| |12556068|0|0|1|0|0| |
    #||/
    #|/
    12556069| |XXX|12556068|12556068|0|0|0|0|0| |
    #3003|1|1003|1|/
    #4479380.000000|5333380.000000| 526.600000|4479380.000000|5333360.000000| 526.720000|4479400.000000|5333360.000000| 526.870000|4479380.000000|5333380.000000| 526.600000|/
    log-File : (100 records for the test)
    SQL*Loader: Release 10.2.0.4.0 - Production on Fr Mai 28 15:16:43 2010
    Copyright (c) 1982, 2007, Oracle. All rights reserved.
    Kontrolldatei: surface_geometry.ctl
    Datendatei: imp_surface_geometry_test.dat
    Fehlerdatei: imp_surface_geometry_test.bad
    Datei für zurückgewiesene Sätze: nichts spezifiziert
    (alle Discards zulassen)
    Zu ladende Anzahl: ALL
    Zu überspringende Anzahl: 0
    Zulässige Fehler: 50
    Bind-Array: 64 Zeilen, maximal 256000 Bytes
    Fortsetzung: 1:1 = 0X23(Zeichen '#'), im nächsten physischen Satz
    Benutzer Pfad: Konventionell
    Tabelle SURFACE_GEOMETRY, geladen von jedem logischen Satz.
    Insert-Option in Kraft für diese Tabelle: TRUNCATE
    Option TRAILING NULLCOLS ist wirksam
    Spaltenname Position Läng Term Eing Datentyp
    ID FIRST * | CHARACTER
    GMLID NEXT * | CHARACTER
    GMLID_CODESPACE NEXT * | CHARACTER
    PARENT_ID NEXT * | CHARACTER
    NULL wenn PARENT_ID = BLANKS
    ROOT_ID NEXT * | CHARACTER
    NULL wenn ROOT_ID = BLANKS
    IS_SOLID NEXT * | CHARACTER
    IS_COMPOSITE NEXT * | CHARACTER
    IS_TRIANGULATED NEXT * | CHARACTER
    IS_XLINK NEXT * | CHARACTER
    IS_REVERSE NEXT * | CHARACTER
    GEB_ID NEXT * | CHARACTER
    GEOMETRY DERIVED * COLUMN OBJECT
    *** Felder in GEOMETRY
    SDO_GTYPE NEXT * | CHARACTER
    SDO_SRID CONSTANT
    Wert ist '31468'
    SDO_ELEM_INFO DERIVED * VARRAY
    Abschlusszeichenfolge : '|/'
    *** Felder in GEOMETRY.SDO_ELEM_INFO
    X FIRST * | CHARACTER
    *** Feldende in GEOMETRY.SDO_ELEM_INFO
    SDO_ORDINATES DERIVED * VARRAY
    Abschlusszeichenfolge : '|/'
    *** Felder in GEOMETRY.SDO_ORDINATES
    X FIRST * | CHARACTER
    *** Feldende in GEOMETRY.SDO_ORDINATES
    *** Feldende in GEOMETRY
    Tabelle SURFACE_GEOMETRY:
    100 Zeilen erfolgreich geladen.
    0 Zeilen aufgrund von Datenfehlern nicht geladen.
    0 Zeilen nicht geladen, da alle WHEN-Klauseln fehlerhaft waren.
    0 Zeilen nicht geladen, da alle Felder NULL waren.
    Zugewiesener Bereich für Bind-Array: 232576 Bytes (64 Zeilen)
    Byte in Lese-Puffer: 1048576
    Gesamtzahl der übersprungenen logischen Datensätze: 0
    Gesamtzahl der gelesenen logischen Datensätze: 100
    Gesamtzahl der abgelehnten logischen Datensätze: 0
    Gesamtzahl der zurückgewiesenen logischen Datensätze: 0
    Lauf begonnen am Fr Mai 28 15:16:43 2010
    Lauf beendet am Fr Mai 28 15:16:43 2010
    Abgelaufene Zeit: 00:00:00.19
    CPU-Zeit: 00:00:00.01
    Edited by: user9338988 on 28.05.2010 06:21

    sorry, wrong forum. I opened the thread in forum "Export/Import/SQL-Loader & External Tables"

  • How to Import data via SQL Loader with characterset  UTF16 little endian?

    Hello,
    I'm importing data from text file into one of my table which contains blob column.
    I've specified following in my control file.
    -----Control file-------
    LOAD DATA
    CHARACTERSET UTF16
    BYTEORDER LITTLE
    INFILE './DataFiles/Data.txt'
    BADFILE './Logs/Data.bad'
    INTO TABLE temp_blob truncate
    FIELDS TERMINATED BY "     "
    TRAILING NULLCOLS
    (GROUP_BLOB,CODE)
    Problem:
    SQL Loader always importing data via big endian. Is there any method available using which we can convert these data to little endian?
    Thanks

    A new preference has been added to customize the import delimiter in main code line. This should be available as part of future release.

Maybe you are looking for