Problem in SQL Loader

Hi Experts,
i'm using SQL Loader for loading data from an XML file into the DB , my control file was some thing like that :
load data
infile 'D:data.xml' "str '</dataNode>'"
replace
into table MY_TABLE
where dataNode is the records' separator, this was working fine, what i'm trying to do now is writing all the parameters to a PARFILE , then passing only the PARFILE to the sql loader like the following :
sqlldr PARFILE=myParaFile.par
where myParaFile looks like that :
userid=xxx/xxx
control=xxx.ctl
log=xxx.log
data=D:\data.xml
the problem now that i have removed the INFILE clause from the control file , and i have put the "data" parameter insetad on the PARFILE , the question now is Where shall i write "str '</dataNode>'" to tell the SQL Loader that the input data is in stream format and use </dataNode> as records' separator
I really appreciate your help.

My XML File:
<dataNode>
<ProductID>1</ProductID>
<Type>Phone</Type>
</dataNode>
<dataNode>
<ProductID>2</ProductID>
<Type>Sim</Type>
</dataNode>
My Control File :
load data
infile 'D:data.xml' "str '</dataNode>'"
replace
into table MY_TABLE
dummy filler terminated by "<dataNode>",
ProductID enclosed by "<ProductID>" and "</ProductID>",
Type enclosed by "<Type>" and "</Type>"
)

Similar Messages

  • Problem using SQL Loader with ODI

    Hi,
    I am having problems using SQL Loader with ODI. I am trying to fill an oracle table with data from a txt file. At first I had used "File to SQL" LKM, but due to the size of the source txt file (700MB), I decided to use "File to Oracle (SQLLDR)" LKM.
    The error that appears in myFile.txt.log is: "SQL*Loader-101: Invalid argument for username/password"
    I think that the problem could be in the definition of the data server (Physical architecutre in topology), because I have left blank Host, user and password.
    Is this the problem? What host and user should I use? With "File to SQL" works fine living this blank, but takes to much time.
    Thanks in advance

    I tried to use your code, but I couldn´t make it work (I don´t know Jython). I think the problem could be with the use of quotes
    Here is what I wrote:
    import os
    retVal = os.system(r'sqlldr control=E:\Public\TXTODI\PROFITA2/Profita2Final.txt.ctl log=E:\Public\TXTODI\PROFITA2/Profita2Final.txt.log userid=MYUSER/myPassword @ mySID')
    if retVal == 1 or retVal > 2:
    raise 'SQLLDR failed. Please check the for details '
    And the error message is:
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (innermost last):
    File "<string>", line 5, in ?
    SQLLDR failed. Please check the for details
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
         at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlC.treatTaskTrt(SnpSessTaskSqlC.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.i(e.java)
         at com.sunopsis.dwg.cmd.h.y(h.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Unknown Source)

  • Problem with SQL*Loader loading long description with carriage return

    I'm trying to load new items into mtl_system_items_interface via a concurrent
    program running the SQL*Loader. The load is erroring due to not finding a
    delimeter - I'm guessing it's having problems with the long_description.
    Here's my ctl file:
    LOAD
    INFILE 'create_prober_items.csv'
    INTO TABLE MTL_SYSTEM_ITEMS_INTERFACE
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
    (PROCESS_FLAG "TRIM(:PROCESS_FLAG)",
    SET_PROCESS_ID "TRIM(:SET_PROCESS_ID)",
    TRANSACTION_TYPE "TRIM(:TRANSACTION_TYPE)",
    ORGANIZATION_ID "TRIM(:ORGANIZATION_ID)",
    TEMPLATE_ID "TRIM(:TEMPLATE_ID)",
    SEGMENT1 "TRIM(:SEGMENT1)",
    SEGMENT2 "TRIM(:SEGMENT2)",
    DESCRIPTION "TRIM(:DESCRIPTION)",
    LONG_DESCRIPTION "TRIM(:LONG_DESCRIPTION)")
    Here's a sample record from the csv file:
    1,1,CREATE,0,546,03,B00-100289,PROBEHEAD PH100 COMPLETE/ VACUUM/COAX ,"- Linear
    X axis, Y,Z pivots
    - Movement range: X: 8mm, Y: 6mm, Z: 25mm
    - Probe tip pressure adjustable contact
    - Vacuum adapter
    - With shielded arm
    - Incl. separate miniature female HF plug
    The long_description has to appear as:
    - something
    - something
    It can't appear as:
    -something-something
    Here's the errors:
    Record 1: Rejected - Error on table "INV"."MTL_SYSTEM_ITEMS_INTERFACE", column
    LONG_DESCRIPTION.
    Logical record ended - second enclosure character not present
    Record 2: Rejected - Error on table "INV"."MTL_SYSTEM_ITEMS_INTERFACE", column
    ORGANIZATION_ID.
    Column not found before end of logical record (use TRAILING NULLCOLS)
    I've asked for help on the Metalink forum and was advised to add trailing nullcols to the ctl so the ctl line now looks like:
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' TRAILING NULLCOLS
    I don't think this was right because now I'm getting:
    Record 1: Rejected - Error on table "INV"."MTL_SYSTEM_ITEMS_INTERFACE", column LONG_DESCRIPTION.
    Logical record ended - second enclosure character not present
    Thanks for any help that may be offered.
    -Tracy

    LOAD
    INFILE 'create_prober_items.csv'
    CONTINUEIF LAST <> '"'
    INTO TABLE MTL_SYSTEM_ITEMS_INTERFACE
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' TRAILING NULLCOLS
    (PROCESS_FLAG "TRIM(:PROCESS_FLAG)",
    SET_PROCESS_ID "TRIM(:SET_PROCESS_ID)",
    TRANSACTION_TYPE "TRIM(:TRANSACTION_TYPE)",
    ORGANIZATION_ID "TRIM(:ORGANIZATION_ID)",
    TEMPLATE_ID "TRIM(:TEMPLATE_ID)",
    SEGMENT1 "TRIM(:SEGMENT1)",
    SEGMENT2 "TRIM(:SEGMENT2)",
    DESCRIPTION "TRIM(:DESCRIPTION)",
    LONG_DESCRIPTION "REPLACE (TRIM(:LONG_DESCRIPTION), '-', CHR(10) || '-')")

  • Problem specifying SQL Loader Log file destination using EM

    Good evening,
    I am following the example given in the 2 Day DBA document chapter 8 section 16.
    In step 5 of 7, EM does not allow me to specify the destination of the SQL Loader log file to be on a mapped network drive.
    The question: Does SQL Loader have a limitation that I am not aware of, that prevents placing the log file on a network share or am I getting this error because of something else I am inadvertently doing wrong ?
    Note: I have placed the DDL, load file data and steps I follow in EM at the bottom of this post to facilitate reproducing the problem *(drive Z is a mapped drive)*.
    Thank you for your help,
    John.
    DDL (generated using SQL developer, you may want to change the space allocated to be less)
    CREATE TABLE "NICK"."PURCHASE_ORDERS"
        "PO_NUMBER"      NUMBER NOT NULL ENABLE,
        "PO_DESCRIPTION" VARCHAR2(200 BYTE),
        "PO_DATE" DATE NOT NULL ENABLE,
        "PO_VENDOR" NUMBER NOT NULL ENABLE,
        "PO_DATE_RECEIVED" DATE,
        PRIMARY KEY ("PO_NUMBER") USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS NOCOMPRESS LOGGING TABLESPACE "USERS" ENABLE
      SEGMENT CREATION DEFERRED PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING STORAGE
        INITIAL 67108864
      TABLESPACE "USERS" ;
    Load.dat file contents
    1, Office Equipment, 25-MAY-2006, 1201, 13-JUN-2006
    2, Computer System, 18-JUN-2006, 1201, 27-JUN-2006
    3, Travel Expense, 26-JUN-2006, 1340, 11-JUL-2006
    Steps I am carrying out in EM
    log in, select data movement -> Load Data from User Files
    Automatically generate control file
    (enter host credentials that work on your machine)
    continue
    Step 1 of 7 ->
      Data file is located on your browser machine
      "Z:\Documentation\Oracle\2DayDBA\Scripts\Load.dat"
       click next
    step 2 of 7 ->
      Table Name
      nick.purchase_orders
      click next
    step 3 of 7 ->
      click next
    step 4 of 7 ->
      click next
    step 5 of 7 ->
      Generate log file where logging information is to be stored
      Z:\Documentation\Oracle\2DayDBA\Scripts\Load.LOG
      Validation Error
      Examine and correct the following errors, then retry the operation:
      LogFile - The directory does not exist.

    Hi John,
    But, i did'nt found any error when i am going the same what you did.
    My Oracle Version is 10.2.0.1 and using Windows xp. See what i did and i got worked
    1.I created one table in scott schema :
    SCOTT@orcl> CREATE TABLE "PURCHASE_ORDERS"
      2  (
      3      "PO_NUMBER"      NUMBER NOT NULL ENABLE,
      4      "PO_DESCRIPTION" VARCHAR2(200 BYTE),
      5      "PO_DATE" DATE NOT NULL ENABLE,
      6      "PO_VENDOR" NUMBER NOT NULL ENABLE,
      7      "PO_DATE_RECEIVED" DATE,
      8      PRIMARY KEY ("PO_NUMBER") USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS NOCOMPRESS LOGGING TABLESPACE "USERS" ENABLE
      9  )
    10  TABLESPACE "USERS";
    Table created.I logged into em Maintenance-->Data Movement-->Load Data from User Files-->My Host Credentials
    Here i total 3 text boxes :
    1.Server Data File : C:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\USERS01.DBF
    2.Data File is Located on Your Browser Machine : z:\load.dat <--- Here z:\ means other machine's shared doc folder; and i selected this option (as option button click) and i created the same load.dat as you mentioned.
    3.Temporary File Location : C:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\ <--- I did'nt mentioned anything.
    Step 2 of 7 Table Name : scott.PURCHASE_ORDERS
    Step 3 of 7 I just clicked Next
    Step 4 of 7 I just clicked Next
    Step 5 of 7 I just clicked Next
    Step 6 of 7 I just clicked Next
    Step 7 of 7 Here it is Control File Contents:
    LOAD DATA
    APPEND
    INTO TABLE scott.PURCHASE_ORDERS
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
    PO_NUMBER INTEGER EXTERNAL,
    PO_DESCRIPTION CHAR,
    PO_DATE DATE,
    PO_VENDOR INTEGER EXTERNAL,
    PO_DATE_RECEIVED DATE
    And i just clicked on submit job.
    Now i got all 3 rows in purchase_orders :
    SCOTT@orcl> select count(*) from purchase_orders;
      COUNT(*)
             3So, there is no bug, it worked and please retry if you get any error/issue.
    HTH
    Girish Sharma

  • Problem with SQL loader - "maximum length"

    using SQL*Loader: Release 8.1.7.0.0
    ===================================
    (full CTL enclosed below)
    I have a problem with several rows, in which I'm getting the "Field in data file exceeds maximum length" error.
    the DB field (referer) is a VARCHAR2(4000), and the field in the error rows never exceeds few hundred characters. According to Oracle docs I should be able to load fields which are no bigger than the DB field, what gives?
    I tried the variation
    referer CHAR "SUBSTR(:referer,1,100)"
    for this field, which causes all the "referer" columns in the "good" rows to load no more than 100 characters, but the same error repeats for the same rows!
    the input file is an IIS log, and the field is the REFERER field. its pure ASCII encoded, is there some character that cause Oracle to behave this way? is this a bug?
    here is one "bad" row: the "bad" field starts with "ht
    tp://web , is enclosed with quotes. I have replaced the client IP and other fields with xxx for privacy reasons.
    after that, I have enclosed my CTL as well.
    any help ?
    Yoram Ayalon
    BTW - I verified in the LOG file that the loader is reading my options for the columns as I described in the CTL. no problem there.
    "2003-06-30 11:11:12" xxx.xxx.xxx.xxx WEBSRVXX 80 GET /xxx.xxx 200 0 778 1359 "ht
    tp://web.ask.com/redir?bpg=http%3a%2f%2fweb.ask.com%2fweb%3fq%3dWhat%2bis%2bsign
    al%2bcommunication%253f%26o%3d0%26page%3d1&q=What+is+signal+communication%3f&u=h
    ttp%3a%2f%2ftm.wc.ask.com%2fr%3ft%3dan%26s%3da%26uid%3d032EBF1A318A100F3%26sid%3
    d3d2bbe4f8d2bbe4f8%26qid%3d4B2346DA8A56C6418CB4DCB9091EEBA7%26io%3d0%26sv%3dza5c
    b0db2%"
    LOAD DATA
    INFILE '/tmp/mod_websrvxx.txt'
    APPEND INTO TABLE tmpLogs
    FIELDS TERMINATED BY WHITESPACE optionally enclosed by '"'
    (LogDate DATE "YYYY-MM-DD HH24:MI:SS", ClientIP, ServerName, ServerPort, ClientM
    ethod,UriStem,Status,BytesSent,BytesReceived,TimeTaken,Referer CHAR "SUBSTR(:Re
    ferer, 1, 100)" )

    Use:
    readsize=3000000
    or some large number to raise this limit.
    Check the below link for detailed explanation
    http://asktom.oracle.com/pls/ask/f?p=4950:8:380010202423963671::NO::F4950_P8_DISPLAYID,F4950_P8_CRITERIA:2167288643374,

  • A problem about sql*load

    how load LOB Data into database?
    I already saw oracle8i documents about sql*load. in general, lob fields in samples are less than 4000
    How load data >4000 into lob ?
    null

    I already knew the solution of the problem,but I met another problem.
    I checked my sql 7 database again, I found large columns are ntext datatype. I changed the ntext to text and migrated data again. In SQL*PLUS ,I found the data of clob columns change to '????.?-?'. I know this is caused by NLS_LANG, my NLS_LANG=simplified chinese_china.zhs16gbk. How do i set my nls_lang to get the right data?
    null

  • Problem with SQL*Loader and different date formats in the same file

    DB: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
    System: AIX 5.3.0.0
    Hello,
    I'm using SQL*Loader to import semi-colon separated values into a table. The files are delivered to us by a data provider who concatenates data from different sources and this results in us having different date formats within the same file. For example:
    ...;2010-12-31;22/11/1932;...
    I load this data using the following lines in the control file:
    EXECUTIONDATE1     TIMESTAMP     NULLIF EXECUTIONDATE1=BLANKS     "TO_DATE(:EXECUTIONDATE1, 'YYYY-MM-DD')",
    DELDOB          TIMESTAMP     NULLIF DELDOB=BLANKS          "TO_DATE(:DELDOB, 'DD/MM/YYYY')",
    The relevant NLS parameters:
    NLS_LANGUAGE=FRENCH
    NLS_DATE_FORMAT=DD/MM/RR
    NLS_DATE_LANGUAGE=FRENCH
    If I load this file as is the values loaded into the table are 31 dec 2010 and 22 nov *2032*, aven though the years are on 4 digits. If I change the NLS_DATE_FORMAT to DD/MM/YYYY then the second date value will be loaded correctly, but the first value will be loaded as 31 dec *2020* !!
    How can I get both date values to load correctly?
    Thanks!
    Sylvain

    This is very strange, after running a few tests I realized that if the year is 19XX then it will get loaded as 2019, and if it is 20XX then it will be 2020. I'm guessing it may have something to do with certain env variables that aren't set up properly because I'm fairly sure my SQL*Loader control file is correct... I'll run more tests :-(

  • Pecuilar problem in Sql*Loader

    Hi,
    I had a sql loader control file which was used to load tables into multiple tables.
    Actually the format of the is something like
    #10#......... <<header record>>
    #20#...... <<body record>>
    #20# <<body record>>
    #EOF# <<it marks end of file>>
    Control file is as follows
    LOAD DATA
    INFILE "C:\WINDOWS\system32\multi1.txt"
    APPEND
    INTO TABLE BROADCAST_HEADER
    WHEN (1:4)='#10#'
    (tag_number "broadcast_header.nextval",
    processing_flag constant 'N'
    ,A POSITION (5:24) Char
    ,B POSITION (26:35) Char
    ,C (37:46) Char
    ,D (48:62) INTEGER EXTERNAL
    ,E (64:78) Char
    INTO TABLE BROADCAST_BODY
    WHEN (1:4)='#20#'
    (P "broadcast_body.nextval",
    Q "broadcast_header.currval",
    R POSITION (5:19) Char
    ,S POSITION (21:45) Char
    ,T POSITION (47:71) Char "decode(:T,null,'NULL',:T)"
    ,U POSITION (73:78) Char
    ,V POSITION (80:85) Char
    INTO TABLE BROADCAST_DRIVER
    WHEN (1:5)='#EOF#'
    RECORD_NUMBER "broadcast_header.currval"
    So while loading the file format as mentioned above,all header and body records get inserted properly but its not loading data in broadcast_driver table giving following error.
    Record 21: Discarded - all columns null.
    Its giving error on line containing #EOF#.
    So please help me out.
    Regards,
    Sandeep Saxena

    My XML File:
    <dataNode>
    <ProductID>1</ProductID>
    <Type>Phone</Type>
    </dataNode>
    <dataNode>
    <ProductID>2</ProductID>
    <Type>Sim</Type>
    </dataNode>
    My Control File :
    load data
    infile 'D:data.xml' "str '</dataNode>'"
    replace
    into table MY_TABLE
    dummy filler terminated by "<dataNode>",
    ProductID enclosed by "<ProductID>" and "</ProductID>",
    Type enclosed by "<Type>" and "</Type>"
    )

  • Problems with SQL*Loader and java Runtime

    Hi. I'm trying to start SQL*Loader on Oracle 8 by using Runtime class in this way:
    try{
    Process p = Runtime.getRuntime().exec( "c:\oracle\ora81\bin\sqlldr.exe parfile=c:\parfile\carica.par" );
    /*If i insert this line my application never stops*/
    p.waitFor();
    }catch( Exception e ){
    . I have seen that if lines to insert are less then 400 all works very fine, but if lines number is greater than 400, all data go in my tables but my log file is opened always in writing.Can anyone tell me why?
    Thanks

    Just a note if the executable "sqlldr.exe" does not stop (quit running) by itself the p.waitFor() will wait for ever.

  • Problem using SQL-LOADER and Unique Identifiers

    I'm trying to load a fixed-length records file containing people names and phone numbers. Data is specified as follows
    Toni Tomas66666666669999999999
    Jose Luis 33333333330000000000
    Notice that a maximum of 2 numbers can follow a person name, and 0000000000 means "no number specified".
    I want to assign a unique identifier to people (instead of using the NAME field as a Primary Key) using an Oracle Sequence. I did that, but I don't know
    how to assign the same id to each number.
    Considering the 2 previous lines, desired result should be:
    PEOPLE
    ======
    1     Toni Tomas
    2     Jose Luis
    TEL_NUMBERS
    ===========
    1     6666666666
    1     9999999999
    2     3333333333
    In order to achieve that, my Control File looks like this
    LOAD DATA
    INFILE phonenumbers.txt
    INTO TABLE people
         personID "mySequenceName.nextval", --an Oracle sequence
         name POSITION(1:10) CHAR
    INTO TABLE tel_numbers
    WHEN phonenumber !='0000000000'
         personID !!!DON'T KNOW HOW TO REFERENCE THE SAME ID!!!!
         phonenumber POSITION(11:20) CHAR
    INTO TABLE tel_numbers
    WHEN phonenumber !='0000000000'
         personID !!!DON'T KNOW HOW TO REFERENCE THE SAME ID!!!!
         phonenumber POSITION(21:30) CHAR
    I tried lots of things, but anyone works:
    a) reference the ID using something like ":\"people.personID\" (or similar aproaches)
    b) using a BEFORE INSERT TRIGGER getting the CURRVAL value of the Sequence. This solution
    does not work because it seems that all people is loaded before any telephone number. Hence,
    all phone numbers are associated, wrongly, to the last person in the data file.
    Does anyone know how can I solve this issue?
    Help would be appreciated. Thank you.

    Hi V Garcia.
    Information within the file is correct. Each line represents a COMPLETE record (Part of the line represents parent information and the rest is children data). As you can see in my first message, you can have more than one detail for a given master (i.e. two phone numbers):
    Toni Tomas66666666669999999999
    (10 chars for the name, 10 for each phone number. Thus, 2 children records to be created)
    With the solution given by Sreekanth Reddy Bandi (use of CURRVALUE within the SQL-Loader Control File), not all the details are linked to the parent record on the DB tables. It seems SLQ-Loader gets crazy when there is such amount of information.

  • Problems using SQL*Loader with Oracle SQL Developer

    I have been using TOAD and able to import large (milllions of rows of data) in various file formats into a table in an Oracle database. My company recently decided not to renew any more TOAD licenses and go with Oracle SQL Developer. The Oracle database is on a corporate server and I access the database via Oracle client locally on my machine. Oracle SQL Developer and TOAD are local on my desktop and connected through TNSnames using the Windows XP platform. I have no issues with using SQL*Loader via the import wizard in TOAD to import the data in these large files into an Oracle table and producing a log file. Loading the same files via SQL*Loader in SQL Developer, freezes up my machine and I cannot get it to produce a log file. Please help!

    I am using SQL Developer version 3.0.04. Yes, I have tried it with a smaller file with no success. What is odd is that the log file is not even created. What is created is a .bat file a control file and a .sh file but no log file. The steps that I take:
    1.Right click on the table I want to import to or go to actions
    2. Import Data
    3. Find file to import
    4. Data Preview - All fields entered according to file
    5. Import Method - SQL Loader utility
    6. Column Definitions - Mapped
    7. Options - Directory of files set
    8. Finish
    With the above steps I was not able to import 255 rows of data. No log file was produced so I don't know why it is failing.
    thanks.
    Edited by: user3261987 on Apr 16, 2012 1:23 PM

  • Problem about sql loader

    hi i don't know is it a baseless question as i never tried it asking this question---
    i have a .ctl file to load data as--
    load data
    infile 'c:\data\mydata.csv'
              into table emp
    fields terminated by "," optionally enclosed by '"'          
    ( empno, empname, sal, deptno )
    now i want to use the column say empno where i want to pass value when i use it as
    sqlldr userid=' ' control=.ctl csv=.csv

    I would suggest u to use external table concept to solve the problem.
    An external table can always be used like any normal Data Base table.
    For example :-
    Data is to be loaded in table countries having columns as
    country_code VARCHAR2(5)
    country_name VARCHAR2(50)
    country_language VARCHAR2(50)
    Create a directory object as belows
    CREATE OR REPLACE DIRECTORY EXT_TABLES AS 'C:\temp\';
    Now create external table with the same structure as the file present in the directory C:\temp & it needs to be loaded to Data Base table.
    CREATE TABLE countries_ext (
    country_code VARCHAR2(5),
    country_name VARCHAR2(50),
    country_language VARCHAR2(50)
    ORGANIZATION EXTERNAL (
    TYPE ORACLE_LOADER
    DEFAULT DIRECTORY ext_tables
    ACCESS PARAMETERS (
    RECORDS DELIMITED BY NEWLINE
    FIELDS TERMINATED BY ','
    MISSING FIELD VALUES ARE NULL
    country_code CHAR(5),
    country_name CHAR(50),
    country_language CHAR(50)
    LOCATION ('Countries1.txt','Countries2.txt')
    PARALLEL 5
    REJECT LIMIT UNLIMITED;
    Now use the below mentioned query to load the Data in table countries like
    Insert into countries
    select country_code ,
    country_name ,
    country_language
    from countries_ext
    where (country_code ,
    country_name ,
    country_language)
    not in
    (select country_code ,
    country_name ,
    country_language
    from countries);

  • Problem in Sql Loader With Win2000 Professional

    Hi,
    I am using Win-2000 prof. while loading data with sqlldr I am not comming out from the prompt after inserting all the data in the table.
    c:\>sqlldr system/manager@scs party.ctl silent=ALL --enter
    ^c
    c:\>
    I have to press Ctrl+C to come out from the prompt.
    help me

    he guys no reply for this topic

  • Problem in Sql*loader with specifc characters like "ù,°,€ ...."

    Hello,
    I have problems by loadinf a flat file which contains specific characters.
    These characters multiplies by 2 the size of the zone.
    For example, in chains of 10 characters.
    If the value is :
    ' ABCDEFGHIJ' OK (10)
    ' ABCDEFGHI° ' NOK (9 + 1*2) 11 > 10
    ' Aù°?° ' OK (2*5)
    ' A°°°°° ' NOK (1 + 5*2) 11 > 10
    For your information :
    PARAMETER     Base     Instance     Session
    NLS_CALENDAR     GREGORIAN          GREGORIAN
    NLS_CHARACTERSET     AL32UTF8          
    NLS_COMP     BINARY          BINARY
    NLS_CURRENCY     $          F
    NLS_DATE_FORMAT     DD-MON-RR          MM/DD/YYYY HH24:MI:SS
    NLS_DATE_LANGUAGE     AMERICAN          FRENCH
    NLS_DUAL_CURRENCY     $          €
    NLS_ISO_CURRENCY     AMERICA          FRANCE
    NLS_LANGUAGE     AMERICAN     AMERICAN     FRENCH
    NLS_LENGTH_SEMANTICS     BYTE     BYTE     BYTE
    NLS_NCHAR_CHARACTERSET     AL16UTF16          
    NLS_NCHAR_CONV_EXCP     FALSE     FALSE     FALSE
    NLS_NUMERIC_CHARACTERS     .,          ,.
    NLS_RDBMS_VERSION     9.2.0.4.0          
    NLS_SORT     BINARY          FRENCH
    NLS_TERRITORY     AMERICA     AMERICA     FRANCE
    NLS_TIMESTAMP_FORMAT     DD-MON-RR HH.MI.SSXFF AM          DD/MM/RR HH24:MI:SSXFF
    NLS_TIMESTAMP_TZ_FORMAT     DD-MON-RR HH.MI.SSXFF AM TZR          DD/MM/RR HH24:MI:SSXFF TZR
    NLS_TIME_FORMAT     HH.MI.SSXFF AM          HH24:MI:SSXFF
    NLS_TIME_TZ_FORMAT     HH.MI.SSXFF AM TZR          HH24:MI:SSXFF TZR

    On my PC with these parameters we have no problem:
    NLS_CALENDAR     GREGORIAN          GREGORIAN
    NLS_CHARACTERSET     WE8MSWIN1252          
    NLS_COMP     BINARY          BINARY
    NLS_CURRENCY     $          F
    NLS_DATE_FORMAT     DD-MON-RR          MM/DD/YYYY HH24:MI:SS
    NLS_DATE_LANGUAGE     AMERICAN          FRENCH
    NLS_DUAL_CURRENCY     $          €
    NLS_ISO_CURRENCY     AMERICA          FRANCE
    NLS_LANGUAGE     AMERICAN     AMERICAN     FRENCH
    NLS_LENGTH_SEMANTICS     BYTE     BYTE     BYTE
    NLS_NCHAR_CHARACTERSET     AL16UTF16          
    NLS_NCHAR_CONV_EXCP     FALSE     FALSE     FALSE
    NLS_NUMERIC_CHARACTERS     .,          ,.
    NLS_RDBMS_VERSION     9.2.0.1.0          
    NLS_SORT     BINARY          FRENCH
    NLS_TERRITORY     AMERICA     AMERICA     FRANCE
    NLS_TIMESTAMP_FORMAT     DD-MON-RR HH.MI.SSXFF AM          DD/MM/RR HH24:MI:SSXFF
    NLS_TIMESTAMP_TZ_FORMAT     DD-MON-RR HH.MI.SSXFF AM TZR          DD/MM/RR HH24:MI:SSXFF TZR
    NLS_TIME_FORMAT     HH.MI.SSXFF AM          HH24:MI:SSXFF
    NLS_TIME_TZ_FORMAT     HH.MI.SSXFF AM TZR          HH24:MI:SSXFF TZR
    So, what is the solution ?

  • Problem ColCount SQL*Loader

    Hello,
    how can I insert this file:
    Col1;Col2;Col3
    a;b;b
    f;d;e
    in this table
    MyTab
    (Col1 VARCHAR2(8),
    Col2 VARCHAR2(8))
    So, I Would like insert a file with 3 columns in a table with 2 columns...
    Thanks a lot

    yes you are right the second data from the data file will be the data that will be loaded. and i was thinking that the op just want to use the col3 but not actually the third data from the data file. the sample you posted link to asktom is the right one if the op wants the third data to be loaded.

Maybe you are looking for

  • Return goods: sales return

    Hi sap gurus, If the goods are returned back, what happens to the returned goods. What is the diff b/w Subsequent delivery free of charge and free of charge delivery

  • Named Anchor linking from another document

    I made a Named Anchor and linked from some text in another document.  Great.  Worked just fine.  The only thing is it doesn't display the named anchor at the top of the page. In other words the footer is at the bottom of the page and the named anchor

  • CS3 Web Premium Suite compatible with Vista 64 bit?

    Greeting folks. I realize this is not a Photoshop specific question, but does anyone use CS3 Web Premium Suite on Vista 64? If so, do you have any problems with any of the programs? I'm contemplating upgrading from Vista 32 to Vista 64. Thanks!

  • Moving songs from Ipod to New Computer

    The computer that my iTunes was installed on was stolen from my house a few weeks ago. Need to know how to move the songs that are in my iPod now to a new library and also how to get songs that I bought the day it was stolen sent to my iPod.

  • Selling of Scrap received with credit of 4% addl. Duty

    Hi friends, Provision of Sales Bills [ Excise Invoicing] with Addl. Duty 4% ie instead of our existing tax pattern 1421 on excise part with 4% on Sales Tax part, a new pattern 1421+4 on excise part with 4% on sales tax part. For information the amoun