Problem using SQL-LOADER and Unique Identifiers

I'm trying to load a fixed-length records file containing people names and phone numbers. Data is specified as follows
Toni Tomas66666666669999999999
Jose Luis 33333333330000000000
Notice that a maximum of 2 numbers can follow a person name, and 0000000000 means "no number specified".
I want to assign a unique identifier to people (instead of using the NAME field as a Primary Key) using an Oracle Sequence. I did that, but I don't know
how to assign the same id to each number.
Considering the 2 previous lines, desired result should be:
PEOPLE
======
1     Toni Tomas
2     Jose Luis
TEL_NUMBERS
===========
1     6666666666
1     9999999999
2     3333333333
In order to achieve that, my Control File looks like this
LOAD DATA
INFILE phonenumbers.txt
INTO TABLE people
     personID "mySequenceName.nextval", --an Oracle sequence
     name POSITION(1:10) CHAR
INTO TABLE tel_numbers
WHEN phonenumber !='0000000000'
     personID !!!DON'T KNOW HOW TO REFERENCE THE SAME ID!!!!
     phonenumber POSITION(11:20) CHAR
INTO TABLE tel_numbers
WHEN phonenumber !='0000000000'
     personID !!!DON'T KNOW HOW TO REFERENCE THE SAME ID!!!!
     phonenumber POSITION(21:30) CHAR
I tried lots of things, but anyone works:
a) reference the ID using something like ":\"people.personID\" (or similar aproaches)
b) using a BEFORE INSERT TRIGGER getting the CURRVAL value of the Sequence. This solution
does not work because it seems that all people is loaded before any telephone number. Hence,
all phone numbers are associated, wrongly, to the last person in the data file.
Does anyone know how can I solve this issue?
Help would be appreciated. Thank you.

Hi V Garcia.
Information within the file is correct. Each line represents a COMPLETE record (Part of the line represents parent information and the rest is children data). As you can see in my first message, you can have more than one detail for a given master (i.e. two phone numbers):
Toni Tomas66666666669999999999
(10 chars for the name, 10 for each phone number. Thus, 2 children records to be created)
With the solution given by Sreekanth Reddy Bandi (use of CURRVALUE within the SQL-Loader Control File), not all the details are linked to the parent record on the DB tables. It seems SLQ-Loader gets crazy when there is such amount of information.

Similar Messages

  • Using SQL*Loader and UTL_FILE to load and unload large files(i.e PDF,DOCs)

    Problem : Load PDF or similiar files( stored at operating system) into an oracle table using SQl*Loader .
    and than Unload the files back from oracle tables to prevoius format.
    I 've used SQL*LOADER .... " sqlldr " command as :
    " sqlldr scott/[email protected] control=c:\sqlldr\control.ctl log=c:\any.txt "
    Control file is written as :
    LOAD DATA
    INFILE 'c:\sqlldr\r_sqlldr.txt'
    REPLACE
    INTO table r_sqlldr
    Fields terminated by ','
    id sequence (max,1) ,
    fname char(20),
    data LOBFILE(fname) terminated by EOF )
    It loads files ( Pdf, Image and more...) that are mentioned in file r_sqlldr.txt into oracle table r_sqlldr
    Text file ( used as source ) is written as :
    c:\kalam.pdf,
    c:\CTSlogo1.bmp
    c:\any1.txt
    after this load ....i used UTL_FILE to unload data and write procedure like ...
    CREATE OR REPLACE PROCEDURE R_UTL AS
    l_file UTL_FILE.FILE_TYPE;
    l_buffer RAW(32767);
    l_amount BINARY_INTEGER ;
    l_pos INTEGER := 1;
    l_blob BLOB;
    l_blob_len INTEGER;
    BEGIN
    SELECT data
    INTO l_blob
    FROM r_sqlldr
    where id= 1;
    l_blob_len := DBMS_LOB.GETLENGTH(l_blob);
    DBMS_OUTPUT.PUT_LINE('blob length : ' || l_blob_len);
    IF (l_blob_len < 32767) THEN
    l_amount :=l_blob_len;
    ELSE
    l_amount := 32767;
    END IF;
    DBMS_LOB.OPEN(l_blob, DBMS_LOB.LOB_READONLY);
    l_file := UTL_FILE.FOPEN('DBDIR1','Kalam_out.pdf','w', 32767);
    DBMS_OUTPUT.PUT_LINE('File opened');
    WHILE l_pos < l_blob_len LOOP
    DBMS_LOB.READ (l_blob, l_amount, l_pos, l_buffer);
    DBMS_OUTPUT.PUT_LINE('Blob read');
    l_pos := l_pos + l_amount;
    UTL_FILE.PUT_RAW(l_file, l_buffer, TRUE);
    DBMS_OUTPUT.PUT_LINE('writing to file');
    UTL_FILE.FFLUSH(l_file);
    UTL_FILE.NEW_LINE(l_file);
    END LOOP;
    UTL_FILE.FFLUSH(l_file);
    UTL_FILE.FCLOSE(l_file);
    DBMS_OUTPUT.PUT_LINE('File closed');
    DBMS_LOB.CLOSE(l_blob);
    EXCEPTION
    WHEN OTHERS THEN
    IF UTL_FILE.IS_OPEN(l_file) THEN
    UTL_FILE.FCLOSE(l_file);
    END IF;
    DBMS_OUTPUT.PUT_LINE('Its working at last');
    END R_UTL;
    This loads data from r_sqlldr table (BOLBS) to files on operating system ,,,
    -> Same procedure with minor changes is used to unload other similar files like Images and text files.
    In above example : Loading : 3 files 1) Kalam.pdf 2) CTSlogo1.bmp 3) any1.txt are loaded into oracle table r_sqlldr 's 3 rows respectively.
    file names into fname column and corresponding data into data ( BLOB) column.
    Unload : And than these files are loaded back into their previous format to operating system using UTL_FILE feature of oracle.
    so PROBLEM IS : Actual capacity (size ) of these files is getting unloaded back but with quality decreased. And PDF file doesnt even view its data. means size is almot equal to source file but data are lost when i open it.....
    and for images .... imgaes are getting loaded an unloaded but with colors changed ....
    Also features ( like FFLUSH ) of Oracle 've been used but it never worked
    ANY SUGGESTIONS OR aLTERNATE SOLUTION TO LOAD AND UNLOAD PDFs through Oracle ARE REQUESTED.
    ------------------------------------------------------------------------------------------------------------------------

    Thanks Justin ...for a quick response ...
    well ... i am loading data into BLOB only and using SQL*Loader ...
    I've never used dbms_lob.loadFromFile to do the loads ...
    i 've opend a file on network and than used dbms_lob.read and
    UTL_FILE.PUT_RAW to read and write data into target file.
    actually ...my process is working fine with text files but not with PDF and IMAGES ...
    and your doubt of ..."Is the data the proper length after reading it in?" ..m not getting wat r you asking ...but ... i think regarding data length ..there is no problem... except ... source PDF length is 90.4 kb ..and Target is 90.8 kb..
    thats it...
    So Request u to add some more help ......or should i provide some more details ??

  • Problem using SQL Loader with ODI

    Hi,
    I am having problems using SQL Loader with ODI. I am trying to fill an oracle table with data from a txt file. At first I had used "File to SQL" LKM, but due to the size of the source txt file (700MB), I decided to use "File to Oracle (SQLLDR)" LKM.
    The error that appears in myFile.txt.log is: "SQL*Loader-101: Invalid argument for username/password"
    I think that the problem could be in the definition of the data server (Physical architecutre in topology), because I have left blank Host, user and password.
    Is this the problem? What host and user should I use? With "File to SQL" works fine living this blank, but takes to much time.
    Thanks in advance

    I tried to use your code, but I couldn´t make it work (I don´t know Jython). I think the problem could be with the use of quotes
    Here is what I wrote:
    import os
    retVal = os.system(r'sqlldr control=E:\Public\TXTODI\PROFITA2/Profita2Final.txt.ctl log=E:\Public\TXTODI\PROFITA2/Profita2Final.txt.log userid=MYUSER/myPassword @ mySID')
    if retVal == 1 or retVal > 2:
    raise 'SQLLDR failed. Please check the for details '
    And the error message is:
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (innermost last):
    File "<string>", line 5, in ?
    SQLLDR failed. Please check the for details
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
         at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlC.treatTaskTrt(SnpSessTaskSqlC.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.i(e.java)
         at com.sunopsis.dwg.cmd.h.y(h.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Unknown Source)

  • Problem with SQL*Loader and different date formats in the same file

    DB: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
    System: AIX 5.3.0.0
    Hello,
    I'm using SQL*Loader to import semi-colon separated values into a table. The files are delivered to us by a data provider who concatenates data from different sources and this results in us having different date formats within the same file. For example:
    ...;2010-12-31;22/11/1932;...
    I load this data using the following lines in the control file:
    EXECUTIONDATE1     TIMESTAMP     NULLIF EXECUTIONDATE1=BLANKS     "TO_DATE(:EXECUTIONDATE1, 'YYYY-MM-DD')",
    DELDOB          TIMESTAMP     NULLIF DELDOB=BLANKS          "TO_DATE(:DELDOB, 'DD/MM/YYYY')",
    The relevant NLS parameters:
    NLS_LANGUAGE=FRENCH
    NLS_DATE_FORMAT=DD/MM/RR
    NLS_DATE_LANGUAGE=FRENCH
    If I load this file as is the values loaded into the table are 31 dec 2010 and 22 nov *2032*, aven though the years are on 4 digits. If I change the NLS_DATE_FORMAT to DD/MM/YYYY then the second date value will be loaded correctly, but the first value will be loaded as 31 dec *2020* !!
    How can I get both date values to load correctly?
    Thanks!
    Sylvain

    This is very strange, after running a few tests I realized that if the year is 19XX then it will get loaded as 2019, and if it is 20XX then it will be 2020. I'm guessing it may have something to do with certain env variables that aren't set up properly because I'm fairly sure my SQL*Loader control file is correct... I'll run more tests :-(

  • Problems using SQL*Loader with Oracle SQL Developer

    I have been using TOAD and able to import large (milllions of rows of data) in various file formats into a table in an Oracle database. My company recently decided not to renew any more TOAD licenses and go with Oracle SQL Developer. The Oracle database is on a corporate server and I access the database via Oracle client locally on my machine. Oracle SQL Developer and TOAD are local on my desktop and connected through TNSnames using the Windows XP platform. I have no issues with using SQL*Loader via the import wizard in TOAD to import the data in these large files into an Oracle table and producing a log file. Loading the same files via SQL*Loader in SQL Developer, freezes up my machine and I cannot get it to produce a log file. Please help!

    I am using SQL Developer version 3.0.04. Yes, I have tried it with a smaller file with no success. What is odd is that the log file is not even created. What is created is a .bat file a control file and a .sh file but no log file. The steps that I take:
    1.Right click on the table I want to import to or go to actions
    2. Import Data
    3. Find file to import
    4. Data Preview - All fields entered according to file
    5. Import Method - SQL Loader utility
    6. Column Definitions - Mapped
    7. Options - Directory of files set
    8. Finish
    With the above steps I was not able to import 255 rows of data. No log file was produced so I don't know why it is failing.
    thanks.
    Edited by: user3261987 on Apr 16, 2012 1:23 PM

  • Problems with SQL*Loader and java Runtime

    Hi. I'm trying to start SQL*Loader on Oracle 8 by using Runtime class in this way:
    try{
    Process p = Runtime.getRuntime().exec( "c:\oracle\ora81\bin\sqlldr.exe parfile=c:\parfile\carica.par" );
    /*If i insert this line my application never stops*/
    p.waitFor();
    }catch( Exception e ){
    . I have seen that if lines to insert are less then 400 all works very fine, but if lines number is greater than 400, all data go in my tables but my log file is opened always in writing.Can anyone tell me why?
    Thanks

    Just a note if the executable "sqlldr.exe" does not stop (quit running) by itself the p.waitFor() will wait for ever.

  • SQL*Loader and "Variable length field was truncated"

    Hi,
    I'm experiencing this problem using SQL*Loader: Release 8.1.7.0.0
    Here is my control file (it's actually split into separate control and data files, but the result is the same)
    LOAD DATA
    INFILE *
    APPEND INTO TABLE test
    FIELDS TERMINATED BY ','
    OPTIONALLY ENCLOSED BY '"'
    first_id,
    second_id,
    third_id,
    language_code,
    display_text VARCHAR(2000)
    begindata
    2,1,1,"eng","Type of Investment Account"
    The TEST table is defined as:
    Name Null? Type
    FIRST_ID NOT NULL NUMBER(4)
    SECOND_ID NOT NULL NUMBER(4)
    THIRD_ID NOT NULL NUMBER(4)
    LANGUAGE_CODE NOT NULL CHAR(3)
    DISPLAY_TEXT VARCHAR2(2000)
    QUESTION_BLOB BLOB
    The log file displays:
    Record 1: Warning on table "USER"."TEST", column DISPLAY_TEXT
    Variable length field was truncated.
    And the results of the insert are:
    FIRST_ID SECOND_ID THIRD_ID LANGUAGE_CODE DISPLAY_TEXT
    2 1 1 eng ype of Investment Account"
    The language_code field is imported correctly, but display_text keeps the closing delimiter, and loses the first character of the string. In other words, it is interpreting the enclosing double quote and/or the delimiter, and truncating the first two characters.
    I've also tried the following:
    LOAD DATA
    INFILE *
    APPEND INTO TABLE test
    FIELDS TERMINATED BY '|'
    first_id,
    second_id,
    third_id,
    language_code,
    display_text VARCHAR(2000)
    begindata
    2|1|1|eng|Type of Investment Account
    In this case, display_text is imported as:
    pe of Investment Account
    In the log file, I get this table which seems odd as well - why is the display_text column shown as having length 2002 when I explicitly set it to 2000?
    Column Name Position Len Term Encl Datatype
    FIRST_ID FIRST * | O(") CHARACTER
    SECOND_ID NEXT * | O(") CHARACTER
    THIRD_ID NEXT * | O(") CHARACTER
    LANGUAGE_CODE NEXT 3 | O(") CHARACTER
    DISPLAY_TEXT NEXT 2002 VARCHAR
    Am I missing something totally obvious in my control and data files? I've played with various combinations of delimiters (commas vs '|'), trailing nullcols, optional enclosed etc.
    Any help would be greatly appreciated!

    Use CHAR instead aof VARCHAR
    LOAD DATA
    INFILE *
    APPEND INTO TABLE test
    FIELDS TERMINATED BY ','
    OPTIONALLY ENCLOSED BY '"'
      first_id,
      second_id,
      third_id,
      language_code,
      display_text    CHAR(2000)
    )From the docu:
    A VARCHAR field is a length-value datatype.
    It consists of a binary length subfield followed by a character string of the specified length.
    http://download-west.oracle.com/docs/cd/A87860_01/doc/server.817/a76955/ch05.htm#20324

  • Loading XML file using sql*loader (10g)

    Hi Everyone....
    I have a major problem. I have a 10g database and I need to use sql loader to load a large XML file into the database. This file contains records for many, many customers. This will be done several times and the number of records will vary. I want to use sql loader and load to a staging table, BUT SPLIT THE RECORDS OUT and add a sequence. I am having 2 problems.
    In the 10g documentation, when you want to split the records you use the BEGINDATA clause and indicate something (like a 0) for each instance of a record. Well my first file has 3,722 records in it. How do I know the number of records in each XML file?
    The second problem is that because this is XML, I thought that I could use ENCLOSED BY. But the start tag has an attribute /counter in it, sql*loader doesnt recognize the starting tag. (i.e.: start tag is: </CustomerRec number=1>
    end tag is: </CustomerRec> )
    So, I used TERMINATED BY '</CustomerRec>'. This will split out the records, but it will NOT include the ending tag of </CustomerRec> and when I use extract, I receive an XML parsing error.
    I have tried to get the ending tag using CONTINUEIF and SKIP. But those options are for "records" and not lines.
    Does anyone have any ideas for the 2 problems above. I am at my wits end and I need to finish this ASAP. Any help would be appreciated.
    Thank you!

    Sorry.... here is an example of what my control file looks like. At the end, I have 92 "0", but remember, I have 3722 records in this first file.
    LOAD DATA (SKIP 1)
    INFILE *
    INTO TABLE RETURN_DATA_STG
    TRUNCATE
    XMLType(xmldata)
    FIELDS
    (fill FILLER CHAR(1),
    loadseq SEQUENCE(MAX,1),
    xmldata LOBFILE (CONSTANT F001AB.XML)
    TERMINATED BY '</ExtractObject>'
    ------ ":xmldata||'</ExtractObject>'"
    BEGINDATA
    0
    0
    0
    0
    0
    0

  • SQL *Loader and External Table

    Hi,
    Can anyone tell me the difference between SQL* Loader and External table?
    What are the conditions under we can use SQL * Loader and External Table.
    Thanx

    External tables are accessible from SQL, which generally simplifies life if the data files are physically located on the database server since you don't have to coordinate a call to an external SQL*Loader script with other PL/SQL processing. Under the covers, external tables are normally just invoking SQL*Loader.
    SQL*Loader is more appropriate if the data files are on a different server or if it is easier to call an executable rather than calling PL/SQL (i.e. if you have a batch file that runs on a server other than the database server that wants to FTP a data file from a FTP server and then load the data into Oracle).
    Justin

  • ORA-01003 when using SQL*Loader

    Hi people,
    Can anyone shed some light on this.
    We have a multitasked dataload operation that is using SQL*Loader and on occasion we're seeing some rows failing due to ORA-01003 no statement parsed. Could this be due to some shortage of a database resource (cursors, connections ??) or are there other well known reasons for this.
    Any help appreciated.
    Regards
    Andy

    Check those registers ( rows ) in the bad file or in the discard file. If the register(s) are in the discard file is because the register ( row ) does not complete the requirements of denoted in the controlfile file and if those registers are in the error file is because the database is rejecting them according to constraints and so on.
    Conclusion: realize in what file those rows are being stored after to be rejected.
    Joel Pérez

  • Problem loading XML-file using SQL*Loader

    Hello,
    I'm using 9.2 and tryin to load a XML-file using SQL*Loader.
    Loader control-file:
    LOAD DATA
    INFILE *
    INTO TABLE BATCH_TABLE TRUNCATE
    FIELDS TERMINATED BY ','
    FILENAME char(255),
    XML_DATA LOBFILE (FILENAME) TERMINATED BY EOF
    BEGINDATA
    data.xml
    The BATCH_TABLE is created as:
    CREATE TABLE BATCH_TABLE (
    FILENAME VARCHAR2 (50),
    XML_DATA SYS.XMLTYPE ) ;
    And the data.xml contains the following lines:
    <?xml version="2.0" encoding="UTF-8"?>
    <!DOCTYPE databatch SYSTEM "databatch.dtd">
    <batch>
    <record>
    <data>
    <type>10</type>
    </data>
    </record>
    <record>
    <data>
    <type>20</type>
    </data>
    </record>
    </batch>
    However, the sqlldr gives me an error:
    Record 1: Rejected - Error on table BATCH_TABLE, column XML_DATA.
    ORA-21700: object does not exist or is marked for delete
    ORA-06512: at "SYS.XMLTYPE", line 0
    ORA-06512: at line 1
    If I remove the first two lines
    "<?xml version="2.0" encoding="UTF-8"?>"
    and
    "<!DOCTYPE databatch SYSTEM "databatch.dtd">"
    from data.xml everything works, and the contentents of data.xml are loaded into the table.
    Any idea what I'm missing here? Likely the problem is with special characters.
    Thanks in advance,

    I'm able to load your file just by removing the second line <!DOCTYPE databatch SYSTEM "databatch.dtd">. I dont have your dtd file, so skipped that line. Can you check if it's problem with ur DTD?

  • Restrictions for using sql commands and operators in loader control file

    Hi ,
    It suppose that there is a lot of restrictions and limitations when using sql commands and operators in the loader control files, same as it seems I cannot use (or) when with case statement, also it seems there is certain length for the case,
    So guys, what are the common limitations and restrictions to be avoided in the loader control file ?
    Your efforts are highly appreciated
    Ash

    Hi Ash,
    if you need to do more complicated logic its better to define the file to be loaded as an external table. You can then use any sql function you like against the external table rather than messing around with what you can and can;t do in a sqlldr control file.
    You can use the external_table option of sqldr to generate the definition.
    Regards,
    Harry
    http://dbaharrison.blogspot.com/

  • Use of SQL*Loader and Thin Client

    I have a java application that is currently using the thin client. For portability issues, I am unable to use an OCI driver. My application also uses SQL*Loader.
    My big issue is that The Thin client requires the HOST and SID for a connection.
    But, for SQL*Loader to work in a client/server environment it needs to use the SERVICE_NAME and net8. Is there any way to have SQL*Loader use similar information that the thin client is using?
    I know the right solution is to use the OCI drivers, but right now that is not feasible.
    Thanks for any help in advance
    Jeff McNurlan
    Navigation Technologies.

    I assume you are using an ObjectInputStream. Without seeing your code or a snippet I can't be certain...
    The JDBC spec says the following about ObjectInputStream :
    "An ObjectInputStream deserializes primitive data and objects previously written using an ObjectOutputStream"
    Since there was nothing written by ObjectOutputStream, ObjectInputStream has nothing to deserialize and this exception would be raised.

  • SQL loader and stream files with new line '

    I have been trying unsuccessfully to load EDI files using SQL loader. The problem
    is that the lines are terminated by ' and when I use the stream file option it does
    not recognise the line terminator given. As I understand it from the documentation
    this should work - but it does not. I have also used the Hex option with no better
    result. Does anyone have any ideas ?
    I can and have used tr "[']" "[\n]" in Unix to convert the ' to newlines - I just
    wonder am I missing something in SQL loader which will allow me to do this ?
    This is the sql loader control file
    LOAD DATA
    INFILE 'WS860685.MFD' "Str ''' "
    BADFILE 'WS860685.bad'
    DISCARDFILE 'WS860685.dsc'
    INTO TABLE "DUND1"."EDI_LOADED_TEMP"
    REPLACE
    FIELDS TERMINATED BY '+'
    TRAILING NULLCOLS
    (L1,
    L2,
    L3,
    L4,
    L5,
    L6,
    L7,
    L8,
    L9,
    L10,
    L11,
    L12,
    L13,
    L14,
    L15,
    L16,
    L17,
    L18,
    L19,
    L20,
    L21,
    L22,
    L23,
    L24,
    L25,
    L26,
    L27,
    L28,
    L29,
    L30,
    L31,
    L32,
    L33,
    L34,
    L35,
    L36,
    L37,
    L38,
    L39,
    L40,
    LNO)
    Heres a sample of the data
    UNB+UNOA:2+5398888501357+5398888501838+080306:0737+395+ DESADV+++1'UNH+0001+DESADV:D:93A:UN:EAN004'

    http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:6020061915147
    answers this question perfectly

  • Sql Loader and carriage returns

    I am currently trying to use sql loader to load data from flat files that was extracted from sybase using bcp and delimited with pipes. There are text and varchar columns that contain carriage return line feeds. I want to preserve these, but I can not load them into Oracle 8.05 using sql loader as it interprets them as end of record indicators. Does anyone have a way to solve this problem? Any assistance would be appreciated.
    Thanks in advance,
    Jignesh

    External tables are accessible from SQL, which generally simplifies life if the data files are physically located on the database server since you don't have to coordinate a call to an external SQL*Loader script with other PL/SQL processing. Under the covers, external tables are normally just invoking SQL*Loader.
    SQL*Loader is more appropriate if the data files are on a different server or if it is easier to call an executable rather than calling PL/SQL (i.e. if you have a batch file that runs on a server other than the database server that wants to FTP a data file from a FTP server and then load the data into Oracle).
    Justin

Maybe you are looking for