Sql loader script

hi all i require a sql loader script
i have some files that i will have to load to oracle database 11g... i dont know the number of files that will fall on the directory daily
how do i do this. ?
my requirement is in_file parameter in the control file should be dynamically populated with the file names and all the files should be processed ... the structure and columns in all the files are same.
plz let me know if you have code for this.
regards
raj

Raj,
Here is you can use to generate sqlldr control and you can include this in simple sql script or call it from unix script or batch script. Run this sql from IDE or sqlplus and pass "table name " generate sqlldr control file
spool &table_name..ctl
SELECT      'LOAD DATA'
         || CHR (10)
         || 'INFILE '''
         || LOWER (table_name)
         || '.dat'''
         || CHR (10)
         || 'INTO TABLE '
         || table_name
         || CHR (10)
         || 'FIELDS TERMINATED BY '','''
         || CHR (10)
         || 'TRAILING NULLCOLS'
         || CHR (10)
         || '('
  FROM   user_tables
WHERE   table_name = UPPER ('&table_name');
  SELECT   DECODE (ROWNUM, 1, '   ', ' , ') || RPAD (column_name, 33, ' ')
           || DECODE (
                 data_type,
                 'VARCHAR2',
                 'CHAR NULLIF (' || column_name || '=BLANKS)',
                 'FLOAT',
                 'DECIMAL EXTERNAL NULLIF(' || column_name || '=BLANKS)',
                 'NUMBER',
                 DECODE (
                    data_precision,
                    0,
                    'INTEGER EXTERNAL NULLIF (' || column_name || '=BLANKS)',
                    DECODE (
                       data_scale,
                       0,
                       'INTEGER EXTERNAL NULLIF (' || column_name || '=BLANKS)',
                       'DECIMAL EXTERNAL NULLIF (' || column_name || '=BLANKS)'
                 'DATE',
                 'DATE "&dformat" NULLIF (' || column_name || '=BLANKS)',
                 NULL
    FROM   user_tab_columns
   WHERE   table_name = UPPER ('&table_name')
ORDER BY   column_id;
SELECT   ')' FROM DUAL;
spool offhttp://www.oracleutilities.com/OSUtil/sqlldr.html
Regards

Similar Messages

  • Creating SQL-Loader script for more than one table at a time

    Hi,
    I am using OMWB 2.0.2.0.0 with Oracle 8.1.7 and Sybase 11.9.
    It looks like I can create SQL-Loader scripts for all the tables
    or for one table at a time. If I want to create SQL-Loader
    scripts for 5-6 tables, I have to either create script for all
    the tables and then delete the unwanted tables or create the
    scripts for one table at a time and then merge them.
    Is there a simple way to create migration scripts for more than
    one but not all tables at a time?
    Thanks,
    Prashant Rane

    No there is no multi-select for creating SQL-Loader scripts.
    You can either create them separately or create them all and
    then discard the one you do not need.

  • How to Eliminate Special Character in SQL LOADER Script

    How to eliminate special character from SQL LOADER script file which suppose not to insert in TABLE
    example.CSV lile like this
    <ABC/ , 7747>
    <DEF/ , 7763>
    <NEW/ , 7779>
    <OLD/, 7795>
    I have to remove < > and / character at the time of loading into table. How It could be done. It is not possible to remove < , > , / character manually from CSV file

    On Unix/Linux that's very easy, on Windows... I don't know...
    $ cat myfile.csv
    <ABC/ , 7747>
    <DEF/ , 7763>
    <NEW/ , 7779>
    <OLD/, 7795>
    $ tr -d "\057\074\076" <myfile.csv >outfile.csv
    $ cat outfile.csv
    ABC , 7747
    DEF , 7763
    NEW , 7779
    OLD, 7795
    $

  • ORA-00936 error from SQL expression in SQL*Loader script

    I am getting the above error on the following line in my SQL*Loader script:
    DIA_CLM_RES_OID DECIMAL EXTERNAL
    "SELECT N_ORG_ENTY_ID FROM TESTG4.ORG_ENTITY
    WHERE N_USER_ID =
    (SELECT UNIQUE WSR_NT_ID FROM CONV_CLM_RESOURCE
    WHERE CLM_RES_OID = :DIA_CLM_RES_OID)",
    What I am basically trying to do is a 2-table lookup of a value:
    1. Find a row in table CONV_CLM_RESOURCE where the value in column CLM_RES_OID matches the value in the input file in field DIA_CLM_RES_OID.
    2. Take the value of field WSR_NT_ID from that row and use it to find a row in table TESTG4.ORG_ENTITY.
    3. Take the value of field WSR_NT_ID from that row and set it in the target table in field DIA_CLM_RES_OID.
    In other words, I am essentially trying to translate the input value by using two other tables to lookup the value to translate to. However, no matter how I arrange it, I keep getting the "ORA-00936: missing expression" error on this statement.
    Can anyone see what I am doing wrong, or perhaps suggest a better way of accomplishing a two-table translation of a value?
    Thanks!

    Still not sure why this doesn't work, but I was able to create and use a function to do this instead, which is probably a better approach anyway.

  • Using SQL Loader script in a Stored Procedure

    Can I use SQL Loader script in a stored procedure and then execute it from a front-end appl.? The reason for this seemingly convoluted solution is that the users don't want a batch load though the records volume is quite high (around 1 mil). Other loads using ODBC connection or OLE DB seem to be inferior to SQL Loader.

    I would suggest a couple of solutions:
    1. Have a cgi script that can upload the file to the server from a web ui, then have the cgi script call the sql*loader file, and it will insert into the database.
    2. You can try to use External tables. This is avaliable in 9i and onwards. You will be able to make any sql DML on the external table.
    I would normally use sql*loader, move the data to a staging table with nologging, and paralle loading. After it has been loaded into the staging table I would process it into my main tables. Have used this approach with up to 60 million records in one load.
    You can do calls to C procedures, Pro*C procedures through PLSQL, as well as java calls, or use Java stored procedures.
    My experience is that SQL*Loader is the fastest way to load data into the database.

  • To schedule the SQL loader Script

    How to schedule the SQL loader Script everyday at 6.30PM in windows Server 2003?
    Could you please help me for this query?

    create a text file with your sql loader commands.
    from batch file you can run sqlplus and call this text file.
    add the schedule to your windows server

  • SQL*LOADER script with ORA-03135: connection lost contact error

    Hello,
    I got a really strange problem with a script that transfer data from a database to an other. Each day we recieve the raw data in zipped txt files, then we feed it to the database with SQL Loader. This worked fine until two weeks ago. At first, I thought the error must have been in the script, but it didn't change at all since 3 months. And the weirdest thing is that sometime it works without any problems, but now I get almost anytime the "ORA-03135: connection lost contact" and it seems to happen randomly. It never fails at the same place , thought the data I feed the script is always the same.
    I'm new to SQL*Loader and since the script seems to send all the data, I don't understand where that error come from. Can I do something on my workstation or the problem is on the database server ?
    I think the problem would be that the connection get a timeout out on the server, but we never got that problem on the production server and it has a lot more data to transfer then my little test script.
    Have you any idea on whjat the problem could be ?

    03135, 00000, "connection lost contact"
    // *Cause:  1) Server unexpectedly terminated or was forced to terminate.
    //          2) Server timed out the connection.
    // *Action: 1) Check if the server session was terminated.
    //          2) Check if the timeout parameters are set properly in sqlnet.ora.I would suspect networking issues external to sqlldr & Oracle

  • Run SQL Loader script from Unix env

    Hi,
    We are using HP-UNIX server. The control file of SQL Loader is placed in oracle home\bin folder in the unix server. I created Executable for this SQL Loader control file in concurrent program and attached into a request group.
    Then I ran it from a responsibility, I am getting the below error.
    +-----------------------------
    | Starting concurrent program execution...
    +-----------------------------
    SQL*Loader-350: Syntax error at line 16.
    Expecting "," or ")", found "TIMESTAMP".
    GL_DATE TIMESTAMP 'yyyy MM dd hh:MI:SS:AM',
    ^
    SQL*Loader: Release 8.0.6.3.0 - Production on Sun Sep 5 15:28:45 2010
    (c) Copyright 1999 Oracle Corporation. All rights reserved.
    SQL*Loader-350: Syntax error at line 16.
    Expecting "," or ")", found "TIMESTAMP".
    GL_DATE TIMESTAMP 'yyyy MM dd hh:MI:SS:AM',
    ^
    My Actual Control file is,
    LOAD DATA
    INFILE '/usr/........xyz.txt'
    BADFILE '/usr/........xyz.bad'
    DISCARDFILE '/usr/........xyz.dsc
    APPEND
    INTO TABLE gl_interface_stg
    FIELDS TERMINATED BY "|" TRAILING NULLCOLS
    GL_DATE TIMESTAMP 'yyyy MM dd hh:MI:SS:AM',
    CATEGORY_NAME,
    JOURNAL_DESCRIPTION,
    JOURNAL_NAME,
    BATCH_NAME,
    BATCH_DESCRIPTION,
    ACCOUNT_CODE,
    PROCESSING_STATUS CONSTANT 'N'
    If I run the above control file from my local machine (windows) using SQLLDR username/pwd @servername control='c:/abc.ctl' from the command prompt, then all the records from xyz.txt file pushed into staging table.
    Where as, If I schedule the above control file and trying to run from a GL responsibility, I am getting the above error message.
    Do I need to modify the control file scripts in order to run from UNIX environment?
    Pls advice how to do it from UNIX environment.
    Thanks in advance.

    Hi,
    SQL*Loader-350: Syntax error at line 16.Please see if these docs help.
    SQL*LOADER-350 SYNTAX ERROR [ID 1019271.102]
    LDR-00350 TOKEN LONGER THAN MAX ALLOWABLE LENGTH ERROR WHEN LOADING ASCII FILE [ID 1020091.6]
    Do I need to modify the control file scripts in order to run from UNIX environment?
    Pls advice how to do it from UNIX environment.See these docs.
    How to Use 9i or 10g Features in SQL*Loader for Apps? [ID 423035.1]
    11i FND:How to specify Record Terminator In Sql*Loader type of concurrent program [ID 252850.1]
    Thanks,
    Hussein

  • ORA-12899 error from function invoked from SQL*Loader

    I am getting the above error when I call a function from my SQL*Loader script, and I am not seeing what the problem is. As far as I can see, there should be no problem with the field lengths, unless the length of the automatic variable within my function is somehow being set at 30? Here are the details (in the SQL*Loader script, the field of interest is the last one):
    ====
    Error:
    ====
    Record 1: Rejected - Error on table TESTM8.LET_DRIVE_IN_FCLTY, column DIF_CSA_ID.
    ORA-12899: value too large for column "TESTM8"."LET_DRIVE_IN_FCLTY"."DIF_CSA_ID" (actual: 30, maximum: 16)
    =======
    Function:
    =======
    CREATE OR REPLACE FUNCTION find_MCO_id (di_oid_in DECIMAL)
    RETURN CHAR IS mco_id CHAR;
    BEGIN
    SELECT AOL_MCO_LOC_CD INTO mco_id
    FROM CONV_DI_FLCTY
    WHERE DIF_INST_ELMNT_OID = di_oid_in;
    RETURN TRIM(mco_id);
    END;
    ==============
    SQL*Loader Script:
    ==============
    LOAD DATA
    INFILE 'LET_DRIVE_IN_FCLTY.TXT'
    BADFILE 'LOGS\LET_DRIVE_IN_FCLTY_BADDATA.TXT'
    DISCARDFILE 'LOGS\LET_DRIVE_IN_FCLTY_DISCARDDATA.TXT'
    REPLACE
    INTO TABLE TESTM8.LET_DRIVE_IN_FCLTY
    FIELDS TERMINATED BY '~' OPTIONALLY ENCLOSED BY '"'
    DIF_DRIVE_IN_OID DECIMAL EXTERNAL,
    DIF_FCLTY_TYPE_OID DECIMAL EXTERNAL NULLIF DIF_FCLTY_TYPE_OID = 'NULL',
    DIF_INST_ELMNT_OID DECIMAL EXTERNAL,
    DIF_PRI_PERSON_OID DECIMAL EXTERNAL NULLIF DIF_PRI_PERSON_OID = 'NULL',
    DIF_SEC_PERSON_OID DECIMAL EXTERNAL NULLIF DIF_SEC_PERSON_OID = 'NULL',
    DIF_CREATE_TS TIMESTAMP "yyyy-mm-dd-hh24.mi.ss.ff6",
    DIF_LAST_UPDATE_TS TIMESTAMP "yyyy-mm-dd-hh24.mi.ss.ff6",
    DIF_ADP_ID CHAR NULLIF DIF_ADP_ID = 'NULL',
    DIF_CAT_CLAIMS_IND CHAR,
    DIF_CAT_DIF_IND CHAR,
    DIF_DAYLT_SAVE_IND CHAR,
    DIF_OPEN_PT_TM_IND CHAR,
    DIF_CSA_ID CONSTANT "find_MCO_id(:DIF_DRIVE_IN_OID)"
    ============
    Table Definitions:
    ============
    SQL> describe CONV_DI_FLCTY;
    Name Null? Type
    DIF_INST_ELMNT_OID NOT NULL NUMBER(18)
    AOL_MCO_LOC_CD NOT NULL VARCHAR2(3)
    SQL> describe LET_DRIVE_IN_FCLTY;
    Name Null? Type
    DIF_DRIVE_IN_OID NOT NULL NUMBER(18)
    DIF_INST_ELMNT_OID NOT NULL NUMBER(18)
    DIF_FCLTY_TYPE_OID NUMBER(18)
    DIF_ADP_ID VARCHAR2(10)
    DIF_CAT_DIF_IND NOT NULL VARCHAR2(1)
    DIF_CAT_CLAIMS_IND NOT NULL VARCHAR2(1)
    DIF_CSA_ID VARCHAR2(16)
    DIF_DAYLT_SAVE_IND NOT NULL VARCHAR2(1)
    DIF_ORG_ENTY_ID VARCHAR2(16)
    DIF_OPEN_PT_TM_IND NOT NULL VARCHAR2(1)
    DIF_CREATE_TS NOT NULL DATE
    DIF_LAST_UPDATE_TS NOT NULL DATE
    DIF_ITM_FCL_MKT_ID NUMBER(18)
    DIF_PRI_PERSON_OID NUMBER(18)
    DIF_SEC_PERSON_OID NUMBER(18)
    =========================
    Thanks for any help with this one!

    I changed one line of the function to:
    RETURN CHAR IS mco_id VARCHAR2(16);
    But I still get the same error:
    ORA-12899: value too large for column "TESTM8"."LET_DRIVE_IN_FCLTY"."DIF_CSA_ID" (actual: 30, maximum: 16)
    I just am not seeing what is being defined as 30 characters. Any ideas much appreciated!

  • Loading Data Error while using SQL Loader

    Hi All,
    I am facing another issue while loading the data from text file to oracle db.
    I have a table in which data is loaded from multiple files. What i want the when i load the data i want to load the filename as well in the table in front of the record. the script which works fine when i load data in windows OS but i am getting error in Unix OS. My sql loader script is as follows.
    source $HOME/. .bash_profile
    export LOGFILE=/NSN/rawfiles/scripts/PAYMOBILE_LOAD/load.log
    ORACLE_ACCESS=cdr/cdr123_awcc@$ORACLE_SID
    FILES=`ls /NSN/rawfiles/bkp/PAYMOBILE/JAN2013/*.txt`;
    for file in ${FILES[@]}
    do
    filename=`expr substr "$file" 43 50`
    echo $filename
    sqlldr $ORACLE_ACCESS control=/NSN/rawfiles/scripts/PAYMOBILE/CAT.ctl log=$LOGFILE data=/NSN/rawfiles/bkp/PAYMOBILE/JAN2013/$filename skip=0 direct=true
    cd
    mv /NSN/rawfiles/bkp/PAYMOBILE/JAN2013/$filename /NSN/backup/PAYMOBILE/JAN2013/
    done
    Control File for the same which i run on unix is :
    Load data
    INFILE "AR_101_1011_01-01-2013-01-32.txt"
    Append
    into table CDR.PAY_MOBILE_012013
    fields terminated by "|" optionally enclosed by " "
    trailing nullcols
    VERSION, TICKETTYPE, GMT_TIME Date 'YYYY-MM-DD HH24:MI:SS' , LOCAL_TIME Date 'YYYY-MM-DD HH24:MI:SS', TRANSACID, EXTERNAL_TRANSACTION_ID, MEDIUM, ALIASCATEGORY, SERVICE, OPERATION, ACTORID,
    ALIASNAME, SENDERACTORID, SENDERALIASNAME, RECIPIENTACTORID, RECIPIENTALIASNAME, THIRDACTORID, THIRDALIASNAME, ERRORHRN, CLASS, SNE,
    MASTERSNE, VOUCHERCATEGORYID, VOUCHERSCENARIOID, EXPIRYDATE Date 'YYYY-MM-DD HH24:MI:SS', UNITVALUEID, CREDIT, VAT, IERROR, OPERSTATE, OPERERROR, CREATION_DATE Date 'YYYY-MM-DD HH24:MI:SS',
    LAST_UPDATE Date 'YYYY-MM-DD HH24:MI:SS', BRANDID, CREDIT_PERIOD, CREDIT_EXPIRY_DATE Date 'YYYY-MM-DD HH24:MI:SS', SENDER_ACCOUNT, SENDER_TRANSAC, SENDER_ACCOUNT_TYPE, RECIPIENT_ACCOUNT,
    RECIPIENT_TRANSAC, RECIPIENT_ACCOUNT_TYPE, RECIPIENT_MEDIUM, TIMESTAMP Date 'YYYY-MM-DD HH24:MI:SS', SENDERBILLINGTYPE, RECIPIENTBILLINGTYPE
    Control file which works fine in windows is as follows:
    Load data
         Append
         into table CDR.PAY_MOBILE_012013
         fields terminated by "|" optionally enclosed by " "
         trailing nullcols
    VERSION, TICKETTYPE, GMT_TIME Date 'YYYY-MM-DD HH24:MI:SS' , LOCAL_TIME Date 'YYYY-MM-DD HH24:MI:SS', TRANSACID, EXTERNAL_TRANSACTION_ID, MEDIUM, ALIASCATEGORY, SERVICE, OPERATION, ACTORID,
    ALIASNAME, SENDERACTORID, SENDERALIASNAME, RECIPIENTACTORID, RECIPIENTALIASNAME, THIRDACTORID, THIRDALIASNAME, ERRORHRN, CLASS, SNE,
    MASTERSNE, VOUCHERCATEGORYID, VOUCHERSCENARIOID, EXPIRYDATE Date 'YYYY-MM-DD HH24:MI:SS', UNITVALUEID, CREDIT, VAT, IERROR, OPERSTATE, OPERERROR, CREATION_DATE Date 'YYYY-MM-DD HH24:MI:SS',
    LAST_UPDATE Date 'YYYY-MM-DD HH24:MI:SS', BRANDID, CREDIT_PERIOD, CREDIT_EXPIRY_DATE Date 'YYYY-MM-DD HH24:MI:SS', SENDER_ACCOUNT, SENDER_TRANSAC, SENDER_ACCOUNT_TYPE, RECIPIENT_ACCOUNT,
    RECIPIENT_TRANSAC, RECIPIENT_ACCOUNT_TYPE, RECIPIENT_MEDIUM, TIMESTAMP Date 'YYYY-MM-DD HH24:MI:SS', SENDERBILLINGTYPE, RECIPIENTBILLINGTYPE, FILENAME CONSTANT "AR_101_1011_01-12-2012-23-32.txt"
    and the sql loader script which works fine in windows is as follows:
    cd K:\paymobilefiles\DEC2012\1
    for %%f in (*.txt) do (K:\paymobilefiles\DEC2012\1\LOAD\REPTEXT K:\paymobilefiles\DEC2012\1\LOAD\TEST.ctl "%%f">K:\paymobilefiles\DEC2012\1\LOAD\MYMOBILE.ctl
    sqlldr userid=cdr/cdr123_awcc@tsiindia control=K:\paymobilefiles\DEC2012\1\LOAD\MYMOBILE.ctl rows=50000 bindsize=20000000 readsize=20000000 data=%%f log=K:\paymobilefiles\DEC2012\1\LOAD\MYMOBILE.log
    move %%f K:\paymobilefiles\DEC2012\backup\1
    when using windows based scripts i am getting filename in the filename column but when running the above mentioned scripts i am not able to get the filenames in the filename column.
    Can anyone help over the same please

    HI,
    There is Difference in Control files , no INFILE & FILENAME CONSTANT values
    Load data
    Append
    into table CDR.PAY_MOBILE_012013
    fields terminated by "|" optionally enclosed by " "
    trailing nullcols
    VERSION, TICKETTYPE, GMT_TIME Date 'YYYY-MM-DD HH24:MI:SS' , LOCAL_TIME Date 'YYYY-MM-DD HH24:MI:SS', TRANSACID, EXTERNAL_TRANSACTION_ID, MEDIUM, ALIASCATEGORY, SERVICE, OPERATION, ACTORID,
    ALIASNAME, SENDERACTORID, SENDERALIASNAME, RECIPIENTACTORID, RECIPIENTALIASNAME, THIRDACTORID, THIRDALIASNAME, ERRORHRN, CLASS, SNE,
    MASTERSNE, VOUCHERCATEGORYID, VOUCHERSCENARIOID, EXPIRYDATE Date 'YYYY-MM-DD HH24:MI:SS', UNITVALUEID, CREDIT, VAT, IERROR, OPERSTATE, OPERERROR, CREATION_DATE Date 'YYYY-MM-DD HH24:MI:SS',
    LAST_UPDATE Date 'YYYY-MM-DD HH24:MI:SS', BRANDID, CREDIT_PERIOD, CREDIT_EXPIRY_DATE Date 'YYYY-MM-DD HH24:MI:SS', SENDER_ACCOUNT, SENDER_TRANSAC, SENDER_ACCOUNT_TYPE, RECIPIENT_ACCOUNT,
    RECIPIENT_TRANSAC, RECIPIENT_ACCOUNT_TYPE, RECIPIENT_MEDIUM, TIMESTAMP Date 'YYYY-MM-DD HH24:MI:SS', SENDERBILLINGTYPE, RECIPIENTBILLINGTYPE, FILENAME CONSTANT "AR_101_1011_01-12-2012-23-32.txt"
    )Thanks,
    Ajay More
    http://moreajays.blogspot.com

  • SQL*LOADER and decode

    I'm setting up a sql*loader script and trying to use the decode function as referred to in 'Applying SQL Operators to Fields' I'm getting an error message ' Token longer than max allowable length of 258 chars'. Is there a limit to the size of the decode statement within sql*loader - or is it better to use a table trigger to handle this on insert? I ran the decode statement as a select through SQL*Plus and it works okay there. Oracle 8.0 Utilities shows example of decode in Ch. 5, but Oracle 9i Utilities Ch. 6 does not. Has anyone done this and what's the impact on performance of the load if I can get it to work? See my example below:
    LOAD DATA
    INFILE 'e2e_prod_cust_profile.csv'
    APPEND
    INTO TABLE APPS.RA_CUSTOMER_PROFILES_INTERFACE
    FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"'
    (Insert_update_flag CHAR(1),
    Orig_system_customer_ref CHAR(240),
    customer_profile_class_name CHAR(30) NULLIF customer_profile_class=BLANKS
    "decode(customer_profile_class_name,
    'NORTHLAND Default','(MIA) Default',
    'NORTHLAND Non Consolidated','(MIA) Non Cons',
    'NORTHLAND Consolidated A','(MIA) Cons A',
    'NORTHLAND Consolidated B','(MIA) Cons B',
    'NORTHLAND Consolidated C','(MIA) Cons C',
    'NORTHLAND Consolidated D','(MIA) Cons D',
    'NORTHLAND Cons A NonZS','(MIA) Cons A NonZS',
    'NORTHLAND Cons B NonZS','(MIA) Cons B NonZS',
    'NORTHLAND Cons C NonZS','(MIA) Cons C NonZS',
    'NORTHLAND Cons D NonZS','(MIA) Cons D NonZS',
    'NORTHLAND International Billing','(MIA) International Billing',
    customer_profile_class_name)",
    credit_hold CHAR(1),
    overall_credit_limit INTERGER EXTERNAL,
    "e2e_cust_profile.ctl" 49 lines, 1855 characters
    SQL*Loader-350: Syntax error at line 15.
    Token longer than max allowable length of 258 chars
    'NORTHLAND Consolidated D','(MIA) Cons D',
    ^

    Your controlfile is incomplete and has some typos, but you could try something like:
    create or replace function decode_profile_class_name (p_longname IN VARCHAR2)
    return VARCHAR2
    is
    begin
      CASE p_longname
        WHEN 'NORTHLAND Default'               THEN RETURN '(MIA) Default';
        WHEN 'NORTHLAND Non Consolidated'      THEN RETURN '(MIA) Non Cons';
        WHEN 'NORTHLAND Consolidated A'        THEN RETURN '(MIA) Cons A';
        WHEN 'NORTHLAND Consolidated B'        THEN RETURN '(MIA) Cons B';
        WHEN 'NORTHLAND Consolidated C'        THEN RETURN '(MIA) Cons C';
        WHEN 'NORTHLAND Consolidated D'        THEN RETURN '(MIA) Cons D';
        WHEN 'NORTHLAND Cons A NonZS'          THEN RETURN '(MIA) Cons A NonZS';
        WHEN 'NORTHLAND Cons B NonZS'          THEN RETURN '(MIA) Cons B NonZS';
        WHEN 'NORTHLAND Cons C NonZS'          THEN RETURN '(MIA) Cons C NonZS';
        WHEN 'NORTHLAND Cons D NonZS'          THEN RETURN '(MIA) Cons D NonZS';
        WHEN 'NORTHLAND International Billing' THEN RETURN '(MIA) International Billing';
        ELSE RETURN p_longname;
      END CASE;
    end;
    LOAD DATA
    INFILE 'e2e_prod_cust_profile.csv'
    APPEND
    INTO TABLE APPS.RA_CUSTOMER_PROFILES_INTERFACE
    FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"'
    Insert_update_flag          CHAR(1),
    Orig_system_customer_ref    CHAR(240),
    customer_profile_class_name CHAR(30) NULLIF customer_profile_class=BLANKS "decode_profile_class_name(:customer_profile_class_name)"
    credit_hold                 CHAR(1),
    overall_credit_limit        INTEGER EXTERNAL

  • SQL*Loader and integrity constraints

    I am running into trouble with integrity constraints in my SQL*Loader script runs. Should I be going up to unix and removing the constraints from sqlplus, then asking the wizard to reinstall them after the data load? Is this documented somewhere? I can't find it. Sorry for asking such a basic question.
    Thanks,
    Ann Cantelow

    Hi,
    Your approach is correct.
    Create tables first
    Move data via the scripts
    Create contraints, indexes, primary keys
    See the User Guide for the ORacle Migration Workbench http://otn.oracle.com/tech/migration/workbench
    Regards
    John

  • Sql Loader and carriage returns

    I am currently trying to use sql loader to load data from flat files that was extracted from sybase using bcp and delimited with pipes. There are text and varchar columns that contain carriage return line feeds. I want to preserve these, but I can not load them into Oracle 8.05 using sql loader as it interprets them as end of record indicators. Does anyone have a way to solve this problem? Any assistance would be appreciated.
    Thanks in advance,
    Jignesh

    External tables are accessible from SQL, which generally simplifies life if the data files are physically located on the database server since you don't have to coordinate a call to an external SQL*Loader script with other PL/SQL processing. Under the covers, external tables are normally just invoking SQL*Loader.
    SQL*Loader is more appropriate if the data files are on a different server or if it is easier to call an executable rather than calling PL/SQL (i.e. if you have a batch file that runs on a server other than the database server that wants to FTP a data file from a FTP server and then load the data into Oracle).
    Justin

  • SQL *Loader and External Table

    Hi,
    Can anyone tell me the difference between SQL* Loader and External table?
    What are the conditions under we can use SQL * Loader and External Table.
    Thanx

    External tables are accessible from SQL, which generally simplifies life if the data files are physically located on the database server since you don't have to coordinate a call to an external SQL*Loader script with other PL/SQL processing. Under the covers, external tables are normally just invoking SQL*Loader.
    SQL*Loader is more appropriate if the data files are on a different server or if it is easier to call an executable rather than calling PL/SQL (i.e. if you have a batch file that runs on a server other than the database server that wants to FTP a data file from a FTP server and then load the data into Oracle).
    Justin

  • SQL-Loader and Pipes

    I have to insert 300.000.000 datasets (generated dynamically by a seperate process) into an Oracle 8.1.6 database. I'd like to do this via SQL-Loader but since Linux (kernel 2.2 at least) supports only files up to 2Gigs and I don't have the necessary space on my machine to generate a large amount of ASCII files I'd like to ask whether there is a possibility to use an ordinary pipe in combination with SQL-Loader. I tried to use a named pipe but all my attempts failed.
    I don't like to use Perl and the DBI module since this would be far so slow (would take about one week I guess).
    Hans

    External tables are accessible from SQL, which generally simplifies life if the data files are physically located on the database server since you don't have to coordinate a call to an external SQL*Loader script with other PL/SQL processing. Under the covers, external tables are normally just invoking SQL*Loader.
    SQL*Loader is more appropriate if the data files are on a different server or if it is easier to call an executable rather than calling PL/SQL (i.e. if you have a batch file that runs on a server other than the database server that wants to FTP a data file from a FTP server and then load the data into Oracle).
    Justin

Maybe you are looking for