SQL Loader in .csh file (KSH)

Hi,
We have some scripts (.sh and .csh) in on our UNIX boxes, Oracle 10G. To load data in tables we call sqlldr. Interestingly our .sh files are working fine but not .csh script. If we set SID manually .csh script runs correctly. How do i set SID for .csh files. Here is the sample code that i have
setenv ORACLE_HOME /folder/oracle/product/10.2.0
setenv PATH {$PATH}:{$ORACLE_HOME/bin}
setenv LD_LIBRARY_PATH $ORACLE_HOME/lib
setenv TNS_ADMIN $work_root/common
here is the call
sqlldr $DB_LOGIN data=$PRCS_FILE control=$CTL_FILE log=$SQL_LDR_LOG
$DB_LOGIN contains db credentials in form username/password and same is accepted in .sh scripts. Let me know if you need more info.

Please post the exact scripts, and what you do to make it working.
The current information is not enough.
The listed part of the csh script does not set $DB_LOGIN.

Similar Messages

  • SQL*Loader Sequential Data File Record Processing?

    If I use the conventional path will SQL*Loader process a data file sequentially from top to bottom?  I have a file comprised of header and detail records with no value found in the detail records that can be used to relate to the header records.  The only option is to derive a header value via a sequence (nextval) and then populate the detail records with the same value pulled from the same sequence (currval).  But for this to work SQL*Loader must process the file in the exact same sequence that the data has been written to the data file.  I've read through the 11g Oracle® Database Utilities SQL*Loader sections looking for proof that this is what will happen but haven't found this information and I don't want to assume that SQL*Loader will always process the data file records sequentially.
    Thank you

    Oracle Support responded with the following statement.
    "Yes, SQL*LOADER process data file from top to bottom.
    This was touched in the note below:
    SQL*Loader - How to Load a Single Logical Record from Physical Records which Include Linefeeds (Doc ID 160093.1)"
    Jason

  • SQL loader and stream files with new line '

    I have been trying unsuccessfully to load EDI files using SQL loader. The problem
    is that the lines are terminated by ' and when I use the stream file option it does
    not recognise the line terminator given. As I understand it from the documentation
    this should work - but it does not. I have also used the Hex option with no better
    result. Does anyone have any ideas ?
    I can and have used tr "[']" "[\n]" in Unix to convert the ' to newlines - I just
    wonder am I missing something in SQL loader which will allow me to do this ?
    This is the sql loader control file
    LOAD DATA
    INFILE 'WS860685.MFD' "Str ''' "
    BADFILE 'WS860685.bad'
    DISCARDFILE 'WS860685.dsc'
    INTO TABLE "DUND1"."EDI_LOADED_TEMP"
    REPLACE
    FIELDS TERMINATED BY '+'
    TRAILING NULLCOLS
    (L1,
    L2,
    L3,
    L4,
    L5,
    L6,
    L7,
    L8,
    L9,
    L10,
    L11,
    L12,
    L13,
    L14,
    L15,
    L16,
    L17,
    L18,
    L19,
    L20,
    L21,
    L22,
    L23,
    L24,
    L25,
    L26,
    L27,
    L28,
    L29,
    L30,
    L31,
    L32,
    L33,
    L34,
    L35,
    L36,
    L37,
    L38,
    L39,
    L40,
    LNO)
    Heres a sample of the data
    UNB+UNOA:2+5398888501357+5398888501838+080306:0737+395+ DESADV+++1'UNH+0001+DESADV:D:93A:UN:EAN004'

    http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:6020061915147
    answers this question perfectly

  • SQL Loader and control file

    I know I've done this before but, I don't use SQL Loader often and I'm having issues getting a file to load.
    The table has 6 columns in it - one of which is a timestamp.
    I was having issues loading it initially with date format issues. I ruled out any issues with the timestamp format by simply loading a dummy table with some timestamp based data and had no issue.
    So - I think the issue is around the fact that the table I'm loading's first column being a value I'm attempting to default with a sequence when loading and - that I'm screwing up something there.
    Table is as such:
    CREATE TABLE ACS_IPS
        (seq_ips NUMBER,
        col2   VARCHAR2(100),
        col3  VARCHAR2(100),
        col4    VARCHAR2(100),
        col5 TIMESTAMP,
        col6    VARCHAR2(100),
        col7    VARCHAR2(100),
        col8 DATE  DEFAULT SYSDATE NOT NULL,
        col9 VARCHAR2(30) DEFAULT USER);The control file is:
    load data
    truncate
    into table acs_ips
    fields terminated by ","
    trailing nullcols
        seq_ips "seq_ips.nextval",
        col2,
        col3,
        col4,
        col5 TIMESTAMP "YYYY-MM-DD HH24:MI:SS.FF9",
        col6,
        col7
        )The sequence column isn't in the file being loaded... and - there are additional columns that are defaulted on the table that aren't in the control file.
    Any help is appreciated... The error I'm getting is:
    Rejected - Error on table CAMS.ACS_IPS, column ACTION_START.
    ORA-01841: (full) year must be between -4713 and +9999, and not be 0

    Yeah - not sure that clears it up...
    The sequence in my table is the first column - and - I think that's the problem... the sequence is being loaded as the first column in the control file but - it's not in the file being loaded so - it's skewing (again - I think??) the data being read in - which is why I'm getting the timestamp issues on the one column (it's actually reading the next column in the file vs. the actual timestamp one).
    If that's the issue - I'm not sure how to avoid it without restructuring the table to stick the sequence physically at the end. I'm certain that's not necessary and - I'm overlooking something that's otherwise simple but evading me.

  • SQL*Loader and multiple files

    Hello, am tasked with loading tables with 21+ million rows. Will SQL*Loader perform better with one large file or many smaller files? Is there any ideal file size for optimal performance? Thank you
    David

    Don, when I tried to loada 21M row table using direct, I get the following messages:
    Record 2373: Rejected - Error on table STAGE_CUSTOMER.
    ORA-03113: end-of-file on communication channel
    SQL*Loader-926: OCI error while uldlfca:OCIDirPathColArrayLoadStream for table STAGE_CUSTOMER
    SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
    SQL*Loader-925: Error while uldlgs: OCIStmtExecute (ptc_hp)
    ORA-03114: not connected to ORACLE
    SQL*Loader-925: Error while uldlgs: OCIStmtFetch (ptc_hp)
    ORA-24338: statement handle not executed
    Here is the SQL*Loader log:
    SQL*Loader: Release 9.2.0.1.0 - Production on Thu Apr 26 15:38:29 2007
    Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
    Control File: stage_customer.ctl
    Character Set UTF8 specified for all input.
    First primary datafile stage_Customer_20070301.csv has a
    utf8 byte order mark in it.
    Data File: stage_Customer_20070301.csv
    Bad File: stage_Customer_20070301.bad
    Discard File: none specified
    (Allow all discards)
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 1000
    Continuation: none specified
    Path used: Direct
    Silent options: FEEDBACK
    Table STAGE_CUSTOMER, loaded from every logical record.
    Insert option in effect for this table: APPEND
    TRAILING NULLCOLS option in effect
    Column Name Position Len Term Encl Datatype
    ROW_ID SEQUENCE (MAX, 1)
    CUSTOMER_ACCT_NUM FIRST * , O(") CHARACTER
    IS_DELETED NEXT * , O(") CHARACTER
    SQL string for column : "CASE WHEN UPPER(:Is_Deleted) IN ('FALSE','0') THEN 0 ELSE 1 END"
    NAME_PREFIX NEXT * , O(") CHARACTER
    FIRST_NAME NEXT * , O(") CHARACTER
    MIDDLE_NAME NEXT * , O(") CHARACTER
    LAST_NAME NEXT * , O(") CHARACTER
    NAME_SUFFIX NEXT * , O(") CHARACTER
    NICK_NAME NEXT * , O(") CHARACTER
    ALT_FIRST_NAME NEXT * , O(") CHARACTER
    ALT_LAST_NAME NEXT * , O(") CHARACTER
    MARKETING_SOURCE_ID NEXT * , O(") CHARACTER
    HOME_PHONE NEXT * , O(") CHARACTER
    WORK_PHONE NEXT * , O(") CHARACTER
    MOBILE_PHONE NEXT * , O(") CHARACTER
    ALTERNATE_PHONE NEXT * , O(") CHARACTER
    EMAIL_ADDR NEXT * , O(") CHARACTER
    ALT_EMAIL_ADDR NEXT * , O(") CHARACTER
    BIRTH_DATE NEXT * , O(") CHARACTER
    SQL string for column : "TRUNC(TO_DATE(:Birth_Date, 'MM/DD/YYYY HH24:MI:SS'))"
    SALES_CHANNEL_ID NEXT * , O(") CHARACTER
    SQL string for column : "decode(:Sales_Channel_id,NULL,NULL,NULL)"
    ASSOCIATE_NUMBER NEXT * , O(") CHARACTER
    ALT_ASSOCIATE_NUMBER NEXT * , O(") CHARACTER
    UPDATE_LOCATION_CD NEXT * , O(") CHARACTER
    UPDATE_DATE NEXT * , O(") CHARACTER
    SQL string for column : "TRUNC(TO_DATE(:Update_Date, 'MM/DD/YYYY HH24:MI:SS'))"
    CUSTOMER_LOGIN_NAME NEXT * , O(") CHARACTER
    DISCOUNT_CD NEXT * , O(") CHARACTER
    DISCOUNT_PERCENT NEXT * , O(") CHARACTER
    BUSINESS_NAME NEXT * , O(") CHARACTER
    POS_TAX_FLAG NEXT * , O(") CHARACTER
    POS_TAX_PROMPT NEXT * , O(") CHARACTER
    SQL string for column : "CASE WHEN UPPER(:POS_TAX_PROMPT) IN ('FALSE','0') THEN 0 ELSE 1 END"
    POS_DEFAULT_TAX_ID NEXT * , O(") CHARACTER
    POS_TAX_ID_EXPIRATION_DATE NEXT * , O(") CHARACTER
    POS_AUTHORIZED_USER_FLAG NEXT * , O(") CHARACTER
    SQL string for column : "CASE WHEN UPPER(:POS_AUTHORIZED_USER_FLAG) IN ('FALSE','0') THEN 0 ELSE 1 END"
    POS_ALLOW_PURCHASE_ORDER_FLAG NEXT * , O(") CHARACTER
    SQL string for column : "CASE WHEN UPPER(:POS_ALLOW_PURCHASE_ORDER_FLAG) IN ('FALSE','0') THEN 0 ELSE 1 END"
    ADDRESS1 NEXT * , O(") CHARACTER
    ADDRESS2 NEXT * , O(") CHARACTER
    ADDRESS3 NEXT * , O(") CHARACTER
    CITY NEXT * , O(") CHARACTER
    STATE_CD NEXT * , O(") CHARACTER
    POSTAL_CD NEXT * , O(") CHARACTER
    COUNTRY_CD NEXT * , O(") CHARACTER
    ALLOW_UPDATE NEXT * , O(") CHARACTER
    SQL string for column : "CASE WHEN UPPER(:ALLOW_UPDATE) IN ('FALSE','0') THEN 0 ELSE 1 END"
    ACTION_CODE NEXT * , O(") CHARACTER
    SQL string for column : "CASE WHEN UPPER(:action_code) IN ('FALSE','0') THEN 0 ELSE 1 END"
    ACCOUNT_TYPE_ID NEXT * , O(") CHARACTER
    LOCALE_CD NEXT * , O(") CHARACTER
    SQL string for column : "CASE WHEN :Locale_CD IS NOT NULL AND :Locale_CD LIKE '__-__' THEN :Locale_CD ELSE 'en-US' END"
    IS_READY_FOR_PROCESSING CONSTANT
    Value is '1'
    IS_BUSINESS CONSTANT
    Value is '0'
    HAD_ERRORS CONSTANT
    Value is '0'
    Record 2373: Rejected - Error on table STAGE_CUSTOMER.
    ORA-03113: end-of-file on communication channel
    SQL*Loader-926: OCI error while uldlfca:OCIDirPathColArrayLoadStream for table STAGE_CUSTOMER
    SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
    SQL*Loader-925: Error while uldlgs: OCIStmtExecute (ptc_hp)
    ORA-03114: not connected to ORACLE
    SQL*Loader-925: Error while uldlgs: OCIStmtFetch (ptc_hp)
    ORA-24338: statement handle not executed
    Table STAGE_CUSTOMER:
    0 Rows successfully loaded.
    1 Row not loaded due to data errors.
    0 Rows not loaded because all WHEN clauses were failed.
    0 Rows not loaded because all fields were null.
    Bind array size not used in direct path.
    Column array rows : 5000
    Stream buffer bytes: 256000
    Read buffer bytes: 1048576
    Total logical records skipped: 0
    Total logical records read: 3469
    Total logical records rejected: 1
    Total logical records discarded: 0
    Direct path multithreading optimization is disabled
    Run began on Thu Apr 26 15:38:29 2007
    Run ended on Thu Apr 26 15:38:30 2007
    Elapsed time was: 00:00:01.18
    CPU time was: 00:00:00.32

  • SQL Loader - CSV Data file with carraige returns and line fields

    Hi,
    I have a CSV data file with occasional carraige returns and line feeds in between, which throws my SQL loader script off. Sql loader, takes the characters following the carraige return as a new record and gives me error. Is there a way I could handle carraige returns and linefeeds in SQL Loader.
    Please help. Thank you for your time.
    This is my Sql Loader script.
    load data
    infile 'D:\Documents and Settings\user1\My Documents\infile.csv' "str '\r\n'"
    append
    into table MYSCHEMA.TABLE1
    fields terminated by ','
    OPTIONALLY ENCLOSED BY '"'
    trailing nullcols
    ( NAME CHAR(4000),
    field2 FILLER,
    field3 FILLER,
    TEST DEPT CHAR(4000)
    )

    You can "regexp_replace" the columns for special characters

  • SQL*Loader with multiple files

    Gurus,
    I search the documentation and this forum and haven't found a solution to my issue yet...
    I am not expert of SQL*Loader. I have used SQL*Loader to copy from one file to a table many times. But I have not copied multiple files into one table especially with different names.
    More specifically....
    I need to load data from multiple files into a table. But the file names will be different each time. A file will be created every hour. The file name will consist of the root file name appended by a time stamp. For example, a file created on 10/07/2010 at 2:15 P.M. would be filea100720101415.txt while a file created on 10/08/2010 at 8:15 A.M. would be filea100820100815.txt. All the files will be in one directory.How can I load the data from the files using SQL*Loader?
    My database: Oracle 10g Release 2
    Operating System: Windows 2003 Server
    Please assist.
    Robert

    sect55 wrote:
    Gurus,
    I search the documentation and this forum and haven't found a solution to my issue yet...
    I am not expert of SQL*Loader. I have used SQL*Loader to copy from one file to a table many times. But I have not copied multiple files into one table especially with different names.
    More specifically....
    I need to load data from multiple files into a table. But the file names will be different each time. A file will be created every hour. The file name will consist of the root file name appended by a time stamp. For example, a file created on 10/07/2010 at 2:15 P.M. would be filea100720101415.txt while a file created on 10/08/2010 at 8:15 A.M. would be filea100820100815.txt. All the files will be in one directory.How can I load the data from the files using SQL*Loader?
    My database: Oracle 10g Release 2
    Operating System: Windows 2003 Server
    Please assist.
    RobertToo bad this isn't in *nix, where you get a powerful shell scripting capability. 
    That said, here is the core of the solution .... you will also need a way to identify files that have been processed vs. new ones. Maybe rename them, maybe move them. But with this sample you can see the basics. From there it is really an issue of DOS scripting, which would better be found by googling around a bit.
    cd c:\loadfiles
    FOR %%datfile IN (*.txt) DO SQLLDR CONTROL=sample.ctl, LOG=sample.log, BAD=baz.bad, DATA=%%datfileTry googling "dos scripting language". You'll find lots of tutorials and ideas on "advanced" (well, as advanced as DOS gets) techniques to solve your problem.
    Edited by: EdStevens on Dec 1, 2010 5:03 PM

  • Problem import csv file with SQL*loader and control file

    I have a *csv file looking like this:
    E0100070;EKKJ 1X10/10 1 KV;1;2003-06-16;01C;75
    E0100075;EKKJ 1X10/10 1 KV;500;2003-06-16;01C;67
    E0100440;EKKJ 2X2,5/2,5 1 KV;1;2003-06-16;01C;37,2
    E0100445;EKKJ 2X2,5/2,5 1 KV;500;2003-06-16;01C;33,2
    E0100450;EKKJ 2X4/4 1 KV;1;2003-06-16;01C;53
    E0100455;EKKJ 2X4/4 1 KV;500;2003-06-16;01C;47,1
    I want to import this csv file to this table:
    create table artikel (artnr varchar2(10), namn varchar2(25), fp_storlek number, datum date, mtrlid varchar2(5), pris number);
    My controlfile looks like this:
    LOAD DATA
    INFILE 'e:\test.csv'
    INSERT
    INTO TABLE ARTIKEL
    FIELDS TERMINATED BY ';'
    TRAILING NULLCOLS
    (ARTNR, NAMN, FP_STORLEK char "to_number(:fp_storlek,'99999')", DATUM date 'yyyy-mm-dd', MTRLID, pris char "to_number(:pris,'999999D99')")
    I cant get sql*loader to import the last column(pris) as I want. It ignore my decimal point which in this case is "," and not "." maybe this is the problem. If the decimal point is the problem how can I get oracle to recognize "," as a decimal point??
    the result from the import now, is that a decimal number (37,2) becomes 372 in the table

    Set NLS_NUMERIC_CHARACTERS environment variable at OS level, before running SqlLoader :
    $ cat test.csv
    E0100070;EKKJ 1X10/10 1 KV;1;2003-06-16;01C;75
    E0100075;EKKJ 1X10/10 1 KV;500;2003-06-16;01C;67
    E0100440;EKKJ 2X2,5/2,5 1 KV;1;2003-06-16;01C;37,2
    E0100445;EKKJ 2X2,5/2,5 1 KV;500;2003-06-16;01C;33,2
    E0100450;EKKJ 2X4/4 1 KV;1;2003-06-16;01C;53
    E0100455;EKKJ 2X4/4 1 KV;500;2003-06-16;01C;47,1
    $ cat artikel.ctl
    LOAD DATA
    INFILE 'test.csv'
    replace
    INTO TABLE ARTIKEL
    FIELDS TERMINATED BY ';'
    TRAILING NULLCOLS
    (ARTNR, NAMN, FP_STORLEK char "to_number(:fp_storlek,'99999')", DATUM date 'yyyy-mm-dd', MTRLID, pris char "to_number(:pris,'999999D99')")
    $ sqlldr scott/tiger control=artikel
    SQL*Loader: Release 10.1.0.3.0 - Production on Sat Nov 12 15:10:01 2005
    Copyright (c) 1982, 2004, Oracle.  All rights reserved.
    Commit point reached - logical record count 6
    $ sqlplus scott/tiger
    SQL*Plus: Release 10.1.0.3.0 - Production on Sat Nov 12 15:10:11 2005
    Copyright (c) 1982, 2004, Oracle.  All rights reserved.
    Connected to:
    Oracle Database 10g Enterprise Edition Release 10.1.0.3.0 - Production
    With the Partitioning, OLAP and Data Mining options
    SQL> select * from artikel;
    ARTNR      NAMN                      FP_STORLEK DATUM      MTRLI       PRIS
    E0100070   EKKJ 1X10/10 1 KV                  1 16/06/2003 01C           75
    E0100075   EKKJ 1X10/10 1 KV                500 16/06/2003 01C           67
    E0100440   EKKJ 2X2,5/2,5 1 KV                1 16/06/2003 01C          372
    E0100445   EKKJ 2X2,5/2,5 1 KV              500 16/06/2003 01C          332
    E0100450   EKKJ 2X4/4 1 KV                    1 16/06/2003 01C           53
    E0100455   EKKJ 2X4/4 1 KV                  500 16/06/2003 01C          471
    6 rows selected.
    SQL> exit
    Disconnected from Oracle Database 10g Enterprise Edition Release 10.1.0.3.0 - Production
    With the Partitioning, OLAP and Data Mining options
    $ export NLS_NUMERIC_CHARACTERS=',.'
    $ sqlldr scott/tiger control=artikel
    SQL*Loader: Release 10.1.0.3.0 - Production on Sat Nov 12 15:10:41 2005
    Copyright (c) 1982, 2004, Oracle.  All rights reserved.
    Commit point reached - logical record count 6
    $ sqlplus scott/tiger
    SQL*Plus: Release 10.1.0.3.0 - Production on Sat Nov 12 15:10:45 2005
    Copyright (c) 1982, 2004, Oracle.  All rights reserved.
    Connected to:
    Oracle Database 10g Enterprise Edition Release 10.1.0.3.0 - Production
    With the Partitioning, OLAP and Data Mining options
    SQL> select * from artikel;
    ARTNR      NAMN                      FP_STORLEK DATUM      MTRLI       PRIS
    E0100070   EKKJ 1X10/10 1 KV                  1 16/06/2003 01C           75
    E0100075   EKKJ 1X10/10 1 KV                500 16/06/2003 01C           67
    E0100440   EKKJ 2X2,5/2,5 1 KV                1 16/06/2003 01C         37,2
    E0100445   EKKJ 2X2,5/2,5 1 KV              500 16/06/2003 01C         33,2
    E0100450   EKKJ 2X4/4 1 KV                    1 16/06/2003 01C           53
    E0100455   EKKJ 2X4/4 1 KV                  500 16/06/2003 01C         47,1
    6 rows selected.
    SQL>                                                                            Control file is exactly as yours, I just put replace instead of insert.

  • Sql loader maximum data file size..?

    Hi - I wrote sql loader script runs through shell script which will import data into table from CSV file. CSV file size is around 700MB. I am using Oracle 10g with Sun Solaris 5 environment.
    My question is, is there any maximum data file size. The following code from my shell script.
    SQLLDR=
    DB_USER=
    DB_PASS=
    DB_SID=
    controlFile=
    dataFile=
    logFileName=
    badFile=
    ${SQLLDR} userid=$DB_USER"/"$DB_PASS"@"$DB_SID \
              control=$controlFile \
              data=$dataFile \
              log=$logFileName \
              bad=$badFile \
              direct=true \
              silent=all \
              errors=5000Here is my control file code
    LOAD DATA
    APPEND
    INTO TABLE KEY_HISTORY_TBL
    WHEN OLD_KEY <> ''
    AND NEW_KEY <> ''
    FIELDS TERMINATED BY ','
    TRAILING NULLCOLS
            OLD_KEY "LTRIM(RTRIM(:OLD_KEY))",
            NEW_KEY "LTRIM(RTRIM(:NEW_KEY))",
            SYS_DATE "SYSTIMESTAMP",
            STATUS CONSTANT 'C'
    )Thanks,
    -Soma
    Edited by: user4587490 on Jun 15, 2011 10:17 AM
    Edited by: user4587490 on Jun 15, 2011 11:16 AM

    Hello Soma.
    How many records exist in your 700 MB CSV file? How many do you expect to process in 10 minutes? You may want to consider performing a set of simple unit tests with 1) 1 record, 2) 1,000 records, 3) 100 MB filesize, etc. to #1 validate that your shell script and control file syntax function as expected (including the writing of log files, etc.), and #2 gauge how long the processing will take for the full file.
    Hope this helps,
    Luke
    Please mark the answer as helpful or answered if it is so. If not, provide additional details.
    Always try to provide actual or sample statements and the full text of errors along with error code to help the forum members help you better.

  • SQL LOADER LPAD CONTROL FILE QUESTION

    Hi, inthe flat file
    company_cd is "1" or "01"
    and center_cd is 3 digits... ex: "493"
    in the table the userid coulmn should be 6 digits
    currently i am getting this as userid in the table
    "010493" is right, because company_cd is "01"
    "10493" is not right, because company_cd is "1"
    if company_cd is 2 digits(01) i am getting 6 digits userid which is OK
    but when company_cd is singile digit(1) i am getting 5 digits userid
    I NEED TO LPAD with 0 in the front when company_cd is "1"any suggetions ???????
    ***********This is the code i am using currently in the CTL file for userid**********
    ,USERID "CONCAT(substr(trim(:company_cd),1,2),lpad(trim(:center_cd),4,0))"
    .......Thank You..........
    Edited by: phani_Marella on Aug 28, 2012 11:12 AM

    Now where does company 'coz' come from all of a sudden?
    I'm sure you read {message:id=9360002} , hence my confusion.
    Anyway, the SQL*Loader forum is @ Export/Import/SQL Loader & External Tables

  • Calling SQL Loader with Dynamic file names

    HI all
    I woul like to know if I can call sql loader as below
    $ sqlldr userid=uname/pwd control=new.ctl, data=$1
    I have to schedule my loader 3 times a day and each time my file names are different(AAA_BBB_timestamp)
    Thanks

    I have found a solution myself and if any one is interested its like this
    for file in `ls -1 /opt/user/from/`
    do
    sqlldr userid=user/pwd@connect_string control=control_file.ctl data="/opt/user/from//$file"
    done
    Ramu

  • SQL Loader reads Data file Sequentially or Randomly?

    Will SQL Loader loads the data read from file Sequentially or Randomly?
    I have the data file like the below
    one
    two
    three
    four
    and my control file is
    LOAD DATA
    INFILE *
    TRUNCATE
    INTO TABLE T TRAILING NULLCOLS
    x RECNUM,
    y POSITION (1:4000)
    so my table will be polulated like
    X Y
    1 one
    2 Two
    3 Three
    4 Four
    Will this happend sequentially even for the large data sets? say i have from one to one million datas in my data files.
    Please clarify.
    Thanks,
    Rajesh.

    SQL Loader may read the file sequentially, but you should not rely on the physical ordering of the rows in the table.
    It looks like that's what you were hinting at.

  • SQL Loader Inserting Log File Statistics to a table

    Hello.
    I'm contemplating how to approach gathering the statistics from the SQL Loader log file to insert them into a table. I've approached this from a Korn Shell Script perspective previously, but now that I'm working in a Windows environment and my peers aren't keen about batch files and scripting I thought I'd attempt to use SQL Loader itself to read the log file and insert one or more records into a table that tracks data uploads. Has anyone created a control file that accomplishes this?
    My current environment:
    Windows 2003 Server
    SQL*Loader: Release 10.2.0.1.0
    Thanks,
    Luke

    Hello.
    Learned a little about inserting into multiple tables with delimited records. Here is my current tested control file:
    LOAD DATA
    APPEND
    INTO TABLE upload_log
    WHEN (1:12) = 'SQL*Loader: '
    FIELDS TERMINATED BY WHITESPACE
    TRAILING NULLCOLS
    (  upload_log_id   RECNUM
    , filler_field_0  FILLER
    , filler_field_1  FILLER
    , filler_field_2  FILLER
    , filler_field_3  FILLER
    , filler_field_4  FILLER
    , filler_field_5  FILLER
    , day_of_week
    , month
    , day_of_month
    , time_of_day
    , year
    , log_started_on          "TO_DATE((:month ||' '|| :day_of_month ||' '|| :time_of_day ||' '|| :year), 'Mon DD HH24:MI:SS YYYY')"
    INTO TABLE upload_log
    WHEN (1:11) = 'Data File: '
    FIELDS TERMINATED BY ':'
    (  upload_log_id    RECNUM
    , filler_field_0   FILLER  POSITION(1)
    , input_file_name          "TRIM(:input_file_name)"
    INTO TABLE upload_log
    WHEN (1:6) = 'Table '
    FIELDS TERMINATED BY WHITESPACE
    (  upload_log_id   RECNUM
    , filler_field_0  FILLER  POSITION(1)
    , table_name              "RTRIM(:table_name, ',')"
    INTO TABLE upload_rejects
    WHEN (1:7) = 'Record '
    FIELDS TERMINATED BY ':'
    (  upload_rejects_id  RECNUM
    , record_number      POSITION(1)  "TO_NUMBER(SUBSTR(:record_number,8,20))"
    , reason
    INTO TABLE upload_rejects
    WHEN (1:4) = 'ORA-'
    FIELDS TERMINATED BY ':'
    (  upload_rejects_id  RECNUM
    , error_code         POSITION(1)
    , error_desc
    INTO TABLE upload_log
    WHEN (1:22) = 'Total logical records '
    FIELDS TERMINATED BY WHITESPACE
    (  upload_log_id      RECNUM
    , filler_field_0     FILLER  POSITION(1)
    , filler_field_1     FILLER
    , filler_field_2     FILLER
    , action                     "RTRIM(:action, ':')"
    , number_of_records
    INTO TABLE upload_log
    WHEN (1:13) = 'Run began on '
    FIELDS TERMINATED BY WHITESPACE
    TRAILING NULLCOLS
    (  upload_log_id   RECNUM
    , filler_field_0  FILLER  POSITION(1)
    , filler_field_1  FILLER
    , filler_field_2  FILLER
    , day_of_week
    , month
    , day_of_month
    , time_of_day
    , year
    , run_began_on            "TO_DATE((:month ||' '|| :day_of_month ||' '|| :time_of_day ||' '|| :year), 'Mon DD HH24:MI:SS YYYY')"
    INTO TABLE upload_log
    WHEN (1:13) = 'Run ended on '
    FIELDS TERMINATED BY WHITESPACE
    TRAILING NULLCOLS
    (  upload_log_id   RECNUM
    , filler_field_0  FILLER  POSITION(1)
    , filler_field_1  FILLER
    , filler_field_2  FILLER
    , day_of_week
    , month
    , day_of_month
    , time_of_day
    , year
    , run_ended_on            "TO_DATE((:month ||' '|| :day_of_month ||' '|| :time_of_day ||' '|| :year), 'Mon DD HH24:MI:SS YYYY')"
    INTO TABLE upload_log
    WHEN (1:18) = 'Elapsed time was: '
    FIELDS TERMINATED BY ':'
    (  upload_log_id   RECNUM
    , filler_field_0  FILLER  POSITION(1)
    , filler_field_1  FILLER
    , filler_field_2  FILLER
    , elapsed_time
    INTO TABLE upload_log
    WHEN (1:14) = 'CPU time was: '
    FIELDS TERMINATED BY ':'
    (  upload_log_id   RECNUM
    , filler_field_0  FILLER  POSITION(1)
    , filler_field_1  FILLER
    , filler_field_2  FILLER
    , cpu_time
    )Here are the basic table create scripts:
    TRUNCATE TABLE upload_log;
    DROP TABLE upload_log;
    CREATE TABLE upload_log
    (  upload_log_id      INTEGER
    , day_of_week        VARCHAR2(  3)
    , month              VARCHAR2(  3)
    , day_of_month       INTEGER
    , time_of_day        VARCHAR2(  8)
    , year               INTEGER
    , log_started_on     DATE
    , input_file_name    VARCHAR2(255)
    , table_name         VARCHAR2( 30)
    , action             VARCHAR2( 10)
    , number_of_records  INTEGER
    , run_began_on       DATE
    , run_ended_on       DATE
    , elapsed_time       VARCHAR2(  8)
    , cpu_time           VARCHAR2(  8)
    TRUNCATE TABLE upload_rejects;
    DROP TABLE upload_rejects;
    CREATE TABLE upload_rejects
    (  upload_rejects_id  INTEGER
    , record_number      INTEGER
    , reason             VARCHAR2(255)
    , error_code         VARCHAR2(  9)
    , error_desc         VARCHAR2(255)
    );Now, if I could only insert a single record to the upload_log table (per table logged); adding separate columns for skipped, read, rejected, discarded quantities. Any advice on how to use SQL Loader to do this (writing a procedure would be fairly simple, but I'd like to perform all of the work in one place if at all possible)?
    Thanks,
    Luke
    Edited by: Luke Mackey on Nov 12, 2009 4:28 PM

  • Does sql*loader supports .prg file.

    Hi All,
    I have a requirement to migrate data from .prg file (clipper) to Oracle 10g . What option i am having in my mind is to use SQL*LOADER to load data from .prg file to 10g . My query is . Is it possible to use .prg file in SQL*LOADER.

    ya i also but is there any other way .

  • SQL loader with *.csv file

    Good morning,
    I have a *.CVS file containing data like this:
    A123456789,Ah Tong Station,Jalan Dungun
    I would like insert the data into a table that returns 1 row for each value
    id outlet_name addr_1A123456789 Ah Tong Station Jalan Dungun
    etc
    In the stored procedure, SQL loader, I used
    Insert into XXXXX values
    (SUBSTR(input_buffer, 1, 10),
    SUBSTR(input_buffer, 11, 60),
    SUBSTR(input_buffer, 61, 110);
    but the problem is that insertion of the outlet_name and addr_1 will not follow the subcripts that provided. Instead in gave an output like that:
    id outlet_name addr_1
    A123456789 ,Ah Tong Station,Jalan Dungun NULL
    Hope to get some advice asap. Thanks. :)
    Pauline

    Why not use the SQL*Loader proper to load this in to your table? It will be significantly faster than anything you could do in PL/SQL, especially if your volumes increase. And it will take care of the parsing for you, as long as you tell it the field delimiter. You've already seen that you can't rely on fixed widths when using delimited data.
    Given this data...
    A123456789,Ah Tong Station,Jalan Dungun
    A234,Some Station,Some Place...the following SQL*Loader control file ( stored in a file called mytable.ctl ) will load your table ( note I've used truncate - you could also append )...
    load data
    infile 'mydata.csv'
    truncate
    into table mytable
    fields terminated by ','
    (  id
    ,  outlet_name
    ,  addr_1
    )This is invoked using the following:-
    sqlldr userid/password@tns control=mytable.ctlThe results are:-
    SQL> select * from mytable;
    ID                             OUTLET_NAME                    ADDR_1
    A123456789                     Ah Tong Station                Jalan Dungun
    A234                           Some Station                   Some PlaceYou will find this much easier. Of course, you could always avoid the database altogether if you don't need to actually store this data. You could just parse the source .csv ( if you are on UNIX this will be really easy using an awk one-liner ) and write it to an output file.
    Anyway, hope this helps.
    Regards
    Adrian

Maybe you are looking for

  • My ipod needs to be restore and I get a error message saying :ipod software update server coud not be contacted.

    My ipod needs to be restore and I get a error message saying :ipod software update server coud not be contacted.

  • Reduce File Size doesn't work!

    When I click the (new) option 'Reduce File Size' in Keynote '09 the app prompts a estimated new size which is smaller than the current size. When I click on 'Reduce' the app doesn't do anything except for showing an error report which tells me someth

  • Update

    how can i update more than one column at a time . i need the query i have a problem with apply changes button in the form can any body help me out .

  • Can't complete setup after setting wi fi

    Hello, I'm new to apple products and we just got a new Ipad2 from the institution. I was setting the thing up but when I got to the wi fi connection part I connected to a wireless network here (our wireless network is not very good but it managed to

  • ALL StoredProcedures not working in SSIS

    Greetings, I whipped up a quick SSIS package to run.  Within this I created a storedProcedure to execute in an OLE DB Command.  After doing this and trying to map the columns I keep getting the following errors: -"The metadata could not be determined