SQL Loader Bad File - Line Number

Is there a way to get line number of records in the bad file or discard file?
Thanks,

Hi,
I think you get the line number of the rows failed to load in the log file. The failing rows are the rows you will find in the bad file.
I don't know if the log contains the line numbers of discarded rows.
Liron Amitzi
Senior DBA consultant
[www.dbsnaps.com]
[www.obiumsoftware.com]

Similar Messages

  • Sql loader Bad file

    I am calling a sqlloader script from unix script. I have mentioned in unix script for bad file as
    BADFILE="/web/local/orderlink/linkload/logs/full_prc_stg_bad.log"and in CTL file I have not mentioned any bad file path. I am calling 30 flat files in that CTL file like
    LOAD DATA
    INFILE '/shared/export/prc_stg1.dat'
    INFILE '/shared/export/prc_stg2.dat'
    INFILE '/shared/export/prc_stg30.dat'
    INTO TABLE L_DRUG_ITEM_PRC_STG_SNGLbut when I run the script, It create log files as
    Data File:      /shared/export/prfl1.dat
      *Bad File:     /web/local/orderlink/linkload/logs/full_profile_load_bad.log*
      Discard File:  none specified
    (Allow all discards)
    Data File:      /shared/export/prfl2.dat
      *Bad File:     /web/local/orderlink/linkload/sql_ctl/prfl2.bad*
      Discard File:  none specified
    (Allow all discards)
    Data File:      /shared/export/prfl3.dat
      *Bad File:     /web/local/orderlink/linkload/sql_ctl/prfl3.bad*
      Discard File:  none specified
    (Allow all discards)
    Data File:      /shared/export/prfl4.dat
      Bad File:     /web/local/orderlink/linkload/sql_ctl/prfl4.bad
      Discard File:  none specifiedFor first file, it is creating bad file at specified location of unix script. but In rest of 29 files, it is creating bad file at the location where CTL file kept. How to change this so that it can point to same folder which is mentioned in unix script, like for first script bad file.

    For Bad file give extension *.bad and for log give *.log
    One thing i want to knw
    Have you given privilages to all files in unix?
    Thanks,
    Neerav

  • Passing file name dynamically to sql loader ctl file

    Hi,
    I am new to scripting and I have a complex requirement involving writing a script.
    Requierment:
    I need to upload a CSV file from a FTP server into oracle table using SQL Loader. The file name resembles like APF0912312359.csv represents 31-DEC-2009 23:59 and there will be multiple files in same day indicated by differnt timestamp as its filename. Once I load a file using SQL loader, I need to move the file to another directory for archival.
    Since the sql loader ctl file has a fixed file name, I would like to know how I can pass the file name dynamically to ctl file to load a new file every time it runs.
    Any help is greatly appreciated..
    Bye
    Sudheer
    Edited by: user2138419 on Oct 28, 2009 4:08 PM

    I agree with Pradeep in regards to declaring the file names on the command line. However, I do have a concern regarding presenting the password on the command line as any user that can issue a ps (assuming Unix ~= Linux here) would be able to read it while the sqlldr command is running.
    Unfortunately, you may not always have the option of declaring the files on the command line. I was recently challenged with this in a Windows 2003 Server environment running SQL*Loader: Release 10.2.0.1.0. In this environment I found that I am able to set a variable file name in a calling batch file and use that value in the control file successfully. Just to demonstrate the approach:
    Batch file:
    set IN_FILE=’c:\inbound\load_me.txt’
    sqlldr user/pswd@db PARFILE=’c:\parameters.txt’
    Parameter file:
    errors=500000
    rows=50000
    control=%CTL_FILE%
    bad=%BAD_FILE%
    discard=%DSC_FILE%
    log=%LOG_FILE%
    Control file:
    LOAD DATA
    INFILE ‘%IN_FILE%’
    INSERT
    INTO TABLE table_to_be_loaded
    I’m really interested to see if this would work on Unix.
    -Luke

  • Create sql loader data file dynamically

    Hi,
    I want a sample program/approach which is used to create a sql loader data file.
    The program will read table name as i/p and will use
    select stmt will column list derived from user_tab_columns from data dictionary
    assuming multiple clob columns in the column list.
    Thanks
    Manoj

    I 'm writing clob and other columns to a sql loader dat file.
    Below sample code for writing clob column is giving file write error.
    How can I write multiple clobs to dat file so that control file will handle it correctly
    offset NUMBER := 1;
    chunk VARCHAR2(32000);
    chunk_size NUMBER := 32000;
    WHILE( offset < dbms_lob.getlength(l_rec_type.narrative) )
    LOOP
    chunk := dbms_lob.substr(l_rec_type.narrative, chunk_size, offset );
    utl_file.put( l_file_handle, chunk );
         utl_file.fflush(l_file_handle);
    offset := offset + chunk_size;
    END LOOP;
         utl_file.new_line(l_file_handle);

  • How to call sql loader ctrl file with in the pl/sql procedure

    Hi Friends,
    I am doing a project related with the transferring data using queues. In the queue , I will get a tab delimited data in the form of CLOB variable/message. I do want to store that dat in the oracle table.
    When updating data into the table ,
    1. Don't want to write that data into a file.( Want to access directly after dequeueing from the specfic queue).
    2. As the data is in tab delimited form, I want to use sql loader concept.
    How do I call the sql loader ctrl file with in my pl/sql procedure. When I searched , most of the forums recommending external procedure or Java program.
    Please Guide me on this issue. my preferrence is pl sql, But don't know about external procedure . If no other way , I will try Java.
    I am using oracle 9.2.0.8.0.
    Thanks in advance,
    Vimal..

    Neither SQL*Loader nor external tables are designed to read data from a CLOB stored in the database. They both work on files stored on the file system. If you don't want the data to be written to a file, you're going to have to roll your own parsing code. This is certainly possible. But it is going to be far less efficient than either SQL*Loader or external tables. And it's likely to involve quite a bit more code.
    The simplest possible thing that could work would be to use something like Tom Kyte's string tokenization package to read a line from the CLOB, break it into the component pieces, and then store the different tokens in a meaningful collection (i.e. an object type or a record type that corresponds to the table definition). Of course, you'll need to handle things like converting strings to numbers or dates, rejecting rows, writing log files, etc.
    Justin

  • Define variable in SQL Loader Control File

    Hi,
    I have an input file where the first line is the header record, followed by the detail records. For the processing, I do not need to store the fields of the header record but I need a date field from this header record and store in as part of the detail record in an oracle record row.
    Is it possible to define a variable within the sql loader control file to store the value that I need in memory then use it when I do the sql insert by the sql loader?
    Thanks for any advice.

    Not sure that you can. But if your on unix/linux/mac its easy enough to write a shell script to populates the variables in a template file that you can then use as the ctl file. The perl template toolkit could be an option for that as well

  • Way to generate sql*loader ctl file from a table?

    I'm an oracle newbie. (Oracle 8i, HP Unix)Is there any way to take an existing table
    description and generate a sql*loader control file from it? If anyone has already written a procedure or knows where one can be found I'd really appreciate it. We're doing a mass conversion to oracle and would like an easy way to re-write all our loads.
    Thanks! Eileen from Polaroid.

    Hi,
    I have shell program, which will get column names from system table and create temp. control file and call sqlloader exe. and load the data automatically.
    You can customise this file, according to your needs.
    Shell Program
    if [ $# -ne 2 ]
    then
    echo " Usage : $0 table_name flat file name"
    exit
    fi
    #assigning envir. variable to unix variables
    table_name=$1
    flat_file=$2
    #creating the control file
    echo "LOAD DATA" > $table_name.ctl
    echo "INFILE '$flat_file'" >> $table_name.ctl
    echo "into table $table_name " >> $table_name.ctl
    echo "fields terminated by '\t'" >> $table_name.ctl
    #calling functions for making column name part
    #describing the table and spooling into file
    sqlplus -s $CUST_ORA_USER << sql_block
    spool $table_name.lst
    desc $table_name
    spool off
    sql_block
    # creating suitable file and add the feilds into control file
    # cutting the first line (headings)
    tail +3 $table_name.lst > temp
    rm -f $table_name.lst
    k=`wc -l < temp`
    k1=`expr $k - 1`
    #cutting the last line
    head -$k1 temp > tempx
    record_count=`wc -l < tempx`
    counter=1
    echo "(" > wxyz.ctl
    # reading file line by line
    cat tempx | while in_line=`line`
    do
    #cutting the first field
    field_name=`echo $in_line | cut -f1 -d' '`
    #calculating the no of characters
    word_cnt=`echo $in_line | wc -w`
    #calculating count in a line
    if [ $word_cnt = 2 ]
    then
    data_type=`echo $in_line | cut -f2 -d' ' | cut -f1 -d'('`
    if [ "$data_type" = "DATE" ]
    then
    data_fmt="DECODE(LENGTH(LTRIM(RTRIM(:$field_name))),'11',to_date(ltrim(rtrim(:$field_name)),'dd-mon-yyyy'),'9',to_date(ltrim(rtrim(:$field_name)),'dd-mm-yy'),'10',to_date(ltrim(rtr im(:$field_name)),'dd-mon-yy'),'yyyy/mm/dd hh24:mi:ss')"
    elif [ "$data_type" = "CHAR" ]
    then
    data_fmt="NVL(RTRIM(LTRIM(:$field_name)),' ')"
    elif [ "$data_type" = "VARCHAR2" ]
    then
    data_fmt="NVL(RTRIM(LTRIM(:$field_name)),' ')"
    else
    data_fmt="NVL(:$field_name,0) "
    fi
    else
    data_type=`echo $in_line | cut -f4 -d' ' | cut -f1 -d'('`
    if [ "$data_type" = "DATE" ]
    then
    data_fmt="DECODE(LENGHTH(LTRIM(RTRIM(:$field_name))),'11',to_date(ltrim(rtrim(:$field_name)),'dd-mon-yyyy'),'9',to_date(ltrim(rtrim(:$field_name)),'dd-mm-yy'),'10',to_date(ltrim(rt rim(:$field_name)),'dd-mon-yy'),'yyyy/mm/dd hh24:mi:ss')"
    elif [ "$data_type" = "CHAR" ]
    then
    data_fmt="NVL(RTRIM(LTRIM(:$field_name)),' ')"
    elif [ "$data_type" = "VARCHAR2" ]
    then
    data_fmt="NVL(RTRIM(LTRIM(:$field_name)),' ')"
    else
    data_fmt="NVL(:$field_name,0) "
    fi
    fi
    #if last line put );
    #else ,
    if test $record_count -eq $counter
    then
    echo $field_name \"$data_fmt\"");" >> wxyz.ctl
    else
    echo $field_name \"$data_fmt\""," >> wxyz.ctl
    fi
    #counter increamenting for each record
    counter=`expr $counter + 1`
    done
    #removing the file
    rm -f temp tempx
    cat wxyz.ctl >> $table_name.ctl
    rm -f x.ctl
    #calling the SQLLOADER
    SQLLDR $CUST_ORA_USER CONTROL=$table_name.ctl ERROR=99999999
    #removing the control file
    rm -f $table_name.ctl

  • Problem specifying SQL Loader Log file destination using EM

    Good evening,
    I am following the example given in the 2 Day DBA document chapter 8 section 16.
    In step 5 of 7, EM does not allow me to specify the destination of the SQL Loader log file to be on a mapped network drive.
    The question: Does SQL Loader have a limitation that I am not aware of, that prevents placing the log file on a network share or am I getting this error because of something else I am inadvertently doing wrong ?
    Note: I have placed the DDL, load file data and steps I follow in EM at the bottom of this post to facilitate reproducing the problem *(drive Z is a mapped drive)*.
    Thank you for your help,
    John.
    DDL (generated using SQL developer, you may want to change the space allocated to be less)
    CREATE TABLE "NICK"."PURCHASE_ORDERS"
        "PO_NUMBER"      NUMBER NOT NULL ENABLE,
        "PO_DESCRIPTION" VARCHAR2(200 BYTE),
        "PO_DATE" DATE NOT NULL ENABLE,
        "PO_VENDOR" NUMBER NOT NULL ENABLE,
        "PO_DATE_RECEIVED" DATE,
        PRIMARY KEY ("PO_NUMBER") USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS NOCOMPRESS LOGGING TABLESPACE "USERS" ENABLE
      SEGMENT CREATION DEFERRED PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING STORAGE
        INITIAL 67108864
      TABLESPACE "USERS" ;
    Load.dat file contents
    1, Office Equipment, 25-MAY-2006, 1201, 13-JUN-2006
    2, Computer System, 18-JUN-2006, 1201, 27-JUN-2006
    3, Travel Expense, 26-JUN-2006, 1340, 11-JUL-2006
    Steps I am carrying out in EM
    log in, select data movement -> Load Data from User Files
    Automatically generate control file
    (enter host credentials that work on your machine)
    continue
    Step 1 of 7 ->
      Data file is located on your browser machine
      "Z:\Documentation\Oracle\2DayDBA\Scripts\Load.dat"
       click next
    step 2 of 7 ->
      Table Name
      nick.purchase_orders
      click next
    step 3 of 7 ->
      click next
    step 4 of 7 ->
      click next
    step 5 of 7 ->
      Generate log file where logging information is to be stored
      Z:\Documentation\Oracle\2DayDBA\Scripts\Load.LOG
      Validation Error
      Examine and correct the following errors, then retry the operation:
      LogFile - The directory does not exist.

    Hi John,
    But, i did'nt found any error when i am going the same what you did.
    My Oracle Version is 10.2.0.1 and using Windows xp. See what i did and i got worked
    1.I created one table in scott schema :
    SCOTT@orcl> CREATE TABLE "PURCHASE_ORDERS"
      2  (
      3      "PO_NUMBER"      NUMBER NOT NULL ENABLE,
      4      "PO_DESCRIPTION" VARCHAR2(200 BYTE),
      5      "PO_DATE" DATE NOT NULL ENABLE,
      6      "PO_VENDOR" NUMBER NOT NULL ENABLE,
      7      "PO_DATE_RECEIVED" DATE,
      8      PRIMARY KEY ("PO_NUMBER") USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS NOCOMPRESS LOGGING TABLESPACE "USERS" ENABLE
      9  )
    10  TABLESPACE "USERS";
    Table created.I logged into em Maintenance-->Data Movement-->Load Data from User Files-->My Host Credentials
    Here i total 3 text boxes :
    1.Server Data File : C:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\USERS01.DBF
    2.Data File is Located on Your Browser Machine : z:\load.dat <--- Here z:\ means other machine's shared doc folder; and i selected this option (as option button click) and i created the same load.dat as you mentioned.
    3.Temporary File Location : C:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\ <--- I did'nt mentioned anything.
    Step 2 of 7 Table Name : scott.PURCHASE_ORDERS
    Step 3 of 7 I just clicked Next
    Step 4 of 7 I just clicked Next
    Step 5 of 7 I just clicked Next
    Step 6 of 7 I just clicked Next
    Step 7 of 7 Here it is Control File Contents:
    LOAD DATA
    APPEND
    INTO TABLE scott.PURCHASE_ORDERS
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
    PO_NUMBER INTEGER EXTERNAL,
    PO_DESCRIPTION CHAR,
    PO_DATE DATE,
    PO_VENDOR INTEGER EXTERNAL,
    PO_DATE_RECEIVED DATE
    And i just clicked on submit job.
    Now i got all 3 rows in purchase_orders :
    SCOTT@orcl> select count(*) from purchase_orders;
      COUNT(*)
             3So, there is no bug, it worked and please retry if you get any error/issue.
    HTH
    Girish Sharma

  • Skipping fields in SQL*LOADER data file

    I have a data file that has more fields than the target table does. How can I write a SQL*LOADER control file to skip some fields in the middle of the text line?
    null

    If you don't want to define input fields by position, the simplest way I think is to use FILLER fields.
    Quoted from SQL*Loader doc:
    "Specifying Filler Fields
    Filler fields have names but they are not loaded into the table. However, filler fields can be used as arguments to init_specs (for example, NULLIF and DEFAULTIF) as well as to directives (for example, SID, OID, REF, BFILE). Also, filler fields can occur anyplace in the data file. They can be inside of the field list for an object or inside the definition of a VARRAY.
    See SQL*Loader DDL Behavior and Restrictions for more information on filler fields and their use.
    A sample filler field specification looks as follows:
    field_1_count FILLER char,
    Ex:
    Regards,
    Zoltan

  • Parameters / Variables in SQL Loader Control File

    Here's my problem, I have created SQL Loader control file (ctl) to load a file of employees. This loader program is registered in Oracle Applications. What I want is to pass a parameter derived from the profile like the user name of the one who executed the loader program and pass it to the loader program so that the user name info is included in the loading
    example
    input data
    123456,John,Smith
    654321,Jane,Doe
    loaded data
    123456,John,Smith,03-JUL-2009,USER13
    654321,Jane,Doe,03-JUL-2009,USER13
    the sysdate is easy because it can be defined in the column but what I really want to achieve is a parameter passed to the ctl file.
    Thanks in advance.

    Dear user!
    Please have a look at this thread.
    {thread:id=915277}
    In the last three posts I explain how to use a shellscript with AWK and SED to pass parameters to a controlfile.
    Please feel free to post again if you have some questions regarding my explanatory notes.
    Yours sincerely
    Florian W.

  • Add "Trailing Nullcolls" to sql loader control files generated by OWB

    Hi gurus,
    I've got a problem loading data with SQL Loader. I need to add the parameter "trailing nullcols" into the SQL Loader control file generated by OWB. I don't want to do this by saving the skript on my hard disk and run it manually, so any ideas where I can put this parameter? I am using OWB 10.1 on a windows 2000 machine.
    Thanks
    Stephan

    Hi,
    I found the solution to problem.
    1; Select the mapping where you map your source flat file to a table.
    2; Right click on the mapping and select "configure"
    3; Go Sources and Targets -&gt; YOUR_TABLE_NAME -&gt; SQL*Loader Parameters
    4; Set Trailing Nullcols = true :-)
    Thank you to anyone looking at this problem.
    Greetings
    Stephan

  • Importing already-made SQL*Loader control files in OWB 10g

    Hi all,
    Suppose that I have a certain amount of already hand-made SQL*Loader control files, say 50 or so.
    Each of these control files have a variable quantity of fields.
    I feel really lazy, so I would like to know if it is possible to import directly these control files instead of re-typing them in the design center. That would save me a lot of time.
    Thanks !
    Burgy

    Hi Burgy
    One option is to use the sqlloader option to generate an external table from the SQLLoader control file (search for generate_only in the sqlloader documentation). If you define the external table in the database you can reverse engineer this into OWB.
    Cheers
    David

  • What is this error java.sql.SQLException: Bad format for number ?

    Dear All,
    I am reading few values from database. Then I get this error "MyError:Error : java.sql.SQLException: Bad format for number 'Sarawak' in column 6. " So what is this error referring to. I have checked the database column and its value fits according the data type. Any hints please?

    I have checked the database column and its value fits according the data type.Check again. Then check again. Keep checking until you find your error. You are trying to read a string containing "Sarawak" as a number. You have "getInt(6)" or other numeric type, and the 6th column in the select statement is not numeric.

  • Read from file, line number N?

    Could someone post some code or explain how can I read from a file, line number N?
    Thanks

    this is a simple way but not so professional:
    string filename="";// file name
    try{
    int N=5;// line number to read first line is line 0
    File file=new File(filename);
    BufferedReader in=new BufferedReader(new FileReader(file));
    for(int x=0;x<N;x++){
    in.readLine();
    String lineN=in.readLine();
    catch(..){
    Alex

  • SQL Loader - read 1st line to one table, rest of data to another

    Hi
    I looked around the FAQs and forums and find similar cases but not mine...
    I am running Oracle 9i and have a text file which has the 1st line as a control header and everything beneath it as data, something like this:
    14/07/2010|8
    12345678|0
    12345679|0
    12345680|10.87
    12345681|7655.8
    12345682|100
    12345683|0
    12345684|-90.44
    12345685|0
    The first (header) line has a date field and a counter (the number of records expected beneath it)
    The rest of the data is an account number and balance.
    Since SQL Loader is invoked outside of Oracle (Unix in my case) I assume I should create two tables, such as:
    Create table
    TIF_CURRENT_BALANCE_DTL
      ACCOUNT_REF_NO   VARCHAR2(30),  
      BALANCE_AMT      NUMBER(12,2)         
    Create table
    TIF_CURRENT_BALANCE_HDR
      HDR_DATE         DATE,  
      HDR_COUNT        NUMBER(10)
    );And use a control file which will load line 1 to TIF_CURRENT_BALANCE_HDR and other lines (SKIP=1) to TIF_CURRENT_BALANCE_DTL.
    Since the header/detail lines are not (necessarily) distinguishable, is there a way to achieve this, without modifying the input file in anyway?
    Thanks
    Martin

    Thanks for your reply - the solution should not be OS dependant as it will run on a Linux and UNIX installation.
    The DB will be (for now) Oracle9i Enterprise Edition Release 9.2.0.4.0 - 64bit Production
    I looked at that web page you provided and had some hope with the ROWS option, but this is the number of rows to load before each commit.
    Any other solutions?
    I know I could load it to a common table (text and number) and then convert the values accordingly in the PL/SQL procedure which will run afterwards, but I feel loading the data in the final format is going to be faster.
    I am considering using SQL Load to load records (SKIPping row 1) into the DTL table and within the PL.SQL loading the 1st record and performing validation. Since the file has approx 2million rows and is processed daily, 1.99999 million records read with SQL Loader and 1 with conventional methods will still be a vast improvement!
    Thanks

Maybe you are looking for

  • Using a DropDownList to filter data

    I am working on my first aspx/CR that uses controls on the page to filter the displayed data. I have a crystal report that by default pulls all warehouse data. It is comprised of two separate views. I have added a DropDownList to the aspx page to all

  • Libreoffice: Save As ... keeps the .odt-extension

    Hi, Has anybody noticed the same behaviour? Whenever saving a document in another format (tested with PDF and DOC), the file is saved as filename.odt.pdf. If I'm not the only one, this is a bug and I will report it to FS...

  • Really an Odd Way to do the Rebate

    Since the rebate should relate to the purchase of the phone (not to whether it was ever activated -- could be used as a door stop and you should still get the $100 credit), it seems they should allow folks to bring the phone and the receipt to a stor

  • An error occured when starting interaction layer application

    Hi all, when starting the IC WebClient and choosing a profile, i get the following error: An error occured when starting interaction layer application An exception has occurredException Class CX_CRM_GENIL_GENERAL_ERROR Text: Component set CRMIC_DEFAU

  • After install ios 7 wifi won turn on please help....

    after install ios 7 wifi won turn on please help....