SQLLoader NULL column load in error table

Hi ,
Following is the the data file
123456vijayBangalore
AjayDehlli
note: before Ajaydehli 6blanks are there
File is fixed width Id(1:6), name(7;12),location (12:40)
when ID is not null then I want to load in the prescriber table
when ID is null then I want to load in the err_prescriber table.
LOAD DATA
Append
INTO TABLE ERR_PRESCRIBER_LKP
WHEN IMS_KEY='dummy'
IMS_KEY     POSITION(1:6)           CHAR     nullif IMS_KEY='dummy',
NAME     POSITION(7:12)           CHAR      "ltrim(rtrim(:NAME))",
LOC     POSITION(18:19)      CHAR      "ltrim(rtrim(:LOC))"
INTO TABLE PRESCRIBER_LKP
WHEN IMS_KEY!=' '
IMS_KEY     POSITION(1:6)           CHAR     nullif IMS_KEY='dummy',
NAME     POSITION(7:12)           CHAR      "ltrim(rtrim(:NAME))",
LOC     POSITION(18:19)      CHAR      "ltrim(rtrim(:LOC))"
this script is loading the first record in the PRESCRIBER_LKP in table but not the second record in the error table.
Could you please help in this

There are two parameters to consider
discardmax – [ALL] The maximum number of discards to allow.
errors – [50] The number of errors to allow on the load. 

Similar Messages

  • CKM IS NOT LOADING IN error tables

    Hi everyone,
    I have created an interface and added a check constraint to the target. but i see that the interface errors out with a check constraint failure but does not load the error table. however, it loads the ikm table and ckm table. iam i missing out something?
    regards
    babu

    Ankit,
    Thanks so much for your help. Let me explain the scenario iam in and what i want. we have a master repository and three wr's . I am presntly working on schema x in dev. I created interface i in x and generated a scenario for this.I exported this scenario and imported into the operator of qa. I have a schema x in qa too. but this schema is empty..has no objects. I executed the imported scenario in qa operator. It ran well but executed the same in x schema of dev. In the execution option i have selected the qa context and agent as local. b)I executed the scenario again this time with qa context and qa agent as well. this time the operator hung up. I was wondering if i need to have the same target table in qa schema to execute this scenario..
    Also, how are we going to tell the execute file(scenario) that it needs to execute in x schema in qa environment. is it by selecting the agent that connects to the qa environment? Please clarify..thanks so much once again for your help..
    regards,
    Babu

  • Sql Loader - Decimal numbers showing in null column

    Greetings,
    My apologies if this is in the wrong forum section. It seemed to be the most logical.
    I have added new column to a control file used in a sql loader upload and I am getting unexpected results. Long story short, I copy foxpro tables from a network directory to my local pc. A foxpro exe converts these tables to .dat files. Sql loader then uploads the .dat files to matching oracle tables. I've run this program from my pc for years with no problems.
    Problem now: We added a new column to a foxpro table and to the matching oracle table. This column in FoxPro in null for now - no data at all. I then added the new column to my ctl file for this table. The program runs, sql loader does it's thing with no errors. However, in the new field in Oracle, I'm finding decimal numbers in many of the records, when all records should have null values in this field. I've checked all other columns in the oracle table and the data looks accurate. I'm not sure why I'm getting these decimal values in the new column.
    My log and bad files show no hints of any problems. The bad file is empty for this table.
    At first I thought the positioning of the new column in the fox table, .ctl file and the oracle table were not lining up correctly, but I checked and they are.
    I've double checked the FoxPro table and all records for this new column are null.
    I'm not sure what to check for next or what to test. I am hoping someone in this forum might lend a clue or has maybe seen this problem before. Below is my control file. The new column is the last one: fromweb_id. It is a number field in both FoxPro and Oracle.
    Thanks for any advise.
    JOBS table control file:
    load data
    infile 'convdata\fp_ora\JOBS.dat' "str X'08'"
    into table JOBS
    fields terminated by X'07'
    TRAILING NULLCOLS
    (SID,
    CO_NAME "replace(replace(:CO_NAME,chr(11),chr(10)),chr(15),chr(13))",
    JOB_TITLE "replace(replace(:JOB_TITLE,chr(11),chr(10)),chr(15),chr(13))",
    CREDITS,
    EARN_DATE date "mm/dd/yyyy",
    COMMENTS CHAR(2000) "replace(replace(:COMMENTS,chr(11),chr(10)),chr(15),chr(13))",
    DONT_SHOW,
    PC_SRC "replace(replace(:PC_SRC,chr(11),chr(10)),chr(15),chr(13))",
    PC_SRC_NO,
    SALARY,
    SALFOR,
    ROOM,
    BOARD,
    TIPS,
    UPD_DATE date "mm/dd/yyyy hh12:mi:ss am",
    STUKEY,
    JOBKEY,
    JO_COKEY,
    JO_CNKEY,
    JO_ZUKEY,
    EMPLID,
    CN_NAME "replace(replace(:CN_NAME,chr(11),chr(10)),chr(15),chr(13))",
    JOB_START date "mm/dd/yyyy",
    JOB_END date "mm/dd/yyyy",
    FROMWEB_ID)

    I apologize for not explaining how this was resolved. Sql Loader was working as it should.
    The problem was due to new fields being added to the FoxPro table, along with the fromweb_id column, that I was not informed about. I was asked to add a column named fromweb_id to the oracle jobs table and to the sql-loader program. I was not told that there were other columns added at the same time. In the foxpro table, the fromweb_id column was the last column added.
    The jobs.dat file contained data from all columns in the foxpro table, including all the new columns. I only added the "fromweb_id" to the control file, which is what I was asked to do. When it ran, it was getting values from one of the new columns and the values were being uploaded into the fromweb_id column in Oracle. It is that simple.
    When I had checked the FoxPro table earlier, I did not pickup on the other new columns. I was focussing in on looking for values in the fromweb_id column. When back-tracing data in the jobs.dat file, I found a value in the fromweb_id column that matched a value in a differnt column (new column) in FoxPro. That is when I realized the other new columns. I instantly knew what the problem was.
    Thanks for all the feedback. I'm sorry if this was an inconvenience to anyone. I'll try to dig a little deeper next time. Lessons learned...
    regards,

  • SQL*Loader-930: Error parsing insert statement for column

    we upload data on daily basis in application throug apps user and these table are invloved
    1. DEV_RA_INTERFACE_LINES_ALL(owner is a apps)
    2.RA_INTERFACE_LINES_ALL(owner is a AR)
    we do steps
    1 delete record from DEV_RA_INTERFACE_LINES_ALL table
    2 delete record from      RA_INTERFACE_LINES_ALL table
    3 load data using sql loader with apps user
    4 insert in RA_INTERFACE_LINES_ALL table
    we want to change user i mean these step do dataupload user not apps
    we give the proper rights to dataupload like select,delete and insert rights on these table to dataupload user but when i going to load data throug sql loader we receive error
    SQL*Loader-930: Error parsing insert statement for column APPS.DEV_RA_INTERFACE_
    LINES_ALL.ORIG_SYSTEM_BILL_ADDRESS_ID.
    ORA-00904: "F_ORIG_SYSTEM_BILL_ADDR_REF": invalid identifier
    and if i insert data through apps then done.

    make sure that u have no speces left
    between lines.
    give the path of control file path correctly.

  • SQL*Loader-929: Error parsing insert statement for table

    Hi,
    I get the following error with SQL*Loader:
    Table MYTABLE loaded from every logical record.
    Insert option in effect for this table: INSERT
    Column Name Position Len Term Encl Datatype
    IDE FIRST * ; CHARACTER
    SQL string for column : "mysequence.NEXTVAL"
    CSI_NBR 1:10 10 ; CHARACTER
    POLICY_NBR 11:22 12 ; CHARACTER
    CURRENCY_COD 23:25 3 ; CHARACTER
    POLICY_STAT 26:27 2 ; CHARACTER
    PRODUCT_COD 28:35 8 ; CHARACTER
    END_DAT 44:53 10 ; CHARACTER
    FISCAL_COD 83:83 1 ; CHARACTER
    TOT_VAL 92:112 21 ; CHARACTER
    SQL*Loader-929: Error parsing insert statement for table MYTABLE.
    ORA-01031: insufficient privileges
    I am positive that I can SELECT the sequence and INSERT into the table with the user invoking sql*loader.
    Where does that "ORA-01031" come from?
    Regards
    ...

    Options:
    1) you are wrong about privileges OR
    2) you have the privilege only when you connect via SQL*Plus (or whichever other tool you used to test the insert).
    Is it possible that during your test you enabled the role which granted you the INSERT privilege - and that SQL*Loader doesn't do this?
    Can you see the table in this list?
    select *
    from user_tab_privs_recd
    where table_name='MY_TABLE'
    and owner='table owner whoever';
    select *
    from user_role_privs;Any roles where DEFAULT_ROLE is not YES?
    HTH
    Regards Nigel

  • How to load error table from mapping

    i have one doubt regarding including error table in mapping.
    i am here with table structure,
    map_name varchar2(30),
    source_name varchar2(15),
    transformation_stage varchar2(100),
    source_name varchar2(30),
    target_table varchar2(30),
    error_field varchar2(100),
    error_message varchar2(225),
    Execution_date date.
    this table i have to include in all mappings and i have to load the errors in this table.please guide me
    how to include this table and load the errors.if you have any code please send me.
    with regards
    narendra.p.v

    just kind of workout(rather call it as template) as post mapping process...
    declare
    -- in this cusor, you can include mor attributes in SELECT clause
    cursor cur_audit is SELECT a.RTE_IID,
         a.RTA_IID,
         a.RTE_ROWKEY,
         a.RTE_SQLERRM,
         c.TASK_NAME,
    -- the return_result attribute, which will get the value only after the mapping completed(/ or exited) including post process
    case when a.RTE_IID is null then
    'SUCCESS'
    else
    'FAILURE'
    end RETURN_RESULT,
         c.RETURN_CODE,
         c.NUMBER_OF_TASK_ERRORS,
         c.NUMBER_OF_TASK_WARNINGS,
         c.CREATION_DATE, c.LAST_UPDATE_DATE
    FROM WB_RT_ERRORS a,
         WB_RT_AUDIT b,
         WB_RT_AUDIT_EXECUTIONS c
    WHERE b.RTe_ID = ANY(SELECT MAX(AUDIT_EXECUTION_ID) FROM WB_RT_AUDIT_EXECUTIONS)
         AND b.rta_iid = a.rta_iid(+)
         AND b.RTE_ID = c.AUDIT_EXECUTION_ID;
    begin
    for i in cur_audit
    loop
    insert into cs_event_template values(ccv_event_seq.nextval,i.task_name,i.return_result,i.return_code,i.number_of_task_errors,
              i.number_of_task_warnings,i.rte_sqlerrm,i.rte_rowkey,i.rta_iid,i.rte_iid,i.creation_date,i.last_update_date);
    end loop;
    commit;
    exception
         when others then
         dbms_output.put_line('Error:     '||SQLErrm);
    end ;
    You will get more than this and/or if you have just simple plan than this , would you mind to just let us know...

  • FDMEE Import error "No periods were identified for loading data into table 'AIF_EBS_GL_BALANCES_STG'

    Hi,
    We are having trouble while importing one ledger 'GERMANY EUR GGAAP'. It works for Dec 2014 but while trying to import data for 2015 it gives an error.
    Import error shows " RuntimeError: No periods were identified for loading data into table 'AIF_EBS_GL_BALANCES_STG'."
    I tried all Knowledge docs from Oracle support but no luck. Please help us resolving this issue as its occurring in our Production system.
    I also checked all period settings under Data Management> Setup> Integration Setup > Global Mapping and Source Mapping and they all look correct.
    Also its only happening to one ledger rest all ledgers are working fine without any issues.
    Thanks

    Hi,
    there are some Support documents related to this issue.
    I would suggest you have a look to them.
    Regards

  • Column not found error while populatin a oracle table with ODI USer

    Hi,
    I am trying to populate a column in a oracle table with the ODI USER name using the function getUser("USER_NAME") in the Mapping column of the Interface, But the interface throwhing an error *Column not found : Supervisor in Statement [Select......]*. but it's working fine with getUser("I_USER') the column is populating the user identifier.
    can any one help me out why user is not populating.
    Thanks

    Enclose the call to the getUser api inside single quotes
    '<%=getUser("USER_NAME")%>'ID being a number can be used directly but USER_NAME returns a string that needs to be quoted

  • How to find out the NULL columns in the table?

    Hi,
    Please provide the query to find the null columns in the table. Here, all rows in the table have same column as null. It won't change.
    Table
    c1 c2 c3 c4
    X C 10
    T D 20
    I wanna find out as C2 is the nullable column.
    Thanks in advance !!
    Regards,
    Vissu....

    Below code might be solution for finding NULL valued in columns in a table and if it is solution .kindly give me points for the same.
    declare
    cursor col_cur is
    select column_name
    from user_tab_cols -- all_tables can also be used in case the table is present in own schema
    where table_name = 'TABLE_NAME'; --provide the TABLE_NAME  
    stmt varchar2(1000):= '';
    v_count number;
    count_null number;
    begin
    execute immediate 'select count(*) from TABLE_NAME' into v_count; --provide the TABLE_NAME  
    for rec in col_cur loop
    execute immediate 'select count(*) from TABLE_NAME' where --provide the TABLE_NAME  
    '||rec.column_name||' IS NULL' into count_null;
    if count_null = v_count then
    stmt :=stmt|| rec.column_name ||chr(13);
    end if;
    end loop;
    dbms_output.put_line(stmt);
    end ;

  • XMLGEN: Produce XML dump of a table WITH tags for null column values

    I am new to generating XML so bear with me....
    We have a customer who needs an XML extract of data, including tags for any column that is null.
    I created a simple test case below using DBMS_XMLGEN.getXML. The first row (A1) has no null values and thus tags for columns A, BEE and CEE are produced. The second row (A2) has null in column BEE and thus tags for only columns A and CEE are produced.
    Is there a way to force a tag for null column BEE in the second row?
    create table foo (A varchar2(10), BEE number, CEE date);
    insert into foo values ('A1',1,sysdate);
    insert into foo values ('A2',null,sysdate);
    SELECT DBMS_XMLGEN.getXML('SELECT * FROM foo') FROM dual;
    <ROWSET>
    <ROW>
    <A>A1</A>
    <BEE>1</BEE>
    <CEE>27-SEP-12</CEE>
    </ROW>
    <ROW>
    <A>A2</A>
    <CEE>27-SEP-12</CEE>
    </ROW>
    </ROWSET>

    What's the database version? (SELECT * FROM v$version)
    Could you use this instead :
    SQL> select xmlserialize(document
      2           xmlelement("ROWSET",
      3             xmlagg(
      4               xmlelement("ROW",
      5                 xmlelement("A", a)
      6               , xmlelement("BEE", bee)
      7               , xmlelement("CEE", cee)
      8               )
      9             )
    10           )
    11         -- for display purpose only :
    12         as clob indent
    13         )
    14  from foo
    15  ;
    XMLSERIALIZE(DOCUMENTXMLELEMEN
    <ROWSET>
      <ROW>
        <A>A1</A>
        <BEE>1</BEE>
        <CEE>2012-09-27</CEE>
      </ROW>
      <ROW>
        <A>A2</A>
        <BEE/>
        <CEE>2012-09-27</CEE>
      </ROW>
    </ROWSET>
    Or,
    SQL> select xmlserialize(document
      2           xmlquery(
      3             '(#ora:view_on_null empty #)
      4             {
      5               <ROWSET>{fn:collection("oradb:/DEV/FOO")}</ROWSET>
      6             }'
      7             returning content
      8           )
      9           as clob indent
    10         )
    11  from dual;
    XMLSERIALIZE(DOCUMENTXMLQUERY(
    <ROWSET>
      <ROW>
        <A>A1</A>
        <BEE>1</BEE>
        <CEE>2012-09-27</CEE>
      </ROW>
      <ROW>
        <A>A2</A>
        <BEE/>
        <CEE>2012-09-27</CEE>
      </ROW>
    </ROWSET>
    (where "DEV" is my test schema)
    If you want to stick with DBMS_XMLGEN, you're gonna have to use PL/SQL and setNullHandling procedure.
    Edited by: odie_63 on 27 sept. 2012 17:14

  • SQL Loader Constraints with Column Objects and Nested Tables

    I am working on loading a Table that (god forbid) contains columns, column objects, and nested tables (which contains several depth of column objects). My question is does SQL Loader have a hidding undocumented feature where it states how the column objects must be grouped in refereneced to the nested tables within the loader file? I can load the various column objects, and nested tables fine right now, however, I am loading them all in strange and insane order. Can anyone answer this question? Thanks.
    Peter

    I just noticed that my email is wrong. If you can help, plese send email to [email protected]
    thanks.

  • Script to add not null columns to all tables in database

    Hello,
    I need to add 5 not null columns to my existing database (all tables).
    The problem is that i do not want to loose the current data.
    I need a script so that i need not do this manually for each table.
    Can u suggest?
    Vishal

    Hello,
    I need to add 5 not null columns to my existing database (all tables).
    The problem is that i do not want to loose the current data.
    I need a script so that i need not do this manually for each table.
    Can u suggest?
    Vishal I always follow this step
    1) Alter table <<tablename>> add(<<columnname>> <<datatype>>)
    2) Update <<tablename>> set <<columnname>>=<<anyvalue>>
    3) Alter table <<tablename>> modify(<<columnname>> <<datatype>> not null)
    else
    1) rename <<tablename>> to <<tablenamebk>>
    2) drop table <<tablename>>
    3) Alter table <<tablenamebk>> add(<<columnname>> <<datatype>>)
    4) update <<tablenamebk>> set <<columnname>>=<<anyvalue>>
    5) create table <<tablename>> (with additional columns with not null)
    6) insert into <<tablename>> select * from <<tablenamebk>>

  • Using dbms_lob to load image into table

    I am trying to load a set of images from my DB drive into a table. This works fine when I try to load only 1 record. If I try to load more than 1 record, first gets created but I get this error, and it doesn't load the images for the rest of them.
    ORA-22297:     warning: Open LOBs exist at transaction commit time
    Cause:     An attempt was made to commit a transaction with open LOBs at transaction commit time.
    Action:     This is just a warning. The transaction was commited successfully, but any domain or functional indexes on the open LOBs were not updated. You may want to rebuild those indexes.
    Am I missing something in the code that's needed?
    in_file UTL_FILE.FILE_TYPE;
    bf bfile;
    b blob;
    src_offset integer := 1;
    dest_offset integer := 1;
    CURSOR get_pics is select id from emp;
    BEGIN
    FOR x in get_pics LOOP
    BEGIN
    insert into stu_pic(id,student_picture)
    values(x.id,empty_blob()) returning student_picture into b;
    l_picture_uploaded := 'Y';
    bf := bfilename('INTERFACES',x.student_id || '.' || p_image_type);
    dbms_lob.fileopen(bf,dbms_lob.file_readonly);
    dbms_lob.open(b,dbms_lob.lob_readwrite);
    dbms_lob.loadBlobFromFile(b,bf,dbms_lob.lobmaxsize,dest_offset,src_offset);
    dbms_lob.close(b);
    dbms_lob.fileclose(bf);
    EXCEPTION when dup_val_on_index then null;
    END;
    END LOOP;
    END;

    There are two methods you can use.
    1. Create an external table with those images(BLOB column) and then use that external table to insert into another table.
    Demo as follows:
    This is my pdf files
    C:\Saubhik\Assembly\Books\Algorithm>dir *.pdf
    Volume in drive C has no label.
    Volume Serial Number is 6806-ABBD
    Directory of C:\Saubhik\Assembly\Books\Algorithm
    08/16/2009  02:11 PM         1,208,247 algorithms.pdf
    08/17/2009  01:05 PM        13,119,033 fci4all.com.Introduction_to_the
    d_Analysis_of_Algorithms.pdf
    09/04/2009  06:58 PM        30,375,002 sedgewick-algorithms.pdf
                   3 File(s)     44,702,282 bytes
                   0 Dir(s)   7,474,593,792 bytes free
    C:\Saubhik\Assembly\Books\Algorithm>This is my file with which I'll load the pdf files as BLOB
    C:\Saubhik\Assembly\Books\Algorithm>type mypdfs.txt
    Algorithms.pdf,algorithms.pdf
    Sedgewick-Algorithms.pdf,sedgewick-algorithms.pdf
    C:\Saubhik\Assembly\Books\Algorithm>Now the actual code
    SQL> /* This is my directory object */
    SQL> CREATE or REPLACE DIRECTORY saubhik AS 'C:\Saubhik\Assembly\Books\Algorithm';
    Directory created.
    SQL> /* Now my external table */
    SQL> /* This table contains two columns. 1.pdfname contains the name of the file
    DOC>   and 2.pdfFile is a BLOB column contains the actual pdf*/ 
    SQL> CREATE TABLE mypdf_external (pdfname VARCHAR2(50),pdfFile BLOB)
      2         ORGANIZATION EXTERNAL (
      3           TYPE ORACLE_LOADER
      4            DEFAULT DIRECTORY saubhik
      5            ACCESS PARAMETERS (
      6              RECORDS DELIMITED BY NEWLINE
      7              BADFILE saubhik:'lob_tab_%a_%p.bad'
      8              LOGFILE saubhik:'lob_tab_%a_%p.log'
      9              FIELDS TERMINATED BY ','
    10              MISSING FIELD VALUES ARE NULL
    11               (pdfname char(100),blob_file_name CHAR(100))
    12              COLUMN TRANSFORMS (pdfFile FROM lobfile(blob_file_name) FROM (saubhik) BLOB)
    13            )
    14            LOCATION('mypdfs.txt')
    15         )
    16         REJECT LIMIT UNLIMITED;
    Table created.
    SQL> SELECT pdfname,DBMS_LOB.getlength(pdfFile) pdfFileLength
      2  FROM   mypdf_external;
    PDFNAME                                            PDFFILELENGTH
    Algorithms.pdf                                           1208247
    Sedgewick-Algorithms.pdf                                30375002
    SQL> Now, you can use this table for any operation very easily. Even for your loading into another table!.
    2. Use of DBMS_LOB like this
    /* Loading a image Winter.jpg in the BLOB column as BLOB!*/
    DECLARE
      v_src_blob_locator BFILE := BFILENAME('SAUBHIK', 'Winter.jpg');
      v_amount_to_load   INTEGER := 4000;
      dest_lob_loc BLOB;
    BEGIN
      --Insert a empty row with id 1
      INSERT INTO test_my_blob_clob VALUES(1,EMPTY_BLOB(),EMPTY_CLOB())
       RETURNING BLOB_COL INTO dest_lob_loc;
      DBMS_LOB.open(v_src_blob_locator, DBMS_LOB.lob_readonly);
      v_amount_to_load := DBMS_LOB.getlength(v_src_blob_locator);
      DBMS_LOB.loadfromfile(dest_lob_loc, v_src_blob_locator, v_amount_to_load);
      DBMS_LOB.close(v_src_blob_locator);
      COMMIT;
    --id=1 is created with Winter.jpg populated in BLOB_COL and CLOB_COL is empty.  
    END;Now user this code to create a procedure with parameter and use that in loop.

  • On load, getting error:  Field in data file exceeds maximum length

    Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
    PL/SQL Release 11.2.0.3.0 - Production
    CORE    11.2.0.3.0    Production
    TNS for Solaris: Version 11.2.0.3.0 - Production
    NLSRTL Version 11.2.0.3.0 - Production
    I'm trying to load a table, small in size (110 rows, 6 columns).  One of the columns, called NOTES is erroring when I run the load.  It is saying that the column size exceeds max limit.  As you can see here, the table column is set to 4000 Bytes)
    CREATE TABLE NRIS.NRN_REPORT_NOTES
      NOTES_CN      VARCHAR2(40 BYTE)               DEFAULT sys_guid()            NOT NULL,
      REPORT_GROUP  VARCHAR2(100 BYTE)              NOT NULL,
      AREACODE      VARCHAR2(50 BYTE)               NOT NULL,
      ROUND         NUMBER(3)                       NOT NULL,
      NOTES         VARCHAR2(4000 BYTE),
      LAST_UPDATE   TIMESTAMP(6) WITH TIME ZONE     DEFAULT systimestamp          NOT NULL
    TABLESPACE USERS
    RESULT_CACHE (MODE DEFAULT)
    PCTUSED    0
    PCTFREE    10
    INITRANS   1
    MAXTRANS   255
    STORAGE    (
                INITIAL          80K
                NEXT             1M
                MINEXTENTS       1
                MAXEXTENTS       UNLIMITED
                PCTINCREASE      0
                BUFFER_POOL      DEFAULT
                FLASH_CACHE      DEFAULT
                CELL_FLASH_CACHE DEFAULT
    LOGGING
    NOCOMPRESS
    NOCACHE
    NOPARALLEL
    MONITORING;
    I did a little investigating, and it doesn't add up.
    when i run
    select max(lengthb(notes)) from NRIS.NRN_REPORT_NOTES
    I get a return of
    643
    That tells me that the largest size instance of that column is only 643 bytes.  But EVERY insert is failing.
    Here is the loader file header, and first couple of inserts:
    LOAD DATA
    INFILE *
    BADFILE './NRIS.NRN_REPORT_NOTES.BAD'
    DISCARDFILE './NRIS.NRN_REPORT_NOTES.DSC'
    APPEND INTO TABLE NRIS.NRN_REPORT_NOTES
    Fields terminated by ";" Optionally enclosed by '|'
      NOTES_CN,
      REPORT_GROUP,
      AREACODE,
      ROUND NULLIF (ROUND="NULL"),
      NOTES,
      LAST_UPDATE TIMESTAMP WITH TIME ZONE "MM/DD/YYYY HH24:MI:SS.FF9 TZR" NULLIF (LAST_UPDATE="NULL")
    BEGINDATA
    |E2ACF256F01F46A7E0440003BA0F14C2|;|DEMOGRAPHICS|;|A01003|;3;|Demographic results show that 46 percent of visits are made by females.  Among racial and ethnic minorities, the most commonly encountered are Native American (4%) and Hispanic / Latino (2%).  The age distribution shows that the Bitterroot has a relatively small proportion of children under age 16 (14%) in the visiting population.  People over the age of 60 account for about 22% of visits.   Most of the visitation is from the local area.  More than 85% of visits come from people who live within 50 miles.|;07/29/2013 16:09:27.000000000 -06:00
    |E2ACF256F02046A7E0440003BA0F14C2|;|VISIT DESCRIPTION|;|A01003|;3;|Most visits to the Bitterroot are fairly short.  Over half of the visits last less than 3 hours.  The median length of visit to overnight sites is about 43 hours, or about 2 days.  The average Wilderness visit lasts only about 6 hours, although more than half of those visits are shorter than 3 hours long.   Most visits come from people who are fairly frequent visitors.  Over thirty percent are made by people who visit between 40 and 100 times per year.  Another 8 percent of visits are from people who report visiting more than 100 times per year.|;07/29/2013 16:09:27.000000000 -06:00
    |E2ACF256F02146A7E0440003BA0F14C2|;|ACTIVITIES|;|A01003|;3;|The most frequently reported primary activity is hiking/walking (42%), followed by downhill skiing (12%), and hunting (8%).  Over half of the visits report participating in relaxing and viewing scenery.|;07/29/2013 16:09:27.000000000 -06:00
    Here is the full beginning of the loader log, ending after the first row return.  (They ALL say the same error)
    SQL*Loader: Release 10.2.0.4.0 - Production on Thu Aug 22 12:09:07 2013
    Copyright (c) 1982, 2007, Oracle.  All rights reserved.
    Control File:   NRIS.NRN_REPORT_NOTES.ctl
    Data File:      NRIS.NRN_REPORT_NOTES.ctl
      Bad File:     ./NRIS.NRN_REPORT_NOTES.BAD
      Discard File: ./NRIS.NRN_REPORT_NOTES.DSC
    (Allow all discards)
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 50
    Bind array:     64 rows, maximum of 256000 bytes
    Continuation:    none specified
    Path used:      Conventional
    Table NRIS.NRN_REPORT_NOTES, loaded from every logical record.
    Insert option in effect for this table: APPEND
       Column Name                  Position   Len  Term Encl Datatype
    NOTES_CN                            FIRST     *   ;  O(|) CHARACTER
    REPORT_GROUP                         NEXT     *   ;  O(|) CHARACTER
    AREACODE                             NEXT     *   ;  O(|) CHARACTER
    ROUND                                NEXT     *   ;  O(|) CHARACTER
        NULL if ROUND = 0X4e554c4c(character 'NULL')
    NOTES                                NEXT     *   ;  O(|) CHARACTER
    LAST_UPDATE                          NEXT     *   ;  O(|) DATETIME MM/DD/YYYY HH24:MI:SS.FF9 TZR
        NULL if LAST_UPDATE = 0X4e554c4c(character 'NULL')
    Record 1: Rejected - Error on table NRIS.NRN_REPORT_NOTES, column NOTES.
    Field in data file exceeds maximum length...
    I am not seeing why this would be failing.

    HI,
    the problem is delimited data defaults to char(255)..... Very helpful I know.....
    what you need to two is tell sqlldr hat the data is longer than this.
    so change notes to notes char(4000) in you control file and it should work.
    cheers,
    harry

  • How to capture Field validation errors in the Error table in ODI 11g

    Hello,
    We are using ODI 11g (11.1.1.5) and the scenario is to read the data from a flat file (.txt) and do a bulk insert into MS SQL Server database table.
    We need to capture the error records (if the source field size is greater than the target column size) into the error table. However the interface errors out at step "Loading - SrcSet0 - Load data (BULK INSERT)" with error message "SQLServer JDBC Driver][SQLServer]Bulk load data conversion error (truncation) for row 33, column 6" but these errors are not being inserted into the error table.
    Is there a way to capture these errors in the error table? Below is the KM details.
    LKM: LKM File to MSSQL (BULK)
    CKM: CKM SQL
    IKM: IKM MSSQL Incremental Update
    FLOW_CONTROL is set to true for the IKM.
    Thanks,
    Krishna

    Hello,
    I had the same problem with ODI when I was trying BULK INSERT of the txt file into MS SQL. Check the cell(s) in your source file (txt) - it looks like the value in hte cell has hiding symbols: when pressing F2 tryng edit the value in the cell the coursor appared far to the right from the right end of the value. So, try to use backspace to delete the hiding symbols and verify the above. If avasrything is OK, then modify your txt file. Let me know if it works.
    BTW , I've created procedure inside the MS SQL 2008R2, which BULK INSERTed records into temporary (#...) table and immediatelly, without any verification all the records were inserted into the final table in the DWH here is the statement:
    if object_id('TEMPDB..#<table>','U') is not null drop table #<table>
    CREATE TABLE [dbo].[#<table>] 
    [1] [varchar] (50) NULL, 
    [2] [varchar] (100) NULL, 
    [3] [varchar] (100) NULL, 
    [4] [varchar] (100) NULL, 
    [5] [varchar] (100) NULL, 
    [6] [varchar] (100) NULL, 
    [7] [varchar]  (100) NULL, 
    [8] [varchar] (100) NULL, 
    [9] [varchar] (100) NULL, 
    [10] [varchar] (100) NULL, 
    [11] [varchar] (100) NULL 
    ) ON [PRIMARY]
    bulk INSERT #<table> FROM 'N:\<table>.txt'
    with
    (FIRSTROW=2,KEEPNULLS,CODEPAGE=1252,FIELDTERMINATOR='\t'
    INSERT
    INTO <table>
    SELECT
    * FROM #<table>
    and it works! Let me also know if you find any other way around.
    regards
    Anatoli

Maybe you are looking for