External Table Load KUP-04037 Error

I was asked to repost this here. This was a follow on question to this thread how to load special characters (diacritics) in the Exort/Import/SQL Loader/External Table Forum.
I've defined an external table and on my one instance running the WE8MSWIN1252 character set everything works fine. On my other instance running AL32UTF8 I get the KUP-04037 error, terminator not found on the field that has "à" (the letter a with a grave accent). Changing it to a standard "a" works avoids the error. Changing the column definition in the external table to nvarchar2 does NOT help.
Any ideas anyone?
Thanks,
Bob Siegel

Exactly. If you do not specify the CHARACTERSET parameter, the database character set is used to interpret the input file. As the input file is in WE8MSWIN1252, the ORACLE_LOADER driver gets confused trying to interpret single-byte WE8MSWIN1252 codes as multibyte AL32UTF8 codes.
The character set of the input file depends on the way it was created. Even on US Windows, you can create text files in different encodings. Notepad allows you to save the file in ANSI code page (=WE8MSWIN1252 on US Windows), Unicode (=AL16UTF16LE), Unicode big endian (=AL16UTF16), and UTF-8 (=AL32UTF8). The Command Prompt edit.exe editor saves the files in the OEM code page (=US8PC437 on US Windows).
-- Sergiusz

Similar Messages

  • While creating external table getting KUP-01005 error

    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-00554: error encountered while parsing access parameters
    KUP-01005: syntax error: found "identifier": expecting one of: "badfile, byteord
    ermark, characterset, colon

    Post the Create statement.

  • Create a External Table in Oracle 10g:== ERROR: KUP-01005

    Hello.
    I have a problem working with external tables, hope someone can help me with this problem. Thanks.
    This is the code of the external table
    ========================
    CREATE TABLE SIAFI.RNP_IDS
    NUMERO_ID VARCHAR2(30 BYTE),
    PRIMER_NOMBRE VARCHAR2(300 BYTE),
    SEGUNDO_NOMBRE VARCHAR2(300 BYTE),
    APELLIDO_PATERNO VARCHAR2(300 BYTE),
    APELLIDO_MATERNO VARCHAR2(300 BYTE),
    DEPARTAMENTO VARCHAR2(300 BYTE),
    CORRELATIVO VARCHAR2(300 BYTE)
    ORGANIZATION EXTERNAL
    ( TYPE ORACLE_LOADER
    DEFAULT DIRECTORY SEG_DIRECTORIO
    ACCESS PARAMETERS
    ( records delimited by NEWLINE
    badfile SEG_DIRECTORIO:'censo.bad'
    logfile SEG_DIRECTORIO:'censo.log'
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' and '"'LDRTRIM
    REJECT ROWS WITH ALL NULL FIELDS
    (NUMERO_ID VARCHAR(30) NULLIF NUMERO_ID=BLANKS
    TERMINATED BY "," OPTIONALLY ENCLOSED BY '"' and '"'LDRTRIM,
    PRIMER_NOMBRE VARCHAR(300) NULLIF PRIMER_NOMBRE=BLANKS
    TERMINATED BY "," OPTIONALLY ENCLOSED BY '"' and '"'LDRTRIM,
    SEGUNDO_NOMBRE VARCHAR(300) NULLIF SEGUNDO_NOMBRE=BLANKS
    TERMINATED BY "," OPTIONALLY ENCLOSED BY '"' and '"'LDRTRIM,
    APELLIDO_PATERNO VARCHAR(300) NULLIF APELLIDO_PATERNO=BLANKS
    TERMINATED BY "," OPTIONALLY ENCLOSED BY '"' and '"'LDRTRIM,
    APELLIDO_MATERNO VARCHAR(300) NULLIF APELLIDO_MATERNO=BLANKS
    TERMINATED BY "," OPTIONALLY ENCLOSED BY '"' and '"'LDRTRIM,
    DEPARTAMENTO VARCHAR(300) NULLIF DEPARTAMENTO=BLANKS
    TERMINATED BY "," OPTIONALLY ENCLOSED BY '"' and '"'LDRTRIM,
    CORRELATIVO VARCHAR(300) NULLIF CORRELATIVO=BLANKS
    TERMINATED BY "," OPTIONALLY ENCLOSED BY '"' and '"'LDRTRIM,
    LOCATION (SEG_DIRECTORIO:'censo.txt')
    REJECT LIMIT UNLIMITED
    NOPARALLEL
    NOMONITORING;
    When executing the stament select from RNP_IDS* it returns the following error message:
    ===========================================================
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-00554: error encountered while parsing access parameters
    KUP-01005: syntax error: found "terminated": expecting one of: "and, comma, defaultif, not, nullif, or, )"
    KUP-01007: at line 6 column 56
    ORA-06512: at "SYS.ORACLE_LOADER", line 19
    ORA-06512: at line 1
    This is the example of the file I'm using:
    ==========================
    "0","DOUGLAS","AUGUSTO","ABBOTT","","1","3672097"
    "0101190600010","MARIA","URBANA","GOMEZ","URBINA","2","1949122"
    "0101190600076","ENRIQUETA","","GARCIA","","2","1162025"
    "0101190800106","LUCILA","","FLORES","","2","1658013"

    Hi
    Here we go...
    I reduced the 300 length of varcahr t o30 for testing purpose...
    SQL> CREATE TABLE RNP_IDS
    2 (
    3 NUMERO_ID VARCHAR2(30),
    4 PRIMER_NOMBRE VARCHAR2(30),
    5 SEGUNDO_NOMBRE VARCHAR2(30),
    6 APELLIDO_PATERNO VARCHAR2(30),
    7 APELLIDO_MATERNO VARCHAR2(30),
    8 DEPARTAMENTO VARCHAR2(30),
    9 CORRELATIVO VARCHAR2(30)
    10 )
    11 ORGANIZATION EXTERNAL
    12 ( TYPE ORACLE_LOADER
    13 DEFAULT DIRECTORY LOG
    14 ACCESS PARAMETERS
    15 ( records delimited by NEWLINE
    16 badfile LOG:'censo.bad'
    17 logfile LOg:'censo.log'
    18 FIELDS TERMINATED BY ','
    19 OPTIONALLY ENCLOSED BY '"' and '"'LDRTRIM
    20 REJECT ROWS WITH ALL NULL FIELDS
    21 (
    22 NUMERO_ID ,
    23 PRIMER_NOMBRE ,
    24 SEGUNDO_NOMBRE ,
    25 APELLIDO_PATERNO,
    26 APELLIDO_MATERNO ,
    27 DEPARTAMENTO,
    28 CORRELATIVO
    29 )
    30 )
    31 LOCATION ('sample1.txt')
    32 )
    33 REJECT LIMIT UNLIMITED
    34 NOPARALLEL
    35 NOMONITORING;
    Table created.
    SQL> desc rnp_ids
    Name Null? Type
    NUMERO_ID VARCHAR2(30)
    PRIMER_NOMBRE VARCHAR2(30)
    SEGUNDO_NOMBRE VARCHAR2(30)
    APELLIDO_PATERNO VARCHAR2(30)
    APELLIDO_MATERNO VARCHAR2(30)
    DEPARTAMENTO VARCHAR2(30)
    CORRELATIVO VARCHAR2(30)
    SQL> select numero_id from rnp_ids;
    NUMERO_ID
    0
    0101190600010
    0101190600076
    0101190800106
    - Pavan Kumar N

  • External Table Load Size

    Hi,
    Could you please guide me on how much maximum load(Flat File) size is supported by External Table, in 9i and 10g.
    Thanks,
    Ashish

    I am not sure any size limits exist - what size files are you planning to use ?
    HTH
    Srini

  • Need info on using external tables load/write data

    Hi All,
    We are planning to load conversion/interface data using external tables feature available in the Oracle database.
    Also for outbound interfaces, we are planning to use the same feature to write data in the file system.
    If you have done similar exercise in any of your projects, please share sample code units. Also, let me know if there
    are any cons/limitations in this approach.
    Thanks,
    Balaji

    Please see old threads for similar discussion -- http://forums.oracle.com/forums/search.jspa?threadID=&q=external+AND+tables&objID=c3&dateRange=all&userID=&numResults=15&rankBy=10001
    Thanks,
    Hussein

  • External Table Loads - Insufficient Privs

    Hi...please advise:
    Facts:
    1. I have 2 schemas: SCHEMA_A and SCHEMA_B
    2. I have an oracle directory 'VPP_DIR' created under SCHEMA_A and granted WRITE and READ on the dir to SCHEMA_B.
    3. The physical dir on the unix server to which VPP_DIR points has read, write, execute privs for the Oracle user.
    4. I have a procedure in SCHEMA_A (CET_PROC) which dynamically creates the external table with parameters passed to it like directory_name, file_name, column_definitions, load_when_clause etc.
    5. The CET_PROC also does a grant SELECT on external table to SCHEMA_B once it is created.
    6. SCHEMA_B has EXECUTE privs to SCHEMA_A.CET_PROC.
    7. SCHEMA_B has a proc (DO_LOAD_PROC) that calls SCHEMA_A.CET_PROC.
    At the point where SCHEMA_A.CET_PROC tries to do the EXECUTE_IMMEDIATE command with the create table code, it fails with "ORA-01031: insufficient privileges"
    If I execute SCHEMA_A.CET_PROC from within SCHEMA_A with the same parameters it works fine.
    If I create CET_PROC inside SCHEMA_B and execute this version from within SCHEMA_B it works fine.
    From accross schemas, it fails. Any advice...please?

    Works for me without CREATE ANY TABLE.
    I found it easier to follow the permissions if I replaced SCHEMA_A and SCHEMA_B with OVERLORD and FLUNKY.
    /Users/williamr: cat /Volumes/Firewire1/william/testexttable.dat
    1,Eat,More,Bananas,TodayAs SYS:
    Connected to:
    Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    Account: SYS@//centosvm.starbase.local:1521/dev10g.starbase.local
    SQL> CREATE USER overlord IDENTIFIED BY overlord                         
      2  DEFAULT TABLESPACE users QUOTA UNLIMITED ON users ACCOUNT UNLOCK;
    User created.
    SQL> CREATE USER flunky IDENTIFIED BY flunky
      2  DEFAULT TABLESPACE users QUOTA UNLIMITED ON users ACCOUNT UNLOCK;
    User created.
    SQL> GRANT CREATE SESSION, CREATE TABLE, CREATE PROCEDURE TO overlord,flunky;
    Grant succeeded.
    SQL> GRANT READ,WRITE ON DIRECTORY extdrive TO overlord;
    Grant succeeded.As OVERLORD:
    Account: OVERLORD@//centosvm.starbase.local:1521/dev10g.starbase.local
    SQL> get afiedt.buf
      1  CREATE OR REPLACE PROCEDURE build_xt
      2      ( p_data OUT SYS_REFCURSOR )
      3  AS
      4      v_sqlstr VARCHAR2(4000) := q'|
      5          CREATE TABLE test_xt
      6          ( id   NUMBER(8)
      7          , col1 VARCHAR2(10)
      8          , col2 VARCHAR2(10)
      9          , col3 VARCHAR2(10)
    10          , col4 VARCHAR2(10) )
    11          ORGANIZATION EXTERNAL
    12          ( TYPE oracle_loader
    13            DEFAULT DIRECTORY extdrive
    14            ACCESS PARAMETERS
    15            ( RECORDS DELIMITED BY newline
    16              BADFILE 'testexttable.bad'
    17              DISCARDFILE 'testexttable.dsc'
    18              LOGFILE 'testexttable.log'
    19              FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
    20              ( id, col1, col2, col3, col4 ) )
    21           LOCATION ('testexttable.dat') )
    22           |';
    23  BEGIN
    24      EXECUTE IMMEDIATE v_sqlstr;
    25      OPEN p_data FOR 'SELECT * FROM test_xt';
    26* END build_xt;
    27 
    28  .
    SQL> @afiedt.buf
    Procedure created.
    SQL> grant execute on build_xt to flunky;
    Grant succeeded.
    SQL> -- Prove it works:
    SQL> var results refcursor
    SQL>
    SQL> exec build_xt(:results)
    PL/SQL procedure successfully completed.
            ID COL1       COL2       COL3       COL4
             1 Eat        More       Bananas    Today
    1 row selected.
    SQL> drop table test_xt purge;
    Table dropped.As FLUNKY:
    Account: FLUNKY@//centosvm.starbase.local:1521/dev10g.starbase.local
    SQL> SELECT * FROM user_sys_privs;
    USERNAME                       PRIVILEGE                                ADM
    FLUNKY                         CREATE TABLE                             NO
    FLUNKY                         CREATE SESSION                           NO
    FLUNKY                         CREATE PROCEDURE                         NO
    3 rows selected.
    SQL> var results refcursor
    SQL>
    SQL> exec overlord.build_xt(:results)
    PL/SQL procedure successfully completed.
            ID COL1       COL2       COL3       COL4
             1 Eat        More       Bananas    Today
    1 row selected.

  • External Table loading?

    Hi:
    Suppose I have an external table with following two fields
    FIELD_A VARCHAR(1),
    FIELD_B VARCHAR(1)
    Suppose I have a file on which external table is based with following data:
    A1
    B1
    C1
    A2
    As you can see in the file I have two rows with a FIELD_1 value of A. Can i specify a rule for the external table to only accept the last row, in the case A2?
    Thanks,
    Thomas

    Not sure what your actual data looks like but you may be able to do something like the following when you select from the external table. You will need to be able to specify what qualifies as the 'last row';
    SQL> create table ext_t (
       c1 varchar2(20),
       c2 varchar2(20))
    organization external
       type oracle_loader
       default directory my_dir
       access parameters
         records delimited by newline
         fields terminated by ','
         missing field values are null
         ( c1,
           c2
       location('test.txt')
    Table created.
    SQL> select * from ext_t
    C1                   C2                 
    A1                   some descrition1   
    B1                   some descrition2   
    C1                   some descrition3   
    A2                   some descrition4   
    4 rows selected.
    SQL> select sc1, c1, c2
    from (
       select c1, c2,substr(c1,1,1) sc1,
       row_number() over (partition by substr(c1,1,1) order by c1 desc) rn
       from ext_t)
    where rn = 1
    SC1  C1                   C2                 
    A    A2                   some descrition4   
    B    B1                   some descrition2   
    C    C1                   some descrition3   
    3 rows selected.

  • Error while accessing External table.

    Hi All,
    While accessing oracle external table. I created the table with the following query.
    CREATE OR REPLACE DIRECTORY load_dir AS '\\oraaps\Exceldata\'
    CREATE TABLE my_sheet
    DEPTNO NUMBER,
    DNAME VARCHAR2(14),
    LOC VARCHAR2(13)
    ORGANIZATION EXTERNAL
    TYPE oracle_loader
    DEFAULT DIRECTORY load_dir
    ACCESS PARAMETERS
    RECORDS DELIMITED BY NEWLINE
    badfile load_dir:'my_sheet.bad'
    logfile load_dir:'my_sheet.log'
    FIELDS TERMINATED BY ','
    MISSING FIELD VALUES ARE NULL
    DEPTNO,
    DNAME,
    LOC
    LOCATION ('my_sheet.csv')
    )REJECT LIMIT UNLIMITED;
    I am sure that the table and the directory got created because i can see the table in the SQL developer. But whenever i say select * from my_sheet i'm getting the following error in the log file.
    LOG file opened at 10/16/06 14:48:21
    Field Definitions for table mysheet
    Record format DELIMITED BY NEWLINE
    Data in file has same endianness as the platform
    Rows with all null fields are accepted
    Fields in Data Source:
    DEPTNO NUMBER Terminated by ","
    Trim whitespace same as SQL Loader
    DNAME VARCHAR2(14),
    Terminated by ","
    Trim whitespace same as SQL Loader
    LOC VARCHAR2(13)
    Terminated by ","
    Trim whitespace same as SQL Loader
    KUP-04001: error opening file \\oraaps\Exceldata\mysheet.csv
    KUP-04017: OS message: The data is invalid.
    Please do reply..Its urgent from my project deliverable point of view.
    Any help appreciated.
    Thanks and Regards.
    V.Venkateswara Rao

    It is not an Oracle error/problem. The error message is quite specific ito the actual root cause of the problem:
    KUP-04001: error opening file \\oraaps\Exceldata\mysheet.csv
    KUP-04017: OS message: The data is invalid.
    These are operating system errors. The operating system cannot access/open/read the specific UNC and/or file.
    Fix it at o/s level and it will work when Oracle needs to make that o/s call.

  • External Table Error

    While Creating an external table I got following error
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-00554: error encountered while parsing access parameters
    KUP-01005: syntax error: found "missing": expecting one of: "badfile, byteordermark, characterset, column, data, delimited, discardfile, disable_directory_link_check, exit, fields, fixed, load, logfile, language, nodiscardfile, nobadfile, nologfile, date_cache, processing, readsize, string, skip, territory, varia¿gÀ"
    KUP-01007: at line 2 column 14
    ORA-06512: at
    My CREATE TABLE Syntax is as follow
    CREATE TABLE APPL_NOTE_EXT
    CUS_ID NUMBER(10),
    TEXT VARCHAR2(2000 ),
    PAGE_NUM NUMBER(3)
    ORGANIZATION EXTERNAL
    ( TYPE ORACLE_LOADER
    DEFAULT DIRECTORY EXT_TABLE
    ACCESS PARAMETERS
    RECORDS DELIMITED BY newline
    MISSING FIELD VALUES ARE NULL
    FIELDS
    CUS_ID POSITION (2:10) ,
    TEXT POSITION (39:1737),
    PAGE_NUM POSITION (13:14)
    LOCATION ('APPNOTES.TXT')
    REJECT LIMIT Unlimited
    NOPARALLEL
    NOMONITORING;

    This is the result I obtained after creating the table with your script:
    Table created.
    SQL> desc APPL_NOTE_EXT
    Name                                      Null?    Type
    CUS_ID                                             NUMBER(10)
    TEXT                                               VARCHAR2(2000)
    PAGE_NUM                                           NUMBER(3)The only issue I faced was the directory object. I had to create it. May be you don't have privileges on the specified ext_table directory. Please verify.
    ~ Madrid

  • Os error in external tables in oracle 9i

    hi, iam geting the following err when selecting the external table.
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-04063: unable to open log file log.log
    OS error The system cannot find the file specified.
    ORA-06512: at "SYS.ORACLE_LOADER", line 14
    ORA-06512: at line 1
    my script for the external table is as follows
    CREATE OR REPLACE DIRECTORY EXT_data AS 'f:\kindle\daily\zippedfiles\';
    CREATE OR REPLACE DIRECTORY EXT_log AS 'd:\2.fin2004\1.sqlloader\3.logentries\thd0\log';
    CREATE OR REPLACE DIRECTORY EXT_bad AS 'd:\2.fin2004\1.sqlloader\3.logentries\thd0\bad';
    grant read, write on directory EXT_data to fincon1;
    grant read, write on directory EXT_log to fincon1;
    grant read, write on directory EXT_bad to fincon1;
    CREATE TABLE THD0_DATA1
    STATUS_CHARACTER               char(1),
    ACCOUNT_NUMBER                    VARchar(13),
    NODEF1                         VARchar(7),
    TRANSACTION_COUNTER               VARCHAR(5)
    ORGANIZATION EXTERNAL
    TYPE ORACLE_LOADER
    DEFAULT DIRECTORY ext_data
    ACCESS PARAMETERS
         RECORDS DELIMITED BY '@'
         badfile ext_bad:'bad.bad'
         logfile ext_log:'log.log'
    FIELDs
    STATUS_CHARACTER          ,
    ACCOUNT_NUMBER          ,
    NODEF1               ,
    TRANSACTION_COUNTER     
    LOCATION ('thd0data.dat')
    REJECT LIMIT UNLIMITED
    how to rectify the above error.
    thanks and regards
    S. Djeanthi

    If you have oracle metalink support then refer to Note:150737.1. If no support, i can paste it send to your email id.

  • Oracle 9i External Tables Error

    Hello,
    I am getting following error in querying an external table.
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-00554: error encountered while parsing access parameters
    KUP-01005: syntax error: found "identifier": expecting one of: "comma, char, date, defaultif, decimal, double, float, integer, (, nullif, oracle_date, oracle_number, position, raw, recnum, ), unsigned, varrawc, varchar, varraw, varcharc, zoned"
    KUP-01008: the bad identifier was: number
    KUP-01007: at line 8 column 23
    ORA-06512: at "SYS.ORACLE_LOADER", line 14
    OR
    Below is how External table is created.
    CREATE TABLE CXS_Datatraffic (
    tablespace_name varchar2(30),
    owner varchar2(30),
    index_name varchar2(30),
    clustering_factor number,
    leaf_blocks number,
    blevel number,
    next_extent number,
    extents number,
    isegment_type varchar2(18),
    index_bytes number,
    table_tspace_name varchar2(30),
    table_owner varchar2(30),
    table_name varchar2(30),
    table_extent number,
    table_extents number,
    tsegment_type varchar2(18),
    table_bytes NUMBER,
    Acurdate date
    ORGANIZATION EXTERNAL (
    TYPE ORACLE_LOADER
    DEFAULT DIRECTORY CXS_XMLPATH
    ACCESS PARAMETERS (
    RECORDS DELIMITED BY NEWLINE
    FIELDS TERMINATED BY ','
    MISSING FIELD VALUES ARE NULL
    tablespace_name char(30),
    owner char(30),
    index_name char(30),
    clustering_factor number,
    leaf_blocks number,
    blevel number,
    next_extent number,
    extents number,
    isegment_type char(18),
    index_bytes number,
    table_tspace_name char(30),
    table_owner char(30),
    table_name char(30),
    table_extent number,
    table_extents number,
    tsegment_type char(18),
    table_bytes NUMBER,
    Acurdate date
    LOCATION ('DataTraffic.csv')
    PARALLEL 5
    REJECT LIMIT UNLIMITED;
    And below is the datafile content(just one record) I am trying to read.
    "APPLSYSX","APPLSYS","FND_ATTACHED_DOCUMENTS_U1",1100732,17459,2,,56,"INDEX",162660352,"APPLSYSD","APPLSYS",,"FND_ATTACHED_DOCUMENTS",,84,"TABLE",344981504,21-DEC-09
    I am not sure where I am doing wrong.
    Please help.
    Thank you for your time in reading this post.
    -R

    RECORDS DELIMITED BY NEWLINE
    FIELDS TERMINATED BY ','i think this is the problem. DELIMITED should be ',' and TERMINATED must be NEWLINE

  • External Tables (Oracle Loader)

    Hi While creating the external Table, i am getting error.
    ORA-12899: value too large for column PRC_REC_TYPE (actual: 2, maximum: 1)
    But while checking the CSV file, i found that prc_rec_type is having one 1 length value.
    -- Create table
    create table ET_PGIT_POL_RISK_COVER
    prc_pol_no VARCHAR2(60),
    prc_end_no_idx VARCHAR2(22),
    prc_sec_code VARCHAR2(12),
    prc_risk_id VARCHAR2(12),
    prc_smi_code VARCHAR2(12),
    prc_code VARCHAR2(12),
    prc_desc VARCHAR2(2000),
    prc_rate VARCHAR2(22),
    prc_rate_per VARCHAR2(22),
    prc_cvr_type VARCHAR2(1),
    prc_add_si_yn VARCHAR2(1),
    prc_si_curr_code VARCHAR2(12),
    prc_prem_curr_code VARCHAR2(12),
    prc_si_fc VARCHAR2(22),
    prc_si_lc_1 VARCHAR2(22),
    prc_si_lc_2 VARCHAR2(22),
    prc_si_lc_3 VARCHAR2(22),
    prc_prem_fc VARCHAR2(22),
    prc_prem_lc_1 VARCHAR2(22),
    prc_prem_lc_2 VARCHAR2(22),
    prc_prem_lc_3 VARCHAR2(22),
    prc_eff_fm_dt VARCHAR2(30),
    prc_eff_to_dt VARCHAR2(30),
    prc_brok_comm_appl_yn VARCHAR2(1),
    prc_no_clm_bonus_appl_yn VARCHAR2(1),
    prc_prof_comm_appl_yn VARCHAR2(1),
    prc_cr_uid VARCHAR2(12),
    prc_rate_eft VARCHAR2(12),
    prc_terrorism_yn VARCHAR2(1),
    prc_lvl VARCHAR2(12),
    prc_tot_si_fc VARCHAR2(22),
    prc_first_loss_perc VARCHAR2(22),
    prc_flxi_risk_cover_desc VARCHAR2(2000),
    prc_rec_type VARCHAR2(1)
    organization external
    type ORACLE_LOADER
    default directory MMI_DATA_MIG
    access parameters
    RECORDS DELIMITED BY NEWLINE BADFILE 'ET_PGIT_POL_RISK_COVER.Bad' LOGFILE 'ET_PGIT_POL_RISK_COVER.Log' FIELDS TERMINATED BY ',' MISSING FIELD VALUES ARE NULL REJECT ROWS WITH ALL NULL FIELDS
    location (MMI_DATA_MIG:'COVER_1.csv')
    reject limit UNLIMITED;
    Not able to upload the file here.

    CHANDAN,0,1001,1,,1004,TPPropertyDamage,2.00000,100.00000,C,0,001,001,10000.000,10000.000,10000.000,10000.000,200.000,200.000,200.000,200.000,14-Feb-13 11:10:39,13-Feb-14 23:59:00,1,0,0,MMI,,0,F,0.000,,,N
    CHANDAN,0,1001,1,,1005,RoadSideAssistance,1.50000,100.00000,C,0,001,001,15000.000,15000.000,15000.000,15000.000,225.000,225.000,225.000,225.000,14-Feb-13 11:10:39,13-Feb-14 23:59:00,1,0,0,MMI,,0,F,0.000,,,N
    There are Actually three rows in csv.Above 2 rows are not inserted into et table whereas last row is getting inserted into et table(Below one)
    CHANDAN,0,1001,1,,1099,SASRIA,0.25000,1.00000,C,0,001,001,12500.000,12500.000,12500.000,12500.000,0.250,0.250,0.250,0.250,14-Feb-13 11:10:39,13-Feb-14 23:59:00,0,0,0,MMI,,0,F,0.000,,,N

  • External Table and Direct path load

    Hi,
    I was just playing with Oracle sql loader and external table features. few things which i ovserved are that data loading through direct path method of sqlloader is much faster and takes much less hard disk space than what external table method takes. here are my stats which i found while data loading:
    For Direct Path: -
    # OF RECORDS.............TIME...................SOURCE FILE SIZE...................DATAFILE SIZE(.dbf)
    478849..........................00:00:43.53...................108,638 KB...................142,088 KB
    957697..........................00:01:08.81...................217,365 KB...................258,568 KB
    1915393..........................00:02:54.43...................434,729 KB...................509,448 KB
    For External Table: -
    # OF RECORDS..........TIME...................SOURCE FILE SIZE...................DATAFILE SIZE(.dbf)
    478849..........................00:02:51.03...................108,638 KB...................966,408 KB
    957697..........................00:08:05.32...................217,365 KB...................1,930,248 KB
    1915393..........................00:17:16.31...................434,729 KB...................3,860,488 KB
    1915393..........................00:23:17.05...................434,729 KB...................3,927,048 KB
    (With PARALLEL)
    i used same files for testing and all other conditions are similar also. In my case datafile is autoextendable, hence, as par requirement its size is automatically increased and hard disk space is reduced thus.The issue is that, is this an expected behaviour? and why in case of external tables such a large hard disk space is used when compared to direct path method? Performance of external table load is also very bad when compared to direct path load.
    one more thing is that while using external table load with PARALLEL option, ideally, it should take less time. But what i actually get is more than what the time was without PARALLEL option.
    In both the cases i am loading data from the same file to the same table (once using direct path and once using external table). before every fresh loading i truncate my internal table into which data was loaded.
    any views??
    Deep
    Message was edited by:
    Deep

    Thanx to all for your suggestions.
    John, my scripts are as follows:
    for external table:
    CREATE TABLE LOG_TBL_LOAD
    (COL1 CHAR(20),     COL2 CHAR(2),     COL3 CHAR(20),     COL4 CHAR(400),
         COL5 CHAR(20),     COL6 CHAR(400),     COL7 CHAR(20),     COL8 CHAR(20),
         COL9 CHAR(400),     COL10 CHAR(400))
    ORGANIZATION EXTERNAL
    (TYPE ORACLE_LOADER
    DEFAULT DIRECTORY EXT_TAB_DIR
    ACCESS PARAMETERS
    (RECORDS DELIMITED BY NEWLINE
    FIELDS TERMINATED BY WHITESPACE OPTIONALLY ENCLOSED BY '"' MISSING FIELD VALUES ARE NULL     
    LOCATION ('LOGZ3.DAT')
    REJECT LIMIT 10;
    for loading i did:
    INSERT INTO LOG_TBL (COL1, COL2, COL3, COL4,COL5, COL6,
    COL7, COL8, COL9, COL10)
    (SELECT COL1, COL2, COL3, COL4, COL5, COL6, COL7, COL8,
    COL9, COL10 FROM LOG_TBL_load_1);
    for direct path my control file is like this:
    OPTIONS (
    DIRECT = TRUE
    LOAD DATA
    INFILE 'F:\DATAFILES\LOGZ3.DAT' "str '\n'"
    INTO TABLE LOG_TBL
    APPEND
    FIELDS TERMINATED BY WHITESPACE OPTIONALLY ENCLOSED BY '"'
    (COL1 CHAR(20),
    COL2 CHAR(2),
    COL3 CHAR(20),
    COL4 CHAR(400),
    COL5 CHAR(20),
    COL6 CHAR(400),
    COL7 CHAR(20),
    COL8 CHAR(20),
    COL9 CHAR(400),
    COL10 CHAR(400))
    and ya, i have used same table in both the situations. after loading once i used to truncate my table, LOG_TBL. i used the same source file, LOGZ3.DAT.
    my tablespace USERS is loaclly managed.
    thanks

  • While loading through External Tables, Japanese characters wrong load

    Hi all,
    I am loading a text file through External Tables. While loading, japanese characters are loading as junk characters. In text file, the characters are showing correctly.
    My spool file
    SET ECHO OFF
    SET VERIFY OFF
    SET Heading OFF
    SET LINESIZE 600
    SET NEWPAGE NONE
    SET PAGESIZE 100
    SET feed off
    set trimspool on
    spool c:\SYS_LOC_LOGIC.txt
    select CAR_MODEL_CD||',' || MAKER_CODE||',' || CAR_MODEL_NAME_CD||',' || TYPE_SPECIFY_NO||',' ||
         CATEGORY_CLASS_NO||',' || SPECIFICATION||',' || DOOR_NUMBER||',' || RECOGNITION_TYPE||',' ||
         TO_CHAR(SALES_START,'YYYY-MM-DD') ||',' || TO_CHAR(SALES_END,'YYYY-MM-DD') ||',' || LOGIC||',' || LOGIC_DESCRIPTION
    from Table where rownum < 100;
    spool off
    My External table load script
    CREATE TABLE SYS_LOC_LOGIC
         CAR_MODEL_CD                         NUMBER               ,
         MAKER_CODE                              NUMBER,
         CAR_MODEL_NAME_CD                    NUMBER,
         TYPE_SPECIFY_NO                         NUMBER               ,
         CATEGORY_CLASS_NO                    NUMBER               ,
         SPECIFICATION                         VARCHAR2(300),
         DOOR_NUMBER                              NUMBER,
         RECOGNITION_TYPE                    VARCHAR2(30),
         SALES_START                          DATE ,
         SALES_END                               DATE ,
         LOGIC                                   NUMBER,
         LOGIC_DESCRIPTION                    VARCHAR2(100)
    ORGANIZATION EXTERNAL
    TYPE ORACLE_LOADER
    DEFAULT DIRECTORY XMLTEST1
    ACCESS PARAMETERS
    RECORDS DELIMITED BY NEWLINE
    FIELDS TERMINATED BY ','
    MISSING FIELD VALUES ARE NULL
                        CAR_MODEL_CD,MAKER_CODE,CAR_MODEL_NAME_CD,TYPE_SPECIFY_NO,
                        CATEGORY_CLASS_NO,SPECIFICATION,DOOR_NUMBER,RECOGNITION_TYPE,
                        SALES_START date 'yyyy-mm-dd', SALES_END      date 'yyyy-mm-dd',
                        LOGIC, LOGIC_DESCRIPTION     
    LOCATION ('SYS_LOC_LOGIC.txt')
    --location ('products.csv')
    REJECT LIMIT UNLIMITED;
    How to solve this.
    Thanks in advance,
    Pal

    Just so I'm clear, user1 connects to the database server and runs the spool to generate a flat file from the database. User2 then uses that flat file to load that data back in to the same database? If the data isn't going anywhere, I assume there is a good reason to jump through all these unload and reload hoops rather than just moving the data from one table to another...
    What is the NLS_LANG set in the client's environment when the spool is generated? Note that the NLS_CHARACTERSET is a database setting, not a client setting.
    What character set is the text file? Are you certain that the text file is UTF-8 encoded? And not encoded using the operating system's local code page (assuming the operating system is capable of displaying Japanese text)
    There is a CHARACTERSET parameter for the external table definition, but that should default to the character set of the database.
    Justin

  • External tables on ACFS

    hi
    I use Windows 2008R2 Std. and Oracle 11gR2 RAC
    I have successfully mounted ACFS share
    ASMCMD> volinfo -a
    Diskgroup Name: SHARED
    Volume Name: SHARED_ACFS
    Volume Device: \\.\asm-shared_acfs-106
    State: ENABLED
    Size (MB): 8192
    Resize Unit (MB): 256
    Redundancy: UNPROT
    Stripe Columns: 4
    Stripe Width (K): 128
    Usage: ACFS
    Mountpath: C:\SHARED
    I have created directory in Oracle mapped onto ACFS share
    and granted read,write access to it to my user
    Then I created external table with success BUT...
    though I see metadata
    ADM@proton22> desc t111;
    Name Null? Type
    NAME VARCHAR2(4000)
    VALUE VARCHAR2(4000)
    I got error:
    ADM@proton22> select * from t111;
    select * from t111
    ERROR at line 1:
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-04027: file name check failed: C:\SHARED\EXTTAB\EXT_PARAM.log
    How to cope with this?
    I granted "full control" privileges to "everyone" user at OS level with no avail.
    Edited by: g777 on 2011-06-01 13:47
    SORRY, I MOVED TO RAC FORUM.

    See "Bug 14045247 : KUP-04027 ERROR WHEN QUERY DATA FROM EXTERNAL TABLE ON ACFS" in MOS.
    This is actually reported as not being a Bug:
    "An ACFS directory on the MS-Windows platform is implemented as a JUNCTION,and is therefore a symbolic link. Therefore, DISABLE_DIRECTORY_LINK_CHECK needs to be used, or a non-ACFS directory."
    i.e. when creating the External Table, the DISABLE_DIRECTORY_LINK_CHECK Clause must be used if using the ORACLE_LOADER Access Driver
    e.g. CREATE TABLE ...
    ... ORGANIZATION EXTERNAL
    (TYPE ORACLE_LOADER ... ACCESS PARAMETERS (RECORDS ... DISABLE_DIRECTORY_LINK_CHECK)
    For full syntax see: http://docs.oracle.com/cd/E11882_01/server.112/e22490/et_params.htm
    Also note Security Implications mentioned in above documentation:
    "Use of this parameter involves security risks because symbolic links can potentially be used to redirect the input/output of the external table load operation"

Maybe you are looking for

  • Not Member found in Active Directory Group using LDAPRealmV2

    Hi Folk: I have configured a LDAPRealmV2 and it loaded ok, but I don't see any users in the group at the console. Here is the CustomRealm membership filter line in ConfigurationData section: membership.filter=(&(member=%M)(objectclass=group)) Anyone

  • How do I do a very short video on my Ipad Mini

    How do I do a very short ( 2mins) video on my new iPad mini please

  • ESB Soap Service invoking mutiple operations of a WSDL

    Hi, we have a requirement where in we have to call mutiple opeartions of a WSDL. We tried developing this using ESB, where in we used many soap services invoking different operations of thw WSDL and a routing service which will decide on which SOAP s

  • Need information to fetch terminal address in RVSCD100

    Dear gurus. There is a report RVSCD100 which shows the document changes in Sales item. i want to do an enhancement in this report. my client wants that it should also display the terminal address from where the data has changed. For example if sales

  • Core Audio Bug?

    Hello, Was persuaded recently to splash out on Core Audio compliant third party I/F for protools (Metric Halo). But it seems that Core Audio can't support 96kHz for Aggregate devices. Does anyone know anything about this? Metric Halo have been very h