Handling external table columns

1) I am creating an external table which reads a file with 5 columns. All the columns are delimited by a '|' symbol. And one of the column is a combined column of 'state' and 'pincode'.
I want to split them in the external table structure itself without creating a view from it. Is it possible to declare a external table structure like this.
2) Also, in the same table a column comes with leading zeros. I want to use the 'TRIM' function instead of declaring the datatype as 'NUMBER'.
Can anyone help me how to do these things.
Thanks & Regards,
Bh.

Hi,
Please use the New name for the External Table and create the view with the same name of the old external Table.
In that case you need not to change in the package level.
Regards
K.Rajkumar

Similar Messages

  • How does impdp handles external tables

    I am just done with schema import and one of the package is invalid because it is referring the External Table. I am getting the error " Table or View does not exist".
    How does impdp handles external tables ?
    Do we need to replicate the directories and files from the source to imported destination ?
    Thanks

    Hi,
    Yes...
    external table directory not available on imported destination operating system
    You need to create the directories and files from the source to imported destination
    Recompile the invalid package
    Thanks

  • Carriage Return in External table column

    Im using Oracle 9.2.0.6 . On Aix.
    Ive created and external table pointing to a csv file.
    When I do a select on the last column of the external_table, it contains a carriage return / linefeed. If I try to make the colums specification smaller then it throws and error
    ORA-01401: inserted value too large for column
    How can I eliminate this unwanted character.
    line from csv file
    2,,,,,,,,,,,,,,BB
    table spec
    create table x_needsprop
    external_prop_ref varchar2(20),
    floor_level     varchar2(20),
    dcc     varchar2(4),
    lc     varchar2(4),
    ptd     varchar2(4),
    wc     varchar2(4),
    ec     varchar2(4),
    rspr     varchar2(4),
    err     varchar2(4),
    ctb     varchar2(4),
    bd     varchar2(4),
    nda     varchar2(4),
    ndd     varchar2(4),
    vsam     varchar2(4),
    res_type varchar2(5)
    ORGANIZATION EXTERNAL
    ( type oracle_loader
    default directory external_dir
    access parameters
    ( fields terminated by ',' )
    location ('needsprop.csv')
    /

    According to the documentation, adding the trim specifer 'RTRIM' directs that "spaces should be trimmed from the end of a text field. Spaces include blanks and other nonprinting characters such as tabs, line feeds, and carriage returns."
    create table x_needsprop
    external_prop_ref varchar2(20),
    floor_level varchar2(20),
    dcc varchar2(4),
    lc varchar2(4),
    ptd varchar2(4),
    wc varchar2(4),
    ec varchar2(4),
    rspr varchar2(4),
    err varchar2(4),
    ctb varchar2(4),
    bd varchar2(4),
    nda varchar2(4),
    ndd varchar2(4),
    vsam varchar2(4),
    res_type varchar2(5)
    ORGANIZATION EXTERNAL
    ( type oracle_loader
    default directory external_dir
    access parameters
    ( fields terminated by ',' RTRIM)
    location ('needsprop.csv')
    )

  • External Table Authentication in OBIEE 11g

    Hi ,
    I have a security table, which contains userid,displayname,group . I have imported Security table in Physical Layer. I'm creating session variables based on condition.
    When am trying to logging into analytic s getting an error, invalid username and password . I'm using 11.1.1.6.0 version
    How to handle external table authentication in OBIEE 11g version.
    Regards,
    Malli

    Hi fiaz,
    That links talks about 10g version.
    Step1: We have imported a secutiry table in Physical layer.
    Step2: Creating a session variable by selecting initilazation block.
    Select user_name,group from security_table where user_id=':USER' and pwd=':password';
    step3: created DISPLAYNAME,GROUP & USER VARIABLES in edit target window
    After these modifications i was trying to logging with new user, which is there in security table.
    I am getting an error that is invalid user or password.
    Is there any other changes does it required here.
    Regards,
    Malli
    Edited by: user10675696 on Dec 26, 2012 9:39 PM

  • UPPER function in external table definition ... ?

    Greetings
    Is there a way to specify a SQL function, like UPPER, in an external table column definition?
    I know I can do it with SQL*Loader in the control file but I haven't found a way to do it in OWB?
    I realize there are ways outside of the external table using a view, etc. but I'd like to reference the external table directly without having to create yet another object.
    Thanks very much!!
    -gary

    Greetings
    Is there a way to specify a SQL function, like UPPER, in an external table column definition?
    I know I can do it with SQL*Loader in the control file but I haven't found a way to do it in OWB?
    I realize there are ways outside of the external table using a view, etc. but I'd like to reference the external table directly without having to create yet another object.
    Thanks very much!!
    -gary

  • Nested Table columns and ADF BC 11.1.2

    I'm thinking ahead to a new application design, including a new database design. In this application, there are users who may not change the production tables directly, but their changes must be approved (and possibly modified) before they are applied to the production tables. The production tables are part of an existing system, and are fairly well normalized, with a master table and several detail tables.
    So for the new design, I want to have a staging table, mirroring the master table, where user changes are stored until they are approved and applied to the production tables. The staging table contains a few extra columns for "add, change or delete", user who submitted the change, date the change was requested. After applying the change, the staging record is to be copied to a change history table, and deleted from staging. That way, the staging table never has a lot of data in it.
    Here's the question:
    I need to deal with the detail tables. I could have staging versions of each detail table, but I was thinking that it might be easier to handle if the detail tables were included as nested table columns of the master staging table. Most of the detail tables only contain a few rows per master row. But can ADF BC 11.1.2 handle nested table columns? Is it easy to use in an application?

    Thanks for the quick response. I was thinking I might get a response from someone who had tried it, so I didn't waste time trying it myself. I consider your response is pretty authoritative, and I'll take that as a sign that I should forget the nested tables design.
    No, polymorphic views probably won't do the job. I think I'll just go ahead and create staging versions of each detail table with foreign keys back to the master staging table. Then I'll let the wizard create the needed association objects and view links in ADF BC. It will complicate the procedure for applying changes, particularly adding a row to the change history table. But that's the price we'll need to pay.
    OTOH - I just had a thought - since the change history table is mostly for auditing, it appears in some reports, but has no CRUD in the application, other than the insert when the changes are applied. If I did that insert with a trigger or some other PL/SQL, then the change history could still be a single table with nested table columns for the details.

  • External Table, Handling Delimited and Special Character in file

    Hi ,
    I have created one external table with these option
    ( TYPE ORACLE_LOADER
    DEFAULT DIRECTORY ***************************************
    ACCESS PARAMETERS
    ( RECORDS DELIMITED BY NEWLINE
    SKIP 0
    FIELDS TERMINATED BY '|'
    OPTIONALLY ENCLOSED BY '"'
    MISSING FIELD VALUES ARE NULL                                          
    LOCATION
    ( 'test_feed.csv'
    Now problem is these are coming as valid.
    anupam|anupam2
    anupam"test|anupam"test2
    "anupam|test3"|test3
    anupam""""test5|test5
    anupam"|test7
    but these are not coming as valid
    "anupam"test4"|test4    --> Case when we have quotes in the filed but still have quotes in it. I guess in this case we can send the filed expect closing double quotes.
    "anupam|test6   --> In case field is starting with double quotes then it's failing
    "anupam"test8|test8"|test8 --> In case one filed contains both pipe ( |) and double quotes then we are sending it enclosed in double quotes. But thats failing the job.
    Can you suggest what is the best way to handle such scenario? ( One restriction though. The file is used by other system - Netezza as well, which can't take more than one character long delimited :'( )

    One approach is to define the external table a ONE column table (with single field on the file). This way each line will come in as a row in the external table. Of course you have to build "parsing logic" on top of that.
    DROP TABLE xtern_table;
    CREATE TABLE xtern_table
        c1 VARCHAR2(4000)
      organization external
        type ORACLE_LOADER DEFAULT directory xtern_data_dir
        ACCESS PARAMETERS (
            RECORDS DELIMITED BY NEWLINE
            FIELDS TERMINATED BY '~'   ---- <<<<<<<< Use a field terminator as a character that is not found in the file
            MISSING FIELD VALUES ARE NULL
            ( c1 CHAR(4000)
         ) location ('mycsv.csv')
    > desc xtern_table
    desc xtern_table
    Name Null Type          
    C1        VARCHAR2(4000)
    > column c1 format A40
    > select * from xtern_table
    C1                                    
    anupam|anupam2                          
    anupam"test|anupam"test2                
    "anupam|test3"|test3                    
    anupam""""test5|test5                   
    anupam"|test7                           
    "anupam"test4"|test4                    
    "anupam|test6                           
    "anupam"test8|test8"|test8              
    8 rows selected
    Ideally, it will be good t have an incoming source file with predictable format.
    Hope this helps.

  • Prblem white handle files using External tables

    Hi,
    I Created an external table to handle files.
    Its working fine.
    But it is rejecting the records, where the fnal column in a record contains no (null) value.
    The data is as follows.
    empno|name|positon
    184|abc|supervisor
    185|xyz|
    186|efg|clerk
    I have given '\n' as row delimiter. and '|' as field delimiter.
    For the above data I am getting only the 1st and 3rd records only.
    External tables are not accepting the 2nd re4cord.
    In the log file it is specifying the error as "KUP-04023: field start is after end of record".
    Could anyone please give me some idea to solve this problem.
    Thank you,
    Regards,
    Gowtham Sen.

    Hi Michaels,
    I have one more doubt.
    After I added the clause "MISSING FIELD VALUES ARE NULL", its working fine.
    Case 1:
    The data is as follows.
    empno|name|positon
    184|abc|supervisor
    185|xyz|
    186|efg|clerk
    Now I am getting the data while query the external table as follows
    empno name position
    184 abc supervisor
    185 xyz <null>
    186 efg clerk
    Case 2:
    But there is a case that the data my come in the following way.
    empno|name|positon
    184|abc|supervisor
    185|xyz|
    186|efg|clerk
    18
    187
    For this data I am getting the data if I query external table as follows.
    empno name position
    184 abc supervisor
    185 xyz <null>
    186 efg clerk
    18 <null> <null>
    187 <null> <null>
    Here I would like to result the last records because, the records violated the formating. But its not doing as I expected.
    Do I need to add any other clauses.
    Thanks in advance,
    Thank you,
    Regards,
    Gowtham Sen

  • A special character in the last column in my external table

    Hello,
    I'v made a external table that looks like this :
    CREATE
    TABLE "REC_XLS"
    "FINANCE_DATE" VARCHAR2(50 BYTE),
    "CREATION_DATE" VARCHAR2(50 BYTE),
    "SENT_DATE" VARCHAR2(50 BYTE),
    "REMARKS" VARCHAR2(200 BYTE),
    "REMAINING_QUANTITY" VARCHAR2(30 BYTE),
    "CODE" VARCHAR2(30 BYTE)
    ORGANIZATION EXTERNAL
    TYPE ORACLE_LOADER DEFAULT DIRECTORY "LOADER" ACCESS PARAMETERS (
    RECORDS DELIMITED BY newline LOAD WHEN CODE!=BLANKS Skip 1 FIELDS
    TERMINATED BY 0X'09' MISSING FIELD VALUES ARE NULL REJECT ROWS
    WITH
    ALL NULL FIELDS ( Finance_date,
    creation_date, sent_date, Remarks,
    remaining_quantity, code) location ( 'Finance.tsv' )
    I exported the result in xls file and I saw in the last column a special character. The result of a copy and paste is prenseted here : "ABE
    You should notice that the last quote is in the upper line instead of at the end of ABE. This means, I think, that there is a carriage return at the end. And the 0X'09' is the tab separator. How can I get rid of this character ?

    my guess is that you run the load on unix server which uses default LF end-of-line character and the file has dos CRLF end-of-line, the easiest option is to change records delimiter in the DDL and replace "RECORDS DELIMITED BY newline" with code below
    RECORDS DELIMITED BY '\r\n'in general end-of-line can be really any character, sample above shows CRLF but it can be TAB, space, | or any other character or set of characters. I have samples of special characters on my blog http://jiri.wordpress.com/2009/01/29/oracle-external-tables-by-examples-part-1/
    hope this helps
    jiri

  • How to insert filename as column in external table.

    I have an external table that is made up by number of different files. I would like to add a new column in the table to store the actual external filename. How can this be done?
    thanks in advanced

    That's going to be a problem, then. The ISO 8859-1 character set doesn't support that character, so you cannot store a ™ in a CHAR or VARCHAR2 column in your database.
    Potentially, you could use an NCHAR or NVARCHAR2 column, depending on your national character set. Using the national character set, though, can cause other problems, particularly for front-end applications that may not fully support those data types.
    You could also potentially change your existing character set to something that does support that character, but that may be a rather involved sort of change.
    Justin

  • External Table which can handle appending multiple csv files dynamic

    I need an external table which can handle appending multiple csv files' values.
    But the problem I am having is : the number of csv files are not fixed.
    I can have between 2 to 6-7 files with the suffix as current_date. Lets say it will be like my_file1_aug_08_1.csv, my_file1_aug_08_2.csv, my_file1_aug_08_3.csv etc. and so on.
    I can do it by following as hardcoding if I know the number of files, but unfortunately the number is not fixed and need to something dynamically to inject with a wildcard search of file pattern.
    CREATE TABLE my_et_tbl
      my_field1 varchar2(4000),
      my_field2 varchar2(4000)
    ORGANIZATION EXTERNAL
      (  TYPE ORACLE_LOADER
         DEFAULT DIRECTORY my_et_dir
         ACCESS PARAMETERS
           ( RECORDS DELIMITED BY NEWLINE
            FIELDS TERMINATED BY ','
            MISSING FIELD VALUES ARE NULL  )
         LOCATION (UTL_DIR:'my_file2_5_aug_08.csv','my_file2_5_aug_08.csv')
    REJECT LIMIT UNLIMITED
    NOPARALLEL
    NOMONITORING;Please advice me with your ideas. thanks.
    Joshua..

    Well, you could do it dynamically by constructing location value:
    SQL> CREATE TABLE emp_load
      2      (
      3       employee_number      CHAR(5),
      4       employee_dob         CHAR(20),
      5       employee_last_name   CHAR(20),
      6       employee_first_name  CHAR(15),
      7       employee_middle_name CHAR(15),
      8       employee_hire_date   DATE
      9      )
    10    ORGANIZATION EXTERNAL
    11      (
    12       TYPE ORACLE_LOADER
    13       DEFAULT DIRECTORY tmp
    14       ACCESS PARAMETERS
    15         (
    16          RECORDS DELIMITED BY NEWLINE
    17          FIELDS (
    18                  employee_number      CHAR(2),
    19                  employee_dob         CHAR(20),
    20                  employee_last_name   CHAR(18),
    21                  employee_first_name  CHAR(11),
    22                  employee_middle_name CHAR(11),
    23                  employee_hire_date   CHAR(10) date_format DATE mask "mm/dd/yyyy"
    24                 )
    25         )
    26       LOCATION ('info*.dat')
    27      )
    28  /
    Table created.
    SQL> select * from emp_load;
    select * from emp_load
    ERROR at line 1:
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    SQL> set serveroutput on
    SQL> declare
      2      v_exists      boolean;
      3      v_file_length number;
      4      v_blocksize   number;
      5      v_stmt        varchar2(1000) := 'alter table emp_load location(';
      6      i             number := 1;
      7  begin
      8      loop
      9        utl_file.fgetattr(
    10                          'TMP',
    11                          'info' || i || '.dat',
    12                          v_exists,
    13                          v_file_length,
    14                          v_blocksize
    15                         );
    16        exit when not v_exists;
    17        v_stmt := v_stmt || '''info' || i || '.dat'',';
    18        i := i + 1;
    19      end loop;
    20      v_stmt := rtrim(v_stmt,',') || ')';
    21      dbms_output.put_line(v_stmt);
    22      execute immediate v_stmt;
    23  end;
    24  /
    alter table emp_load location('info1.dat','info2.dat')
    PL/SQL procedure successfully completed.
    SQL> select * from emp_load;
    EMPLO EMPLOYEE_DOB         EMPLOYEE_LAST_NAME   EMPLOYEE_FIRST_ EMPLOYEE_MIDDLE
    EMPLOYEE_
    56    november, 15, 1980   baker                mary            alice     0
    01-SEP-04
    87    december, 20, 1970   roper                lisa            marie     0
    01-JAN-99
    SQL> SY.
    P.S. Keep in mind that changing location will affect all sessions referencing external table.

  • SQL Loader: handling difference datatypes in data file and table column

    Hi,
    I am not sure if my question is valid but I am having this doubt.
    I am trying to load data from my data file into a table with just a single column of FLOAT datatype using SQL Loader. But very few insertions take place, leaving a large number of record rejected for the same reason-
    Record 7: Rejected - Error on table T1, column MSISDN.
    ORA-01722: invalid number
    The data in my datafile goes like this: (with a single space before every field)
       233207332711<EOFD><EORD>    233208660745<EOFD><EORD>    233200767380<EOFD><EORD>
    Here I want to know if there is any way to type cast the data read from the data file to suit my table column's datatype.
    How do I handle this? I want to load all the data from my datafile into my table.

    Pl continue the discussion in your original post - Pls help: SQL Loader loads only one record

  • ORA-01843: not a valid month, external table select fails on date column

    Hi,
    I created an external table as follows:
    CREATE OR REPLACE DIRECTORY sales_feeds AS '/backup/oracle/feeds';
    Directory created.
    CREATE TABLE salesfeed_external_table
              PROD_ID               NUMBER
            , CUST_ID               NUMBER
            , TIME_ID               DATE
            , CHANNEL_ID            NUMBER
            , PROMO_ID              NUMBER
            , QUANTITY_SOLD         NUMBER(10,2)
            , AMOUNT_SOLD           NUMBER(10,2)
    ORGANIZATION EXTERNAL
            TYPE ORACLE_LOADER
            DEFAULT DIRECTORY sales_feeds
            ACCESS PARAMETERS
                    RECORDS DELIMITED BY newline
                    FIELDS TERMINATED BY ','
            LOCATION ('salesfeed.dat')
    REJECT LIMIT 0
    Table created.I then create that external flat file salesfeed.dat as folows
    set echo off
    set feedback off
    set linesize 100
    set pagesize 0
    set head off
    set sqlprompt ''
    set trimspool on
    spool on
    spool /backup/oracle/feeds/salesfeed.dat
    select PROD_ID||','||CUST_ID||','||TIME_ID||','||CHANNEL_ID||','||PROMO_ID||','||QUANTITY_SOLD||','||AMOUNT_SOLD
    FROM salesfeed where rownum = 1;
    spool off
    exit
    cat salesfeed.dat
    18,3131,21/03/1998 00:00:00,3,999,1,2600.76Note only one row at the moment (diagonising)
    Try to read it
    select * from salesfeed_external_table
    ERROR at line 1:
    ORA-29913: error in executing ODCIEXTTABLEFETCH callout
    ORA-30653: reject limit reachedThe log shows:
    cat SALESFEED_EXTERNAL_TABLE_1302.log
    LOG file opened at 11/24/11 12:16:02
    Field Definitions for table SALESFEED_EXTERNAL_TABLE
      Record format DELIMITED BY NEWLINE
      Data in file has same endianness as the platform
      Rows with all null fields are accepted
      Fields in Data Source:
        PROD_ID                         CHAR (255)
          Terminated by ","
          Trim whitespace same as SQL Loader
        CUST_ID                         CHAR (255)
          Terminated by ","
          Trim whitespace same as SQL Loader
        TIME_ID                         CHAR (255)
          Terminated by ","
          Trim whitespace same as SQL Loader
        CHANNEL_ID                      CHAR (255)
          Terminated by ","
          Trim whitespace same as SQL Loader
        PROMO_ID                        CHAR (255)
          Terminated by ","
          Trim whitespace same as SQL Loader
        QUANTITY_SOLD                   CHAR (255)
          Terminated by ","
          Trim whitespace same as SQL Loader
        AMOUNT_SOLD                     CHAR (255)
          Terminated by ","
          Trim whitespace same as SQL Loader
    error processing column TIME_ID in row 1 for datafile /backup/oracle/feeds/salesfeed.dat
    ORA-01843: not a valid monthI checked my nls date format etc:
    select * from nls_session_parameters;
    PARAMETER                      VALUE
    NLS_LANGUAGE                   ENGLISH
    NLS_TERRITORY                  UNITED KINGDOM
    NLS_CURRENCY                   £
    NLS_ISO_CURRENCY               UNITED KINGDOM
    NLS_NUMERIC_CHARACTERS         .,
    NLS_CALENDAR                   GREGORIAN
    NLS_DATE_FORMAT                DD/MM/YYYY HH24:MI:SS
    NLS_DATE_LANGUAGE              ENGLISH
    NLS_SORT                       BINARY
    NLS_TIME_FORMAT                HH24.MI.SSXFF
    NLS_TIMESTAMP_FORMAT           DD-MON-RR HH24.MI.SSXFF
    NLS_TIME_TZ_FORMAT             HH24.MI.SSXFF TZR
    NLS_TIMESTAMP_TZ_FORMAT        DD-MON-RR HH24.MI.SSXFF TZR
    NLS_DUAL_CURRENCY              ¤
    NLS_COMP                       BINARY
    NLS_LENGTH_SEMANTICS           BYTE
    NLS_NCHAR_CONV_EXCP            FALSEAny ideas?
    Thanks,
    Mich
    Edited by: Mich Talebzadeh on 24-Nov-2011 04:19

    Thanks
    The following now works
    CREATE TABLE salesfeed_external_table
              PROD_ID               NUMBER
            , CUST_ID               NUMBER
            , TIME_ID               DATE
            , CHANNEL_ID            NUMBER
            , PROMO_ID              NUMBER
            , QUANTITY_SOLD         NUMBER(10,2)
            , AMOUNT_SOLD           NUMBER(10,2)
    ORGANIZATION EXTERNAL
            TYPE ORACLE_LOADER
            DEFAULT DIRECTORY sales_feeds
            ACCESS PARAMETERS
                    RECORDS DELIMITED BY newline
                    FIELDS TERMINATED BY ','
                    MISSING FIELD VALUES ARE NULL
                    (         PROD_ID
                            , CUST_ID
                            , TIME_ID DATE  'dd/mm/yyyy HH24:MI:SS'
                            , CHANNEL_ID
                            , PROMO_ID
                            , QUANTITY_SOLD
                            , AMOUNT_SOLD
            LOCATION ('salesfeed.dat')
    REJECT LIMIT 0
    select * from salesfeed_external_table;
       PROD_ID    CUST_ID TIME_ID             CHANNEL_ID   PROMO_ID QUANTITY_SOLD AMOUNT_SOLD
            18       3131 21/03/1998 00:00:00          3        999             1     2600.76Regards,
    Mich

  • How to reject external table rows with some blank columns

    How to reject external table rows with some blank columns
    I have an external table and I would like to reject rows when a number of fields are empty. Here are the details.
    CREATE TABLE EXTTAB (
    ID NUMBER(10),
    TSTAMP DATE,
    C1 NUMBER(5,0),
    C2 DATE,
    C3 FLOAT(126)
    ORGANIZATION EXTERNAL (
    TYPE ORACLE_LOADER
    DEFAULT DIRECTORY EXT_DAT_DIR
    ACCESS PARAMETERS (
    RECORDS DELIMITED BY NEWLINE
    LOAD WHEN (NOT (c1 = BLANKS AND c2 = BLANKS AND c3 = BLANKS))
    LOGFILE EXT_LOG_DIR:'exttab.log'
    BADFILE EXT_BAD_DIR:'exttab.bad'
    DISCARDFILE EXT_BAD_DIR:'exttab.dsc'
    FIELDS TERMINATED BY "|"
    LRTRIM
    MISSING FIELD VALUES ARE NULL
    REJECT ROWS WITH ALL NULL
    FIELDS (
    ID,
    TSTAMP DATE 'YYYYMMDDHH24MISS',
    C1,
    C2 DATE 'YYYYMMDDHH24MISS',
    C3
    ) LOCATION ('dummy.dat')
    REJECT LIMIT UNLIMITED
    So, as you can see from the LOAD WHEN clause, I'd like to reject rows when C1, C2 and C3 are empty.
    The above statement works fine and creates the table. However when I am trying to load data using it, the following error is produced:
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-00554: error encountered while parsing access parameters
    KUP-01005: syntax error: found "not": expecting one of: "double-quoted-string, identifier, (, number, single-quoted-string"
    KUP-01007: at line 1 column 41
    ORA-06512: at "SYS.ORACLE_LOADER", line 14
    ORA-06512: at line 1
    It seems that external tables driver does not understand the "NOT (...)" condition. Could anyone suggest how I can achieve what I want in a different way?
    Thank you.
    Denis

    Another method would be to simply remove the "LOAD WHEN condition" and create a view on the external table which filters the data.
    CREATE EXTTAB_VIEW AS
    SELECT * FROM EXTTAB
    WHERE not (c1 is null and c2 is null and c3 is null);

  • External Table error: KUP-04043: table column not found in external source

    I am trying to get the syntaxc correct for an external table.
    I keep getting this error:
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-04043: table column not found in external source: SITE
    29913. 00000 - "error in executing %s callout"
    *Cause:    The execution of the specified callout caused an error.
    *Action:   Examine the error messages take appropriate action.
    Data looks like: (some of one of many files, where the character field widths are variable)
    ZZ,ANYOLDDATA,77777,25002000,201103,12,555.555,11.222
    ZZ,ANYOLDDATA,77777,25002300,201103,34,602.162,8.777
    ZZ,ANYOLDDATA,77777,25002400,201103,12,319.127,9.666
    ZZ,OTHERDATA,77121,55069600,201103,34,25.544,1.332
    ZZ,OTHERDATAS,77122,55069600,201103,22, 1.011,0.293
    External table def I have:
    CREATE TABLE MY_INPUT (
    FIRST_CODE VARCHAR2(10),
    SECOND_CODE VARCHAR2(20),
    MY_NUMBER VARCHAR2(20),
    THIRD_CODE VARCHAR2(20),
    YEARMO VARCHAR2(6),
    N NUMBER,
    MEAN NUMBER,
    SD NUMBER
    ORGANIZATION EXTERNAL (
    TYPE ORACLE_LOADER
    DEFAULT DIRECTORY INPUT_DIR
    ACCESS PARAMETERS (
    RECORDS DELIMITED BY newline
    BADFILE INPUT_LOGDIR:'bad.bad'
    LOGFILE INPUT_LOGDIR:'log.log'
    DISCARDFILE INPUT_LOGDIR:'discards.log'
    fields terminated by ',' LRTRIM
    MISSING FIELD VALUES ARE NULL
    REJECT ROWS WITH ALL NULL FIELDS
    ( THIRD_CODE,N,MEAN,SD) )
    LOCATION ( 'myfile.rpt')
    NOPARALLEL
    REJECT LIMIT UNLIMITED;
    I have the directories INPUT_DIR and INPUT_LOGDIR defined, and read/write access granted to the user who creates the table and tried to query from it.
    I have tried various combinations of VARCHAR2 lengths and NUMBER vs VARCHAR2 for some of the numeric fields.
    I am not getting any Bad, Log or Discard files.
    I can do a GET from the SQL prompt, and see the data:
    SQL> GET 'C:\temp\input_dir'myfile.rpt'
    and I see the data.
    Windows 7
    Oracle 11.2
    I am not positive of the newline record delimiter - these files are generated by an automated system. Probably generated on a UNIX machine.
    Any suggestions on what to try would be helpful.
    KUP-04043 error message says to check the syntax .. .I am running out of thigns to check.
    Thank you - Karen

    And the get ( I created the sanitized file, so we have a real working, failing, santiized example):
    SQL> get c:\Inputfiles\myfile.rpt
    1 ZZ,ANYOLDDATA,77777,25002000,201103,12,555.555,11.222
    2 ZZ,ANYOLDDATA,77777,25002300,201103,34,602.162,8.777
    3 ZZ,ANYOLDDATA,77777,25002400,201103,12,319.127,9.666
    4 ZZ,OTHERDATA,77121,55069600,201103,34,25.544,1.332
    5* ZZ,OTHERDATAS,77122,55069600,201103,22, 1.011,0.293
    So the full series is:
    CREATE DIRECTORY INPUT_DIR AS 'C:\InputFiles';
    -- grant READ and WRITE
    GRANT READ ON DIRECTORY INPUT_DIR TO ILQC;
    GRANT WRITE ON DIRECTORY INPUT_DIR TO ILQC;
    -- As SYS, create the bad/log/discard directory:
    CREATE DIRECTORY LOGDIR AS 'C:\InputFiles\Logs';
    -- grant READ and WRITE
    GRANT READ ON DIRECTORY LOGDIR TO ILQC;
    GRANT WRITE ON DIRECTORY LOGDIR TO ILQC;
    CREATE TABLE MY_INPUT (
    FIRST_CODE VARCHAR2(10),
    SECOND_CODE VARCHAR2(20),
    MY_NUMBER VARCHAR2(20),
    THIRD_CODE VARCHAR2(20),
    YEARMO VARCHAR2(6),
    N NUMBER,
    MEAN NUMBER,
    SD NUMBER
    ORGANIZATION EXTERNAL (
    TYPE ORACLE_LOADER
    DEFAULT DIRECTORY INPUT_DIR
    ACCESS PARAMETERS (
    RECORDS DELIMITED BY newline
    BADFILE INPUT_LOGDIR:'bad.bad'
    LOGFILE INPUT_LOGDIR:'log.log'
    DISCARDFILE INPUT_LOGDIR:'discards.log'
    fields terminated by ',' LRTRIM
    MISSING FIELD VALUES ARE NULL
    REJECT ROWS WITH ALL NULL FIELDS
    ( THIRD_CODE,N,MEAN,SD) )
    LOCATION ( 'myfile.rpt')
    NOPARALLEL
    REJECT LIMIT UNLIMITED;
    SELECT * FROM my_input;
    and GET is as above.

Maybe you are looking for