Case statement logic for External Tables

Hi All,
Is there anyway I can perform a CASE logic in External table creation script?
I have a column which is supposed to receive only Numbers. But if i inadvertently receive a String, i want to insert NULL for that instance.
My table has the following creation syntax: ( I have to make a check for the NumValue column - althought I am using VARCHAR2(50) I have the transformation stage where the NumValue column has a CASE logic ; my entire file is getting rejected just because a single row in the input dat file is coming as a String)
CREATE TABLE XYZ_TABLE
     LineNumber NUMBER(20),
     NumValue VARCHAR2(50 BYTE)
ORGANIZATION EXTERNAL
( TYPE ORACLE_LOADER
DEFAULT DIRECTORY "EXT_TAB_DIR"
ACCESS PARAMETERS
( RECORDS DELIMITED BY NEWLINE
BADFILE 'EXT_TAB_DIR_LOG':'FILE1.BAD'
LOGFILE 'EXT_TAB_DIR_LOG':'FILE1.LOG'
DISCARDFILE 'EXT_TAB_DIR_LOG':'FILE1.DSC'
FIELDS TERMINATED BY '#|#'
OPTIONALLY ENCLOSED BY '#$' and '$#'
MISSING FIELD VALUES ARE NULL(
LINENUMBER,
     NUMVALUE
LOCATION
( 'FILE1.dat'
REJECT LIMIT UNLIMITED;
Thank you,
Chaitanya

Chaitanya wrote:
So here, in the CASE logic, can i perform a check for validating if the value received for NumValue is only number and not some Varchar2 value (which I am currently receiving for some rows)Assuming "is only number" means NumValue column can containg digits only:
insert
  into TableABC(
                linenumber,
                nuvalue
  select  linenumber,
          NumValue
    from  TABLEXYZ
    where regexp_like(NumValue,'^\d+$')
/SY.
P.S. If NULL column NumValue is allowed, add OR NumValue IS NULL

Similar Messages

  • How to use remote directory for external table

    Hi Folks,
    I have 2 Oracle 11GR2 64 bit database installed on Win 2008 server as prod1 and prod2.
    I have one directory created on prod1 server as EXT_TAB_DIR using the path as D:\OrsDWtest_dir .
    I want to use this directory in Prod2 server and use external table using this remote directory.
    I am able to access the Prod1 directory from Prod2 machine and also i have created Network map drive as Z drive pointing to that prod1 D:\OrsDWtest_dir directory. Also i checked read and Write permissions are there . I am able to create the external table but when i try to fetch the data i m getting below error ..
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-04040: file IOMM_20121213_060736.csv in EXT_TAB_DIR not found
    now my doubt is this possible ? Can we use remote directory for External table ? or is there is there any alternative way to achieve same ?
    Thanks & Regards,
    Vikash Jain(DBA)

    could you confirm the name and the existence of this file "IOMM_20121213_060736.csv" ?
    same error like:
    http://www.oracle-base.com/articles/9i/external-tables-9i.php
    if the load files have not been saved in the appropriate directory the following result will be displayed.
    SQL> SELECT *
      2  FROM   countries_ext
      3  ORDER BY country_name;
    SELECT *
    ERROR at line 1:
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-04040: file Countries1.txt in EXT_TABLES not found
    ORA-06512: at "SYS.ORACLE_LOADER", line 14
    ORA-06512: at line 1Edited by: Fran on 10-ene-2013 23:32

  • Is it possible to use a case statement when joining different tables based on input parameters?

    Hi,
    I have a scenario where my stored procedure takes 5 parameters and the users can pass NULL or some value to these parameters and based on the parameters, I need to pull data from various tables.
    Is it possible to use a case statement in the join, similar the one in the below example. I'm getting error when I use the below type of statement.
    select a.*
    from a
    case
    when parameter1=1 then
    inner join a on a.id = b.id
    when parameter1=2 then
    inner join a on a.id = c.id
    end;
    Please let me know, if this type of statement works, and if it works will it create any performance issues?. If the above doesn't work, could you please give me some alternate solutions?
    Thanks.

    Here's a technique for joining A to B or C depending on the input parameters. In theory, you are joining to both tables but the execution plan includes filters to skip whichever join is not appropriate. The drawback is that you have to do outer joins, not inner ones.
    CREATE TABLE A AS SELECT LEVEL ak FROM dual CONNECT BY LEVEL <= 100;
    CREATE TABLE b AS SELECT ak, bk
    FROM A, (SELECT LEVEL bk FROM dual CONNECT BY LEVEL <= 10);
    CREATE TABLE c(ak, ck) AS SELECT ak, bk*10 FROM b;
    variable p1 NUMBER;
    variable p2 NUMBER;
    exec :p1 := 1;
    exec :p2 := 20;
    SELECT /*+ gather_plan_statistics */ A.ak, nvl(b.bk, c.ck) otherk FROM A
    LEFT JOIN b ON A.ak = b.ak AND :p1 IS NOT NULL AND b.bk = :p1
    LEFT JOIN c ON A.ak = c.ak AND :p1 is null and :p2 IS NOT NULL and c.ck = :p2
    WHERE A.ak <= 9;
    SELECT * FROM TABLE(dbms_xplan.display_cursor(NULL,NULL,'IOSTATS LAST'));
    | Id  | Operation             | Name            | Starts | E-Rows | A-Rows |   A-Time   | Buffers |
    |   0 | SELECT STATEMENT      |                 |      1 |        |      9 |00:00:00.01 |       7 |
    |*  1 |  HASH JOIN OUTER      |                 |      1 |      9 |      9 |00:00:00.01 |       7 |
    |*  2 |   HASH JOIN OUTER     |                 |      1 |      9 |      9 |00:00:00.01 |       7 |
    |*  3 |    TABLE ACCESS FULL  | A               |      1 |      9 |      9 |00:00:00.01 |       3 |
    |   4 |    VIEW               | VW_DCL_5532A50F |      1 |      9 |      9 |00:00:00.01 |       4 |
    |*  5 |     FILTER            |                 |      1 |        |      9 |00:00:00.01 |       4 |
    |*  6 |      TABLE ACCESS FULL| B               |      1 |      9 |      9 |00:00:00.01 |       4 |
    |   7 |   VIEW                | VW_DCL_5532A50F |      1 |      9 |      0 |00:00:00.01 |       0 |
    |*  8 |    FILTER             |                 |      1 |        |      0 |00:00:00.01 |       0 |
    |*  9 |     TABLE ACCESS FULL | C               |      0 |      9 |      0 |00:00:00.01 |       0 |
    Predicate Information (identified by operation id):
       1 - access("A"."AK"="ITEM_0")
       2 - access("A"."AK"="ITEM_1")
       3 - filter("A"."AK"<=9)
      5 - filter(:P1 IS NOT NULL)
       6 - filter(("B"."AK"<=9 AND "B"."BK"=:P1))
       8 - filter((:P2 IS NOT NULL AND :P1 IS NULL))
       9 - filter(("C"."AK"<=9 AND "C"."CK"=:P2))
    You can see that table C was not really accessed: the buffer count is 0.
    exec :p1 := NULL;
    SELECT /*+ gather_plan_statistics */ A.ak, nvl(b.bk, c.ck) otherk FROM A
    LEFT JOIN b ON A.ak = b.ak AND :p1 IS NOT NULL AND b.bk = :p1
    LEFT JOIN c ON A.ak = c.ak AND :p1 is null and :p2 IS NOT NULL and c.ck = :p2
    WHERE A.ak <= 9;
    SELECT * FROM TABLE(dbms_xplan.display_cursor(NULL,NULL,'IOSTATS LAST'));
    Now table B is not accessed.
    | Id  | Operation             | Name            | Starts | E-Rows | A-Rows |   A-Time   | Buffers | Reads  |
    |   0 | SELECT STATEMENT      |                 |      1 |        |      9 |00:00:00.02 |       7 |      2 |
    |*  1 |  HASH JOIN OUTER      |                 |      1 |      9 |      9 |00:00:00.02 |       7 |      2 |
    |*  2 |   HASH JOIN OUTER     |                 |      1 |      9 |      9 |00:00:00.01 |       3 |      0 |
    |*  3 |    TABLE ACCESS FULL  | A               |      1 |      9 |      9 |00:00:00.01 |       3 |      0 |
    |   4 |    VIEW               | VW_DCL_5532A50F |      1 |      9 |      0 |00:00:00.01 |       0 |      0 |
    |*  5 |     FILTER            |                 |      1 |        |      0 |00:00:00.01 |       0 |      0 |
    |*  6 |      TABLE ACCESS FULL| B               |      0 |      9 |      0 |00:00:00.01 |       0 |      0 |
    |   7 |   VIEW                | VW_DCL_5532A50F |      1 |      9 |      9 |00:00:00.01 |       4 |      2 |
    |*  8 |    FILTER             |                 |      1 |        |      9 |00:00:00.01 |       4 |      2 |
    |*  9 |     TABLE ACCESS FULL | C               |      1 |      9 |      9 |00:00:00.01 |       4 |      2 |

  • Scripts for external table

    We've lost our backup script for a particularly hairy EXTERNAL TABLE def'n. TOAD is botching the script. Is there a SQL statement I can execute against the db which will deliver back the script?
    -Chuck

    Charles,
    <br>You can use DBMS_METADATA package.</br>
      1  SELECT dbms_metadata.get_ddl('TABLE', table_name)
      2  from user_tables
      3* where table_name = 'EXT_RN_DEPLOY_DATA' --your table name
    SQL> /
      CREATE TABLE "SCOTT"."EXT_RN_DEPLOY_DATA"
       (    "MONITOR_ID" VARCHAR2(30),
            "SAMPLE_ID" VARCHAR2(30),
            "LATITUDE" VARCHAR2(10),
            "DEW_POINT" NUMBER(18,8)
       ORGANIZATION EXTERNAL
        ( TYPE ORACLE_LOADER
          DEFAULT DIRECTORY "DATA_FILE_DIR2"
          ACCESS PARAMETERS
          ( RECORDS DELIMITED BY NEWLINE SKIP 1
    BADFILE DATA_FILE_DIR2:'REVEXT%A_%P.BAD'
    LOGFILE DATA_FILE_DIR2:'REVEXT%A_%P.LOG'
    FIELDS TERMINATED BY ","
    OPTIONALLY ENCLOSED BY '"'
    MISSING FIELD VALUES ARE NULL
          LOCATION
           ( "DATA_FILE_DIR2":'load.csv'
       REJECT LIMIT UNLIMITED
    SQL> <br>More info <a href=http://download-west.oracle.com/docs/cd/B10501_01/server.920/a96652/ch15.htm#1005930><b>here</b></a></br>
    <br>Nicolas.</br>

  • Using MISSING FIELD VALUES ARE NULL for external table

    I want to place a null for values missing in the sub_account field. Here is my external table:
    CREATE OR REPLACE DIRECTORY INCOMING_ORDERS_log_dir
    AS 'c:\starpubs\starpubs\dataformats\logs\INCOMING_ORDERS\log';
    CREATE OR REPLACE DIRECTORY INCOMING_ORDERS_bad_dir
    AS 'c:\starpubs\starpubs\dataformats\logs\INCOMING_ORDERS\bad';
    create table ext_INCOMING_ORDERS_table (
    Account varchar(5),
    Sub_Account varchar(1),
    Override_Code varchar(1),
    Nomenclature varchar(28),
    chg_nbr varchar(3),
    quantity integer,
    U_I varchar(5),
    zipcode varchar(5),
    type_reject varchar(2)
    organization external
    type oracle_loader
    default directory user_dir
    access parameters
    records delimited by newline
    missing field values are null
    badfile INCOMING_ORDERS_bad_dir:'INCOMING_ORDERS%a_%p.bad'
    logfile INCOMING_ORDERS_log_dir:'INCOMING_ORDERS%a_%p.log'
    fields
    Account(1:5) char(5),
    Sub_Account(7:7) char(1),
    Override_Code(10:10) char(1),
    Nomenclature(11:38) char(28),
    chg_nbr(40:42) char(3),
    quantity(44:48) integer external,
    U_I(50:54) char(5),
    zipcode(56:60) char(5),
    type_reject(61:62) char(2)
    location('PTCLICK.MANUAL.NOMEN.TXT','PTCLICK.ORDERS.TXT', 'EUR_RES.TXT', 'MQ.TXT', 'BPRO.TXT')
    reject limit unlimited;
    How can I place the MISSING FIELD VALUES ARE NULL for missing values for the sub_account?

    made the change I received this error:
    SQL> select * from ext_INCOMING_ORDERS_table;
    select * from ext_INCOMING_ORDERS_table
    ERROR at line 1:
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-00554: error encountered while parsing access parameters
    KUP-01005: syntax error: found "no": expecting one of: "comma, date_format,
    defaultif, enclosed, ltrim, lrtrim, ldrtrim, notrim, nullif, optionally, ),
    rtrim, terminated"
    KUP-01007: at line 7 column 26
    CREATE OR REPLACE DIRECTORY INCOMING_ORDERS_log_dir
    AS 'c:\starpubs\starpubs\dataformats\logs\INCOMING_ORDERS\log';
    CREATE OR REPLACE DIRECTORY INCOMING_ORDERS_bad_dir
    AS 'c:\starpubs\starpubs\dataformats\logs\INCOMING_ORDERS\bad';
    create table ext_INCOMING_ORDERS_table (
    Account varchar(5),
    Sub_Account varchar(1),
    Override_Code varchar(1),
    Nomenclature varchar(28),
    chg_nbr varchar(3),
    quantity integer,
    U_I varchar(5),
    zipcode varchar(5),
    type_reject varchar(2)
    organization external
    type oracle_loader
    default directory user_dir
    access parameters
    records delimited by newline
    badfile INCOMING_ORDERS_bad_dir:'INCOMING_ORDERS%a_%p.bad'
    logfile INCOMING_ORDERS_log_dir:'INCOMING_ORDERS%a_%p.log'
    fields
    Account(1:5) char(5),
    Sub_Account(7:7) char(1) NO PRESERVE BLANKS,
    Override_Code(10:10) char(1),
    Nomenclature(11:38) char(28),
    chg_nbr(40:42) char(3),
    quantity(44:48) integer external,
    U_I(50:54) char(5),
    zipcode(56:60) char(5),
    type_reject(61:62) char(2)
    location('PTCLICK.MANUAL.NOMEN.TXT','PTCLICK.ORDERS.TXT', 'EUR_RES.TXT', 'MQ.TXT', 'BPRO.TXT')
    reject limit unlimited;

  • Unix permission problem for external table in oracle 10g, sun solaris

    Hello All,
    I'm facing a problem in accessing external table which has stumped me a bit.
    What I'm looking for is to use a external table with restricted permission to Others(770) on unix.
    Would appreciate if someone helps me out here.
    Here are the steps:
    1.create directory ext_tab_dir1 as '/home/ravi/test'
    2.grant read,write on directory ext_tab_dir1 to scott
    3.CREATE TABLE scott.emp_load1(employee_number CHAR(5))
    ORGANIZATION EXTERNAL (
    type oracle_loader
    default directory ext_tab_dir1
    access parameters (
    records delimited by newline
    fields terminated by ' '
    optionally enclosed by '"'
    missing field values are null
    location ('info.dat')
    Now I have added unix user oracle to unix group myDbGroup and provided myDbGroup read/write/exec permission on directory /home/ravi/test.
    info.dat has been placed in /home/ravi/test.
    #pwd
    /home/ravi
    #ls -l
    drwxrwx--- 2 ravi myDbGroup 512 Mar 7 17:35 test
    I have manually logged in as user oracle and successfully created a sample file in /home/ravi/test.
    -rwxrwx--- 1 ravi myDbGroup 13 Mar 7 17:33 info.dat
    -rw-r--r-- 1 oracle oinstall 0 Mar 7 18:05 sampleFile
    I then connect to the db using sqlplus as ravi and do a
    #select * from scott.emp_load1
    I get the following error
    ERROR at line 1:
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-04063: unable to open log file emp_load1_18567.log
    OS error Permission denied
    ORA-06512: at "SYS.ORACLE_LOADER", line 19
    After this, I gave full permission to /home/ravi/test
    drwxrwxrwx 2 ravi myDbGroup 512 Mar 7 17:35 test
    #select * from scott.emp_load1
    And this was successful.
    It created a log file in /home/ravi/test
    -rwxrwx--- 1 ravi myDbGroup 13 Mar 7 17:33 info.dat
    -rw-r--r-- 1 oracle oinstall 0 Mar 7 18:05 sampleFile
    -rw-r--r-- 1 oracle oinstall 0 Mar 7 18:05 emp_load1_18567.log
    Now what stumped me is the owner and group owner of the file emp_load1_18567.log.
    It is same as the sampleFile which I created manually!!
    From this it can be deducted oracle is not using user id oracle while reading/writing to the unix directory but somehow assigning user id oracle as owner to the log file at the end.
    If someone has encountered this problem earlier or has some info about this...pls share.
    Regards,
    Ravinandan

    Thanks for the reply.
    I have earlier checked this with NOLOGFILE option.
    But no luck.
    I would like to add one more detail(which I missed earlier) about the problem.
    If I give 750 access to /home/ravi(That is read/exec to Group and none perm to Others),
    I will get the same error(KUP-04063: unable to open log file emp_load1_18567.log).
    This obviously means oracle is not even able to access the directory(Although unix user oracle has access via the group perms).

  • Oracle CASE statement logic

    Hi all,
    I have to compare the value of a varchar variable using a CASE statement and display the corresponding output.
    But when the following code is being executed, and i gave the value of dayrange as anything other than number, i am getting the error;
    The daysrange variable can be a number or a string (Hence i declared it as a varcahr2)
    Error report:
    ORA-06502: PL/SQL: numeric or value error: character to number conversion error
    ORA-06512: at line 5
    06502. 00000 - "PL/SQL: numeric or value error%s"
    *Cause:   
    *Action:
    declare
    daysrange varchar2(10):='abc';
    x varchar2(100);
    begin
    CASE WHEN DAYSRANGE = 1 THEN x := 'LD';
              WHEN DAYSRANGE BETWEEN 2 AND 7 THEN x := 'LW';
              WHEN DAYSRANGE BETWEEN 8 AND 30 THEN x:= 'LM';
              WHEN DAYSRANGE BETWEEN 31 AND 90 THEN x:= 'L3M';
              WHEN DAYSRANGE BETWEEN 91 AND 180 THEN x:= 'L6M';
              WHEN DAYSRANGE BETWEEN 181 AND 365 THEN x:= 'LY';
              WHEN DAYSRANGE BETWEEN 366 AND 730 THEN x:= 'L2Y';
              WHEN DAYSRANGE > 730 THEN x:= 'O2Y';
         ELSE x:='x:= x';
         END case;--DATERANGE
    exception
    when case_not_found then
    x:='something';
    dbms_output.put_line(x);
    end;
    Edited by: Chaitanya on Nov 25, 2010 1:25 AM

    Hi,
    Chaitanya wrote:
    ... The daysrange variable can be a number or a string (Hence i declared it as a varcahr2)That's usually not a good design. It would be better to have two variables (or columns) if necessary, a VARCHAR2 and a NUMBER.
    If you can't change the design, then test daysrange, and then do different things depending on whether it is a number or not.
    For example:
    declare
         daysrange      varchar2(10)     := '17';
         daysrange_n     NUMBER;
         x           varchar2(100);
    begin
         IF  REGEXP_LIKE (daysrange, '^\d+$')
         THEN
              daysrange_n := TO_NUMBER (daysrange);
              x := CASE 
                   WHEN daysrange_n > 730     THEN 'O2Y'
                   WHEN daysrange_n > 365     THEN 'L2Y'
                   WHEN daysrange_n > 180     THEN 'L1Y'
                   WHEN daysrange_n >  90     THEN 'L6M'
                   WHEN daysrange_n >  30     THEN 'L3M'
                   WHEN daysrange_n >   7     THEN 'LM'
                   WHEN daysrange_n >   1     THEN 'LW'
                   WHEN daysrange_n =   1     THEN 'LD'
                                  ELSE  x          -- If necessary
                   END;
         END IF;
    ...The tests in a CASE expression are done in order. The n-th WHEN condition is tried only after conditions 1 through n have failed. That's why we can saY, for example,
    "daysrange_n > 365" instead of
    "daysrange_n BETWEEN 366 AND 730". If the 2nd test is even being performed, we know that the 1st test failed, and that daysrnage_n is not > 730.
    I'm not saying that you have to write CASE expressions like this, or that it's necessarily better. You should know that it's possible, then choose whichever way makes the most sense in this situation.

  • How not to consider a missing field for external tables

    My Oracle vers. is 10gR2
    I've created an external table using this syntax:
    create table ext_table
    (a number(5),
    b number(5),
    c varchar2(1000))
    organization external
    (type ORACLE_LOADER
    default directory FLAISTD
    access parameters (records delimited by newline
    fields terminated by "#"
    (a char(5),
    b char(5),
    c char(1000)))
    location ('file.csv')
    My problem is this. I've got a file.XLS that I save as file.CSV Sometimes any row of the file.XLS misses of the last column and so in my file.CSV I can have something like this:
    123#123#xxx
    456#456
    and when I try to perform a select * from ext_table I get an error because it expects a missing field.
    How can I do? Can I "say" in the create table above something for warning that the last field might miss?
    Thanks in advance!

    Solomon Yakobson wrote:
    Use TRAILING NULLCOLS:Oops, it is external table not SQL*Loader. So it should be MISSING FIELD VALUES ARE NULL:
    create table ext_table
    (a number(5),
    b number(5),
    c varchar2(1000))
    organization external
    (type ORACLE_LOADER
    default directory TEMP
    access parameters (records delimited by newline
    fields terminated by "#" missing field values are null
    (a char(5),
    b char(5),
    c char(1000)))
    location ('file.csv')
    Table created.
    SQL> select  *
      2    from  ext_table
      3  /
             A          B C
           123        123 xxx
           456        456
    SQL>SY.

  • Auto stats gathering for partitioned table

    Hi,
    We are in 10gR2 in sun solaris. We are using auto stats gathering for our DB. Here is my question,
    i know oracle gather statistics of the table, if the table changes more than 10%. How this work out for partitioned table? If the partition table changes more than 10% will last partition analyzed or the full table. We have partitioned based on insertion date.
    Appreciate your responds.
    Regards,
    Satheesh Shanmugam
    http://borndba.com

    I hope it will be only current partition which has teh stale statistics will be gathered the stastics instead of full table.
    Anil Malkai

  • OMBPLUS Looking for External Table in Mappings

    Hello,
    Can I resolve this problem with OMBPLUS?
    I would like looking for all the mappings (stage) that have a source as an External Table.
    Maybe anyone have a example?
    Thanks
    christian
    Edited by: user11126676 on 14.09.2009 07:19

    Something like this should get you on your way
    (code assumes that you have logged in and CCd to the project and then the module already)
    set maps [OMBLIST MAPPINGS]
    foreach map $maps {
            set tabs [ OMBRETRIEVE MAPPING '$map' GET EXTERNAL_TABLE OPERATORS]
            if {[llength $tabs] == 0} {
                puts "Mapping $map does not depend on external tables"
            } else {
                puts "Mapping $map depends on external tables: "
                foreach tab $tabs {
                     puts $tab
    }Cheers,
    Mike

  • Help Need for External Table

    Gurus,
    While i reading the data from CSV using External Table in some columns the records having the special symbol like 'new line feed'.
    How can i trim that one.
    please help in this issue.

    Hi,
    Use Substr or Replace functions on them.

  • Network location for external table's file

    Hi,
    I am trying to import data from a csv file in oracle 10g database. I can successfully upload data from a csv file located on server itself using external table. But when I redefine directory object on a folder on network location, it doesn't work. Is that a limitation with external tables to access data from a file on network location.
    Any clarifications and suggestions would be highly appreciated.
    Thanks,
    Aniket

    Hi Nicolas,
    I created a directory object pointing to a folder on a system(Other than Oracle 10g server) in the network. This folder is shared and can be accessed from the Oracle 10g server. When I create an external table with the default directory as the shared folder it couldn't read the csv file from that folder ( Shared folder on another system).
    But when I redefined the directory object on a local folder which was on Oracle 10g server, it could read from csv file in the local folder using external table.
    My understanding is using external tables, one can only read files that are on the local machine i.e. Oracle 10g server and not on a different system.
    Please correct me if i am wrong. Any further suggestions would be highly appreciated.
    Regads,
    Aniket

  • Need Help For External Table

    Gurus,
    I have created a External Table with the Following Script
    CREATE TABLE ext_wdm_rollout_plan_test
    (structure VARCHAR2(50),
    initial_phase VARCHAR2(50),
    chain_rfi VARCHAR2(50),
    chain_anu VARCHAR2(50),
    protected_by VARCHAR2(255))
    ORGANIZATION EXTERNAL (
    DEFAULT DIRECTORY EXTERNAL_TAB_DIR
    ACCESS PARAMETERS(records delimited BY newline
         badfile EXTERNAL_TAB_DIR :'EXT_WDM_ROLLOUT.bad'
         LOGFILE EXTERNAL_TAB_DIR :'EXT_WDM_ROLLOUT.log'
         SKIP 1
         FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
         missing field VALUES are NULL
    Structure char      
    ,Initial_Phase char
    ,Chain_RFI char
    ,Chain_ANU char
    ,Protected_by char
    LOCATION (
    EXTERNAL_TAB_DIR:'wdm_rollout_plan_test.csv'
    REJECT LIMIT UNLIMITED
    In these Table Some Columns having the Special Character like 'new line feed' ie it looks like a square symbol in my database.
    How can i avoid that kind of special symbols while creating the External Table.
    I tried with the TRIM command also with External Table Scripts but it doesn't works.
    please help me in this issue.
    Regards,
    Venugopal

    Hi,
    Use Substr or Replace functions on them.

  • How to use sub folder paths for external table location parameter?

    is it possible to use one Oracle directory and address multiple files under some OS sub-directories beside that Oracle directory? Like;
    host mkdir /tmp/orcl_dir
    host mkdir /tmp/orcl_dir/fold1
    host mkdir /tmp/orcl_dir/fold2
    CREATE DIRECTORY ext_tab_dir AS '/tmp/orcl_dir';
    CREATE TABLE ext_all_source
        ORGANIZATION EXTERNAL (
           TYPE ORACLE_DATAPUMP
           DEFAULT DIRECTORY ext_tab_dir
           LOCATION ( 'fold1/all_source1.dmp', 'fold2/all_source2.dmp' ) ) PARALLEL 4
        AS SELECT * FROM all_source;
    ERROR at line 1:
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-04076: file name cannot contain a path specification: fold1/all_source1.dmp
    ORA-06512: at "SYS.ORACLE_DATAPUMP", line 19Thank you.
    Message was edited by:
    antu

    Justin is there a way you are aware of at operating system level to teach oracle that it has to access for example 16 different piece of files over a meta file. I saw some definition file like this but I am searching if this is the tools own mapping format;
    $ cat ./CON_PUB/main/CON_xfrm_subs_status_reason_act_lk_20080302.dat
    (object mfile_c_type
    (path "file:OBSOLETE")
    (fs "file://amanos/s01/abinitio/data/prod/mfs/mfs_16way")
    (local_paths 16
      "file://amanos/s01/abinitio/data/prod/mfs/parts/mfs_16way_001/mfs_16way/Applications/RDS/CON_PUB/main/CON_xfrm_subs_status_reason_act_lk_20080302.dat"
      "file://amanos/s01/abinitio/data/prod/mfs/parts/mfs_16way_002/mfs_16way/Applications/RDS/CON_PUB/main/CON_xfrm_subs_status_reason_act_lk_20080302.dat"
      "file://amanos/s01/abinitio/data/prod/mfs/parts/mfs_16way_003/mfs_16way/Applications/RDS/CON_PUB/main/CON_xfrm_subs_status_reason_act_lk_20080302.dat"
      "file://amanos/s01/abinitio/data/prod/mfs/parts/mfs_16way_004/mfs_16way/Applications/RDS/CON_PUB/main/CON_xfrm_subs_status_reason_act_lk_20080302.dat"
      "file://amanos/s01/abinitio/data/prod/mfs/parts/mfs_16way_005/mfs_16way/Applications/RDS/CON_PUB/main/CON_xfrm_subs_status_reason_act_lk_20080302.dat"
      "file://amanos/s01/abinitio/data/prod/mfs/parts/mfs_16way_006/mfs_16way/Applications/RDS/CON_PUB/main/CON_xfrm_subs_status_reason_act_lk_20080302.dat"
      "file://amanos/s01/abinitio/data/prod/mfs/parts/mfs_16way_007/mfs_16way/Applications/RDS/CON_PUB/main/CON_xfrm_subs_status_reason_act_lk_20080302.dat"
      "file://amanos/s01/abinitio/data/prod/mfs/parts/mfs_16way_008/mfs_16way/Applications/RDS/CON_PUB/main/CON_xfrm_subs_status_reason_act_lk_20080302.dat"
      "file://amanos/s01/abinitio/data/prod/mfs/parts/mfs_16way_009/mfs_16way/Applications/RDS/CON_PUB/main/CON_xfrm_subs_status_reason_act_lk_20080302.dat"
      "file://amanos/s01/abinitio/data/prod/mfs/parts/mfs_16way_010/mfs_16way/Applications/RDS/CON_PUB/main/CON_xfrm_subs_status_reason_act_lk_20080302.dat"
      "file://amanos/s01/abinitio/data/prod/mfs/parts/mfs_16way_011/mfs_16way/Applications/RDS/CON_PUB/main/CON_xfrm_subs_status_reason_act_lk_20080302.dat"
      "file://amanos/s01/abinitio/data/prod/mfs/parts/mfs_16way_012/mfs_16way/Applications/RDS/CON_PUB/main/CON_xfrm_subs_status_reason_act_lk_20080302.dat"
      "file://amanos/s01/abinitio/data/prod/mfs/parts/mfs_16way_013/mfs_16way/Applications/RDS/CON_PUB/main/CON_xfrm_subs_status_reason_act_lk_20080302.dat"
      "file://amanos/s01/abinitio/data/prod/mfs/parts/mfs_16way_014/mfs_16way/Applications/RDS/CON_PUB/main/CON_xfrm_subs_status_reason_act_lk_20080302.dat"
      "file://amanos/s01/abinitio/data/prod/mfs/parts/mfs_16way_015/mfs_16way/Applications/RDS/CON_PUB/main/CON_xfrm_subs_status_reason_act_lk_20080302.dat"
      "file://amanos/s01/abinitio/data/prod/mfs/parts/mfs_16way_016/mfs_16way/Applications/RDS/CON_PUB/main/CON_xfrm_subs_status_reason_act_lk_20080302.dat"))or at least a symbolic linking strategy may handle this, but I couldn't figure out how and of course this will have its own costs of course to manage.

  • Need a logic for Internal table processing

    Hi,
    I have a requirement...an internal table contains three fields material no, bin no, and Quantity
    Mat. No    |         Bin No        |              Quantity
    a              |              x1         |                   10
    a              |              x1         |                   10
    a              |              x2         |                   20
    b              |              x3         |                   10 
    c              |              x3         |                   20
    c              |              x4         |                   30
    c              |              x4         |                   40
    In this I need to append the records to new internal table say itab1 where multiple entries exist for some material no like mat no 'a' and 'c'  and
    if the material no. exist only once in the table, it has to be moved to another new internal table say itab2.
    Pls suggest some logic that does not have performance issues.
    Thanks in advance
    Saravana

    Hi there,
    a solution in brief...
    data: wa_itab1_a like itab1,
             wa_itab2_b like itab1,
             lv_tabix       type sytabix.
    sort itab1 by matnr.
    loop at itab1.
      wa_itab1_a = itab1.
      at new matnr.
        lv_tabix = sy-tabix + 1.
        clear wa_itab2_b.
        READ TABLE itab1 into wa_itab2_b
                            INDEX lv_tabix.
        if wa_itab2_b-matnr ne wa_itab1_a-matnr.
          append wa_itab1_a to itab2.
          delete itab1 where matnr = wa_itab1_a-matnr.
        endif.
      endat.
    endloop.
    Regards
    George Zervas
    Edited by: gzervas on Oct 20, 2010 12:08 PM

Maybe you are looking for