Validate External Table Fails

Hi there,
I am new to WB and I am trying to setup a simple project to read from a data file and populate a database table. I have create a 2 locations, a database and the other a file location. I have created a module and have created the external table, and imported the other database table. I already have an external table in my database that reads the data file, but was unable to import it for some reason when I did so it didn't appear, so I created it again in WB. Having done so if I perform validation on it, by right clicking, I get the following error
VLD-0187: The location is not configured for for module MY_MODULE.
Specify a location in the configuration for the database module. Locations must be specified as data locations in the Module Editor to appear in the configuration location selection list.
Does anybody know what this means? The database module under which I have created my external table and imported my table uses the database location I have created, and my extrenal table uses the file location I have created.
Any help would be much appreciated
thanks, Anil

OK there are few thing that need to be set:
1. MY_MODULE is your Schema (module) in OWB Project explorer --> Double clicking on it opens 2 tabs for locations
-- 1. Metadata location --> this is your connection to DB
-- 2. Data locations --> this is where data is/will be located
2. In connection explorer you need to have 2 connections
-- 1. DB connection --> this is your connection to DB
-- 2. File locations --> this is directory where your files are
3. Double click on your External table --> tab Location --> set file location
Do you have all of these set?

Similar Messages

  • File loaded successfully by SQLldr but external table failed

    Hi ,
    When i tried to create an external table to load data from file it failed with the below error
    "KUP-04018: partial record at end of file",
    When i query on the external table for rows with rownum<2500 i can see the records being properly loaded but when i tried to fetch all records it throws the above error.
    But the same file when i loaded with sqlldr it loaded all the records successfully.
    Can you please let me know the reason for this?
    Regards

    Can you post us your SQLLDR control file, your external table definition, a sample of your data (preferably including the last lines of data which you believe are erroring) and also let us know your database version.

  • Create external table fails

    I'm trying to map a trace file to an external table' but I'm getting "missing keyword" error.
    set verify off
    create or replace directory 'TRACE_DIR' as '/db_home/oracle/product/diag/rdbms/test/dtms_t/trace';
    column id new_value os_user;
    SELECT sys_context('USERENV', 'OS_USER') id FROM DUAL;
    BEGIN
          EXECUTE IMMEDIATE 'alter session set tracefile_identifier = &os_user' ;
    END;
    column tracefile_loc new_value tracefile
    select SUBSTR(tracefile_loc,INSTR(tracefile_loc,'/',-1)+1) tracefile_loc
    from
    ( select value ||'/'||(select instance_name from v$instance) ||'_ora_'||
                  (select spid||case when traceid is not null then '_'||traceid else null end
                        from v$process where addr = (select paddr from v$session
                                                     where sid = (select sid from v$mystat
                                                                  where rownum = 1
                  ) || '.trc' tracefile_loc
       from v$parameter where name = 'user_dump_dest'
    jbrock@ddtms_t> CREATE TABLE trace_ext
      2                     (line varchar2(80)
      3                     )
      4       ORGANIZATION EXTERNAL
      5       (
      6         TYPE ORACLE_LOADER
      7         DEFAULT DIRECTORY trace_dir
      8         ACCESS PARAMETERS
      9         (
    10           records delimited by newline
    11           missing field values are null
    12           ( line
    13           )
    14         )
    15         LOCATION (&&tracefile)
    16       )
    17       REJECT LIMIT UNLIMITED;
           LOCATION (ddtms_t_ora_9101_JBROCK.trc)
    ERROR at line 15:
    ORA-00905: missing keyword

    Hi,
    jimmyb wrote:
    ... jbrock@ddtms_t> CREATE TABLE trace_ext
    2                     (line varchar2(80)
    3                     )
    4       ORGANIZATION EXTERNAL
    5       (
    6         TYPE ORACLE_LOADER
    7         DEFAULT DIRECTORY trace_dir
    8         ACCESS PARAMETERS
    9         (
    10           records delimited by newline
    11           missing field values are null
    12           ( line
    13           )
    14         )
    15         LOCATION (&&tracefile)
    16       )
    17       REJECT LIMIT UNLIMITED;
    LOCATION (ddtms_t_ora_9101_JBROCK.trc)
    ERROR at line 15:
    ORA-00905: missing keyword
    Doesn't the location have to be in single-qutoes? For line 15, try
    LOCATION ('&&tracefile')

  • External table fails to view data after changing file name

    Hello,
    I have previously imported 122 external tables and was able to view their data without any problems.
    The names of these files have since changed. The names have a date in them.
    So I dropped the external tables from that database and recreated them with the new file names.
    I then went into OWB and dropped them from OWB.
    I reimported all of the tables without errors.
    I go to view the data and get an error that the file does not exists.
    The problem is that the file it is telling me does not exists, is the file I had been using last week. Not the new file.
    However, when I go and right click the external table and do a configure, the correct name is showing up under datafiles.
    It's almost as though when I dropped the external tables, not everything was cleaned up. For whatever reason, OWB is still using the old names.
    Dan

    Hi there... Wich OWB version are you using?
    If you've created this external tables using any other tool (ie: SQLPLUS, SQLDeveloper, Toad, SQLNavigator, etc), are you able to query this objects from one of these tools?
    If you've created this ET's using OWB, are you creating it as external tables or are you mapping it as source flat files?
    If you are using OWB to create it, there are a few configurations/properties you should change to have them running well.
    Please, post some more info.
    Thanks

  • Failed to parse SQL Query - External Tables [Apex 4.0.2]

    Greetings experts -
    Has anyone encountered errors when creating a report in Apex that sources from an external table within the database? I'm using the 4.0.2 version that is packaged with the 11g XE edition on 64bit CentOS.
    For example, I might run:
    SELECT NULL LINK,
    COL1 LABEL,
    COL2 VALUE
    FROM MYTAB;
    Where MYTAB is an external table sitting on top of a comma-separated file. When I go to create the page, I'm prompted with the "Failed to parse SQL" dialogue.
    I noticed that if I did CTAS on the external table, and referenced the CTAS table, my page ran without problem!
    Any ideas? Is this a known "limitation" of Apex?
    Thanks,
    CJ

    Chiedu,
    Please try removing all declarative validations on this tabular form, and see if it works as expected then. There are some limitations on the type of views and joins that are supported by tabular forms when using the new declarative validations, i.e. you'll need a key preserved table in your view or join.
    Regards,
    Marc

  • ORA-01843: not a valid month, external table select fails on date column

    Hi,
    I created an external table as follows:
    CREATE OR REPLACE DIRECTORY sales_feeds AS '/backup/oracle/feeds';
    Directory created.
    CREATE TABLE salesfeed_external_table
              PROD_ID               NUMBER
            , CUST_ID               NUMBER
            , TIME_ID               DATE
            , CHANNEL_ID            NUMBER
            , PROMO_ID              NUMBER
            , QUANTITY_SOLD         NUMBER(10,2)
            , AMOUNT_SOLD           NUMBER(10,2)
    ORGANIZATION EXTERNAL
            TYPE ORACLE_LOADER
            DEFAULT DIRECTORY sales_feeds
            ACCESS PARAMETERS
                    RECORDS DELIMITED BY newline
                    FIELDS TERMINATED BY ','
            LOCATION ('salesfeed.dat')
    REJECT LIMIT 0
    Table created.I then create that external flat file salesfeed.dat as folows
    set echo off
    set feedback off
    set linesize 100
    set pagesize 0
    set head off
    set sqlprompt ''
    set trimspool on
    spool on
    spool /backup/oracle/feeds/salesfeed.dat
    select PROD_ID||','||CUST_ID||','||TIME_ID||','||CHANNEL_ID||','||PROMO_ID||','||QUANTITY_SOLD||','||AMOUNT_SOLD
    FROM salesfeed where rownum = 1;
    spool off
    exit
    cat salesfeed.dat
    18,3131,21/03/1998 00:00:00,3,999,1,2600.76Note only one row at the moment (diagonising)
    Try to read it
    select * from salesfeed_external_table
    ERROR at line 1:
    ORA-29913: error in executing ODCIEXTTABLEFETCH callout
    ORA-30653: reject limit reachedThe log shows:
    cat SALESFEED_EXTERNAL_TABLE_1302.log
    LOG file opened at 11/24/11 12:16:02
    Field Definitions for table SALESFEED_EXTERNAL_TABLE
      Record format DELIMITED BY NEWLINE
      Data in file has same endianness as the platform
      Rows with all null fields are accepted
      Fields in Data Source:
        PROD_ID                         CHAR (255)
          Terminated by ","
          Trim whitespace same as SQL Loader
        CUST_ID                         CHAR (255)
          Terminated by ","
          Trim whitespace same as SQL Loader
        TIME_ID                         CHAR (255)
          Terminated by ","
          Trim whitespace same as SQL Loader
        CHANNEL_ID                      CHAR (255)
          Terminated by ","
          Trim whitespace same as SQL Loader
        PROMO_ID                        CHAR (255)
          Terminated by ","
          Trim whitespace same as SQL Loader
        QUANTITY_SOLD                   CHAR (255)
          Terminated by ","
          Trim whitespace same as SQL Loader
        AMOUNT_SOLD                     CHAR (255)
          Terminated by ","
          Trim whitespace same as SQL Loader
    error processing column TIME_ID in row 1 for datafile /backup/oracle/feeds/salesfeed.dat
    ORA-01843: not a valid monthI checked my nls date format etc:
    select * from nls_session_parameters;
    PARAMETER                      VALUE
    NLS_LANGUAGE                   ENGLISH
    NLS_TERRITORY                  UNITED KINGDOM
    NLS_CURRENCY                   £
    NLS_ISO_CURRENCY               UNITED KINGDOM
    NLS_NUMERIC_CHARACTERS         .,
    NLS_CALENDAR                   GREGORIAN
    NLS_DATE_FORMAT                DD/MM/YYYY HH24:MI:SS
    NLS_DATE_LANGUAGE              ENGLISH
    NLS_SORT                       BINARY
    NLS_TIME_FORMAT                HH24.MI.SSXFF
    NLS_TIMESTAMP_FORMAT           DD-MON-RR HH24.MI.SSXFF
    NLS_TIME_TZ_FORMAT             HH24.MI.SSXFF TZR
    NLS_TIMESTAMP_TZ_FORMAT        DD-MON-RR HH24.MI.SSXFF TZR
    NLS_DUAL_CURRENCY              ¤
    NLS_COMP                       BINARY
    NLS_LENGTH_SEMANTICS           BYTE
    NLS_NCHAR_CONV_EXCP            FALSEAny ideas?
    Thanks,
    Mich
    Edited by: Mich Talebzadeh on 24-Nov-2011 04:19

    Thanks
    The following now works
    CREATE TABLE salesfeed_external_table
              PROD_ID               NUMBER
            , CUST_ID               NUMBER
            , TIME_ID               DATE
            , CHANNEL_ID            NUMBER
            , PROMO_ID              NUMBER
            , QUANTITY_SOLD         NUMBER(10,2)
            , AMOUNT_SOLD           NUMBER(10,2)
    ORGANIZATION EXTERNAL
            TYPE ORACLE_LOADER
            DEFAULT DIRECTORY sales_feeds
            ACCESS PARAMETERS
                    RECORDS DELIMITED BY newline
                    FIELDS TERMINATED BY ','
                    MISSING FIELD VALUES ARE NULL
                    (         PROD_ID
                            , CUST_ID
                            , TIME_ID DATE  'dd/mm/yyyy HH24:MI:SS'
                            , CHANNEL_ID
                            , PROMO_ID
                            , QUANTITY_SOLD
                            , AMOUNT_SOLD
            LOCATION ('salesfeed.dat')
    REJECT LIMIT 0
    select * from salesfeed_external_table;
       PROD_ID    CUST_ID TIME_ID             CHANNEL_ID   PROMO_ID QUANTITY_SOLD AMOUNT_SOLD
            18       3131 21/03/1998 00:00:00          3        999             1     2600.76Regards,
    Mich

  • Can BIP be logged in using BIEE user from external tables?

    I've setup the BIEE Session Initialization Blocks using data from external table in database.
    And now Oracle BI Interactive Dashboards can be logged in using user "sysadmin" which stored in external table.
    And user "sysadmin" has its group information of "XMLP_ADMIN" from external table
    But BIP can't with error like follows:
    PS: xmlp-server-config.xml
    <property name="COMPRESS_REPORT_DATA" value="false"/>
    <property name="BI_SERVER_SECURITY_URL" value="jdbc:oraclebi://10.100.100.69:9703/"/>
    <property name="SAW_SESSION_TIMEOUT" value="90"/>
    <property name="BI_SERVER_SECURITY_DRIVER" value="oracle.bi.jdbc.AnaJdbcDriver"/>
    <property name="SAW_PORT" value="9704"/>
    <property name="SAW_PROTOCOL" value="http"/>
    <property name="COMPRESS_REPORT_OUTPUT" value="false"/>
    <property name="SAW_URL_SUFFIX" value="analytics/saw.dll"/>
    <property name="SECURITY_MODEL" value="BI_SERVER"/>
    2009-06-28 17:32:50.484 NOTIFICATION connect to NQSSECONDARYCCS=;PORT=;SSLKEYSTO
    REPASSWORD=***;PRIMARYCCS=;USER=sysadmin;PRIMARYCCSPORT=;TRUSTANYSERVER=;LOGFILE
    PATH=C:\DOCUME~1\ADMINI~1\LOCALS~1\Temp\3\;SECONDARYCCSPORT=;TRUSTSTOREPASSWORD=
    ***;LOGLEVEL=;SSL=;HOST=10.100.100.69;CATALOG=;PASSWORD=***;
    2009-06-28 17:32:50.703 NOTIFICATION connect to NQSSECONDARYCCS=;PORT=;SSLKEYSTO
    REPASSWORD=***;PRIMARYCCS=;USER=Administrator;IMPERSONATE=sysadmin;PRIMARYCCSPOR
    T=;TRUSTANYSERVER=;LOGFILEPATH=C:\DOCUME~1\ADMINI~1\LOCALS~1\Temp\3\;SECONDARYCC
    SPORT=;TRUSTSTOREPASSWORD=***;LOGLEVEL=;SSL=;HOST=10.100.100.69;CATALOG=;PASSWOR
    D=***;
    2009-06-28 17:32:50.921 WARNING java.io.IOException: [nQSError: 43001] validate user sysadmin from St
    ar failed:invalid user/password
    It's wired that a user can login to Oracle BI Interactive Dashboards but can't to BIP while they use the same authentication.

    That should work..
    If you are able to login in BIEE with the external user,
    and configured BI security model as BIEE, then you should be able to login.
    May be, restart the services and try. this is not going to change any, but just in case, you have modified any setting for which the restart may be required.,.

  • How to import external table, which exist in export dump file.

    My export dump file has one external table. While i stated importing into my developement instance , I am getting the error "ORA-00911: invalid character".
    The original definition of the extenal table is as given below
    CREATE TABLE EXT_TABLE_EV02_PRICEMARTDATA
    EGORDERNUMBER VARCHAR2(255 BYTE),
    EGINVOICENUMBER VARCHAR2(255 BYTE),
    EGLINEITEMNUMBER VARCHAR2(255 BYTE),
    EGUID VARCHAR2(255 BYTE),
    EGBRAND VARCHAR2(255 BYTE),
    EGPRODUCTLINE VARCHAR2(255 BYTE),
    EGPRODUCTGROUP VARCHAR2(255 BYTE),
    EGPRODUCTSUBGROUP VARCHAR2(255 BYTE),
    EGMARKETCLASS VARCHAR2(255 BYTE),
    EGSKU VARCHAR2(255 BYTE),
    EGDISCOUNTGROUP VARCHAR2(255 BYTE),
    EGREGION VARCHAR2(255 BYTE),
    EGAREA VARCHAR2(255 BYTE),
    EGSALESREP VARCHAR2(255 BYTE),
    EGDISTRIBUTORCODE VARCHAR2(255 BYTE),
    EGDISTRIBUTOR VARCHAR2(255 BYTE),
    EGECMTIER VARCHAR2(255 BYTE),
    EGECM VARCHAR2(255 BYTE),
    EGSOLATIER VARCHAR2(255 BYTE),
    EGSOLA VARCHAR2(255 BYTE),
    EGTRANSACTIONTYPE VARCHAR2(255 BYTE),
    EGQUOTENUMBER VARCHAR2(255 BYTE),
    EGACCOUNTTYPE VARCHAR2(255 BYTE),
    EGFINANCIALENTITY VARCHAR2(255 BYTE),
    C25 VARCHAR2(255 BYTE),
    EGFINANCIALENTITYCODE VARCHAR2(255 BYTE),
    C27 VARCHAR2(255 BYTE),
    EGBUYINGGROUP VARCHAR2(255 BYTE),
    QTY NUMBER,
    EGTRXDATE DATE,
    EGLISTPRICE NUMBER,
    EGUOM NUMBER,
    EGUNITLISTPRICE NUMBER,
    EGMULTIPLIER NUMBER,
    EGUNITDISCOUNT NUMBER,
    EGCUSTOMERNETPRICE NUMBER,
    EGFREIGHTOUTBOUNDCHARGES NUMBER,
    EGMINIMUMORDERCHARGES NUMBER,
    EGRESTOCKINGCHARGES NUMBER,
    EGINVOICEPRICE NUMBER,
    EGCOMMISSIONS NUMBER,
    EGCASHDISCOUNTS NUMBER,
    EGBUYINGGROUPREBATES NUMBER,
    EGINCENTIVEREBATES NUMBER,
    EGRETURNS NUMBER,
    EGOTHERCREDITS NUMBER,
    EGCOOP NUMBER,
    EGPOCKETPRICE NUMBER,
    EGFREIGHTCOSTS NUMBER,
    EGJOURNALBILLINGCOSTS NUMBER,
    EGMINIMUMORDERCOSTS NUMBER,
    EGORDERENTRYCOSTS NUMBER,
    EGRESTOCKINGCOSTSWAREHOUSE NUMBER,
    EGRETURNSCOSTADMIN NUMBER,
    EGMATERIALCOSTS NUMBER,
    EGLABORCOSTS NUMBER,
    EGOVERHEADCOSTS NUMBER,
    EGPRICEADMINISTRATIONCOSTS NUMBER,
    EGSHORTPAYMENTCOSTS NUMBER,
    EGTERMCOSTS NUMBER,
    EGPOCKETMARGIN NUMBER,
    EGPOCKETMARGINGP NUMBER,
    EGWEIGHTEDAVEMULTIPLIER NUMBER
    ORGANIZATION EXTERNAL
    ( TYPE ORACLE_LOADER
    DEFAULT DIRECTORY EV02_PRICEMARTDATA_CSV_CON
    ACCESS PARAMETERS
    LOCATION (EV02_PRICEMARTDATA_CSV_CON:'VPA.csv')
    REJECT LIMIT UNLIMITED
    NOPARALLEL
    NOMONITORING;
    While importing , when i seen the log file , it is failing the create the external table. Getting the error "ORA-00911: invalid character".
    Can some one suggest how to import external tables
    Addressing this issue will be highly appriciated.
    Naveen

    Hi Srinath,
    When i observed the create table syntax of external table from import dump log file, it show few lines as below. I could not understand these special characters. And create table definationis failing with special character viz ORA-00911: invalid character
    ACCESS PARAMETERS
    LOCATION (EV02_PRICEMARTDATA_CSV_CON:'VPA.csv').
    I even observed the create table DDL from TOAD. It is same as i mentioned earlier
    Naveen

  • Error in Loading a clob via external table (Oracle 10.2.0.3 On Solaris 8)

    Hi,
    I am trying to insert a csv file as a single record as a CLOB (one row for entire csv file) and trying to do that via external table. But unalbe to do so. Following is the syntax I tried with:
    create table testext_tab2
      ( file_data clob
      organization external
         TYPE ORACLE_LOADER
         DEFAULT DIRECTORY dir_n1
      access parameters
             RECORDS DELIMITED BY NEWLINE
               BADFILE DIR_N1:'lob_tab_%a_%p.bad'
                 LOGFILE DIR_N1:'lob_tab_%a_%p.log'
                 FIELDS TERMINATED BY ','
              MISSING FIELD VALUES ARE NULL
      clob_filename CHAR(100)
      COLUMN TRANSFORMS  (file_data FROM LOBFILE (clob_filename) FROM (DIR_N1) CLOB)
      LOCATION ('emp.txt')
    REJECT LIMIT UNLIMITED
    --it gives the output that the table is created but the table does not have any rows (select count(*) from testext_tab2 gives 0 rows)
    -- and the logfile has entries like follows:
      Fields in Data Source:
        CLOB_FILENAME                   CHAR (100)
          Terminated by ","
          Trim whitespace same as SQL Loader
      Column Transformations
        FILE_DATA
            is set from a LOBFILE
                directory is from constant DIR_N1
                    directory object list is ignored
                file is from field CLOB_FILENAME
                file contains character data
                    in character set WE8ISO8859P1
    KUP-04001: error opening file /oracle/dba/dir_n1/7369
    KUP-04017: OS message: No such file or directory
    KUP-04065: error processing LOBFILE for field FILE_DATA
    KUP-04101: record 1 rejected in file /oracle/dba/dir_n1/emp.txt
    KUP-04001: error opening file /oracle/dba/dir_n1/7499
    KUP-04017: OS message: No such file or directory
    KUP-04065: error processing LOBFILE for field FILE_DATA
    KUP-04101: record 2 rejected in file /oracle/dba/dir_n1/emp.txt
    KUP-04001: error opening file /oracle/dba/dir_n1/7521
    KUP-04017: OS message: No such file or directory
    KUP-04065: error processing LOBFILE for field FILE_DATA
    KUP-04101: record 3 rejected in file /oracle/dba/dir_n1/emp.txt
    and also the file to be loaded (emp.txt) has data like this:
    7369,SMITH,CLERK,7902,12/17/1980,800,null,20
    7499,ALLEN,SALESMAN,7698,2/20/1981,1600,300,30
    7521,WARD,SALESMAN,7698,2/22/1981,1250,500,30
    7566,JONES,MANAGER,7839,4/2/1981,2975,null,20
    7654,MARTIN,SALESMAN,7698,9/28/1981,1250,1400,30
    7698,BLAKE,MANAGER,7839,5/1/1981,2850,null,30
    7782,CLARK,MANAGER,7839,6/9/1981,2450,null,10
    7788,SCOTT,ANALYST,7566,12/9/1982,3000,null,20
    7839,KING,PRESIDENT,null,11/17/1981,5000,null,10
    7844,TURNER,SALESMAN,7698,9/8/1981,1500,0,30
    7876,ADAMS,CLERK,7788,1/12/1983,1100,null,20
    7900,JAMES,CLERK,7698,12/3/1981,950,null,30
    7902,FORD,ANALYST,7566,12/3/1981,3000,null,20
    7934,MILLER,CLERK,7782,1/23/1982,1300,null,10I will be thankful for help on this. Also I read on asktom site (http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:1669379500346411993)
    that LOB are not supported for external tables but there are other sites with examples of CLOB being loaded by external tables.
    With regards,
    Orausern

    CMcM wrote:
    Hi all
    We have an application that runs fine on 10.2.0.4 on most platforms, but a customer has reported an error when running 10.2.0.3 on HP. We have since reproduced the error on 10.2.0.3 on XP but have failed to reproduce it on Solaris or Linux.
    The exact error is within a set of procedures, but the simplest reproducible form of the error is pasted in below.Except that you haven't pasted output to show us what the actual error is. Are we supposed to guess?
    SQL> select * from v$version;
    BANNER
    Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Prod
    PL/SQL Release 10.2.0.1.0 - Production
    CORE    10.2.0.1.0      Production
    TNS for 32-bit Windows: Version 10.2.0.1.0 - Production
    NLSRTL Version 10.2.0.1.0 - Production
    SQL> ed
    Wrote file afiedt.buf
      1  declare
      2    vstrg clob:= 'A';
      3    Thisstrg varchar2(32000);
      4  begin
      5    for i in 1..31999 loop
      6      vstrg := vstrg||'A';
      7    end loop;
      8    ThisStrg := vStrg;
      9* end;
    SQL> /
    PL/SQL procedure successfully completed.
    SQL>Works ok for me on 10.2.0.1 (Windows 2003 server)

  • Error when loading from External Tables in OWB 11g

    Hi,
    I face a strange problem while loading data from flat file into the External Tables.
    ORA-12899: value too large for column EXPIRED (actual: 4, maximum: 1)
    error processing column EXPIRED in row 9680 for datafile <data file location>/filename.dat
    In a total of 9771 records nearly 70 records are rejected due to the above mentioned error. The column (EXPIRED) where the error being reported doesn't have value greater than 1 at all. I suspect it to be a different problem.
    Example: One such record that got rejected is as follows:
    C|234|Littérature commentée|*N*|2354|123
    highlightened in Bold is the EXPIRED Column.
    When I tried to insert this record into the External Table using UTL_FILE Utility it got loaded successfully. But when I try with the file already existing in the file directory it again fails with the above error, and I would like to mention that all the records which have been loaded are not Ok, please have a look at the DESCRIPTION Column which is highlightened. The original information in the data file looks like:
    C|325|*Revue Générale*|N|2445|132
    In the External Table the Description Value is replaced by the inverted '?' as follows:
    Reue G¿rale
    Please help.
    Thanks,
    JL.

    user1130292 wrote:
    Hi,
    I face a strange problem while loading data from flat file into the External Tables.
    ORA-12899: value too large for column EXPIRED (actual: 4, maximum: 1)
    error processing column EXPIRED in row 9680 for datafile <data file location>/filename.dat
    In a total of 9771 records nearly 70 records are rejected due to the above mentioned error. The column (EXPIRED) where the error being reported doesn't have value greater than 1 at all. I suspect it to be a different problem.
    Example: One such record that got rejected is as follows:
    C|234|Littérature commentée|*N*|2354|123
    highlightened in Bold is the EXPIRED Column.
    When I tried to insert this record into the External Table using UTL_FILE Utility it got loaded successfully. But when I try with the file already existing in the file directory it again fails with the above error, and I would like to mention that all the records which have been loaded are not Ok, please have a look at the DESCRIPTION Column which is highlightened. The original information in the data file looks like:
    C|325|*Revue Générale*|N|2445|132
    In the External Table the Description Value is replaced by the inverted '?' as follows:
    Reue G¿rale
    Please help.
    Thanks,
    JL.sorry, couldnt see the highlighted test.could you plesae enclsoe it in  tags
    also post the table definition with attributes. BTW..Whats your NLS_LANGUAGE SET TO?

  • Error in accessing data through external table

    Hi,
    I am getting the following error while trying to view data through external table,
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-04062: no data source specified
    ORA-06512: at "SYS.ORACLE_LOADER", line 19
    according to this error "KUP-04062: no data source specified", location should be specified and i have specified the location as
    LOCATION (<diectory_name>:' '),
    then also it is giving the same error.
    Please Help!

    I am also got the same error when retrieving the data from external table.
    I resolve this problem, like this:
    1. I imported the flat file(.csv) into repository.
    2. Map that flat file fields to database table cols.
    3. Deployed the mapping using controlcenter, and run the mapping.
    4. if it fails there, i loaded the flatfile data through the command prompt using "sqlldr" statement and control file is our mapping control file.
    5 Then it loaded data into database table sucessfully.
    Thanks.

  • Error:loading from external table to database table

    hi,
    Following error comes while inserting data from external to new_ext table
    what might be the reason??
    error and external table creation script is as follows
    regards,
    Avinash
    ERROR at line 1:
    ORA-29913: error in executing ODCIEXTTABLEFETCH callout
    ORA-29400: data cartridge error
    KUP-04020: found record longer than buffer size supported, 524288, in
    /home/oracle/rgtr1.txt
    ORA-06512: at " SYS.ORACLE_LOADER", line 14
    ORA-06512: at line 1
    ORA-06512: at "PRODNDBA.PROC_RGTR1", line 513
    ORA-06512: at line 1
    external table script is as follows:
    DROP
    TABLE EXT_TABLE_RGTR1 CASCADE CONSTRAINTS;
    CREATE
    TABLE EXT_TABLE_RGTR1
    ID
    VARCHAR2(10 BYTE),
    PC
    VARCHAR2(2 BYTE),
    BU
    VARCHAR2(4 BYTE),
    CONSUMER_NO
    VARCHAR2(12 BYTE),
    CUR_READING1
    NUMBER,
    CUR_READING2
    NUMBER,
    ADJ_CONS1
    NUMBER,
    ADJ_CONS2
    NUMBER,
    TOT_EC
    NUMBER,
    TOT_FCA
    NUMBER,
    TOT_ED
    NUMBER,
    ADDLESS_AMT
    NUMBER,
    NETBILL_AMT
    NUMBER,
    ADJ_UNIT1
    NUMBER,
    ADJ_UNIT2
    NUMBER,
    TARIFF_CODE
    NUMBER,
    DUTY_CODE
    NUMBER,
    DISCONN_TAG
    VARCHAR2(1 BYTE),
    MIN_CHRG_IND
    VARCHAR2(1 BYTE),
    OC_CODE1
    VARCHAR2(1 BYTE),
    OC_CODE2
    VARCHAR2(1 BYTE),
    OC_AMT1
    NUMBER,
    OC_AMT2
    NUMBER,
    PREV_DPC
    NUMBER,
    CREDIT_AVG_BILL
    NUMBER,
    ADJ_TYPES
    NUMBER,
    ADJ_EC_FCA
    NUMBER,
    ADJ_ED
    NUMBER,
    ADJ_AMT1
    NUMBER,
    ADJ_AMT2
    NUMBER,
    ADJ_AMT3
    NUMBER,
    ADJ_AMT4
    NUMBER,
    ADJ_AMT5
    NUMBER,
    ADJ_AMT6
    NUMBER,
    ADJ_AMT7
    NUMBER,
    ADJ_AMT8
    NUMBER,
    MR_CYCLE
    VARCHAR2(2 BYTE),
    NOTICE
    VARCHAR2(2 BYTE),
    INSTAL_IND
    VARCHAR2(1 BYTE),
    LAST_RCPT_DT
    VARCHAR2(6 BYTE),
    MS1
    NUMBER,
    MS2
    NUMBER,
    ARR_PL_CR
    NUMBER,
    PREV_READ1
    NUMBER,
    PREV_READ2
    NUMBER,
    OLD_CONS_NUM
    VARCHAR2(16 BYTE),
    MTR_CODE1
    VARCHAR2(2 BYTE),
    MTR_CODE2
    VARCHAR2(2 BYTE),
    ABR_EC1
    NUMBER,
    ABR_FCA1
    NUMBER,
    REV_CAT
    VARCHAR2(2 BYTE),
    BILL_DT
    DATE,
    SD_PAID
    NUMBER,
    SD_ARRS
    NUMBER,
    SD_ADDL
    NUMBER,
    PR_RDNG_IND1
    VARCHAR2(1 BYTE),
    PR_RDNG_IND2
    VARCHAR2(1 BYTE),
    NAB_IND
    VARCHAR2(1 BYTE),
    CONN_LOAD
    NUMBER,
    OLDEST_OS_DT
    VARCHAR2(4 BYTE),
    ABR_RECAMT1
    NUMBER,
    METER_RENT
    NUMBER,
    EX_DUTY
    NUMBER,
    TDL_CHARGES
    NUMBER,
    ABR_MTH
    VARCHAR2(4 BYTE),
    READING_IND
    VARCHAR2(1 BYTE),
    FB_IND
    VARCHAR2(1 BYTE),
    ADJ_AMT
    NUMBER,
    MTR_NUM1
    NUMBER,
    MTR_NUM2
    NUMBER,
    AVG_UNITS1
    NUMBER,
    AVG_UNITS2
    NUMBER,
    MIN_BILL_AMT
    NUMBER,
    METER_COST_DEMANDED
    NUMBER,
    MF1
    VARCHAR2(2 BYTE),
    MF2
    VARCHAR2(2 BYTE),
    MR
    VARCHAR2(2 BYTE),
    ROUTE
    VARCHAR2(4 BYTE),
    SEQ
    VARCHAR2(4 BYTE),
    DISC_IND
    VARCHAR2(1 BYTE),
    FREEZE_CODE
    VARCHAR2(1 BYTE),
    MTH40
    NUMBER,
    CONCES_EC
    NUMBER,
    METER_COST_ARREARS
    NUMBER,
    METER_COST_PAID
    NUMBER,
    SANCTION_LOAD
    NUMBER,
    PENALTY_ON_CL
    NUMBER,
    CL_SLB
    NUMBER,
    UNTSLB
    NUMBER,
    CAPACIT_PENALTY
    NUMBER,
    ARREARS_OF_INT
    NUMBER,
    INT_ON_ARREARS
    NUMBER,
    FIXED_CHARGES
    NUMBER,
    ASSESSED_DPC
    NUMBER,
    FCA_UNT_OLD
    NUMBER,
    FCA_UNT_NEW
    NUMBER,
    DEFECT_RDN1
    NUMBER,
    DEFECT_RDN2
    NUMBER,
    CAP_CT_RENT
    NUMBER,
    CT_RENT
    NUMBER,
    ADV_BILL_IND
    VARCHAR2(1 BYTE),
    ADV_BILL_RD_IND
    VARCHAR2(1 BYTE),
    INST_IND
    VARCHAR2(1 BYTE),
    CONSCUR_STATUS
    VARCHAR2(1 BYTE),
    MTR_COST1
    NUMBER,
    MTR_COST2
    NUMBER,
    MTR_COST3
    NUMBER,
    LOOM_NOS
    NUMBER,
    CONS_OLD_TARIFF
    NUMBER,
    CONS_NEW_TARIFF
    NUMBER,
    PD_TD_DATE
    VARCHAR2(6 BYTE),
    ED5
    VARCHAR2(10 BYTE),
    ED6
    VARCHAR2(10 BYTE),
    ED8
    VARCHAR2(10 BYTE),
    B66_AMT1
    NUMBER,
    BILL_MTH
    VARCHAR2(6 BYTE),
    STD_MTH
    NUMBER,
    PREV_RD_MTH
    VARCHAR2(4 BYTE),
    ACCT_HD_TRF
    VARCHAR2(2 BYTE),
    ADM_CODE
    NUMBER,
    BILL_DUE_DATE
    DATE,
    EC_DUTY
    NUMBER,
    ABNOR_IND
    VARCHAR2(1 BYTE),
    TRF04_CONN_LOAD
    NUMBER,
    PROCESS_DT
    DATE,
    SUPPLY_DATE
    VARCHAR2(6 BYTE),
    BILL_DT_FROM
    DATE,
    RECEIPT_AMT
    NUMBER,
    LOCKED_CR_UNITS
    NUMBER,
    LOCKED_CHARGES
    NUMBER,
    LOCKED_ED
    NUMBER,
    LOCKED_FCA
    NUMBER,
    LOCKED_FC
    NUMBER,
    LOCKED_RLCHARGES
    NUMBER,
    LOCKED_CL
    NUMBER,
    LOCKED_ECEDFIX
    NUMBER,
    LOCKED_ACC_MONTH
    NUMBER,
    AVG_OLD_CONSMP
    NUMBER(6),
    AVG_SINCE
    VARCHAR2(4 BYTE),
    AVG_OCCURANCES
    NUMBER(2),
    AVG_MA_IND
    VARCHAR2(1 BYTE),
    MTR1_PHASE
    NUMBER(1),
    MTR_PHASE1
    VARCHAR2(1 BYTE),
    MTR2_PHASE
    NUMBER(1),
    IND_CAT
    VARCHAR2(1 BYTE),
    L61_TARIFF
    NUMBER(2),
    MTR_CHG_AMT1
    NUMBER(12),
    NO_OF_MONTHS
    NUMBER,
    MTR_CHG_AMT2
    NUMBER(12),
    B66_AMT
    NUMBER(10),
    ADJ_AMT11
    NUMBER,
    B66_ADJ_TYP
    VARCHAR2(1 BYTE),
    B60_AMT
    NUMBER(10),
    ADJ_AMT13
    NUMBER,
    B60_ADJ_TYP
    VARCHAR2(1 BYTE),
    CPF_NO
    NUMBER(8),
    ADJ1
    NUMBER(9),
    ADJ2
    NUMBER(9),
    ADJ3
    NUMBER(9),
    ADJ4
    NUMBER(9),
    ADJ5
    NUMBER(9),
    ADJ6
    NUMBER(9),
    ADJ7
    NUMBER(9),
    ADJ8
    NUMBER(9),
    ADJ11
    NUMBER(9),
    ADJ13
    NUMBER(9),
    DTC_CODE
    VARCHAR2(7 BYTE),
    TITLE
    VARCHAR2(8 BYTE),
    NAME
    VARCHAR2(36 BYTE),
    ADDRESS1
    VARCHAR2(26 BYTE),
    ADDRESS2
    VARCHAR2(26 BYTE),
    VILLAGE
    VARCHAR2(14 BYTE),
    PIN
    VARCHAR2(6 BYTE),
    LOCK_SINCE
    VARCHAR2(4 BYTE),
    FAULTY_SINCE
    VARCHAR2(4 BYTE),
    MTR_DIGIT1
    VARCHAR2(1 BYTE),
    MTR_DIGIT2
    VARCHAR2(1 BYTE),
    FCA_PAISE
    NUMBER(9),
    GTY_EXP_DT
    VARCHAR2(6 BYTE),
    REJ_READ_MTR1
    NUMBER(6),
    REJ_READ_MTR2
    NUMBER(6),
    REJ_ADJ_UNITS1
    NUMBER(6),
    REJ_CONSMP
    NUMBER(6),
    OWNER_CODE
    NUMBER(1),
    MTR_CAP_AMP
    NUMBER(1),
    MTR_A
    VARCHAR2(1 BYTE),
    MTR_C
    VARCHAR2(1 BYTE),
    MTR_D
    VARCHAR2(1 BYTE),
    MTR_E
    VARCHAR2(1 BYTE),
    CUT_OF_DT
    DATE,
    POLE
    VARCHAR2(6 BYTE),
    EDRATE
    NUMBER(9),
    AREA
    VARCHAR2(1 BYTE),
    MTR_BRAND1
    VARCHAR2(2 BYTE),
    MTR_BRAND2
    VARCHAR2(2 BYTE),
    LC1
    NUMBER(3),
    POWER_CUT_IND
    VARCHAR2(1 BYTE),
    SERVICE_DT
    DATE,
    ADJ_UNIT11
    NUMBER,
    ADJ_UNIT22
    NUMBER,
    RECEIPT_AMT1
    NUMBER,
    MTR1
    NUMBER,
    STRK_BLL_AMT
    NUMBER,
    DUTY_UNTS1
    NUMBER,
    DUTY_AMT1
    NUMBER,
    DUTY_UNTS2
    NUMBER,
    DUTY_AMT2
    NUMBER,
    DUTY_UNTS3
    NUMBER,
    DUTY_AMT3
    NUMBER,
    ADJ_UNIT_45
    NUMBER,
    ADV_BIL_EC
    NUMBER,
    NEW_DUTY_RATE
    NUMBER,
    NEW_DUTY_UNTS
    NUMBER,
    NEW_DUTY_AMT
    NUMBER,
    DPC_PAB
    NUMBER,
    MS9_SINCE
    VARCHAR2(4 BYTE),
    DISTRICT_CODE
    NUMBER(2),
    TAX_ON_SALE
    NUMBER,
    LOCKED_CHARGES_TSE
    NUMBER
    ORGANIZATION
    EXTERNAL
    ( TYPE ORACLE_LOADER
    DEFAULT DIRECTORY EXT_DIR
    ACCESS PARAMETERS
    ( records delimited by newline
    badfile
    'ext.bad'
    logfile 'ext.log'
    fields
    ID POSITION
    (1-4),
    PC POSITION
    (7-7),
    BU POSITION
    (8-11),
    CONSUMER_NO POSITION
    (18-29 ),
    CUR_READING1 POSITION
    (33-41 ),
    CUR_READING2 POSITION
    (42-50 ),
    ADJ_CONS1 POSITION
    (51-59 ),
    ADJ_CONS2 POSITION
    (60-68 ),
    TOT_EC POSITION
    (69-86),
    TOT_FCA POSITION
    (87-104 ),
    TOT_ED POSITION
    (105-122 ),
    ADDLESS_AMT POSITION
    (123-140 ),
    NETBILL_AMT POSITION
    (141-158 ),
    ADJ_UNIT1 POSITION
    (159-167 ),
    ADJ_UNIT2 POSITION
    (168-176 ),
    TARIFF_CODE POSITION
    (177-178 ),
    DUTY_CODE POSITION
    (179-180 ),
    DISCONN_TAG POSITION
    (181-181 ),
    MIN_CHRG_IND POSITION
    (182-182 ),
    OC_CODE1 POSITION
    (183-183 ),
    OC_CODE2 POSITION
    (184-184 ),
    OC_AMT1 POSITION
    (185-202 ),
    OC_AMT2 POSITION
    (203-220 ),
    PREV_DPC POSITION
    (221-238 ),
    CREDIT_AVG_BILL POSITION
    (239-256 ),
    ADJ_TYPES POSITION
    (257-264 ),
    ADJ_EC_FCA POSITION
    (265-282 ),
    ADJ_ED POSITION
    (283-300 ),
    ADJ_AMT1 POSITION
    (301-318 ),
    ADJ_AMT2 POSITION
    (319-336 ),
    ADJ_AMT3 POSITION
    (337-354 ),
    ADJ_AMT4 POSITION
    (355-372 ),
    ADJ_AMT5 POSITION
    (373-390 ),
    ADJ_AMT6 POSITION
    (391-408 ),
    ADJ_AMT7 POSITION
    (409-426 ),
    ADJ_AMT8 POSITION
    (427-444 ),
    MR_CYCLE POSITION
    (445-446 ),
    NOTICE POSITION
    (447-448 ),
    INSTAL_IND POSITION
    (449-449 ),
    LAST_RCPT_DT POSITION
    (453-458 ),
    MS1 POSITION
    (459-459),
    MS2 POSITION
    (460-460),
    ARR_PL_CR POSITION
    (461-478 ),
    PREV_READ1 POSITION
    (479-487 ),
    PREV_READ2 POSITION
    (488-496 ),
    OLD_CONS_NUM POSITION
    (497-512 ),
    MTR_CODE1 POSITION
    (513-514 ),
    MTR_CODE2 POSITION
    (515-516 ),
    ABR_EC1 POSITION
    (517-534 ),
    ABR_FCA1 POSITION
    (535-552 ),
    REV_CAT POSITION
    (553-554 ),
    BILL_DT POSITION
    (555-560 ) DATE "DDMMYY",
    SD_PAID POSITION
    (561-569 ),
    SD_ARRS POSITION
    (570-578 ),
    SD_ADDL POSITION
    (579-587 ),
    PR_RDNG_IND1 POSITION
    (588-588 ),
    PR_RDNG_IND2 POSITION
    (589-589 ),
    NAB_IND POSITION
    (590-590 ),
    CONN_LOAD POSITION
    (592-600 ),
    OLDEST_OS_DT POSITION
    (601-604 ),
    ABR_RECAMT1 POSITION
    (605-622 ),
    METER_RENT POSITION
    (623-631 ),
    EX_DUTY POSITION
    (632-649 ),
    TDL_CHARGES POSITION
    (632-649 ),
    ABR_MTH POSITION
    (650-651 ),
    READING_IND POSITION
    (652-652 ),
    FB_IND POSITION
    (653-653 ),
    ADJ_AMT POSITION
    (654-671 ),
    MTR_NUM1 POSITION
    (672-681 ),
    MTR_NUM2 POSITION
    (682-691 ),
    AVG_UNITS1 POSITION
    (692-700 ),
    AVG_UNITS2 POSITION
    (701-709 ),
    MIN_BILL_AMT POSITION
    (710-727 ),
    METER_COST_DEMANDED POSITION
    (710-727 ),
    MF1 POSITION
    (728-729),
    MF2 POSITION
    (730-731),
    MR POSITION
    (732-733),
    ROUTE POSITION
    (734-737),
    SEQ POSITION
    (738-741),
    DISC_IND POSITION
    (742-742 ),
    FREEZE_CODE POSITION
    (743-743 ),
    MTH40 POSITION
    (744-752),
    CONCES_EC POSITION
    (753-770 ),
    METER_COST_ARREARS POSITION
    (753-770 ),
    METER_COST_PAID POSITION
    (771-788 ),
    SANCTION_LOAD POSITION
    (789-797 ),
    PENALTY_ON_CL POSITION
    (798-815 ),
    CL_SLB POSITION
    (816-817 ),
    UNTSLB POSITION
    (818-819 ),
    CAPACIT_PENALTY POSITION
    (820-837 ),
    ARREARS_OF_INT POSITION
    (838-855 ),
    INT_ON_ARREARS POSITION
    (856-873 ),
    FIXED_CHARGES POSITION
    (874-891 ),
    ASSESSED_DPC POSITION
    (892-909 ),
    FCA_UNT_OLD POSITION
    (910-927 ),
    FCA_UNT_NEW POSITION
    (928-945 ),
    DEFECT_RDN1 POSITION
    (982-990 ),
    DEFECT_RDN2 POSITION
    (991-999 ),
    CAP_CT_RENT POSITION
    (1000-1008 ),
    CT_RENT POSITION
    (1009-1017 ),
    ADV_BILL_IND POSITION
    (1018-1018 )
    LOCATION
    (EXT_DIR:'rgtr1.txt')
    REJECT
    LIMIT UNLIMITED
    LOGGING
    NOCACHE
    NOPARALLEL;

    KUP-04020: found record longer than buffer size supported, number, in string
    Cause: a record in the data source was longer than the maximum data size supported. The number reported is the maximum supported size of a record.
    Action: none
    Examine your bad file and log files for details of the rows which failed from insertion.

  • Bad file is not created during the external table creation.

    Hello Experts,
    I have created a script for external table in Oracle 10g DB. Everything is working fine except it does not create the bad file, But it creates the log file. I Cann't figure out what is the issue. Because my shell scripts is failing and the entire program is failing. I am attaching the table creation script and the shell script where it is refering and the error. Kindly let me know if something is missing. Thanks in advance
    Table Creation Scripts:_-------------------------------
    create table RGIS_TCA_DATA_EXT
    guid VARCHAR2(250),
    badge VARCHAR2(250),
    scheduled_store_id VARCHAR2(250),
    parent_event_id VARCHAR2(250),
    event_id VARCHAR2(250),
    organization_number VARCHAR2(250),
    customer_number VARCHAR2(250),
    store_number VARCHAR2(250),
    inventory_date VARCHAR2(250),
    full_name VARCHAR2(250),
    punch_type VARCHAR2(250),
    punch_start_date_time VARCHAR2(250),
    punch_end_date_time VARCHAR2(250),
    event_meet_site_id VARCHAR2(250),
    vehicle_number VARCHAR2(250),
    vehicle_description VARCHAR2(250),
    vehicle_type VARCHAR2(250),
    is_owner VARCHAR2(250),
    driver_passenger VARCHAR2(250),
    mileage VARCHAR2(250),
    adder_code VARCHAR2(250),
    bonus_qualifier_code VARCHAR2(250),
    store_accuracy VARCHAR2(250),
    store_length VARCHAR2(250),
    badge_input_type VARCHAR2(250),
    source VARCHAR2(250),
    created_by VARCHAR2(250),
    created_date_time VARCHAR2(250),
    updated_by VARCHAR2(250),
    updated_date_time VARCHAR2(250),
    approver_badge_id VARCHAR2(250),
    approver_name VARCHAR2(250),
    orig_guid VARCHAR2(250),
    edit_type VARCHAR2(250)
    organization external
    type ORACLE_LOADER
    default directory ETIME_LOAD_DIR
    access parameters
    RECORDS DELIMITED BY NEWLINE
    BADFILE ETIME_LOAD_DIR:'tstlms.bad'
    LOGFILE ETIME_LOAD_DIR:'tstlms.log'
    READSIZE 1048576
    FIELDS TERMINATED BY '|'
    MISSING FIELD VALUES ARE NULL(
    GUID
    ,BADGE
    ,SCHEDULED_STORE_ID
    ,PARENT_EVENT_ID
    ,EVENT_ID
    ,ORGANIZATION_NUMBER
    ,CUSTOMER_NUMBER
    ,STORE_NUMBER
    ,INVENTORY_DATE char date_format date mask "YYYYMMDD HH24:MI:SS"
    ,FULL_NAME
    ,PUNCH_TYPE
    ,PUNCH_START_DATE_TIME char date_format date mask "YYYYMMDD HH24:MI:SS"
    ,PUNCH_END_DATE_TIME char date_format date mask "YYYYMMDD HH24:MI:SS"
    ,EVENT_MEET_SITE_ID
    ,VEHICLE_NUMBER
    ,VEHICLE_DESCRIPTION
    ,VEHICLE_TYPE
    ,IS_OWNER
    ,DRIVER_PASSENGER
    ,MILEAGE
    ,ADDER_CODE
    ,BONUS_QUALIFIER_CODE
    ,STORE_ACCURACY
    ,STORE_LENGTH
    ,BADGE_INPUT_TYPE
    ,SOURCE
    ,CREATED_BY
    ,CREATED_DATE_TIME char date_format date mask "YYYYMMDD HH24:MI:SS"
    ,UPDATED_BY
    ,UPDATED_DATE_TIME char date_format date mask "YYYYMMDD HH24:MI:SS"
    ,APPROVER_BADGE_ID
    ,APPROVER_NAME
    ,ORIG_GUID
    ,EDIT_TYPE
    location (ETIME_LOAD_DIR:'tstlms.dat')
    reject limit UNLIMITED;
    _***Shell Script*:*----------------_*
    version=1.0
    umask 000
    DATE=`date +%Y%m%d%H%M%S`
    TIME=`date +"%H%M%S"`
    SOURCE=`hostname`
    fcp_login=`echo $1|awk '{print $3}'|sed 's/"//g'|awk -F= '{print $2}'`
    fcp_reqid=`echo $1|awk '{print $2}'|sed 's/"//g'|awk -F= '{print $2}'`
    TXT1_PATH=/home/ac1/oracle/in/tsdata
    TXT2_PATH=/home/ac2/oracle/in/tsdata
    ARCH1_PATH=/home/ac1/oracle/in/tsdata
    ARCH2_PATH=/home/ac2/oracle/in/tsdata
    DEST_PATH=/home/custom/sched/in
    PROGLOG=/home/custom/sched/logs/rgis_tca_to_tlms_create.sh.log
    PROGNAME=`basename $0`
    PROGPATH=/home/custom/sched/scripts
    cd $TXT2_PATH
    FILELIST2="`ls -lrt tstlmsedits*.dat |awk '{print $9}'`"
    NO_OF_FILES2="`ls -lrt tstlmsedits*.dat |awk '{print $9}'|wc -l`"
    $DEST_PATH/tstlmsedits.dat for i in $FILELIST2
    do
    cat $i >> $DEST_PATH/tstlmsedits.dat
    printf "\n" >> $DEST_PATH/tstlmsedits.dat
    mv $i $i.$DATE
    #mv $i $TXT2_PATH/test/.
    mv $i.$DATE $TXT2_PATH/test/.
    done
    if test $NO_OF_FILES2 -eq 0
    then
    echo " no tstlmsedits.dat file exists " >> $PROGLOG
    else
    echo "created dat file tstlmsedits.dat at $DATE" >> $PROGLOG
    echo "-------------------------------------------" >> $PROGLOG
    fi
    NO_OF_FILES1="`ls -lrt tstlms*.dat |awk '{print $9}'|wc -l`"
    FILELIST1="`ls -lrt tstlms*.dat |awk '{print $9}'`"
    $DEST_PATH/tstlms.datfor i in $FILELIST1
    do
    cat $i >> $DEST_PATH/tstlms.dat
    printf "\n" >> $DEST_PATH/tstlms.dat
    mv $i $i.$DATE
    # mv $i $TXT2_PATH/test/.
    mv $i.$DATE $TXT2_PATH/test/.
    done
    if test $NO_OF_FILES1 -eq 0
    then
    echo " no tstlms.dat file exists " >> $PROGLOG
    else
    echo "created dat file tstlms.dat at $DATE" >> $PROGLOG
    fi
    cd $TXT1_PATH
    FILELIST3="`ls -lrt tstlmsedits*.dat |awk '{print $9}'`"
    NO_OF_FILES3="`ls -lrt tstlmsedits*.dat |awk '{print $9}'|wc -l`"
    $DEST_PATH/tstlmsedits.datfor i in $FILELIST3
    do
    cat $i >> $DEST_PATH/tstlmsedits.dat
    printf "\n" >> $DEST_PATH/tstlmsedits.dat
    mv $i $i.$DATE
    #mv $i $TXT1_PATH/test/.
    mv $i.$DATE $TXT1_PATH/test/.
    done
    if test $NO_OF_FILES3 -eq 0
    then
    echo " no tstlmsedits.dat file exists " >> $PROGLOG
    else
    echo "created dat file tstlmsedits.dat at $DATE" >> $PROGLOG
    echo "-------------------------------------------" >> $PROGLOG
    fi
    NO_OF_FILES4="`ls -lrt tstlms*.dat |awk '{print $9}'|wc -l`"
    FILELIST4="`ls -lrt tstlms*.dat |awk '{print $9}'`"
    $DEST_PATH/tstlms.datfor i in $FILELIST4
    do
    cat $i >> $DEST_PATH/tstlms.dat
    printf "\n" >> $DEST_PATH/tstlms.dat
    mv $i $i.$DATE
    # mv $i $TXT1_PATH/test/.
    mv $i.$DATE $TXT1_PATH/test/.
    done
    if test $NO_OF_FILES4 -eq 0
    then
    echo " no tstlms.dat file exists " >> $PROGLOG
    else
    echo "created dat file tstlms.dat at $DATE" >> $PROGLOG
    fi
    #connecting to oracle to generate bad files
    sqlplus -s $fcp_login<<EOF
    select count(*) from rgis_tca_data_ext;
    select count(*) from rgis_tca_data_history_ext;
    exit;
    EOF
    #counting the records in files
    tot_rec_in_tstlms=`wc -l $DEST_PATH/tstlms.dat | awk ' { print $1 } '`
    tot_rec_in_tstlmsedits=`wc -l $DEST_PATH/tstlmsedits.dat | awk ' { print $1 } '`
    tot_rec_in_tstlms_bad=`wc -l $DEST_PATH/tstlms.bad | awk ' { print $1 } '`
    tot_rec_in_tstlmsedits_bad=`wc -l $DEST_PATH/tstlmsedits.bad | awk ' { print $1 } '`
    #updating log table
    echo "pl/sql block started"
    sqlplus -s $fcp_login<<EOF
    define tot_rec_in_tstlms     = '$tot_rec_in_tstlms';
    define tot_rec_in_tstlmsedits     = '$tot_rec_in_tstlmsedits';
    define tot_rec_in_tstlms_bad     = '$tot_rec_in_tstlms_bad';
    define tot_rec_in_tstlmsedits_bad='$tot_rec_in_tstlmsedits_bad';
    define fcp_reqid ='$fcp_reqid';
    declare
    l_tstlms_file_id number := null;
    l_tstlmsedits_file_id number := null;
    l_tot_rec_in_tstlms number := 0;
    l_tot_rec_in_tstlmsedits number := 0;
    l_tot_rec_in_tstlms_bad number := 0;
    l_tot_rec_in_tstlmsedits_bad number := 0;
    l_request_id fnd_concurrent_requests.request_id%type;
    l_start_date fnd_concurrent_requests.actual_start_date%type;
    l_end_date fnd_concurrent_requests.actual_completion_date%type;
    l_conc_prog_name fnd_concurrent_programs.concurrent_program_name%type;
    l_requested_by fnd_concurrent_requests.requested_by%type;
    l_requested_date fnd_concurrent_requests.request_date%type;
    begin
    --getting concurrent request details
    begin
    SELECT fcp.concurrent_program_name,
    fcr.request_id,
    fcr.actual_start_date,
    fcr.actual_completion_date,
    fcr.requested_by,
    fcr.request_date
    INTO l_conc_prog_name,
    l_request_id,
    l_start_date,
    l_end_date,
    l_requested_by,
    l_requested_date
    FROM fnd_concurrent_requests fcr, fnd_concurrent_programs fcp
    WHERE fcp.concurrent_program_id = fcr.concurrent_program_id
    AND fcr.request_id = &fcp_reqid; --fnd_global.conc_request_id();
    exception
    when no_data_found then
    fnd_file.put_line(fnd_file.log, 'Error:RGIS_TCA_TO_TLMS_CREATE.sh');
    fnd_file.put_line(fnd_file.log, 'No data found for request_id');
    fnd_file.put_line(fnd_file.log, sqlerrm);
    raise_application_error(-20001,
    'Error occured when executing RGIS_TCA_TO_TLMS_CREATE.sh ' ||
    sqlerrm);
    when others then
    fnd_file.put_line(fnd_file.log, 'Error:RGIS_TCA_TO_TLMS_CREATE.sh');
    fnd_file.put_line(fnd_file.log,
    'Error occured when retrieving request_id request_id');
    fnd_file.put_line(fnd_file.log, sqlerrm);
    raise_application_error(-20001,
    'Error occured when executing RGIS_TCA_TO_TLMS_CREATE.sh ' ||
    sqlerrm);
    end;
    --calling ins_or_upd_tca_process_log to update log table for tstlms.dat file
    begin
    rgis_tca_to_tlms_process.ins_or_upd_tca_process_log
                   (l_tstlms_file_id,
                   'tstlms.dat',
                   l_conc_prog_name,
                   l_request_id,
                   l_start_date,
                   l_end_date,
                   &tot_rec_in_tstlms,
                   &tot_rec_in_tstlms_bad,
                   null,
                   null,               
                   null,
                   null,
                   null,
                   null,
                   null,
                   l_requested_by,
                   l_requested_date,
                   null,
                   null,
                   null,
                   null,
                   null);
    exception
    when others then
    fnd_file.put_line(fnd_file.log, 'Error:RGIS_TCA_TO_TLMS_CREATE.sh');
    fnd_file.put_line(fnd_file.log,
    'Error occured when executing rgis_tca_to_tlms_process.ins_or_upd_tca_process_log for tstlms file');
    fnd_file.put_line(fnd_file.log, sqlerrm);
    end;
    --calling ins_or_upd_tca_process_log to update log table for tstlmsedits.dat file
    begin
    rgis_tca_to_tlms_process.ins_or_upd_tca_process_log
                   (l_tstlmsedits_file_id,
                   'tstlmsedits.dat',
                   l_conc_prog_name,
                   l_request_id,
                   l_start_date,
                   l_end_date,
                   &tot_rec_in_tstlmsedits,
                   &tot_rec_in_tstlmsedits_bad,
                   null,
                   null,               
                   null,
                   null,
                   null,
                   null,
                   null,
                   l_requested_by,
                   l_requested_date,
                   null,
                   null,
                   null,
                   null,
                   null);
    exception
    when others then
    fnd_file.put_line(fnd_file.log, 'Error:RGIS_TCA_TO_TLMS_CREATE.sh');
    fnd_file.put_line(fnd_file.log,
    'Error occured when executing rgis_tca_to_tlms_process.ins_or_upd_tca_process_log for tstlmsedits file');
    fnd_file.put_line(fnd_file.log, sqlerrm);
    end;
    end;
    exit;
    EOF
    echo "rgis_tca_to_tlms_process.sql started"
    sqlplus -s $fcp_login @$SCHED_TOP/sql/rgis_tca_to_tlms_process.sql $fcp_reqid
    exit;
    echo "rgis_tca_to_tlms_process.sql ended"
    _**Error:*----------------------------------*_
    RGIS Scheduling: Version : UNKNOWN
    Copyright (c) 1979, 1999, Oracle Corporation. All rights reserved.
    TCATLMS module: TCA To TLMS Import Process
    Current system time is 18-AUG-2011 06:13:27
    COUNT(*)
         16
    COUNT(*)
         25
    wc: cannot open /home/custom/sched/in/tstlms.bad
    wc: cannot open /home/custom/sched/in/tstlmsedits.bad
    pl/sql block started
    old 33:     AND fcr.request_id = &fcp_reqid; --fnd_global.conc_request_id();
    new 33:     AND fcr.request_id = 18661823; --fnd_global.conc_request_id();
    old 63:                &tot_rec_in_tstlms,
    new 63:                16,
    old 64:                &tot_rec_in_tstlms_bad,
    new 64:                ,
    old 97:                &tot_rec_in_tstlmsedits,
    new 97:                25,
    old 98:                &tot_rec_in_tstlmsedits_bad,
    new 98:                ,
    ERROR at line 64:
    ORA-06550: line 64, column 4:
    PLS-00103: Encountered the symbol "," when expecting one of the following:
    ( - + case mod new not null others <an identifier>
    <a double-quoted delimited-identifier> <a bind variable> avg
    count current exists max min prior sql stddev sum variance
    execute forall merge time timestamp interval date
    <a string literal with character set specification>
    <a number> <a single-quoted SQL string> pipe
    <an alternatively-quoted string literal with character set specification>
    <an alternatively-q
    ORA-06550: line 98, column 4:
    PLS-00103: Encountered the symbol "," when expecting one of the following:
    ( - + case mod new not null others <an identifier>
    <a double-quoted delimited-identifier> <a bind variable> avg
    count current exists max min prior sql st
    rgis_tca_to_tlms_process.sql started
    old 12: and concurrent_request_id = '&1';
    new 12: and concurrent_request_id = '18661823';
    old 18: and concurrent_request_id = '&1';
    new 18: and concurrent_request_id = '18661823';
    old 22: rgis_tca_to_tlms_process.run_tca_data(l_tstlms_file_id,&1);
    new 22: rgis_tca_to_tlms_process.run_tca_data(l_tstlms_file_id,18661823);
    old 33: rgis_tca_to_tlms_process.run_tca_data_history(l_tstlmsedits_file_id,&1);
    new 33: rgis_tca_to_tlms_process.run_tca_data_history(l_tstlmsedits_file_id,18661823);
    old 44: rgis_tca_to_tlms_process.send_tca_email('TCATLMS',&1);
    new 44: rgis_tca_to_tlms_process.send_tca_email('TCATLMS',18661823);
    declare
    ERROR at line 1:
    ORA-20001: Error occured when executing RGIS_TCA_TO_TLMS_PROCESS.sql ORA-01403:
    no data found
    ORA-06512: at line 59
    Executing request completion options...
    ------------- 1) PRINT   -------------
    Printing output file.
    Request ID : 18661823      
    Number of copies : 0      
    Printer : noprint
    Finished executing request completion options.
    Concurrent request completed successfully
    Current system time is 18-AUG-2011 06:13:29
    ---------------------------------------------------------------------------

    Hi,
    Check the status of the batch in SM35 transaction.
    if the batch is locked by mistake or any other error, now you can release it and aslo you can process again.
    To Release -Shift+F4.
    Also you can analyse the job status through F2 button.
    Bye

  • External Table, Handling Delimited and Special Character in file

    Hi ,
    I have created one external table with these option
    ( TYPE ORACLE_LOADER
    DEFAULT DIRECTORY ***************************************
    ACCESS PARAMETERS
    ( RECORDS DELIMITED BY NEWLINE
    SKIP 0
    FIELDS TERMINATED BY '|'
    OPTIONALLY ENCLOSED BY '"'
    MISSING FIELD VALUES ARE NULL                                          
    LOCATION
    ( 'test_feed.csv'
    Now problem is these are coming as valid.
    anupam|anupam2
    anupam"test|anupam"test2
    "anupam|test3"|test3
    anupam""""test5|test5
    anupam"|test7
    but these are not coming as valid
    "anupam"test4"|test4    --> Case when we have quotes in the filed but still have quotes in it. I guess in this case we can send the filed expect closing double quotes.
    "anupam|test6   --> In case field is starting with double quotes then it's failing
    "anupam"test8|test8"|test8 --> In case one filed contains both pipe ( |) and double quotes then we are sending it enclosed in double quotes. But thats failing the job.
    Can you suggest what is the best way to handle such scenario? ( One restriction though. The file is used by other system - Netezza as well, which can't take more than one character long delimited :'( )

    One approach is to define the external table a ONE column table (with single field on the file). This way each line will come in as a row in the external table. Of course you have to build "parsing logic" on top of that.
    DROP TABLE xtern_table;
    CREATE TABLE xtern_table
        c1 VARCHAR2(4000)
      organization external
        type ORACLE_LOADER DEFAULT directory xtern_data_dir
        ACCESS PARAMETERS (
            RECORDS DELIMITED BY NEWLINE
            FIELDS TERMINATED BY '~'   ---- <<<<<<<< Use a field terminator as a character that is not found in the file
            MISSING FIELD VALUES ARE NULL
            ( c1 CHAR(4000)
         ) location ('mycsv.csv')
    > desc xtern_table
    desc xtern_table
    Name Null Type          
    C1        VARCHAR2(4000)
    > column c1 format A40
    > select * from xtern_table
    C1                                    
    anupam|anupam2                          
    anupam"test|anupam"test2                
    "anupam|test3"|test3                    
    anupam""""test5|test5                   
    anupam"|test7                           
    "anupam"test4"|test4                    
    "anupam|test6                           
    "anupam"test8|test8"|test8              
    8 rows selected
    Ideally, it will be good t have an incoming source file with predictable format.
    Hope this helps.

  • Synonym problem with external table in materialized view

    I have a materialized view that includes selects on two external tables.
    However, no matter how I try to access the external tables, the creation fails with a "Synonym Translation Is No Longer Valid" (ORA-00980) error.
    This happens even when I replace the tablename with the full table name, including the link.
    I can create a view just fine, but if I then try something like CREATE MATERIALIZED VIEW mvw_my_view AS SELECT * FROM vw_my_view (where vw_my_view is the view in question), it still throws the Synonym Translation exception.

    00980, 00000, "synonym translation is no longer valid"
    // *Cause: A synonym did not translate to a legal target object. This
    //         could happen for one of the following reasons:
    //         1. The target schema does not exist.
    //         2. The target object does not exist.
    //         3. The synonym specifies an incorrect database link.
    //         4. The synonym is not versioned but specifies a versioned
    //            target object.
    // *Action: Change the synonym definition so that the synonym points at
    //          a legal target object.It is really, Really, REALLY difficult to fix a problem that can not be seen.
    use COPY & PASTE so we can see what you do & how Oracle responds.

Maybe you are looking for