Prblem white handle files using External tables

Hi,
I Created an external table to handle files.
Its working fine.
But it is rejecting the records, where the fnal column in a record contains no (null) value.
The data is as follows.
empno|name|positon
184|abc|supervisor
185|xyz|
186|efg|clerk
I have given '\n' as row delimiter. and '|' as field delimiter.
For the above data I am getting only the 1st and 3rd records only.
External tables are not accepting the 2nd re4cord.
In the log file it is specifying the error as "KUP-04023: field start is after end of record".
Could anyone please give me some idea to solve this problem.
Thank you,
Regards,
Gowtham Sen.

Hi Michaels,
I have one more doubt.
After I added the clause "MISSING FIELD VALUES ARE NULL", its working fine.
Case 1:
The data is as follows.
empno|name|positon
184|abc|supervisor
185|xyz|
186|efg|clerk
Now I am getting the data while query the external table as follows
empno name position
184 abc supervisor
185 xyz <null>
186 efg clerk
Case 2:
But there is a case that the data my come in the following way.
empno|name|positon
184|abc|supervisor
185|xyz|
186|efg|clerk
18
187
For this data I am getting the data if I query external table as follows.
empno name position
184 abc supervisor
185 xyz <null>
186 efg clerk
18 <null> <null>
187 <null> <null>
Here I would like to result the last records because, the records violated the formating. But its not doing as I expected.
Do I need to add any other clauses.
Thanks in advance,
Thank you,
Regards,
Gowtham Sen

Similar Messages

  • How to write data to text file using external tables

    can anybody tell how to write data to text file using external tables concept?

    Hi,
    Using external table u can load the data in your local table in database,
    then using your local db table and UTL_FILE pacakge u can wrrite data to text file
    external table
    ~~~~~~~~~~~
    http://download-east.oracle.com/docs/cd/B19306_01/server.102/b14200/statements_7002.htm#i2153251
    UTL_FILE
    ~~~~~~~~~
    http://download-east.oracle.com/docs/cd/B19306_01/appdev.102/b14258/u_file.htm#sthref14093
    Message was edited by:
    Nicloei W
    Message was edited by:
    Nicloei W

  • How to read any file using external tables.

    Hi folks,
    I have written an application that reads a series of csv files using external tables which works fine as long as I specify each file name in the directory i.e.......
    CREATE TABLE gb_test
    (file_name varchar2(10),
    rec_date date
    rec_name VARCHAR2(20),
    rec_age number,
    ORGANIZATION EXTERNAL
    TYPE ORACLE_LOADER
    DEFAULT DIRECTORY GB_TEST
    ACCESS PARAMETERS
    RECORDS DELIMITED BY NEWLINE
    FIELDS TERMINATED BY ','
    LOCATION ('data1.csv','data2.csv','data3.csv','data4.csv')
    PARALLEL 5
    REJECT LIMIT 20000;
    However I have discovered that I may not know the name of the files to be processed prior to the program being run so just want to read any file regardless of it's name (although it will always be a .csv file).
    Is there a way to ensure that you don't need to specify the files to be read in the LOCATION part of the syntax.
    Thanks in advance.
    Graham.

    Right, I have now completed this, however it's currently only working as SYS as opposed to any user, however here is a detail of the scenario and the steps required in case any of you guys need in the future ......
    The problem was I needed to search for csv files on my hard-drive. These files would be stored in a series of directories (a through to z), so I needed a way to read all 26 directories and process all files in these directories.
    The problem was, prior to running the program, the user would remove all the files in the directories and insert new ones, but it was never known how many he would decide to do each time.
    Solution: I created a table called stock_data_directories as follows ...
    create table stock_data_directories(sdd_rec_no number,
    sdd_table_name varchar2(50),
    sdd_directory_name varchar2(50),
    sdd_directory_path varchar2(100));
    Then inserted 26 records like ...
    insert into stock_data_directories(sdd_rec_no,sdd_table_name,sdd_directory_name,sdd_directory_path)
    values(1,'rawdata_a','KPOLLOCKA','C:\KPOLLOCK\A')
    insert into stock_data_directories(sdd_rec_no,sdd_table_name,sdd_directory_name,sdd_directory_path)
    values(2,'rawdata_b','KPOLLOCKB','C:\KPOLLOCK\B');
    etc...etc...
    Then created 26 DIRECTORIES E.G.
    CREATE OR REPLACE DIRECTORY KPOLLOCKA AS 'C:\KPOLLOCK\A';
    CREATE OR REPLACE DIRECTORY KPOLLOCKB AS 'C:\KPOLLOCK\B';
    Then created 26 external tables like the following ...
    CREATE TABLE rawdata_a
    (stock varchar2(1000),
    stock_date varchar2(10),
    stock_open VARCHAR2(20),
    stock_high varchar2(20),
    stock_low varchar2(20),
    stock_close VARCHAR2(30),
    stock_qty varchar2(20) )
    ORGANIZATION EXTERNAL
    TYPE ORACLE_LOADER
    DEFAULT DIRECTORY KPOLLOCKA
    ACCESS PARAMETERS
    RECORDS DELIMITED BY NEWLINE
    FIELDS TERMINATED BY ','
    LOCATION ('AA.csv')
    PARALLEL 5
    REJECT LIMIT 20000
    This basically says in directory rawdata_a it currently has 1 file called AA.csv.
    Then wrote a procedure as follows ...
    procedure p_process_files(pv_return_message OUT varchar2)is
    cursor c_get_stock_data_directories is
    select distinct sdd_directory_path,
    sdd_table_name
    from stock_data_directories
    order by sdd_table_name;
    vv_return_message varchar2(1000);
    begin
    -- here get the files for each directory
    for r_get_stock_directories in c_get_stock_data_directories loop
    p_build_external_table(r_get_stock_directories.sdd_directory_path,
         r_get_stock_directories.sdd_table_name,
         vv_return_message);
    end loop;
    end;
    then wrote a procedure called p_build_external_table as follows ...
    procedure p_build_external_table(pv_directory_path IN stock_data_directories.sdd_directory_path%type, -- e.g. 'C:\kpollock\A\
    pv_table_name IN stock_data_directories.sdd_table_name%type, -- e.g. rawdata_a
    pv_return_message OUT varchar2) is
    vv_pattern VARCHAR2(1024);
    ns VARCHAR2(1024);
    vv_file_name varchar2(4000);
    vv_start_string varchar2(1) := '''';
    vv_end_string varchar2(3) := ''',';
    vn_counter number := 0;
    vv_err varchar2(2000);
    BEGIN
    vv_pattern := pv_directory_path||'*';
    SYS.DBMS_BACKUP_RESTORE.searchFiles(vv_pattern, ns);
    FOR each_file IN (SELECT FNAME_KRBMSFT AS name FROM X$KRBMSFT) LOOP
    if each_file.name like '%.CSV' then
    vv_file_name := vv_file_name||vv_start_string||substr(each_file.name,instr(each_file.name,'\',1,3)+1)||vv_end_string;
         vn_counter := vn_counter + 1;
    end if;
    END LOOP;
    vv_file_name := substr(vv_file_name,1,length(vv_file_name)-1); -- remove final , from string
    execute immediate 'alter table '||pv_table_name||' location('||vv_file_name||')';
    pv_return_message := 'Successfully changed '||pv_table_name||' at '||pv_directory_path||' to now have '||to_char(vn_counter)||' directories';
    exception
    when others then
    vv_err := sqlerrm;
    pv_return_message := ' Error found updating directories. Error = '||vv_err;
    END;
    This reads every file in the directory and appends it to a list, so if it finds A.csv and ABC.csv, then using the dynamic sql, it alters the location to now read 'a.csv','abc.csv',
    It ignores all other file extentions.

  • Need to merge a csv file using external tables into a main table

    Hi,
    I have a csv file which contains the date(with time stamp), column1(number),column2(number), column3 (number). I am using external tables concept to load the data froom csv to this external table and then merging into the main table. Problem here is : the csv file is a system generated file and nothing can be edited under it. our aim is to automate this process of loading data from csv to the table. In this csv the date time stamp is not in the proper format.I mean the date is not visible and only minutes and seconds are visible.By changing the format in csv manually this can be overcome.but we donot need any manual intervention.
    how can i overcome this problem ?? please help mee...
    Excels data looks like:
    (PDH-TSV 4.0) (India Standard Time)(-330)     \\DISAPPSER01\Processor(_Total)\% Privileged Time     \\DISAPPSER01\Processor(_Total)\% Processor Time     \\DISAPPSER01\Web Service(_Total)\Current Connections
    56:59.0               47
    57:09.0     0.72379582     4.204561281     46
    57:19.0     0.916548537     4.006179927     44
    57:29.0     0.663034771     3.674662541     43
    57:39.0     0.750789844     4.093933999     42
    57:49.0     0.721538487     2.650858026     40
    57:59.0     0.594781604     3.333393703     40

    please format your sample data giving header to the column so that we can make sense out of the values, also since the minutes and seconds are only given, what is the date to be considered for records to be moved to the master table, sysdate or will the date be passed as a parameter?

  • How to read specific lines from a text file using external table or any other method?

    Hi,
    I have a text file with delimited data, I have to pick only odd number rows and load into a table...
    Ex:
    row1:  1,2,2,3,3,34,4,4,4,5,5,5,,,5  ( have to load only this row)
    row2:   8,9,878,78,657,575,7,5,,,7,7
    Hope this is enough..
    I am using Oracle 11.2.0 version...
    Thanks

    There are various ways to do this.  I would be inclined to use SQL*Loader.  That way you can load it from the client or the server and you can use a SQL*Loader sequence to preserve the row order in the text file.  I would load the whole row as a varray into a staging table, then use the TABLE and MOD functions to load the individual numbers from only the odd rows.  Please see the demonstration below.
    SCOTT@orcl12c> HOST TYPE text_file.csv
    1,2,2,3,3,34,4,4,4,5,5,5,,,5
    8,9,878,78,657,575,7,5,,,7,7
    101,201
    102,202
    SCOTT@orcl12c> HOST TYPE test.ctl
    LOAD DATA
    INFILE text_file.csv
    INTO TABLE staging
    FIELDS TERMINATED BY ','
    TRAILING NULLCOLS
    (whole_row VARRAY TERMINATED BY '/n' (x INTEGER EXTERNAL),
    rn SEQUENCE)
    SCOTT@orcl12c> CREATE TABLE staging
      2    (rn         NUMBER,
      3     whole_row  SYS.OdciNumberList)
      4  /
    Table created.
    SCOTT@orcl12c> HOST SQLLDR scott/tiger CONTROL=test.ctl LOG=test.log
    SQL*Loader: Release 12.1.0.1.0 - Production on Tue Aug 27 13:48:37 2013
    Copyright (c) 1982, 2013, Oracle and/or its affiliates.  All rights reserved.
    Path used:      Conventional
    Commit point reached - logical record count 4
    Table STAGING:
      4 Rows successfully loaded.
    Check the log file:
      test.log
    for more information about the load.
    SCOTT@orcl12c> CREATE TABLE a_table
      2    (rn       NUMBER,
      3     data  NUMBER)
      4  /
    Table created.
    SCOTT@orcl12c> INSERT INTO a_table (rn, data)
      2  SELECT s.rn,
      3         t.COLUMN_VALUE data
      4  FROM   staging s,
      5         TABLE (s.whole_row) t
      6  WHERE  MOD (rn, 2) != 0
      7  /
    17 rows created.
    SCOTT@orcl12c> SELECT * FROM a_table
      2  /
            RN       DATA
             1          1
             1          2
             1          2
             1          3
             1          3
             1         34
             1          4
             1          4
             1          4
             1          5
             1          5
             1          5
             1
             1
             1          5
             3        101
             3        201
    17 rows selected.

  • Error while fetching data from OWB Client using External Table.

    Dear All,
    I am using Oracle Warehouse Builder 11g & Oracle 10gR2 as repository database on Windows 2000 Server.
    I facing some issue in fetching data from a Flat File using external table from OWB Client.
    I have perform all the steps without any error but when I try to view the data, I got the following error.
    ======================================
    RA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-04040: file expense_categories.csv in SOURCE_LOCATION not found
    ORA-06512: at "SYS.ORACLE_LOADER", line 19
    java.sql.SQLException: ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-04040: file expense_categories.csv in SOURCE_LOCATION not found
    ORA-06512: at "SYS.ORACLE_LOADER", line 19
         at oracle.jdbc.driver.SQLStateMapping.newSQLException(SQLStateMapping.java:70)
         at oracle.jdbc.driver.DatabaseError.newSQLException(DatabaseError.java:110)
         at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:171)
         at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:455)
         at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:413)
         at oracle.jdbc.driver.T4C8Oall.receive(T4C8Oall.java:1030)
         at oracle.jdbc.driver.T4CStatement.doOall8(T4CStatement.java:183)
         at oracle.jdbc.driver.T4CStatement.executeForDescribe(T4CStatement.java:774)
         at oracle.jdbc.driver.T4CStatement.executeMaybeDescribe(T4CStatement.java:849)
         at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1186)
         at oracle.jdbc.driver.OracleStatement.executeQuery(OracleStatement.java:1377)
         at oracle.jdbc.driver.OracleStatementWrapper.executeQuery(OracleStatementWrapper.java:386)
         at oracle.wh.ui.owbcommon.QueryResult.<init>(QueryResult.java:18)
         at oracle.wh.ui.owbcommon.dataviewer.relational.OracleQueryResult.<init>(OracleDVTableModel.java:48)
         at oracle.wh.ui.owbcommon.dataviewer.relational.OracleDVTableModel.doFetch(OracleDVTableModel.java:20)
         at oracle.wh.ui.owbcommon.dataviewer.RDVTableModel.fetch(RDVTableModel.java:46)
         at oracle.wh.ui.owbcommon.dataviewer.BaseDataViewerPanel$1.actionPerformed(BaseDataViewerPanel.java:218)
         at javax.swing.AbstractButton.fireActionPerformed(AbstractButton.java:1849)
         at javax.swing.AbstractButton$Handler.actionPerformed(AbstractButton.java:2169)
         at javax.swing.DefaultButtonModel.fireActionPerformed(DefaultButtonModel.java:420)
         at javax.swing.DefaultButtonModel.setPressed(DefaultButtonModel.java:258)
         at javax.swing.AbstractButton.doClick(AbstractButton.java:302)
         at javax.swing.AbstractButton.doClick(AbstractButton.java:282)
         at oracle.wh.ui.owbcommon.dataviewer.BaseDataViewerPanel.executeQuery(BaseDataViewerPanel.java:493)
         at oracle.wh.ui.owbcommon.dataviewer.BaseDataViewerEditor.init(BaseDataViewerEditor.java:116)
         at oracle.wh.ui.owbcommon.dataviewer.BaseDataViewerEditor.<init>(BaseDataViewerEditor.java:58)
         at oracle.wh.ui.owbcommon.dataviewer.relational.DataViewerEditor.<init>(DataViewerEditor.java:16)
         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
         at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
         at java.lang.reflect.Constructor.newInstance(Constructor.java:494)
         at oracle.wh.ui.owbcommon.IdeUtils._tryLaunchEditorByClass(IdeUtils.java:1412)
         at oracle.wh.ui.owbcommon.IdeUtils._doLaunchEditor(IdeUtils.java:1349)
         at oracle.wh.ui.owbcommon.IdeUtils._doLaunchEditor(IdeUtils.java:1367)
         at oracle.wh.ui.owbcommon.IdeUtils.showDataViewer(IdeUtils.java:869)
         at oracle.wh.ui.owbcommon.IdeUtils.showDataViewer(IdeUtils.java:856)
         at oracle.wh.ui.console.commands.DataViewerCmd.performAction(DataViewerCmd.java:19)
         at oracle.wh.ui.console.commands.TreeMenuHandler$1.run(TreeMenuHandler.java:188)
         at java.awt.event.InvocationEvent.dispatch(InvocationEvent.java:209)
         at java.awt.EventQueue.dispatchEvent(EventQueue.java:461)
         at java.awt.EventDispatchThread.pumpOneEventForHierarchy(EventDispatchThread.java:242)
         at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:163)
         at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:157)
         at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:149)
         at java.awt.EventDispatchThread.run(EventDispatchThread.java:110)
    ===========================
    In the error it is showing that file expense_categories.csv in SOURCE_LOCATION not found but I am 100% sure that file is very much there.
    Is anybody face the same issue?
    Do we need to configure something before loading data from a flat file from OWB Client?
    Any help would higly appreciable.
    Regards,
    Manmohan Sharma

    Hi Detlef / Gowtham,
    Now I am able to fetch data from flat files from OWB Server as well as OWB Client.
    One way I have achieved as suggested by you
    1) Creating location on the OWB Client
    2) Samples the files at client
    3) Created & Configured external table
    4) Copy all flat files on OWB Server
    5) Updated the location which I created at the client.
    Other way
    1) Creating location on the OWB Client
    2) Samples the files at client
    3) Created & Configured external table
    4) Copied flat files on the sever in same drive & directory . like if my all flat files are on C:\data at OWB Client then I copied flat file C:\data on the OWB Server. But this is feasible for Non-Windows.
    Hence my problem solved.
    Thanks a lot.
    Regards,
    Manmohan

  • How to find rejected rows using External Table

    Hi all,
    I had written a stored procedure to read comma(,) separated flat file using External Tables in oracle 9i with Reject Limit "Unlimited" Option. Sometimes all rows are successfully loaded into external table and sometimes some rows are failed, I can find out those rows from logfile created by oracle. I want to inserted those rows key values into some other table. Can any body suggest/links how to write this.
    Thanking you in advance

    "Is there a way to have the system truncate the log files besides some O/S utility that will scour the directory every night, week, etc.?"
    You can use UTL_FILE.FREMOVE to delete the file, if you have sufficient privileges. You can schedule it using DBMS_JOB if you like or run it before you recreate the file, as demonstrated below. The pl/sql block to check whether the file exists is used for demonstration purposes only and is not necessary. This is just one method. There are various ways to delete or truncate a file. This just seems like the simplest.
    scott@ORA92> CREATE OR REPLACE DIRECTORY mydir AS 'c:\oracle'
      2  /
    Directory created.
    scott@ORA92> DECLARE
      2    v_bfile BFILE := BFILENAME ('MY_DIR', 'test_tab.log');
      3  BEGIN
      4    IF DBMS_LOB.FILEEXISTS (v_bfile) = 1
      5    THEN DBMS_OUTPUT.PUT_LINE ('test_tab.log exists');
      6    ELSE DBMS_OUTPUT.PUT_line ('test_tab.log does not exist');
      7    END IF;
      8  END;
      9  /
    test_tab.log exists
    PL/SQL procedure successfully completed.
    scott@ORA92> EXECUTE UTL_FILE.FREMOVE ('MY_DIR', 'test_tab.log')
    PL/SQL procedure successfully completed.
    scott@ORA92> DECLARE
      2    v_bfile BFILE := BFILENAME ('MY_DIR', 'test_tab.log');
      3  BEGIN
      4    IF DBMS_LOB.FILEEXISTS (v_bfile) = 1
      5    THEN DBMS_OUTPUT.PUT_LINE ('test_tab.log exists');
      6    ELSE DBMS_OUTPUT.PUT_line ('test_tab.log does not exist');
      7    END IF;
      8  END;
      9  /
    test_tab.log does not exist
    PL/SQL procedure successfully completed.
    scott@ORA92> CREATE TABLE test_tab
      2    (col1 NUMBER,
      3       col2 VARCHAR2(4))
      4  ORGANIZATION external
      5    (TYPE ORACLE_LOADER
      6       DEFAULT DIRECTORY mydir
      7       ACCESS PARAMETERS
      8         (RECORDS DELIMITED BY NEWLINE
      9          BADFILE 'MYDIR':'test_bad.bad'
    10          LOGFILE 'MYDIR':'test_tab.log'
    11          FIELDS TERMINATED BY ","
    12            (col1,
    13             col2))
    14       LOCATION ('test.dat'))
    15  REJECT LIMIT UNLIMITED
    16  /
    Table created.
    scott@ORA92> SELECT * FROM test_tab
      2  /
          COL1 COL2
             1 a
             2 b
    scott@ORA92> "Or is there a way to prevent the log from being written to every time it's accessed?"
    You can use NOLOGFILE as an access parameter to prevent it from being written to, as shown below.
    scott@ORA92> CREATE OR REPLACE DIRECTORY mydir AS 'c:\oracle'
      2  /
    Directory created.
    scott@ORA92> CREATE TABLE test_tab
      2    (col1 NUMBER,
      3       col2 VARCHAR2(4))
      4  ORGANIZATION external
      5    (TYPE ORACLE_LOADER
      6       DEFAULT DIRECTORY mydir
      7       ACCESS PARAMETERS
      8         (RECORDS DELIMITED BY NEWLINE
      9          BADFILE 'MYDIR':'test_bad.bad'
    10          NOLOGFILE
    11          FIELDS TERMINATED BY ","
    12            (col1,
    13             col2))
    14       LOCATION ('test.dat'))
    15  REJECT LIMIT UNLIMITED
    16  /
    Table created.
    scott@ORA92> SELECT * FROM test_tab
      2  /
          COL1 COL2
             1 a
             2 b
    scott@ORA92> "Is there a way to have the log written only when an error occurs?"
    Not that I know of, but that does not mean that somebody else doesn't know how or isn't able to figure out a way. If you do find a way, please post it for the benefit of the rest of us.

  • Report question: how to use external table to bring data from excel file?

    i need to import excel data into oracle report, how to do that? thanks.

    Hi... What is the error message you get when using external tables?
    You must create a control center connection in the Connection Explorer.
    I recommend you read this:
    http://download-east.oracle.com/docs/cd/B31080_01/doc/owb.102/b28223/toc.htm
    Look for the following sections:
    - Using External Tables (and the "External Tables versus Flat File Operators" sub section)
    - About Flat File Modules
    - Chapter 16, Deploying Target Systems
    Regards,
    Marcos

  • Error while using External Table

    Hi Guru's,
    I am using external tables in my procedure and the code to create it will be dynamically built inside the procedure itself. I have tested it so many times and it works fine only. But suddenly now it is throwing some error and i was not able to figure out the problem.
    Additional Info:
    1. I have provided the full rights to the input directory.
    2. Working in Oracle 10g release 2
    3. Unix OS in server
    4. Error description:
    << UK_TRADINGDATA_LOAD >> ABEND : <<ABEND SYS-000 >> uncatched ORACLE-error : ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-04040: file EXTRACTION COMPLETED in EXT_UK_TRADINGDATA not found
               v_ac_sql:='CREATE TABLE VLDPROCESS'||gv_ac_schema_suffix||'.EXT_UK_TRAD_'||gv_ac_machineid ||'_'||gv_ac_retailer_id||'_ADHOC'||CHR(10)
                            ||'(  AC_RETAILER_ID       VARCHAR2(2 BYTE),'||CHR(10)
                            ||'AC_DEPARTMENT_CD     VARCHAR2(20 BYTE),'||CHR(10)
                            ||'NC_PERIOD_FROM       NUMBER,'||CHR(10)
                            ||'NC_PERIOD_TO       NUMBER'||CHR(10)   
                            ||')'||CHR(10)
                            ||'ORGANIZATION EXTERNAL'||CHR(10)
                            ||'(TYPE ORACLE_LOADER'||CHR(10)
                            ||'DEFAULT DIRECTORY EXT_UK_TRADINGDATA'||CHR(10)
                            ||'ACCESS PARAMETERS'||CHR(10)
                            ||'(RECORDS DELIMITED BY NEWLINE'||CHR(10)
                            ||'badfile     '''||v_ac_failed_files||'.bad'''||CHR(10)
                            ||'discardfile '''||v_ac_failed_files||'.dis'''||CHR(10)
                            ||'logfile     '''||v_ac_failed_files||'.log'''||CHR(10)
                            ||'FIELDS'||CHR(10)
                            ||'('||CHR(10)
                            ||'AC_RETAILER_ID   position( 1: 2) char( 2),'||CHR(10)
                            ||'AC_DEPARTMENT_CD position( 3:22) char(20),'||CHR(10)
                            ||'NC_PERIOD_FROM   position(23:29) char( 7),'||CHR(10)
                            ||'NC_PERIOD_TO     position(30:36) char( 7)'||CHR(10)     
                            ||')'||CHR(10)
                            ||')'||CHR(10)
                            ||'LOCATION ('''||v_ac_file_name ||''')'||CHR(10)
                            ||')'||CHR(10)
                            ||'REJECT LIMIT 100'||CHR(10)
                            ||'NOPARALLEL'||CHR(10)
                            ||'NOMONITORING';  
               EXECUTE IMMEDIATE v_ac_sql; Kindly help me!
    With Regards,
    VJ

    Sometimes when troubleshooting issues like this it is helpful to output the dynamic SQL generated to the screen or spool it to a text file. From there it's usually easy to determine where the problem lies.
    If you can provide that one of the forum members can figure it out.

  • XML File in External Table - OS error permission denied.

    Hi.
    10g R2, Red Hat Linux
    I'm using the article (see below, taken from http://www.dbazine.com/olc/olc-articles/scardina1 by Mark Scardina) to create an external table where I'd store my XML file.
    So, I
    1. Created a directory xmlfile_dir
    2. Granted access to needed db user
    3. Created the table
    CREATE TABLE relayxml_xt (doc CLOB)
    ORGANIZATION EXTERNAL
    TYPE ORACLE_LOADER
    DEFAULT DIRECTORY xmlfile_dir
    ACCESS PARAMETERS
    FIELDS (lobfn CHAR TERMINATED BY ',')
    COLUMN TRANSFORMS (doc FROM lobfile (lobfn))
    LOCATION ('xml.dat')
    REJECT LIMIT UNLIMITED;
    4. mv relay.xml /xmlfile_dir/xml.dat
    When I run SELECT * FROM relayxml_xt I get this:
    Error starting at line 1 in command:
    select * from relayxml_xt
    Error report:
    SQL Error: ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-04063: unable to open log file RELAYXML_XT_28773.log
    OS error Permission denied
    ORA-06512: at "SYS.ORACLE_LOADER", line 19
    29913. 00000 - "error in executing %s callout"
    *Cause:    The execution of the specified callout caused an error.
    *Action:   Examine the error messages take appropriate action.
    What am I doing wrong?
    Thanks,
    Using External Tables
    Introduced in Oracle9i, Oracle’s external table feature offers a solution to define a table in the database while leaving the data stored outside of the database. Prior to Oracle Database 10g, external tables can be used only as read-only tables. In other words, if you create an external table for XML files, these files can be queries and the table can be joined with other tables. However, no DML operations, such as INSERT, UPDATE, and DELETE, are allowed on the external tables.
    Note: In Oracle Database 10g , by using the ORACLE_DATAPUMP driver instead of the default ORACLE_DRIVER, you can write to external tables. In Oracle Database 10g, you can define VARCHAR2 and CLOB columns in external tables to store XML documents. The following example shows how you can create an external table with a CLOB column to store the XML documents. First, you need to create a DIRECTORY to read the data files:
    CREATE DIRECTORY data_file_dir AS 'D:\xmlbook\Examples\Chapter9\src\xml';
    GRANT READ, WRITE ON DIRECTORY data_file_dir TO demo;
    Then, you can use this DIRECTORY to define an external table:
    CREATE TABLE customer_xt (doc CLOB)
    ORGANIZATION EXTERNAL
    TYPE ORACLE_LOADER
    DEFAULT DIRECTORY data_file_dir
    ACCESS PARAMETERS
    FIELDS (lobfn CHAR TERMINATED BY ',')
    COLUMN TRANSFORMS (doc FROM lobfile (lobfn))
    LOCATION ('xml.dat')
    REJECT LIMIT UNLIMITED;
    The xml.dat file follows:
    customer1.xml
    customer2.xml
    If you describe the table, you can see the following definition:
    SQL> DESC customer_xt;
    Name Null? Type
    DOC CLOB
    Then, you can query the XML document as follows:
    SELECT XMLType(doc).extract('/Customer/EMAIL')
    FROM customer_xt;
    Though the query requires run-time XMLType creation and XPath evaluation, this approach is useful when applications just need a few queries on the XML data and don’t want to upload the XML data into database. In Oracle Database 10g, you cannot create external tables that contain pre-defined XMLType column types.
    Message was edited by:
    vi2167

    Your don't have the proper operating system privileges. Be sure that you (=oracle OS user / the OS Linux user that is starting the database) are allowed have read privs on the path and/or file.
    for example...
    chown -Rf /xxxxxxx/xxxx/etc
    ls -l file.xml
    file.xml    oracle:oinstall    rw-rw-rw

  • Unable to read chinese characters in a flat file to external table

    Hi All,
    We have a flat file containing data in chinese. We are using external table to read data in files. When i do select <coulmn-name> from <table> it displays box for the chinese characters. The column is of type varchar2.
    The NLS_LANGUAGE is AMERICAN and NLS_CHARACTERSET is AL32UTF8.
    The character set is mentioed as UTF8 in external table defnition.
    Is the external table not reading these charcters properly? Should any patch be installed or some settings should be changed for sql developer to to display chinese character?

    The NLS_LANG environment variable/registry string variable (You are just one of the 1000s of 'anonymous' users refusing to post platform and version), so NOT NLS_LANGUAGE, on the server hosting the file, should be set to anything ending in .AL32UTF8
    Sybrand Bakker
    Senior Oracle DBA

  • Loading xml file into external tables

    emp.xml is xml file namewhich is saved in C:\Documents and Settings\james\Desktop\emp.xml
    xml file
    <EMPLOYEES>
    <EMP>
    <EMPNO>7369</EMPNO>
    <ENAME>SMITH</ENAME>
    <JOB>CLERK</JOB>
    <HIREDATE>17-DEC-80</HIREDATE>
    <SAL>800</SAL>
    </EMP>
    <EMP>
    <EMPNO>7499</EMPNO>
    <ENAME>ALLEN</ENAME>
    <JOB>SALESMAN</JOB>
    <HIREDATE>20-FEB-81</HIREDATE>
    <SAL>1600</SAL>
    <COMM>300</COMM>
    </EMP>
    </EMPLOYEES>
    CREATE DIRECTORY my_xml_dir AS 'C:\Documents and Settings\james\Desktop\emp.xml'
    CREATE TABLE my_xml_et ( EMPNO NUMBER, EMPNAME VARCHAR2(10), JOB VARCHAR2(10), HIREDATE DATE, SAL NUMBER )
    using external tables how this xml data is loaded into my_xml_et

    http://download-west.oracle.com/docs/cd/B13789_01/appdev.101/b10790/toc.htm
    Take a look at Examples 4-8 and 4-9. Even thought this is 10g doc code should work on 9.2.4 or later

  • How to load external data without using external tables

    Hi,
    I'm asked to develop an ETL for loading external data into a database but, unfortunately, the vers. is 8i so I can't use external tables.
    I know that it's possible by using SQL-Loader. What I want to know is whether there's another way in 8i to load external data from a text file or a CSV file directly with a stored procedure without having to write CTL of SQL-Loader on the server. I mean, is there a way to write everything on the database the same manner I'd do whether I wrote data from a table to a file using for example UTL_FILE or the unique way is to write control files of SQL-Loader directly on the server?
    Thanks!

    Mark1970 wrote:
    Thank you very much Karthick
    I didn't know I could use UTL_FILE also for reading files. I've always use it only for writing data into external files.Yes you can use UTL_FILE to read. The version of oracle you are using i last used in 2004 :) Still remember giving the OS path of the file to open the file. That is long gone now. Its surprising that you are still in 8i.

  • How to overwrite a log and bad file in external table in oracle 10g

    Hi,
    I have used external table in oracle 10g.whenever use select query in external table orace internally create one log file in specified directory,
    but this log file is growing.How can i overwrite the log file(old to replace with new).I need overwrite a log and bad file in external table.
    kindly give the solutions.
    By
    Siva

    I don't believe that is possible with the LOGFILE clause, but it may be with the BADFILE clause. Here is an excerpt from the documentation :
    The LOGFILE clause names the file that contains messages generated by the external tables utility while it was accessing data in the datafile. If a log file already exists by the same name, the access driver reopens that log file and appends new log information to the end. This is different from bad files and discard files, which overwrite any existing file. NOLOGFILE is used to prevent creation of a log file.
    If you specify LOGFILE, you must specify a filename or you will receive an error.
    If neither LOGFILE nor NOLOGFILE is specified, the default is to create a log file. The name of the file will be the table name followed by _%p and it will have an extension of .log.

  • How to use external table - creating NFS mount -the details involved

    Hi,
    We are using Oracle 10.2.0.3 on Solaris 10. I want to use external tables to load huge csv data into the database. This concept was tested and also found to be working fine. But my doubt that : since ours is a J2EE application, the csv files have to come from the front end- from the app server. So in this case how to move them to the db server?
    For my testing I just used putty to transfer the file to db server, than ran the dos2unix command to strip off the control character at the end of file. but since this is to be done from the app server, putty can not be used. In this case how can this be done? Are there any risks or security issues involved in this process?
    Regards

    orausern wrote:
    For my testing I just used putty to transfer the file to db server, than ran the dos2unix command to strip off the control character at the end of file. but since this is to be done from the app server, putty can not be used. In this case how can this be done? Are there any risks or security issues involved in this process? Not sure why "putty" cannot be used. This s/w uses the standard telnet and ssh protocols. Why would it not work?
    As for getting the files from the app server to the db server. There are a number of options.
    You can look at it from an o/s replication level. The command rdist is common on most (if not all) Unix/Linux flavours and used for remote distribution and sync'ing of files and directories. It also supports scp as the underlying protocol (instead of the older rcp protocol).
    You can use file sharing - the typical Unix approach would be to use NFS. Samba is also an option if NTLM (Windows) is already used in the organisation and you want to hook this into your existing security infrastructure (e.g. using Microsoft's Active Directory).
    You can use a cluster file system - a file system that resides on shared storage and can be used by by both app and db servers as a mounted/cooked file system. Cluster file systems like ACFS, OCFS2 and GFS exist for Linux.
    You can go for a pull method - where the db server on client instruction (that provides the file details), connects to the app server (using scp/sftp/ftp), copy that file from the app server, and then proceed to load it. You can even add a compression feature to this - so that the db server copies a zipped file from the app server and then unzip it for loading.
    Security issues. Well, if the internals is not exposed then security will not be a problem. For example, defining a trusted connection between app server ad db server - so the client instruction does not have to contain any authentication data. Letting the client instruction only specify the filename and have the internal code use a standard and fixed directory structure. That way the client cannot instruct something like +/etc/shadow+ be copied from the app server and loaded into the db sever as a data file. Etc.

Maybe you are looking for

  • How do I get apple support without an AppleCare protection plan?

    So I recently had a problem with my Apple ID while still under AppleCare, and it turned out that AppleiD was saved with .con instead of .com. I was told to make a new Apple ID and use this new one for everything, and so I did. I forgot my password fo

  • Runtime Error when enter in Room

    Hello I create a project room with standad SAP Room Templates. The userdata of the roommembers comes from Active Directory and Portal UME. When I enter to room with the portal-ume-user, I recive errors in the detailed navigation and no iViews are vis

  • How do i run  a shell script?

    Hi, i am attempting to update some source packages to the latest up-stream version to eliminate a compile time error described at  http://permalink.gmane.org/gmane.os.apple.fink.devel/21842 but i am having trouble applying the solution. regretably i

  • ERROR 5897 in Directory Server Logs

    Hi, Sun JES DS 5.2 Patch 6 I am getting following in my error logs : ERROR<5897> - Schema - conn=-1 op=-1 msgId=-1 - User error: Entry "cn=cpchangelog,dc=abc,dc=def,dc=com", attribute "firstchangenumber" is not allowed Do u know how to resolve and ho

  • Agent Execution Repository

    How does a agent determine which work repository to execute scenerios from and write execution logs to.