Skipping header in external table...

Is it possible to define an external table within OWB to skip the first few rows of a file? I haven't been able to find anything in the configuration of an external table that allows you to skip lines.
If it is not available, will it be be added to OWB in the future?
I am running OWB 9.2.0.2.8
Thanks

I think I have found a bug in OWB. I keep losing the number of rows to skip in external tables.
I defined a File to skip the first three rows then created an external table based on the File. When I deployed the external table, "SKIP 3" was generated. Later, I made some configuration changes to the external table (setup a BAD file/location, set it to trim the fields, load Nulls, etc). When I redeployed the external table, "SKIP 3" was missing.
Then I did an MDL export of the File. In the MDL file, "SkipRecords" was 0. I changed it to 3, and then imported the MDL back into OWB (match by UOID, replace existing). Then when I redeployed the external table, "SKIP 3" was back in the definition.
Is this a known issue? Is it fixed in the latest version? I am using OWB 9.2.0.2.8.
Thanks!

Similar Messages

  • SKIP option in External Table

    Hi,
    I would like to skip the first row in the external table.
    I used the SKIP option.
    The tabl create succsessfully - But for some reason i am getting an error related to the skip option when select from the table.
    Could you please tell me what missing/wrong with the syntax ?
    SQL> CREATE TABLE ext_tab (
      2  ITEM                VARCHAR2(100),
      3  MODEL               VARCHAR2(100),
      4  packing_no          VARCHAR2(100),
      5  ship_date           VARCHAR2(100),
      6  Internal_SN         VARCHAR2(100),
      7  SHIPPING_SN         VARCHAR2(100),
      8  MACID               VARCHAR2(100),
      9  CARTON_NO           VARCHAR2(100),
    10  PALLET_ID           VARCHAR2(100),     
    11  PO                  VARCHAR2(100),
    12  KEY                 VARCHAR2(100),
    13  PASSWORD_TR         VARCHAR2(100),
    14  PASSWORD_ADMIN      VARCHAR2(100))
    15  ORGANIZATION EXTERNAL
    16  (TYPE oracle_loader
    17  DEFAULT DIRECTORY TMP_DIR
    18  ACCESS PARAMETERS
    19  (FIELDS TERMINATED BY ','
    20  SKIP 1
    21  MISSING FIELD VALUES ARE NULL
    22  (ITEM,MODEL,packing_no,SHIP_DATE,Internal_SN,SHIPPING_SN,MACID,CARTON_NO,PALLET_ID,PO,KEY,PASSWORD_TR,PASSWORD_ADMIN)) 
    23  LOCATION ('SN.csv'))
    24  PARALLEL
    25  REJECT LIMIT 0;
    Table created.
    SQL> select * from ext_tab;
    select * from ext_tab
    ERROR at line 1:
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-00554: error encountered while parsing access parameters
    KUP-01005: syntax error: found "skip": expecting one of: "column, enclosed, exit, (, ltrim, lrtrim, ldrtrim, missing, notrim, optionally, rtrim, reject"
    KUP-01007: at line 2 column 1
    ORA-06512: at "SYS.ORACLE_LOADER", line 19

    Hi,
    Kindly find below the syntax for creating the external table
    Line: -----
    CREATE TABLE <table_name> (
    <column_definitions>)
    ORGANIZATION EXTERNAL
    (TYPE oracle_loader
    DEFAULT DIRECTORY <oracle_directory_object_name>
    ACCESS PARAMETERS (
    RECORDS DELIMITED BY newline
    BADFILE <file_name>
    DISCARDFILE <file_name>
    LOGFILE <file_name>
    [READSIZE <bytes>]
    [SKIP <number_of_rows>
    FIELDS TERMINATED BY '<terminator>'
    REJECT ROWS WITH ALL NULL FIELDS
    MISSING FIELD VALUES ARE NULL
    (<column_name_list>))\
    LOCATION ('<file_name>'))
    [PARALLEL]
    REJECT LIMIT <UNLIMITED | integer>;
    Line: -----
    Please note that the order of the items in the ACCESS PARAMETERS differs in your Create Table syntax. Also check whether the .CSV file is available in the TMP_DIR directory path.
    Regards,
    Sandhya G.

  • Skip lines on external tables

    Hello,
    Would it be possible to skip specific lines ? By exemple I don't want line 107, 206 and 455.
    Thanks for your answer !

    BLANKS as null value or BLANKS as a character string ?
    You can do it as :
    ORGANIZATION EXTERNAL
    TYPE ORACLE_LOADER DEFAULT DIRECTORY "LOADER" ACCESS PARAMETERS (
    RECORDS DELIMITED BY newline Skip 1 FIELDS TERMINATED BY 0X'09'
    LOAD WHEN (CODE != 'BLANKS')
    MISSING FIELD VALUES ARE NULL
    REJECT ROWS
    WITH
    ALL NULL
    FIELDS (Emp_id, Last_name, First_name, Adr_1, Adr_2, CODE, CITY) ) LOCATION ( 'Emp.tsv' )
    )or
    ORGANIZATION EXTERNAL
    TYPE ORACLE_LOADER DEFAULT DIRECTORY "LOADER" ACCESS PARAMETERS (
    RECORDS DELIMITED BY newline Skip 1 FIELDS TERMINATED BY 0X'09'
    LOAD WHEN (CODE is not null)
    MISSING FIELD VALUES ARE NULL
    REJECT ROWS
    WITH
    ALL NULL
    FIELDS (Emp_id, Last_name, First_name, Adr_1, Adr_2, CODE, CITY) ) LOCATION ( 'Emp.tsv' )
    )

  • EXTERNAL TABLE SKIP HEADER :::THANKS

    Hi Everybody,
    How can I skip header in a csv File using an external table definition ,
    Thanks,
    W

    Did you try SKIP?
    http://download-uk.oracle.com/docs/cd/B10501_01/server.920/a96652/ch12.htm#1009462

  • External Table Skip Row

    I've created external table, but how do I get the external table to skip the header and footer?
    CREATE TABLE "KAS_EXT"
    CUSTOMERID VARCHAR2(9),
    CANVASSID VARCHAR2(9),
    REPID VARCHAR2(9)
    ORGANIZATION external
    TYPE oracle_loader
    DEFAULT DIRECTORY SYS_SQLLDR_XT_TMPDIR_00000
    ACCESS PARAMETERS
    RECORDS DELIMITED BY NEWLINE CHARACTERSET WE8MSWIN1252
    BADFILE 'kaslist_ext_devt_water.bad'
    LOGFILE 'kaslist_ext_devt_water.log_xt'
    READSIZE 1048576
    FIELDS TERMINATED BY "," LDRTRIM
    MISSING FIELD VALUES ARE NULL
    SKIP 1 <<<<<<<<<<<<<<< here???
    REJECT ROWS WITH ALL NULL FIELDS
    CUSTOMERID CHAR(255)
    TERMINATED BY ",",
    CANVASSID CHAR(255)
    TERMINATED BY ",",
    REPID CHAR(255)
    TERMINATED BY ","
    location
    'kas.csv'
    )REJECT LIMIT UNLIMITED

    I don't know of any way to make it automatically skip footers. Hopefully, the footer doesn't match your data structure, so that it couldn't get loaded as a record, then it would just go to your bad file.

  • SQL and External table question

    Hello averyone,
    I have a file to be read as an external table part of it is below:
    ISO-10303-21;
    HEADER;
    FILE_DESCRIPTION((''),'2;1');
    FILE_NAME('BRACKET','2005-07-08T',('broilo'),(''),
    'PRO/ENGINEER BY PARAMETRIC TECHNOLOGY CORPORATION, 2004400',
    'PRO/ENGINEER BY PARAMETRIC TECHNOLOGY CORPORATION, 2004400','');
    FILE_SCHEMA(('CONFIG_CONTROL_DESIGN'));
    ENDSEC;
    DATA;
    #5=CARTESIAN_POINT('',(5.5E0,5.5E0,-5.1E1));
    #6=DIRECTION('',(0.E0,0.E0,1.E0));
    #7=DIRECTION('',(-1.E0,0.E0,0.E0));
    #8=AXIS2_PLACEMENT_3D('',#5,#6,#7);
    The first question is: how to ignore the lines until the DATA; line or SQL already does it for me?
    The second question is: since the fields of interest are separated by commas and the first field does not interest me (it is solved with a varchar2) how can I read the following fields as numbers ignoring the (,# and ) characters please?
    Thanks for any help.
    Sincerely yours,
    André Luiz

    The SKIP option can be used with SQL*Loader to skip a certain number of lines before starting to load. Off hand I cannot see any easy way to load the data in the format given. The format does not resemble a typical CVS format. You can look at the test cases provided for SQ*Loader in the Oracle® Database Utilities guide - or simply write PL/SQL code to load this data manually using UTL_FILE.

  • IGNORE COMMENTS IN EXTERNAL TABLE FILE

    I currently have a partitioned table, split by month going back some 7 years.
    The business have now agreed to archive off some of the data but to have it
    available if required. I therefore intend to select month by month, the data
    to a flat file and make use of external tables to provide the data for the
    business if required. Then to remove the months partition thus releaseing
    space back to the database. Each months data is only about 100bytes long, but
    there could be about 15million rows.
    I can successfully create the external table and read the data.
    However I would like to place a header in the spooled file but the only method
    I have found to ignore the header is to use the "SKIP" parameter. I would
    rather not use this as it hard codes the number of lines I reserve for a
    header.
    Is there another method to add a comment to a external tables file and have the
    selects on the file ignore the comments?
    Could I use "LOAD WHEN ( column1 != '--' )" as each comment line begins with a
    double dash?
    Also, is there a limit to the size of an external tables file?

    When I added the lines to attempt to
    exclude the Header lines to be:
    RECORDS DELIMITED BY newline
    NOBADFILE
    NODISCARDFILE
    NOLOGFILE
    SKIP 0
    LOAD WHEN ( cvt_seq_num LIKE '--%' )
    FIELDS
    TERMINATED BY ","
    OPTIONALLY ENCLOSED BY '"'
    MISSING FIELD VALUES ARE NULL
    REJECT ROWS WITH ALL NULL FIELDS
    ( cvt_seq_num INTEGER EXTERNAL(12)
    I got:
    select * from old_cvt_external
    ERROR at line 1:
    ORA-29913: error in executing ODCIEXTTABLEOPENcallout
    ORA-29400: data cartridge error
    KUP-00554: error encountered while parsing access parameters
    KUP-01005: syntax error: found "identifier": expecting one of: "equal,notequal"
    KUP-01008: the bad identifier was: LIKE
    KUP-01007: at line 6 column 33
    ORA-06512: at "SYS.ORACLE_LOADER", line 14
    ORA-06512: at line 1
    When I moved the line I got:
    RECORDS DELIMITED BY newline
    NOBADFILE
    NODISCARDFILE
    NOLOGFILE
    SKIP 0
    FIELDS
    TERMINATED BY ","
    OPTIONALLY ENCLOSED BY '"'
    MISSING FIELD VALUES ARE NULL
    REJECT ROWS WITH ALL NULL FIELDS
    LOAD WHEN ( cvt_seq_num LIKE '--%' )
    I got:
    select * from old_cvt_external
    ERROR at line 1:
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-00554: error encountered while parsing access parameters
    KUP-01005: syntax error: found "load": expecting one of: "exit, ("
    KUP-01007: at line 11 column 11
    ORA-06512: at "SYS.ORACLE_LOADER", line 14
    ORA-06512: at line 1
    So it appears that the "LOAD WHEN" does not work.
    Can you please suggest anything else?

  • Headers appear in external table

    I have defined several external tables over text files that have been built in the same way - in excel, then saved as a csv file.
    Most of them work fine but there is one that always returns the column headers as data. I cant understand how this happens for this one file when they are all built in the the same way. I have checked the ddl for the external tables and they are equivalent.
    I realise I could set it to skip 1 row but I am loathe to do that because if this is some sort of bug then it may be resolved later resulting in the first row of data being lost - a problem that may not become obvious until found by end users.
    I have tried rebuilding from scratch and get the same result.
    Has anyone else faced this situation?

    I don't know what the exact trick is that OWB does,
    but when you sample the file you use for the external
    table and tell OWB to use the first row for the
    header names, it won't use the first line as a data
    line, even though there's nothing to be found in the
    external table definition that indicates that.Hi,
    that's no trick. If you sample the file and use the first row for column names the first row only disapears in the wizard. The resulting external table definition doesn't contain a skip row statement in this case automatically. The reason why you don't recognize this is because per default the loader writes no bad files. So the header row is normally ignored (because it doesn't match the structure) and you don't find it in the external table. If you will configure your external tables based on a file with an header and a skip parameter = 0 to produce a bad file (Bad File Name = file.bad) you will find the header row in this bad file then.
    The only right way ist to configure the skip parameter - not to repress the resulting errors.
    In this special file I guess the header matches the structure of the external table, so it isn't ignored but loaded into the external table.
    Regards,
    Detlef

  • IGNORE COMMENTS IN EXTERNAL TABLE

    I currently have a partitioned table, split by month going back some 7 years.
    The business have now agreed to archive off some of the data but to have it
    available if required. I therefore intend to select month by month, the data
    to a flat file and make use of external tables to provide the data for the
    business if required. Then to remove the months partition thus releaseing
    space back to the database. Each months data is only about 100bytes long, but
    there could be about 15million rows.
    I can successfully create the external table and read the data.
    However I would like to place a header in the spooled file but the only method
    I have found to ignore the header is to use the "SKIP" parameter. I would
    rather not use this as it hard codes the number of lines I reserve for a
    header.
    Is there another method to add a comment to a external tables file and have the
    selects on the file ignore the comments?
    Could I use "LOAD WHEN ( column1 != '--' )" as each comment line begins with a
    double dash?
    Also, is there a limit to the size of an external tables file?
    Thanks
    JD

    Hi Gurus..
    any help will be highly appreciated.

  • Reject the footer using the external table

    Hi,
    I have a flat file with fixed length which have a header and a footer.
    My file is something like this:
    HADF.TXT0309
    D12345ABCD
    D22345ABCD
    FOOTERHJ
    I want to create an external table based on that file, but
    I don't want to have the header and the footer in my table. To eliminate the header I used skip 1, but I don't know how to eliminate the footer.
    Any example and suggestions will be appreciated
    Thank you for your time and consideration
    Catalin

    Hi,
    This problem may be due to several reasons. I am aware of few reasons.
    1)Have you deployed the external table first?
    2) If you do not have your data base client and server running in the same machine, you should place the csv file in the database server machine's 'c:\CSV' folder in order to create the external table through database. Then do a select count(*) statement.
    3)Another reason may be as gerardnico said the file name you refered may be wrong.
    I don't know your requirement. If you could create the external table succefully and if you get value for the select count(*) from <external_table_name>, then try to import the external table into the Design Center and map it with the Table you need.
    If you are doing it purely with OWB then,
    Do you have the file Export_WithHeaders.csv in the Server machine's 'c:\CSV_FILE' folder?
    Because while importing the metadata of the CSV file the OWB will point to your local machine's 'c:\CSV_FILE' that is why your Validation and Deployment is success without errors.
    But while executing the map it will take the data from Server machine. It will search for the Location 'c:\CSV_FILE' in the server machine and will look for the file Export_WithHeaders.csv there. So create athe same folder setup which you have in your client machine and run this again.
    Try this if you do not get any better answers in this thread

  • External Table Preprocessor

    Dear Experts,
    I was trying to explore the "preprocessor" command in external table, but I was not able to succeed.
    I just followed the steps mentioned in, latest Oracle Magazine.
    http://www.oracle.com/technetwork/issue-archive/2012/12-nov/o62asktom-1867739.html
    But finally when I do the select from table df, I am getting the following error.
    ORA-29913: error in executing ODCIEXTTABLEFETCH callout
    ORA-29400: data cartridge error
    KUP-04095: preprocessor command /export/csrmgr/ripple_dms/run_df.sh encountered error "error during exec: errno is 2
    PS: The output of v$version;
    Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production
    PL/SQL Release 11.2.0.2.0 - Production
    "CORE     11.2.0.2.0     Production"
    TNS for IBM/AIX RISC System/6000: Version 11.2.0.2.0 - Production
    NLSRTL Version 11.2.0.2.0 - Production
    Could anyone help me to resolve this issue.

    I'm not able to test Tom's example at the minute.
    However, that new feature was discussed in the March/April 2011 edition of Oracle Magazine too.
    Oracle Magazine March/April 2011 article on new feature in 11gR2 called the external table preprocessor.
    by Arup Nanda
    CHANGE IN PROCESS
    Now suppose that with your external table and its text file in place, the input file for the external table is compressed to reduce the volume of data transmitted across the network. Although compression helps the network bandwidth utilization, it creates a challenge for the ETL process. The file must be uncompressed before its contents can be accessed by the external table.
    Rather than uncompress the file in a separate process, you can use the preprocessor feature of Oracle Database 11g Release 2 with external tables to uncompress the file inline. And you will notneed to change the ETL process.
    To use the preprocessor feature, first you need to create a preprocessor program. The external table expects input in a text format but not necessarily in a file. The external table does not need to read a file; rather, it expects to get the file contents “fed” to it. So the preprocessor program must stream the input data directly to the external table—and not
    create another input file. The input to the pre-processor will be a compressed file, and the output will be the uncompressed contents.
    The following is the code for your new preprocessor program, named preprocess.bat:
    @echo off
    C:\oracle\product\11.2.0\dbhome_1\BIN\unzip.exe -qc %1The first line, @echo off, suppresses the output of the command in a Windows environment. The remaining code calls the unzip.exe utility located in the Oracle home. The utility needs an input file, which is the first (and only) parameter passed to it, shown as %1. The options q and c tell the utility to uncompress quietly (q) without producing extraneous output such as “Inflating” or “%age inflated” and match filenames case-insensitively (c), respectively.
    Next you need to create the directory object where this preprocessor program is located. Logging in as SYS, issue
    create directory execdir as 'c:\tools';And now grant EXECUTE permissions on the directory to the ETLADMIN user:
    grant execute on directory execdir to etladmin;Finally, create the new external table:
    create table indata1
      cust_id number,
      cust_name varchar2(20),
      credit_limit number(10)
    organization external
      type oracle_loader
      default directory etl_dir
      access parameters
        records delimited by newline
        preprocessor execdir:'preprocess.bat'
        fields terminated by ","
      location ('indata1.txt')
    /It calls the preprocess.bat executable in the directory specified by EXECDIR before the external table accesses the indata1.txt file in the location specified by the ETL_DIR directory. Remember, indata1.txt is now a compressed file. So, in effect, the external table reads not the actual specified input file but rather the output of preprocess.bat, which is the uncompressed data from the indata1.txt file.
    If you select from the external table now, the output will be similar to that of the earlier select * from indata1; query. The preprocessor passed the uncompressed contents of the indata1.txt (compressed) file on to the external table. There was no need to uncompress the file first—saving significant time and the intermediate space required and making it unnecessary to change the ETL process.
    This inline preprocessing unzip example uses a script, but that is not always necessary. An executable can be used instead. For example, in Linux you can use /bin/gunzip. However, the utility can’t accept any parameters. So if you pass parameters (as in this article’s example), you must use a script.
    Security Concerns
    =================
    The EXECUTE privilege on a directory is a new feature introduced in Oracle Database 11g Release 2. It enables the DBA to grant EXECUTE permissions only for certain directories and only to certain users. Without WRITE privileges, users will not be able to
    update the executables inside a directory to insert malicious code, but users will be able to execute the “approved” code accessible in a single location. The DBA can put all the necessary preprocessor tools into a single directory and grant EXECUTE privileges there to the users who may need them. And, of course, the executables and data should be in different directories.
    Preprocessing also requires some special precautions on the part of the DBA. Because the executables called by preprocessing programs will be executed under the privileges of the Oracle software owner and malicious executable code can cause a lot of damage, the DBA should be extremely careful in monitoring executables for potentially harmful code.
    The directory containing the preprocessor executables needs to be accessible to the Oracle software owner for EXECUTE operations only, not for WRITE activity. Therefore, as an added precaution, the system administrator can remove WRITE access to that directory from all users, including the Oracle software owner. This significantly reduces the chance of damage by malicious code.
    Other Uses
    ==========
    Compression is not the only use for inline preprocessing, although it certainly is the most widely used. You can, for example, use this preprocessing technique to show the output of a program as an external table. Consider, for instance, the dir command in Windows for listing the contents of a directory. How would you like to get the output as a table so that you can apply predicates?
    Getting and using this output is quite simple with the preprocessor functionality. Remember, the preprocessor does not actually need a file but, rather, requires the output of the preprocessor program. You can write a preprocessor program to send the
    output of a dir command. The new preprocessor program, named preproc_dir.bat, has only the following two lines:
    @echo off
    dirYou will also need a file for the external table. The contents of the file are irrelevant, so you can use any file that the Oracle software owner can read in a directory to which that owner has read access. For this example, the file is dirfile.txt, and although the contents of the file are immaterial, the file must exist, because the external table will access it. Listing 1 shows how to create the table.
    Because the dir command displays output in a prespecified manner, the external table easily parses it by reading the fields located in specific positions. For example, positions 1 through 10 display the date, 11 through 20 display the time, and so on. The dir command produces some heading and preliminary information that the external table has to ignore, so there is a skip 5 clause in Listing 1 that skips the first five lines of the output. The last few lines of the output show how many files and directories are present and how much free space remains. This output must be skipped as well, so the external table displays records only when the date column has a value.
    Listing 1 also shows the result of a query against the external table. Because the MOD_DT column is of the date datatype, you can also apply a WHERE condition to select a specified set of records.
    Code Listing 1:
    ===============
    create table dir_tab
      mod_dt date,
      mod_time char(10),
      file_type char(10),
      file_size char(10),
      file_name char(40)
      organization external
      type oracle_loader
      default directory etl_dir
      access parameters
        records delimited by newline
        preprocessor execdir:'preproc_dir.bat'
        skip 5
        load when (mod_dt != blanks)
        fields
          mod_dt position (01:10) DATE mask "mm/dd/yyyy",
          mod_time position (11:20),
          file_type position (21:29),
          file_size position (30:38),
          file_name position (39:80)
      location ('dirfile.txt')
    reject limit unlimited
    -- select from this table
    SQL> select * from dir_tab;
    MOD_DT        MOD_TIME       FILE_TYPE       FILE_SIZE      FILE_NAME
    16-DEC-10     10:12 AM       <DIR> .
    16-DEC-10     10:12 AM       <DIR> ..
    22-MAY-10     09:57 PM       <DIR> archive
    22-MAY-10     10:27 PM                                2,048 hc_alap112.dat
    05-DEC-10     07:07 PM                                   36 indata1.txt
    22-DEC-05     04:07 AM                               31,744 oradba.exe
    16-DEC-10     09:58 AM                                1,123 oradim.log
    28-SEP-10     12:41 PM                                1,536 PWDALAP112.ora
    16-DEC-10     09:58 AM                                2,560 SPFILEALAP112.ORA
    9 rows selected.
    -- select a file not updated in last 1 year
    SQL> select * from dir_tab where mod_dt < sysdate - 365;
    MOD_DT        MOD_TIME       FILE_TYPE       FILE_SIZE      FILE_NAME
    22-DEC-05     04:07 AM                               31,744 oradba.exe

  • KUP-01005 error in external table definition

    Hello,
    I'm facing an issue that I found it has been rose in the past but that never had an answer.
    I'm using an external table to load a file on the database.
    This file format is CSV (actually using ";" as delimiter) and it has header and trailer records that has the same format of data records, but contain field names in it.
    E.g.:
    FIELD_NAME1;FIELD_NAME2;...
    I want to discard this records (for the header is easy, I can use SKIP 1 parameter) and the way I was trying to do this was using the following condition:
    LOAD WHEN ( FIELD_NAME1 != "FIELD_NAME1" )
    The problem arises when a record has the first field empty (hence NULL): the record is discarded.
    Therefore I tried modifing the condition in this way:
    LOAD WHEN ( ( FIELD_NAME1 != "FIELD_NAME1" ) OR ( ( FIELD_NAME1 = NULL ) )
    but in this case I get the following error:
    KUP-01005 : syntax error : found "null": expecting on of "blanks, double-quoted-string, hexprefix, identifier, single-quoted-string"
    I've doublechecked Oracle reference and the NULL keyword is admitted - morover it is stated that a NULL value can be tested only against NULL value, otherwise the comparison returns false.
    Can anyone help me solving this issue? Is it a bug in SQLLOADER Driver used by External Tables or an error in Oracle Reference?
    Regards

    It's an other option that I've already tried, but it doesn't work - and it is correct because in External Table statement I've specified to trim fields (LRTRIM), so there's no fields containing blanks (they're simply empty).
    However, my main concern is that I tried to use a statement that should be correct - according to Oracle reference.
    Instead I get an error regarding the syntax of the statement...
    I think that I can workaround this problem - but it is a workaround, not a real solution... :-|

  • External table.How to load numbers (decimal and scientific notation format)

    Hi all, I need to load inside an external table records that contain 7 fields. The last field is called AMOUNT and it's represented in some records with the decimal format, in others records with the scientific notation format as, for example, below:
    CY001_STATU;2009;Jan;11220020GR;'03900;CYZ900;-9,99999999839929e-03
    CY001_STATU;2009;Jan;11200100;'60800;CYZ900;41380,77
    The External table's script is the following:
    CREATE TABLE HYP_DATA
    COUNTRY VARCHAR2(50 BYTE),
    YEAR VARCHAR2(20 BYTE),
    PERIOD VARCHAR2(20 BYTE),
    ACCOUNT VARCHAR2(50 BYTE),
    DEPT VARCHAR2(20 BYTE),
    ACTIVITY_LOC VARCHAR2(20 BYTE),
    AMOUNT VARCHAR2(50 BYTE)
    ORGANIZATION EXTERNAL
    ( TYPE ORACLE_LOADER
    DEFAULT DIRECTORY HYP_DATA_DIR
    ACCESS PARAMETERS
    ( RECORDS DELIMITED BY NEWLINE
    BADFILE 'HYP_BAD_DIR':'HYP_LOAD.bad'
    DISCARDFILE 'HYP_DISCARD_DIR':'HYP_LOAD.dsc'
    LOGFILE 'HYP_LOG_DIR':'HYP_LOAD.log'
    SKIP 0
    FIELDS TERMINATED BY ";"
    MISSING FIELD VALUES ARE NULL
    REJECT ROWS WITH ALL NULL FIELDS
    "COUNTRY" Char,
    "YEAR" Char,
    "PERIOD" Char,
    "ACCOUNT" Char,
    "DEPT" Char,
    "ACTIVITY_LOC" Char,
    "AMOUNT" Char
    LOCATION (HYP_DATA_DIR:'Total.txt')
    REJECT LIMIT UNLIMITED
    NOPARALLEL
    NOMONITORING;
    If, for the field AMOUNT I use the datatype VARCHAR (as above), the table is loaded but I have some records rejected, and all these records contain the last field AMOUNT with the scientific notation as:
    CY001_STATU;2009;Jan;11220020GR;'03900;CYZ900;-9,99999999839929e-03
    CY001_STATU;2009;Feb;11220020GR;'03900;CYZ900;-9,99999999839929e-03
    CY001_STATU;2009;Mar;11220020GR;'03900;CYZ900;-9,99999999839929e-03
    CY001_STATU;2009;Dec;11220020GR;'03900;CYZ900;-9,99999999839929e-03
    All the others records with a decimal AMOUNT are loaded correctly.
    So, my problem is that I NEED to load all the records (with the decimal and the scientific notation format) together (without records rejected), but I don't know which datatype I have to use for the AMOUNT field....
    Anybody has any idea ???
    Any help would be appreciated
    Thanks in advance
    Alex

    @OP,
    What version of Oracle are you using?
    Just cut'n'paste of you script and example woked FINE for me.
    however my quation is... An external table will LOAD all data or none at all. How are you validating/concluding that...
    I have some records rejected, and all these records contain the last field AMOUNT with the scientific notation
    select * from v$version where rownum <2;
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
    select * from mydata;
    CY001_STATU     2009     Jan     11220020GR     '03900     CYZ900     -9,99999999839929e-03
    CY001_STATU     2009     Feb     11220020GR     '03900     CYZ900     -9,99999999839929e-03
    CY001_STATU     2009     Jan     11220020GR     '03900     CYZ900     -9,99999999839929e-03
    CY001_STATU     2009     Jan     11200100     '60800     CYZ900     41380,77
    CY001_STATU     2009     Mar     11220020GR     '03900     CYZ900     -9,99999999839929e-03
    CY001_STATU     2009     Dec     11220020GR     '03900     CYZ900     -9,99999999839929e-03
    CY001_STATU     2009     Jan     11220020GR     '03900     CYZ900     -9,99999999839929e-03
    CY001_STATU     2009     Jan     11200100     '60800     CYZ900     41380,77MYDATA table script is...
    drop table mydata;
    CREATE TABLE mydata
    COUNTRY VARCHAR2(50 BYTE),
    YEAR VARCHAR2(20 BYTE),
    PERIOD VARCHAR2(20 BYTE),
    ACCOUNT VARCHAR2(50 BYTE),
    DEPT VARCHAR2(20 BYTE),
    ACTIVITY_LOC VARCHAR2(20 BYTE),
    AMOUNT VARCHAR2(50 BYTE)
    ORGANIZATION EXTERNAL
    ( TYPE ORACLE_LOADER
    DEFAULT DIRECTORY IN_DIR
    ACCESS PARAMETERS
    ( RECORDS DELIMITED BY NEWLINE
    BADFILE 'IN_DIR':'HYP_LOAD.bad'
    DISCARDFILE 'IN_DIR':'HYP_LOAD.dsc'
    LOGFILE 'IN_DIR':'HYP_LOAD.log'
    SKIP 0
    FIELDS TERMINATED BY ";"
    MISSING FIELD VALUES ARE NULL
    REJECT ROWS WITH ALL NULL FIELDS
    "COUNTRY" Char,
    "YEAR" Char,
    "PERIOD" Char,
    "ACCOUNT" Char,
    "DEPT" Char,
    "ACTIVITY_LOC" Char,
    "AMOUNT" Char
    LOCATION (IN_DIR:'total.txt')
    REJECT LIMIT UNLIMITED
    NOPARALLEL
    NOMONITORING;vr,
    Sudhakar B.

  • How to ''give'' error for this case of an EXTERNAL TABLE?

    Our external table routine works fine:
    -- We have a csv file with 2 cols.
    -- When we create the table referring the csv it works fine.
    -- Even if the csv has more the 2 cols, the ET command only takes the 2 cols and it works fine.
    -- Now, users are saying that if the csv has more than 2 cols, the ET command should give an error
    I went through the command but cannot find any clause which will do this.
    Is there any other way or workaround?
    CREATE TABLE <table_name> (
    <column_definitions>)
    ORGANIZATION EXTERNAL
    (TYPE oracle_loader
    DEFAULT DIRECTORY <oracle_directory_object_name>
    ACCESS PARAMETERS (
    RECORDS DELIMITED BY newline
    BADFILE <file_name>
    DISCARDFILE <file_name>
    LOGFILE <file_name>
    [READSIZE <bytes>]
    [SKIP <number_of_rows>
    FIELDS TERMINATED BY '<terminator>'
    REJECT ROWS WITH ALL NULL FIELDS
    MISSING FIELD VALUES ARE NULL
    (<column_name_list>))\
    LOCATION ('<file_name>'))
    [PARALLEL]
    REJECT LIMIT <UNLIMITED | integer>;
    Is it possible to use the READSIZE?
    Edited by: Channa on Sep 23, 2010 2:28 AM

    -- Now, users are saying that if the csv has more than 2 cols, the ET command should give an error
    I went through the command but cannot find any clause which will do this.
    Is there any other way or workaround?I looked at Serverprocess' sql*loader script and did not see how that would answer your question - how to raise an error if the file has more than 2 columns. If I missed something can Serverprocess explain?
    I can't think of a direct way to do this with your external table either, but there may be indirect ways. Some brainstorming ideas of perhaps dubious usefulness follow.
    Placing a view over the external table can limit results to the first two columns but won't raise an error.
    A pipelined function can read the external table, check for data where there shouldn't be any, and raise an exception when you find data in columns where there should not be any.
    Similarly, you could ditch the external table and use utl_file to read the file, manually parsing and checking the data. LOTS more work but more control on your end. External tables are much easer to use :(
    Or, first load the external table into a work table before the "real" select. Check the work table for the offending data programatically and raise an error if data is where it should not be. You could keep the existing external table and not have to do a lot of recoding.
    Or, also load the data into an otherwise unneeded global temporary table first. Use a trigger on the load to look for the unwanted data and raise an error if offending data is there
    These ideas are boiling down to variations on validating the data before you use it.
    Good luck!

  • Syntax Error, while Creating External Table

    Hi,
    Can anyone tell this Syntax is correct or not. And how it look like:
    CREATE TABLE emp_load (T_DATE DATE NOT NULL,
    T_NO DOUBLE PRECISION,
    T_TYPE CHAR(1),
    T_CO VARCHAR(30),
    T_CONTRACT DOUBLE PRECISION,
    T_PARTY VARCHAR(15),
    T_BILL_NO DOUBLE PRECISION,
    T_BILL_DATE DATE,
    T_QTY DOUBLE PRECISION,
    T_RATE DOUBLE PRECISION,
    T_BKG DOUBLE PRECISION NOT NULL,
    T_TAX DOUBLE PRECISION,
    T_OTHER DOUBLE PRECISION,
    T_DIV DOUBLE PRECISION,
    T_STAMPS DOUBLE PRECISION,
    T_DC_NO DOUBLE PRECISION,
    T_SPOT CHAR(2),
    T_CITY VARCHAR(3),
    T_ORDER DOUBLE PRECISION,
    T_TRADE DOUBLE PRECISION,
    T_TIME DATE,
    T_FORM CHAR(1),
    HEADER VARCHAR(2),
    T_DC_DATE DATE,
    T_POD VARCHAR(30),
    T_POD_DATE DATE,
    T_SET INTEGER,
    T_MARGIN DOUBLE PRECISION NOT NULL WITH DEFAULT,
    T_MKT_TYPE VARCHAR(3) NOT NULL,
    T_MEMBER_CODE VARCHAR(15),
    T_EXIM SMALLINT,
    T_P VARCHAR(15),
    T_C VARCHAR(30),
    T_TERM VARCHAR(8),
    T_CUST VARCHAR(12),
    T_SUBBKG DOUBLE PRECISION,
    T_BILL_TYPE VARCHAR(1),
    T_OPN_CLS DOUBLE PRECISION,
    T_AUCTION DOUBLE PRECISION,
    T_BRANCH VARCHAR(8),
    T_UNQID VARCHAR(20),
    T_SQ SMALLINT,
    T_CBKG DOUBLE PRECISION,
    T_COTHER DOUBLE PRECISION,
    T_MOD DATE,
    T_ORIG_CUST VARCHAR(20),
    T_TOT DOUBLE PRECISION,
    T_FIRM SMALLINT,
    T_ORDER_TIME DATE,
    T_NON_MT SMALLINT)
    ORGANIZATION EXTERNAL (TYPE ORACLE_LOADER DEFAULT DIRECTORY ext_tab_dir
    ACCESS PARAMETERS (RECORDS FIXED 62 FIELDS (T_DATE DATE NOT NULL,
    T_NO DOUBLE PRECISION,
    T_TYPE CHAR(1),
    T_CO VARCHAR(30),
    T_CONTRACT DOUBLE PRECISION,
    T_PARTY VARCHAR(15),
    T_BILL_NO DOUBLE PRECISION,
    T_BILL_DATE DATE,
    T_QTY DOUBLE PRECISION,
    T_RATE DOUBLE PRECISION,
    T_BKG DOUBLE PRECISION NOT NULL,
    T_TAX DOUBLE PRECISION,
    T_OTHER DOUBLE PRECISION,
    T_DIV DOUBLE PRECISION,
    T_STAMPS DOUBLE PRECISION,
    T_DC_NO DOUBLE PRECISION,
    T_SPOT CHAR(2),
    T_CITY VARCHAR(3),
    T_ORDER DOUBLE PRECISION,
    T_TRADE DOUBLE PRECISION,
    T_TIME DATE,
    T_FORM CHAR(1),
    HEADER VARCHAR(2),
    T_DC_DATE DATE,
    T_POD VARCHAR(30),
    T_POD_DATE DATE,
    T_SET INTEGER,
    T_MARGIN DOUBLE PRECISION NOT NULL WITH DEFAULT,
    T_MKT_TYPE VARCHAR(3) NOT NULL,
    T_MEMBER_CODE VARCHAR(15),
    T_EXIM SMALLINT,
    T_P VARCHAR(15),
    T_C VARCHAR(30),
    T_TERM VARCHAR(8),
    T_CUST VARCHAR(12),
    T_SUBBKG DOUBLE PRECISION,
    T_BILL_TYPE VARCHAR(1),
    T_OPN_CLS DOUBLE PRECISION,
    T_AUCTION DOUBLE PRECISION,
    T_BRANCH VARCHAR(8),
    T_UNQID VARCHAR(20),
    T_SQ SMALLINT,
    T_CBKG DOUBLE PRECISION,
    T_COTHER DOUBLE PRECISION,
    T_MOD DATE,
    T_ORIG_CUST VARCHAR(20),
    T_TOT DOUBLE PRECISION,
    T_FIRM SMALLINT,
    T_ORDER_TIME DATE,
    T_NON_MT SMALLINT))
    LOCATION ('BR271107.DAT'))
    Error at Command Line:28 Column:40
    Error report:
    SQL Error: ORA-00905: missing keyword
    00905. 00000 - "missing keyword"
    Thank u..!
    Ravi

    Where can we find that directory in the server. You have to provide for it's existence.
    So you must create C:/Oracle on the server or have your ext_tab_dir point to some existing directory on the server (better if dedicated to external tables for not creating confusion)
    You must also see to have read and write OS rights and have granted read (and write) privileges on directory ext_tab_dir to your_user_name
    Regards
    Etbin
    If you can use utl_file try to use utl_file_dir as your ext_tab_dir to perform the test => copy your file to the directory your utl_file_dir is pointing to and do select * from test
    Message was edited by: Etbin
    user596003

Maybe you are looking for

  • Late 2009 imac as second monitor

    Hello, I just purchased a new iMac retina 27 inch. I am trying to use my late 2009 iMac 27 inch as a second display using a mini display port. I can get it to work my old cheap second monitor, but when I try the 2009 mac, and I press command F2, it s

  • Approver Tab in Graphic View

    hi, For all the SC when i see the approver Tab i am not able to see the approve hierarchy in graphic Form .so i am going for Table display. how to restore this problem

  • N80 - where should mp3 files be on memory card

    I have an N80 and use Apple Macs. I use the Thrubb application to tranfer music. Where should they reside on the memory card ? When I transfer to Sounds > Digital I can't see them via Music Player but can via Gallery. Any help appreciated

  • Qosmio G30-126 - Win7 or Vista instaltion hdd not showing

    Hello everyone, i have one system Qosmio g30-126. I dont have the recovery cds. I want to instal the Win7 or Win Vista. He is not showing me the both 200+200 HDD. plz telling to me what i do.

  • How to get audio frequency

    hello This is sudhakar working as a programmer on Java now we are implementing audio recording and playing.. but here we are facing one problem.. that is . how to get the audio frequency...that is pitch but it variate the pitch which during the play.