Remove Enter Sign in External Table

Dear Gurus,
I have external table with show 'enter sign'.
I thought it was blank space but not.
How to remove it?since function TRIM can't remove it.
Records delimited by newline -> is that any syntax to prevent enter sign appear?
Thank you
Regards
JOE

This seems to be a TOAD issue - pl ask in the TOAD forums - http://toadworld.com/Discussions/tabid/824/Default.aspx
HTH
Srini

Similar Messages

  • Remove space in external table

    Dear Gurus,
    I generated external table in Toad, but found that after generated, the datas have space.
    My Oracle database version is 11g
    OS:Linux redhat 5
    I created directory in server then created file.txt, external table in Database. I have check file.txt and there is no space.
    Where the space came from?
    Please help
    Thank you
    JOE

    This seems to be a TOAD issue - pl ask in the TOAD forums - http://toadworld.com/Discussions/tabid/824/Default.aspx
    HTH
    Srini

  • How to remove carraige return from the field while loading external table

    I am facing an issue of not getting rid of carraige returns present in the fileds of the source .csv file while loading the records into external table.
    I had tried using LRTRIM, but it does not help.
    The error I am getting is:
    KUP-04021: field formatting error for field POPULATION_DESCRIPTION
    KUP-04037: terminator not found
    I am pasting one record out of the .csv file which is causing this error as below:
    "Business Card Accounts
    ",123,7 BizCard - Gamers,Control,"Business Card Accounts
    ",75270,75271
    You can see the carraige return in the 1st field as well as 5th field. Filed are separated by commas & eclosed by double quotes.
    Could anybody help on this please?

    Can you copy the file to an external table only version, and then dos2unix it?
    Alternatively, you can use the ACCESS PARAMETERS area in your external table definition to set that your RECORDS DELIMITED BY carriage returns, instead of the default.
    Check out the example here http://www.psoug.org/reference/externaltab.html

  • External Table LKM: error during task interpretation

    I've had this problem more than once now, using "LKM File to Oracle (EXTERNAL TABLE)". Before the interface can even run, I get an error, as if the script can't be output. I've tried removing the LKM and reimporting. No use. I am using this LKM successfully in other projects... not sure why it would have issues here. When I had this problem previously, Oracle Support couldn't reproduce it (they certainly tried). I managed to work around it then, but now... this is getting problematic. Insight welcome.
    Here's the start of the error:
    com.sunopsis.tools.core.exception.SnpsSimpleMessageException: Error during task interpretation
    Task:6
    java.lang.Exception: BeanShell script error: Sourced file: inline evaluation of: ``out.print("The application script threw an exception: java.lang.StringIndexOutOf . . . '' Token Parsing Error: Lexical error at line 2, column 42. Encountered: "\\" (92), after : "": <at unknown location>
    BSF info: Create external table at line: 0 column: columnNo
         at com.sunopsis.dwg.codeinterpretor.a.a(a.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.i(e.java)
         at com.sunopsis.dwg.cmd.h.y(h.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Unknown Source)
    Text:The application script threw an exception: java.lang.StringIndexOutOfBoundsException: *String index out of range: 2* BSF info: Create external table at line: 0 column: columnNo

    Hi,
    I have this exact problem, did you get a solution?
    Thanks

  • Best way to remove duplicates based on multiple tables

    Hi,
    I have a mechanism which loads flat files into multiple tables (can be up to 6 different tables) using external tables.
    Whenever a new file arrives, I need to insert duplicate rows to a side table, but the duplicate rows are to be searched in all 6 tables according to a given set of columns which exist in all of them.
    In the SQL Server Version of the same mechanism (which i'm migrating to Oracle) it uses an additional "UNIQUE" table with only 2 columns(Checksum1, Checksum2) which hold the checksum values of 2 different sets of columns per inserted record. when a new file arrives it computes these 2 checksums for every record and look it up in the unique table to avoid searching all the different tables.
    We know that working with checksums is not bulletproof but with those sets of fields it seems to work.
    My questions are:
    should I use the same checksums mechanism? if so, should I use the owa_opt_lock.checksum function to calculate the checksums?
    Or should I look for duplicates in all tables one after the other (indexing some of the columns we check for duplicates with)?
    Note:
    These tables are partitioned with day partitions and can be very large.
    Any advice would be welcome.
    Thanks.

    >
    I need to keep duplicate rows in a side table and not load them into table1...table6
    >
    Does that mean that you don't want ANY row if it has a duplicate on your 6 columns?
    Let's say I have six records that have identical values for your 6 columns. One record meets the condition for table1, one for table2 and so on.
    Do you want to keep one of these records and put the other 5 in the side table? If so, which one should be kept?
    Or do you want all 6 records put in the side table?
    You could delete the duplicates from the temp table as the first step. Or better
    1. add a new column WHICH_TABLE NUMBER to the temp table
    2. update the new column to -1 for records that are dups.
    3. update the new column (might be done with one query) to set the table number based on the conditions for each table
    4. INSERT INTO TABLE1 SELECT * FROM TEMP_TABLE WHERE WHICH_TABLE = 1
    INSERT INTO TABLE6 SELECT * FROM TEMP_TABLE WHERE WHICH_TABLE = 6
    When you are done the WHICH_TABLE will be flagged with
    1. NULL if a record was not a DUP but was not inserted into any of your tables - possible error record to examine
    2. -1 if a record was a DUP
    3. 1 - if the record went to table 1 (2 for table 2 and so on)
    This 'flag and then select' approach is more performant than deleting records after each select. Especially if the flagging can be done in one pass (full table scan).
    See this other thread (or many, many others on the net) from today for how to find and remove duplicates
    Best way of removing duplicates

  • External table no data

    I created an external table with no errors, but when I try to issue sql on that external table there is no data. Here's a sub-set of my data and the sql used to create the external table. 10.2.0.2 on aix 5.2 64bit.
    "BoxC_SL.lwl"|106|"tzdpship52"|1|1|1/28/2008 0:00:00|12/30/1899 9:56:03|||||
    "Genshp_SL.lwl"|106|"tzdpship52"|1|1|1/28/2008 0:00:00|12/30/1899 9:56:04|||||
    "TRLAB-001B_600.lwl"|126|"\\Ibm1\tzsgprod63"|2|1|1/28/2008 0:00:00|12/30/1899 9:56:08|"14928758"|"89-5425.04"|||
    "PalC_SL.lwl"|106|"tzdpship52"|1|1|1/28/2008 0:00:00|12/30/1899 9:56:11|||||
    "SSCC_SL.lwl"|106|"tzdpship52"|1|1|1/28/2008 0:00:00|12/30/1899 9:56:12|||||
    CREATE TABLE "DMR"."LABELS"
    ( "LABEL_NAME" VARCHAR2(100),
    "PRINTER_NUM" NUMBER,
    "PRINTER_NAME" VARCHAR2(100),
    "QTY" NUMBER,
    "DUPLICATES" NUMBER,
    "PRINT_DATE" DATE,
    "PRINT_TIME" TIMESTAMP,
    "LOT" NUMBER,
    "PART" VARCHAR(25),
    "BATCH" VARCHAR(25),
    "WONUM" NUMBER,
    "ITEM" VARCHAR(25)
    ORGANIZATION EXTERNAL
    ( TYPE ORACLE_LOADER
    DEFAULT DIRECTORY "LISTENER_LOG_DIR"
    ACCESS PARAMETERS
    ( records delimited by newline
    nobadfile
    nologfile
    nodiscardfile
    fields terminated by "|" lrtrim
    missing field values are null
    label_name,
    printer_num,
    printer_name,
    qty,
    duplicates,
    print_date char(8) date_format
    date mask "MM/DD/YY",
    print_time char(8) date_format
    date mask "HH24:MI:SS",
    lot,
    part,
    batch,
    wonum,
    item
    LOCATION
    ( 'AUDIT_001.txt'
    REJECT LIMIT UNLIMITED;
    Table created.
    sys@DMRDEV> select * from dmr.labels where rownum < 10;
    no rows selected
    sys@DMRDEV> select count(*) from dmr.labels;
    COUNT(*)
    0

    Hi,
    Here we go
    SQL&gt; drop table t11;
    Table dropped.
    SQL&gt;
    SQL&gt; create table t11 (
    2 "LABEL_NAME" VARCHAR2(20),
    3 "PRINTER_NUM" NUMBER,
    4 "PRINTER_NAME" VARCHAR2(20),
    5 "QTY" NUMBER,
    6 "DUPLICATES" NUMBER,
    7 "PRINT_DATE" date,
    8 "PRINT_TIME" TIMESTAMP,
    9 "LOT" NUMBER,
    10 "PART" VARCHAR(25),
    11 "BATCH" VARCHAR(25),
    12 "WONUM" NUMBER,
    13 "ITEM" VARCHAR(25)
    14 )
    15 organization external (
    16 type oracle_loader
    17 default directory LISTENER_LOG_DIR
    18 access parameters (
    19 records delimited by 0x'0a'
    20 fields terminated by "|" optionally enclosed by '"'
    21 missing field values are null
    22 (
    23 "LABEL_NAME" ,
    24 "PRINTER_NUM" ,
    25 "PRINTER_NAME" ,
    26 "QTY" ,
    27 "DUPLICATES" ,
    28 "PRINT_DATE" DATE "MM/DD/YYYY",
    29 "PRINT_TIME" char(25) DATE_FORMAT timestamp mask "MM/DD/YYYY HH24:MI:SS",
    30 "LOT" ,
    31 "PART" ,
    32 "BATCH" ,
    33 "WONUM" ,
    34 "ITEM"
    35 )
    36 )
    37 location ('holidays.txt')
    38 )
    39 reject limit unlimited;
    Table created.
    SQL&gt;
    I am displaying the count only since you can query select * from t11 and check.. :-)
    SQL&gt; select count(*) from t11;
    COUNT(*)
    5
    Modify the data like this :- remove *""*
    BoxC_SL.lwl|106|tzdpship52|1|1|01/5/2008|12/30/1899 9:56:03|||||
    Genshp_SL.lwl|106|tzdpship52|1|1|01/28/2008|12/30/1899 9:56:04|||||
    PalC_SL.lwl|106|tzdpship52|1|1|1/28/2008|12/30/1899 9:56:11|||||
    SSCC_SL.lwl|106|tzdpship52|1|1|1/28/2008|12/30/1899 9:56:12|||||
    TRLAB-001B_600.lwl|126|\\Ibm1\tzsgprod63|2|1|1/28/2008|12/30/1899 9:56:08|14928758|89-5425.04|||
    Satisfied... Answered question correctly.. ;-)
    - Pavan Kumar

  • How can an external table handle data with line feed between delimiters?

    I have defined an external table as below. My data is pipe delimited and comes from a DOS system.
    I already remove any carriage returns before putting the file into the DATA_DIR for reading. But
    I have found that some of my VARCHAR fields have embeded line feeds.
    Is it possible to have a definition that would remove any line feed characters between the delimiters?
    Below I also threw together a sample data set there ID #2 has that extra character. Yes, I could
    write an awk script to pre-process all my data files. But I am hoping there is a way for Oracle
    to also do this.
    I understand the LDTRIM to remove any leading and trailing spaces in the delimited field. Is there a
    REPLACE or TRANSLATE option. I did a bit of searching but I must be asking the wrong things.
    Thanks for any help
    Eric
    CREATE TABLE table_ext
      id        NUMBER,
      desc1     VARCHAR2(64 CHAR),
      desc2     VARCHAR2(255 CHAR),
      add_date  DATE
    ORGANIZATION EXTERNAL
      TYPE ORACLE_LOADER
      DEFAULT DIRECTORY data_dir
      ACCESS PARAMETERS
        RECORDS DELIMITED BY NEWLINE
        CHARACTERSET WE8ISO8859P1
        BADFILE     log_dir:'table_ext.bad'
        DISCARDFILE log_dir:'table_ext.dis'
        LOGFILE     log_dir:'table_ext.log'
        FIELDS TERMINATED BY '|' LDRTRIM
        MISSING FIELD VALUES ARE NULL
        id        INTEGER EXTERNAL(38),
        desc1     CHAR(64),
        desc2     CHAR(255),
        add_date  CHAR DATE_FORMAT DATE MASK "yyyy-mm-dd hh24:mi",
      LOCATION( 'data.txt' )
    PARALLEL
    REJECT LIMIT UNLIMITED;
    1|short desc|long desc|2001-01-01 00:00
    2|short desc| long
    desc |1999-03-03 23:23
    3|short desc|  long  desc  | 2011-02-02 02:02

    Thanks for looking. But that relates to the record delimiter which in my case is the pipe character '|'. In my various data sets this is consistent. I expect each record to be one per line. But between two delimiters some data has a line feed. So I'm looking for a method that will "cleanup" the field data as it gets imported.
    I was hoping there was an option that would ignore any embedded line feeds (\n) characters. I.e., those not at the end of the line.
    Eric

  • IGNORE COMMENTS IN EXTERNAL TABLE FILE

    I currently have a partitioned table, split by month going back some 7 years.
    The business have now agreed to archive off some of the data but to have it
    available if required. I therefore intend to select month by month, the data
    to a flat file and make use of external tables to provide the data for the
    business if required. Then to remove the months partition thus releaseing
    space back to the database. Each months data is only about 100bytes long, but
    there could be about 15million rows.
    I can successfully create the external table and read the data.
    However I would like to place a header in the spooled file but the only method
    I have found to ignore the header is to use the "SKIP" parameter. I would
    rather not use this as it hard codes the number of lines I reserve for a
    header.
    Is there another method to add a comment to a external tables file and have the
    selects on the file ignore the comments?
    Could I use "LOAD WHEN ( column1 != '--' )" as each comment line begins with a
    double dash?
    Also, is there a limit to the size of an external tables file?

    When I added the lines to attempt to
    exclude the Header lines to be:
    RECORDS DELIMITED BY newline
    NOBADFILE
    NODISCARDFILE
    NOLOGFILE
    SKIP 0
    LOAD WHEN ( cvt_seq_num LIKE '--%' )
    FIELDS
    TERMINATED BY ","
    OPTIONALLY ENCLOSED BY '"'
    MISSING FIELD VALUES ARE NULL
    REJECT ROWS WITH ALL NULL FIELDS
    ( cvt_seq_num INTEGER EXTERNAL(12)
    I got:
    select * from old_cvt_external
    ERROR at line 1:
    ORA-29913: error in executing ODCIEXTTABLEOPENcallout
    ORA-29400: data cartridge error
    KUP-00554: error encountered while parsing access parameters
    KUP-01005: syntax error: found "identifier": expecting one of: "equal,notequal"
    KUP-01008: the bad identifier was: LIKE
    KUP-01007: at line 6 column 33
    ORA-06512: at "SYS.ORACLE_LOADER", line 14
    ORA-06512: at line 1
    When I moved the line I got:
    RECORDS DELIMITED BY newline
    NOBADFILE
    NODISCARDFILE
    NOLOGFILE
    SKIP 0
    FIELDS
    TERMINATED BY ","
    OPTIONALLY ENCLOSED BY '"'
    MISSING FIELD VALUES ARE NULL
    REJECT ROWS WITH ALL NULL FIELDS
    LOAD WHEN ( cvt_seq_num LIKE '--%' )
    I got:
    select * from old_cvt_external
    ERROR at line 1:
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-00554: error encountered while parsing access parameters
    KUP-01005: syntax error: found "load": expecting one of: "exit, ("
    KUP-01007: at line 11 column 11
    ORA-06512: at "SYS.ORACLE_LOADER", line 14
    ORA-06512: at line 1
    So it appears that the "LOAD WHEN" does not work.
    Can you please suggest anything else?

  • Position in external table

    We have excel spread sheet.
    We wanted to load data from excel spread sheet to Oracle via external table.
    I saved my file into .txt format.
    No I need to create external table in oracle DB.
    my question is as follows:
    Do I need to define exact lenght of datatype per data?
    Now data of column city take 16 slots in data file.
    But I defined size 30, is it ok?
    SQL>CREATE TABLE External_City
    CITY VARCHAR2(30),
    STATE VARCHAR2(20),
    ZIP VARCHAR2(10),
    COUNTRY VARCHAR2(30)
    ORGANIZATION EXTERNAL
    TYPE ORACLE_LOADER
    DEFAULT DIRECTORY DATALOAD
    ACCESS PARAMETERS
    RECORDS DELIMITED BY NEWLINE
    FIELDS TERMINATED BY ‘,’
    MISSING FIELD VALUES ARE NULL
    LOCATION (’city_comma.txt’)
    REJECT LIMIT UNLIMITED;
    Do we need to define data position in external table?

    I do not think it has to do with your data or positon of data in it. I am not sure what or where your file is. You need to ensure accessibility to it. it is not seeing the file at all.
    BTW - avoid using udump location as your directory location. Change the directory location (I am not saying it is the cause).
    Here is my test. Although my class.dat file is empty, it shows that your external table should work once your file and directory is accessible. I have only replaced DATALOAD with mytest_dir
    testdb@DBMSDIRECT01:/export/home/oracle/dbmsdirect/test:touch class.dat
    testdb@DBMSDIRECT01:/export/home/oracle/dbmsdirect/test:sqlplus /nolog
    SQL*Plus: Release 10.2.0.2.0 - Production on Mon May 19 15:34:55 2008
    Copyright (c) 1982, 2005, Oracle.  All Rights Reserved.
    SQL> connect testuser
    Enter password:
    Connected.
    SQL> create or replace directory mytest_dir as '/export/home/oracle/dbmsdirect/test';
    Directory created.
    SQL>
    SQL>
    SQL> create table external_class
      2  (
      3  item_no number(10),
      4  class varchar2(255),
      5  level_date date,
      6  note varchar2(4),
      7  sched_level varchar2(10),
      8  min_range number(12),
      9  max_range number(14)
    10  )
    11  ORGANIZATION EXTERNAL
    12  (
    13  TYPE oracle_loader
    14  DEFAULT DIRECTORY mytest_dir
    15  ACCESS PARAMETERS (
    16  RECORDS DELIMITED BY NEWLINE
    17  FIELDS TERMINATED BY ','
    18  MISSING FIELD VALUES ARE NULL
    19  REJECT ROWS WITH ALL NULL FIELDS
    20  (item_no, class, level_date, note,sched_level,min_range,max_range)
    21  )
    22  LOCATION ('class.dat')
    23  )
    24  PARALLEL
    REJECT LIMIT 0;
    25
    Table created.
    SQL> desc external_class
    Name                                      Null?    Type
    ITEM_NO                                            NUMBER(10)
    CLASS                                              VARCHAR2(255)
    LEVEL_DATE                                         DATE
    NOTE                                               VARCHAR2(4)
    SCHED_LEVEL                                        VARCHAR2(10)
    MIN_RANGE                                          NUMBER(12)
    MAX_RANGE                                          NUMBER(14)
    SQL>
    SQL> select * from external_class;
    no rows selected
    SQL>
    SQL> ho rm class.dat
    SQL>
    SQL> select * from external_class;
    select * from external_class
    ERROR at line 1:
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-04040: file class.dat in MYTEST_DIR not found
    ORA-06512: at "SYS.ORACLE_LOADER", line 19
    SQL>
    SQL> ho touch class.dat
    SQL>
    SQL> select * from external_class;
    no rows selected
    SQL>
    SQL>
    SQL> col DIRECTORY_PATH format a50
    SQL> set lines 200
    SQL> select * from dba_directories where DIRECTORY_NAME = 'MYTEST_DIR';
    OWNER                          DIRECTORY_NAME                 DIRECTORY_PATH
    SYS                            MYTEST_DIR                     /export/home/oracle/dbmsdirect/test
    SQL>

  • IGNORE COMMENTS IN EXTERNAL TABLE

    I currently have a partitioned table, split by month going back some 7 years.
    The business have now agreed to archive off some of the data but to have it
    available if required. I therefore intend to select month by month, the data
    to a flat file and make use of external tables to provide the data for the
    business if required. Then to remove the months partition thus releaseing
    space back to the database. Each months data is only about 100bytes long, but
    there could be about 15million rows.
    I can successfully create the external table and read the data.
    However I would like to place a header in the spooled file but the only method
    I have found to ignore the header is to use the "SKIP" parameter. I would
    rather not use this as it hard codes the number of lines I reserve for a
    header.
    Is there another method to add a comment to a external tables file and have the
    selects on the file ignore the comments?
    Could I use "LOAD WHEN ( column1 != '--' )" as each comment line begins with a
    double dash?
    Also, is there a limit to the size of an external tables file?
    Thanks
    JD

    Hi Gurus..
    any help will be highly appreciated.

  • Oracle External Table SELECT FROM problem from workstation

    At Solaris Oracle database server console.
    Created directory on the Solaris server as Oracle owner.
    As SYS created oracle directory object on that Solaris directory
    Granted read/write to SCHEMA_OWNER on that oracle directory object.
    Using OS Oracle user, moved a tab-delimited flat file into the Solaris directory.
    Logged on a SCHEMA_OWNER using SQLPLUS on the Solaris database server.
    Ran external table script in SQLPLUS.
    External table was created.
    Ran Select * from ext_table.
    Appropriate dataset returned.
    So at the Oracle Solaris server console,
    the create external table script worked without errors,
    and a select from statement on the external table worked without errors.
    Move to developer workstation:
    Logged on as SCHEMA_OWNER using SQLPLUS on Windows workstation.
    TNSNAMES of course points to the Solaris database server.
    Ran the external table script in SQLPLUS.
    External table was created.
    Ran Select * from ext_table.
    This failed with these errors:
    ORA29913 error executing ODCIEXTTABLEOPEN callout
    ORA29400 data cartridge error
    cant open logfile.log
    OS permission denied
    ORA06512 sys.oracle_loader line 19
    So how can I select from external tables from a windows workstation?
    Any guidance on this problem would be appreciated.

    Thank you for your response.
    I am not creating the external table on the workstation.
    In both cases,
    from the Solaris console
    and from the Windows workstation
    the external table is being created on the Solaris file system
    in an Oracle Directory object
    running an SQL script from SQLPLUS prompt.
    The external table creator, schema owner, has read and write permissions
    granted by SYS on the oracle directory object.
    The problem is:
    Logged in as the schema owner,
    you can select * from ext_table from the Solaris console SQLPLUS,
    but you can not select * from ext_table from the Windows workstation SQLPLUS.
    You can DROP TABLE ext_table from the Windows workstation SQLPLUS.
    You just cannot select * from ext_table from the Windows workstation SQLPLUS.
    I guess as a test,
    I can drop from the ext_table creation script the references
    to discardfile, badfile, and logfile,
    and remove this other file objects from the equation...
    which appears to be a file permissions problem.

  • Failed to parse SQL Query - External Tables [Apex 4.0.2]

    Greetings experts -
    Has anyone encountered errors when creating a report in Apex that sources from an external table within the database? I'm using the 4.0.2 version that is packaged with the 11g XE edition on 64bit CentOS.
    For example, I might run:
    SELECT NULL LINK,
    COL1 LABEL,
    COL2 VALUE
    FROM MYTAB;
    Where MYTAB is an external table sitting on top of a comma-separated file. When I go to create the page, I'm prompted with the "Failed to parse SQL" dialogue.
    I noticed that if I did CTAS on the external table, and referenced the CTAS table, my page ran without problem!
    Any ideas? Is this a known "limitation" of Apex?
    Thanks,
    CJ

    Chiedu,
    Please try removing all declarative validations on this tabular form, and see if it works as expected then. There are some limitations on the type of views and joins that are supported by tabular forms when using the new declarative validations, i.e. you'll need a key preserved table in your view or join.
    Regards,
    Marc

  • Invalid number error when using external table

    Hello.
    I have a problem with creating an external table with number field.
    My txt file looks like:
    11111|text text|03718
    22222|text text text|04208
    33333|text|04215
    I try to create external table like:
    create table table_ex (
    id varchar2(5),
    name varchar2(100),
    old_id number(5))
    organization external (Type oracle_loader default directory dir
    access parameters(
    RECORDS DELIMITED BY NEWLINE
    fields terminated by '|'
    (id, name, old_id))
    location ('file.txt'));
    When i create the table and run select i get this in log file:
    Field Definitions for table TABLE_EX
    Record format DELIMITED BY NEWLINE
    Data in file has same endianness as the platform
    Rows with all null fields are accepted
    Fields in Data Source:
    ID CHAR (255)
    Terminated by "|"
    Trim whitespace same as SQL Loader
    NAME CHAR (255)
    Terminated by "|"
    Trim whitespace same as SQL Loader
    OLD_ID CHAR (255)
    Terminated by "|"
    Trim whitespace same as SQL Loader
    error processing column OLD_ID in row 1 for datafile
    /dir/file.txt
    ORA-01722: invalid number
    Whats the problem?
    Any idea?
    Thanks
    Message was edited by:
    DejanH

    Try this:
    create table table_ex
    id varchar2(5),
    name varchar2(100),
    old_id number
    organization external
    (Type oracle_loader default directory dir access parameters
    ( RECORDS DELIMITED BY NEWLINE fields terminated by '|'
    (id CHAR(5),
    name CHAR(100),
    old_id CHAR(5)
    location ('file.txt')
    I have removed the length of Number field and added length in characters later
    Aalap Sharma :)
    Message was edited by:
    Aalap Sharma

  • Oracle 11g - External Table/SQL Developer Issue?

    Oracle 11g - External Table/SQL Developer Issue?
    ==============================
    I hope this is the right forum for this issue, if not let me, where to go.
    We are using Oracle 11g (11.2.0.1.0) on (Platform : solaris[tm] oe (64-bit)), Sql Developer 3.0.04
    We are trying to use oracle external table to load text files in .csv format. Here is our data look like.
    ======================
    Date1,date2,Political party,Name, ROLE
    20-Jan-66,22-Nov-69,Democratic,"John ", MMM
    22-Nov-70,20-Jan-71,Democratic,"John Jr.",MMM
    20-Jan-68,9-Aug-70,Republican,"Rick Ford Sr.", MMM
    9-Aug-72,20-Jan-75,Republican,Henry,MMM
    ------ ALL NULL -- record
    20-Jan-80,20-Jan-89,Democratic,"Donald Smith",MMM
    ======================
    Our Expernal table structures is as follows
    CREATE TABLE P_LOAD
    DATE1 VARCHAR2(10),
    DATE2 VARCHAR2(10),
    POL_PRTY VARCHAR2(30),
    P_NAME VARCHAR2(30),
    P_ROLE VARCHAR2(5)
    ORGANIZATION EXTERNAL
    (TYPE ORACLE_LOADER
    DEFAULT DIRECTORY P_EXT_TAB_D
    ACCESS PARAMETERS (
    RECORDS DELIMITED by NEWLINE
    SKIP 1
    FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"' LDRTRIM
    REJECT ROWS WITH ALL NULL FIELDS
    MISSING FIELD VALUES ARE NULL
    DATE1 CHAR (10) Terminated by "," ,
    DATE2 CHAR (10) Terminated by "," ,
    POL_PRTY CHAR (30) Terminated by "," ,
    P_NAME CHAR (30) Terminated by "," OPTIONALLY ENCLOSED BY '"' ,
    P_ROLE CHAR (5) Terminated by ","
    LOCATION ('Input.dat')
    REJECT LIMIT UNLIMITED;
         It created successfully using SQL Developer
    Here is the issue.
    It is not loading the records, where fields are enclosed in '"' (Rec # 2,3,4,7)
    It is loading all NULL value record (Rec # 6)     
    *** If we remove the '"' from input data, it loads all records including all NULL records
    Log file has
    KUP-04021: field formatting error for field P_NAME
    KUP-04036: second enclosing delimiter not found
    KUP-04101: record 2 rejected in file ....
    Our questions
    Why did "REJECT ROWS WITH ALL NULL FIELDS" not working?
    Why did Terminated by "," OPTIONALLY ENCLOSED BY '"' not working?
    Any idea?
    Thanks in helping.

    I don't think this is a SQLDeveloper issue. You will get better answers in the Database - General or perhaps SQL and PL/SQL forums.

  • Oracle 11g - External Table Issue SQL - PL/SQL?

    Oracle 11g - External Table Issue?
    =====================
    I hope this is the right forum for this issue, if not let me, where to go.
    We are using Oracle 11g (11.2.0.1.0) on (Platform : solaris[tm] oe (64-bit)), Sql Developer 3.0.04
    We are trying to use oracle external table to load text files in .csv format. Here is our data look like.
    ======================
    Date1,date2,Political party,Name, ROLE
    20-Jan-66,22-Nov-69,Democratic,"John ", MMM
    22-Nov-70,20-Jan-71,Democratic,"John Jr.",MMM
    20-Jan-68,9-Aug-70,Republican,"Rick Ford Sr.", MMM
    9-Aug-72,20-Jan-75,Republican,Henry,MMM
    ALL NULL -- record
    20-Jan-80,20-Jan-89,Democratic,"Donald Smith",MMM
    ======================
    Our Expernal table structures is as follows
    CREATE TABLE P_LOAD
    DATE1 VARCHAR2(10),
    DATE2 VARCHAR2(10),
    POL_PRTY VARCHAR2(30),
    P_NAME VARCHAR2(30),
    P_ROLE VARCHAR2(5)
    ORGANIZATION EXTERNAL
    (TYPE ORACLE_LOADER
    DEFAULT DIRECTORY P_EXT_TAB_D
    ACCESS PARAMETERS (
    RECORDS DELIMITED by NEWLINE
    SKIP 1
    FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"' LDRTRIM
    REJECT ROWS WITH ALL NULL FIELDS
    MISSING FIELD VALUES ARE NULL
    DATE1 CHAR (10) Terminated by "," ,
    DATE2 CHAR (10) Terminated by "," ,
    POL_PRTY CHAR (30) Terminated by "," ,
    P_NAME CHAR (30) Terminated by "," OPTIONALLY ENCLOSED BY '"' ,
    P_ROLE CHAR (5) Terminated by ","
    LOCATION ('Input.dat')
    REJECT LIMIT UNLIMITED;
    It created successfully using SQL Developer
    Here is the issue.
    It is not loading the records, where fields are enclosed in '"' (Rec # 2,3,4,7)
    It is loading all NULL value record (Rec # 6)
    *** If we remove the '"' from input data, it loads all records including all NULL records
    Log file has
    KUP-04021: field formatting error for field P_NAME
    KUP-04036: second enclosing delimiter not found
    KUP-04101: record 2 rejected in file ....
    Our questions
    Why did "REJECT ROWS WITH ALL NULL FIELDS" not working?
    Why did Terminated by "," OPTIONALLY ENCLOSED BY '"' not working?
    Any idea?
    Thanks in helping.
    Edited by: qwe16235 on Jun 11, 2011 11:31 AM

    I'm not sure, but maybe you should get rid of the redundancy that you have in your CREATE TABLE statement.
    This line covers all fields:
    FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"'
    {code}
    So I would change the field list to:
    {code}
    DATE1 CHAR (10),
    DATE2 CHAR (10),
    POL_PRTY CHAR (30),
    P_NAME CHAR (30),
    P_ROLE CHAR (5)
    {code}
    It worked on my installation.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

Maybe you are looking for

  • Multiple Users

    I have multiple user accounts on one computer (in fact, they're all me but separated for work, home and various civic interests). With PowerMail, I can make every user account use the same, single mail database (on an external drive). Mail doesn't ap

  • My adobe acrobat XI Pro has been constantly freezing when I open documents.

    My adobe acrobat XI Pro has been constantly freezing when I open documents. I've uninstalled and reinstalled the program, changed some settings based on community threads, nothing is working. I have to close the document, open it back up and wait at

  • Database tables for Special GL - down payment

    As per my knowledge      All vendor open items or with BSAK database table      All vendor cleared items or with BSIK database table      All customer open items or with BSAD database table      All customer cleared items or with BSID database table

  • Offline payment processing email not being received?

    We just got an event booking today with credit card payment and have yet to receive the secure PDF for processing. Is anyone else having this issue today or in the recent past? I noticed that this occured back in Feb but is it still not fixed? Any cl

  • CachedRowSet and CLOB

    I use a CachedRowSet instance with an Oracle 8i (i get the same result with a 9i) datbase. if the table i do the query in contains any CLOB or BLOB column, when i execute the populate(ResultSet) method i get the following exception : java.lang.Number