External Table - Record delimiter
Hi,
Is it possible to have combination of character and NEWLINE as a record delimiter in external table. i.e something like RECORDS DELIMITED BY '#^'||chr(10)
Thanks
S. Sathish Kumar
you can always read the manuals
http://download-west.oracle.com/docs/cd/B10501_01/server.920/a96652/ch12.htm#1007431
assuming that you mean you have the #^ at the end of the line, then it's just
RECORDS DELIMITED BY '#^'
if you mean that you sometimes have #^ in the middle, and only break when it's at the end of the line, then you need to use (on windows)
RECORDS DELIMITED BY '#^\r\n'
Similar Messages
-
External Table, Handling Delimited and Special Character in file
Hi ,
I have created one external table with these option
( TYPE ORACLE_LOADER
DEFAULT DIRECTORY ***************************************
ACCESS PARAMETERS
( RECORDS DELIMITED BY NEWLINE
SKIP 0
FIELDS TERMINATED BY '|'
OPTIONALLY ENCLOSED BY '"'
MISSING FIELD VALUES ARE NULL
LOCATION
( 'test_feed.csv'
Now problem is these are coming as valid.
anupam|anupam2
anupam"test|anupam"test2
"anupam|test3"|test3
anupam""""test5|test5
anupam"|test7
but these are not coming as valid
"anupam"test4"|test4 --> Case when we have quotes in the filed but still have quotes in it. I guess in this case we can send the filed expect closing double quotes.
"anupam|test6 --> In case field is starting with double quotes then it's failing
"anupam"test8|test8"|test8 --> In case one filed contains both pipe ( |) and double quotes then we are sending it enclosed in double quotes. But thats failing the job.
Can you suggest what is the best way to handle such scenario? ( One restriction though. The file is used by other system - Netezza as well, which can't take more than one character long delimited :'( )One approach is to define the external table a ONE column table (with single field on the file). This way each line will come in as a row in the external table. Of course you have to build "parsing logic" on top of that.
DROP TABLE xtern_table;
CREATE TABLE xtern_table
c1 VARCHAR2(4000)
organization external
type ORACLE_LOADER DEFAULT directory xtern_data_dir
ACCESS PARAMETERS (
RECORDS DELIMITED BY NEWLINE
FIELDS TERMINATED BY '~' ---- <<<<<<<< Use a field terminator as a character that is not found in the file
MISSING FIELD VALUES ARE NULL
( c1 CHAR(4000)
) location ('mycsv.csv')
> desc xtern_table
desc xtern_table
Name Null Type
C1 VARCHAR2(4000)
> column c1 format A40
> select * from xtern_table
C1
anupam|anupam2
anupam"test|anupam"test2
"anupam|test3"|test3
anupam""""test5|test5
anupam"|test7
"anupam"test4"|test4
"anupam|test6
"anupam"test8|test8"|test8
8 rows selected
Ideally, it will be good t have an incoming source file with predictable format.
Hope this helps. -
External Table : ROW Delimiter
Hi,
I am working on an interesting solution that invloves modifying the definitions of External Tables dynamically. I have created a Procedure that alters the delimiters ( ROW and COLUMN ) using Dynamic SQL.
CREATE OR REPLACE PROCEDURE SP_CHANGE_DELIMITER
P_RECORD_DELIM VARCHAR2,
P_FIELD_DELIM VARCHAR2,
P_DEBUG NUMBER DEFAULT 0
AS
P_RECORD_DELIM_ VARCHAR2(100):= P_RECORD_DELIM;
P_FIELD_DELIM_ VARCHAR2(100):= P_FIELD_DELIM;
sql_ VARCHAR2(4000);
BEGIN
IF P_RECORD_DELIM_ IS NOT NULL
THEN
P_RECORD_DELIM_ := ''''||P_RECORD_DELIM_||'''';
ELSE
P_RECORD_DELIM_ := 'NEWLINE';
END IF;
IF P_FIELD_DELIM_ IS NOT NULL
THEN
P_FIELD_DELIM_ := ''''||P_FIELD_DELIM_||'''';
END IF;
sql_:=
ALTER TABLE EXTERN_EMPL_RPT
ACCESS PARAMETERS
RECORDS DELIMITED BY '||P_RECORD_DELIM_||'
FIELDS TERMINATED BY '||P_FIELD_DELIM_ ||'
IF NVL(P_DEBUG,0) = 1 THEN
DBMS_OUTPUT.PUT_LINE(sql_);
END IF;
EXECUTE IMMEDIATE sql_;
END;I am able to dynamically change the Definition of the CLOUM Delimiter using my Procedure :-
EXEC SP_CHANGE_DELIMITER(P_RECORD_DELIM=> '', P_FIELD_DELIM => '*',P_DEBUG=>1);However, when I try to change the ROW Delimiter, I am getting this error :-
EXEC SP_CHANGE_DELIMITER(P_RECORD_DELIM=> '|', P_FIELD_DELIM => '#',P_DEBUG=>1);
SQL> SELECT * FROM EXTERN_EMPL_RPT;
SELECT * FROM EXTERN_EMPL_RPT
ERROR at line 1:
ORA-29913: error in executing ODCIEXTTABLEFETCH callout
ORA-30653: reject limit reached
ORA-06512: at "SYS.ORACLE_LOADER", line 52
ORA-06512: at line 1I am working with data that looks like this:-
001 | Sandeep | Seshan
002 | Seshan | SandeepIf I try to change this to :-
001 # Sandeep # Seshan |
002 # Seshan # Sandeep |I get the ORA-29913 error.
Can you please help me with this sticky bit ?
Please do let me know if I need to include more information.
Thanks,
SandeepTry increasing reject limit to unlimited!
ALTER TABLE EXTERN_EMPL_RPT REJECT LIMITED UNLIMITED; -
External table.How to load numbers (decimal and scientific notation format)
Hi all, I need to load inside an external table records that contain 7 fields. The last field is called AMOUNT and it's represented in some records with the decimal format, in others records with the scientific notation format as, for example, below:
CY001_STATU;2009;Jan;11220020GR;'03900;CYZ900;-9,99999999839929e-03
CY001_STATU;2009;Jan;11200100;'60800;CYZ900;41380,77
The External table's script is the following:
CREATE TABLE HYP_DATA
COUNTRY VARCHAR2(50 BYTE),
YEAR VARCHAR2(20 BYTE),
PERIOD VARCHAR2(20 BYTE),
ACCOUNT VARCHAR2(50 BYTE),
DEPT VARCHAR2(20 BYTE),
ACTIVITY_LOC VARCHAR2(20 BYTE),
AMOUNT VARCHAR2(50 BYTE)
ORGANIZATION EXTERNAL
( TYPE ORACLE_LOADER
DEFAULT DIRECTORY HYP_DATA_DIR
ACCESS PARAMETERS
( RECORDS DELIMITED BY NEWLINE
BADFILE 'HYP_BAD_DIR':'HYP_LOAD.bad'
DISCARDFILE 'HYP_DISCARD_DIR':'HYP_LOAD.dsc'
LOGFILE 'HYP_LOG_DIR':'HYP_LOAD.log'
SKIP 0
FIELDS TERMINATED BY ";"
MISSING FIELD VALUES ARE NULL
REJECT ROWS WITH ALL NULL FIELDS
"COUNTRY" Char,
"YEAR" Char,
"PERIOD" Char,
"ACCOUNT" Char,
"DEPT" Char,
"ACTIVITY_LOC" Char,
"AMOUNT" Char
LOCATION (HYP_DATA_DIR:'Total.txt')
REJECT LIMIT UNLIMITED
NOPARALLEL
NOMONITORING;
If, for the field AMOUNT I use the datatype VARCHAR (as above), the table is loaded but I have some records rejected, and all these records contain the last field AMOUNT with the scientific notation as:
CY001_STATU;2009;Jan;11220020GR;'03900;CYZ900;-9,99999999839929e-03
CY001_STATU;2009;Feb;11220020GR;'03900;CYZ900;-9,99999999839929e-03
CY001_STATU;2009;Mar;11220020GR;'03900;CYZ900;-9,99999999839929e-03
CY001_STATU;2009;Dec;11220020GR;'03900;CYZ900;-9,99999999839929e-03
All the others records with a decimal AMOUNT are loaded correctly.
So, my problem is that I NEED to load all the records (with the decimal and the scientific notation format) together (without records rejected), but I don't know which datatype I have to use for the AMOUNT field....
Anybody has any idea ???
Any help would be appreciated
Thanks in advance
Alex@OP,
What version of Oracle are you using?
Just cut'n'paste of you script and example woked FINE for me.
however my quation is... An external table will LOAD all data or none at all. How are you validating/concluding that...
I have some records rejected, and all these records contain the last field AMOUNT with the scientific notation
select * from v$version where rownum <2;
Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
select * from mydata;
CY001_STATU 2009 Jan 11220020GR '03900 CYZ900 -9,99999999839929e-03
CY001_STATU 2009 Feb 11220020GR '03900 CYZ900 -9,99999999839929e-03
CY001_STATU 2009 Jan 11220020GR '03900 CYZ900 -9,99999999839929e-03
CY001_STATU 2009 Jan 11200100 '60800 CYZ900 41380,77
CY001_STATU 2009 Mar 11220020GR '03900 CYZ900 -9,99999999839929e-03
CY001_STATU 2009 Dec 11220020GR '03900 CYZ900 -9,99999999839929e-03
CY001_STATU 2009 Jan 11220020GR '03900 CYZ900 -9,99999999839929e-03
CY001_STATU 2009 Jan 11200100 '60800 CYZ900 41380,77MYDATA table script is...
drop table mydata;
CREATE TABLE mydata
COUNTRY VARCHAR2(50 BYTE),
YEAR VARCHAR2(20 BYTE),
PERIOD VARCHAR2(20 BYTE),
ACCOUNT VARCHAR2(50 BYTE),
DEPT VARCHAR2(20 BYTE),
ACTIVITY_LOC VARCHAR2(20 BYTE),
AMOUNT VARCHAR2(50 BYTE)
ORGANIZATION EXTERNAL
( TYPE ORACLE_LOADER
DEFAULT DIRECTORY IN_DIR
ACCESS PARAMETERS
( RECORDS DELIMITED BY NEWLINE
BADFILE 'IN_DIR':'HYP_LOAD.bad'
DISCARDFILE 'IN_DIR':'HYP_LOAD.dsc'
LOGFILE 'IN_DIR':'HYP_LOAD.log'
SKIP 0
FIELDS TERMINATED BY ";"
MISSING FIELD VALUES ARE NULL
REJECT ROWS WITH ALL NULL FIELDS
"COUNTRY" Char,
"YEAR" Char,
"PERIOD" Char,
"ACCOUNT" Char,
"DEPT" Char,
"ACTIVITY_LOC" Char,
"AMOUNT" Char
LOCATION (IN_DIR:'total.txt')
REJECT LIMIT UNLIMITED
NOPARALLEL
NOMONITORING;vr,
Sudhakar B. -
Hi there!
Got this problem that are driving me quite mad.
I'm working on a Oracle 9.2.0.7, Unix is a hp-ux.
The problem is:
using an external table, record in variable format, with '^' as separetor, if file size is an exact power of 1024 (i.e. 1K, 2k, 1M etc) I get the error:
reject limit reached
The strange thing is that the file is correct, anc other files are loaded with no problem at all.
For now the "work around" is to edit the file (vi on file so we check that there aren't strange chars) and then add a space at the end of it.
After this the file is loaded!
Anyone have an idea?
Thank you,
AntonioSorry guys, I was a bit busy.
I've talked with a coworker, we got an example of a table and a more precise idea: it's not "power of 1K" but "power of 1M"
This is the example table:
CREATE TABLE ext_table
c_field_1 CHAR(1),
v_field_2 VARCHAR2(15),
v_field_3 VARCHAR2(24),
v_field_4 VARCHAR2(24),
v_field_5 VARCHAR2(14),
v_field_6 VARCHAR2(14),
n_field_7 NUMBER(6,0),
n_field_8 NUMBER(6,0),
n_field_9 NUMBER(10,0),
v_field_10 VARCHAR2(79),
v_field_11 VARCHAR2(12),
c_field_12 CHAR(1),
v_field_13 VARCHAR2(2),
v_field_14 VARCHAR2(7),
v_field_15 VARCHAR2(30),
v_field_16 VARCHAR2(14),
d_field_17 DATE,
n_field_18 NUMBER(8,0),
d_field_19 DATE,
n_field_20 NUMBER(8,0),
v_field_21 VARCHAR2(191),
d_field_22 DATE,
n_field_23 NUMBER(8,0),
v_field_24 VARCHAR2(2),
n_field_25 NUMBER(6,0),
n_field_26 NUMBER(10,0))
ORGANIZATION EXTERNAL (
DEFAULT DIRECTORY DR_LOAD
ACCESS PARAMETERS(records delimited by newline
badfile 'DR_ERROR':'EXT_TABLE_LOAD.error'
logfile 'DR_LOG':'EXT_TABLE_LOAD.log'
fields (
c_field_1 POSITION(2:2) CHAR(1)
,v_field_2 POSITION(3:17) CHAR(15)
,v_field_3 POSITION(18:41) CHAR(24)
,v_field_4 POSITION(42:65) CHAR(24)
,v_field_5 POSITION(66:79) CHAR(14)
,v_field_6 POSITION(80:93) CHAR(14)
,n_field_7 POSITION(94:99) INTEGER EXTERNAL(6)
,n_field_8 POSITION(100:105) INTEGER EXTERNAL(6)
,n_field_9 POSITION(106:115) INTEGER EXTERNAL(10)
,v_field_10 POSITION(116:194) CHAR(79)
,v_field_11 POSITION(195:206) CHAR(12)
,c_field_12 POSITION(207:207) CHAR(1)
,v_field_13 POSITION(208:209) CHAR(2)
,v_field_14 POSITION(210:216) CHAR(7)
,v_field_15 POSITION(217:246) CHAR(30)
,v_field_16 POSITION(247:260) CHAR(14)
,d_field_17 POSITION(261:274) DATE YYYYMMDDHH24MISS
,n_field_18 POSITION(261:268) INTEGER EXTERNAL(8)
,d_field_19 POSITION(275:288) DATE YYYYMMDDHH24MISS
,n_field_20 POSITION(275:282) INTEGER EXTERNAL(8)
,v_field_21 POSITION(289:479) CHAR(191)
,d_field_22 POSITION(480:493) DATE YYYYMMDDHH24MISS
,n_field_23 POSITION(480:487) INTEGER EXTERNAL(8)
,v_field_24 POSITION(494:495) CHAR(2)
,n_field_25 POSITION(496:501) INTEGER EXTERNAL(6)
,n_field_26 POSITION(502:511) INTEGER EXTERNAL(10)
LOCATION (
DR_LOAD:'EXT_TABLE_LOAD'
and a row
2T0123456789 11111111 2008091008264220080910092642020202020304000000000200613359956 0006133599560L2*************************************999912310000002008091008264220080910092642L2061335995620202203042008091008264220080910092642 20090319112538000808291000021162
If you replicate the row, that is 512 byte, 2048 times so you will have a file that is exactly 1MB, then any select on the external table will fail.
The error log EXT_TABLE_LOAD.log says:
KUP-04021: field formatting error for field C_FIELD_1
KUP-04023: field start is after end of record
KUP-04101: record 2049 rejected in file <file>
(NOTE: it's not true, the file is correct! if you try to load the single row it will load nicely)
and you'll have an empty EXT_TABLE_LOAD.error
Hope this will help.
Bye,
Antonio -
SQL Loader/External Table multiple record delimiters
Hi every one.
I have a strange problem, I have an external csv file which i wish to deal with (external tables or sql loader). This csv is totally not organized in structure and it contains records that are mixed together, meaning that some records are delimited by newline characters. So in short, i want to know if I will be able to load the data in this csv separating records by newline character and another character? So is that possible to have multiple record delimiters specified in the same ctl file?abohsin,
I think using the Stream record format would be helful in your case. Please explore that.
Using stream record option, instead of the default new line, you can specify a user defined record delimiter.
Check this link.
http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_control_file.htm#i1005509
Here is what I did. Not the complete answer, but it might be helpful.
Replace all delimiters witha standard delimiters (in unix)
sed s/HEAD,/**DLM**/g < test.dat >test2.dat
sed s/TAIL,/**DLM**/g < test2.dat >test3.dat
create table t(
TEXT varchar2(100)
and Use that delimiter as the standard delimiter.
load data
infile "test3.dat" "str '**DLM**'"
into table T
TRUNCATE
fields terminated by 'XXXXX' optionally enclosed by '"'
TEXT
sql> select * from t;
TEXT
1111,2222,
4444,5555,
4444
1111,3333,
8888,6666,
5555
{code}
You should also replace new line charecters with '***DLM***'.
Thanks,
Rajesh. -
ORA-06502 error with external table having long records
I'm getting a strange error with the oci driver that I don't get with the thin driver.
The basic situation is that we using external tables and both the oci and thin drivers have been working until we tested with a table that had longer records than the previous tables. The new table has 1800 byte records.
The thin driver still works fine with the new table. However, the oci driver generates the following error with the new table:
ORA-06502: PL/SQL: numeric or value error: character string buffer too small
I suspect that with oci driver the oci DLLs are reading the external file and the DLLs can't handle the longer record length.
Particulars
- Oracle DB Server is 10.2.0.1 on SunOS 5.9
- OCI Instant Client instantclient_10_2 on Windows XP
- OCI client code from DB server installation running on DB server
- Works with thin driver from Windows XP and on DB server machine for all record lengths
- Works with both OCI drivers for records < 1800 bytes (don't know actual limit)
- Fails with both OCI drivers for records = 1800 bytes.
Does anyone out there have any thoughts.
Thanks in advance.Your access parameters are in the wrong order. External tables are a bit fussy like that. Refer to the access_parameters section in the Utilities manual and follow the order there. From memory it will go something like this:
RECORDS DELIMITED...
LOG/BAD/DISCARDFILE...
FIELDS TERMINATED...LDRTRIM
MISSING FIELD VALUES...
fields...
)Regards... -
Issues with external table from text file ( tab delimiter )
Hello Guru,
--Data in my file, file name : TEST1 ( tab delimiter )
"C1" "C2"
"test column1" "01/27/2012"
"test column1" "01/27/2012"
"test column1" "01/09/2012"
-- Table
CREATE TABLE EXT_TEST
C1 VARCHAR2(50 BYTE),
C2 VARCHAR2(12 BYTE)
ORGANIZATION EXTERNAL
TYPE ORACLE_LOADER DEFAULT DIRECTORY "TEST"
ACCESS PARAMETERS ( RECORDS DELIMITED BY NEWLINE
SKIP 1 FIELDS
TERMINATED BY X'9' missing field VALUES are NULL REJECT ROWS
WITH ALL NULL FIELDS ) LOCATION ( 'TEST1' )
REJECT LIMIT UNLIMITED;
-- my current output
select * from EXT_TEST ;
C1 C2
"test column1" "01/27/2012"
"test column1" "01/27/2012"
"test column1" "01/09/2012"
-- I need output in this way
C1 C2
"test column1" "01/27/2012"
"test column1" "01/27/2012"
"test column1" "01/09/2012"
my version :
Oracle Database 10g Enterprise Edition Release 10.2.0.5.0 - 64bi
PL/SQL Release 10.2.0.5.0 - Production
"CORE 10.2.0.5.0 Production"
TNS for HPUX: Version 10.2.0.5.0 - Production
NLSRTL Version 10.2.0.5.0 - Production
Please help me resolve this and learn further steps.
Thank You!Try this:
-- Etc ---
FIELDS TERMINATED BY X'9'
OPTIONALLY ENCLOSED BY '"'
missing field VALUES are NULL REJECT ROWS
-- Etc ---
{code}
:p -
External tables in Oracle 11g database is not loading null value records
We have upgraded our DB from Oracle 9i to 11g...
It was noticed that data load to external tables in 9i is rejecting the records with null columns..However upgrading it to 11g,it allows the records with null values.
Is there are way to restrict loading the records that has few coulmns that are null..
Can you please share if this is the expected behaviour in Oracle 11g...
Thanks.Data isn't really loaded to an External Table. Rather, the external table lets you query an external data source as if it were a regular database table. To not see the rows with the NULL value, simply filter those rows out with your SQL statement:
SELECT * FROM my_external_table WHERE colX IS NOT NULL;
HTH,
Brian -
External Table - possible bug related to record size and total bytes in fil
I have an External Table defined with fixed record size, using Oracle 10.2.0.2.0 on HP/UX. At 279 byte records (1 or more fields, doesn't seem to matter), it can read almost 5M bytes in the file (17,421 records to be exact). At 280 byte records, it can not, but blows up with "partial record at end of file" - which is nonsense. It can read up to 3744 records, just below 1,048,320 bytes (1M bytes). 1 record over that, it blows up.
Now, If I add READSIZE and set it to 1.5M, then it works. I found this extends further, for instance 280 recsize with READSIZE 1.5M will work for a while but blows up on 39M bytes in the file (I didn't bother figuring exactly where it stops working in this case). Increasing READSIZE to 5M works again, for 78M bytes in file. But change the definition to have 560 byte records and it blows up. Decrease the file size to 39M bytes and it still won't work with 560 byte records.
Anyone have any explanation for this behavior? The docs say READSIZE is the read buffer, but only mentions that it is important to the largest record that can be processed - mine are only 280/560 bytes. My table definition is practically taken right out of the example in the docs for fixed length records (change the fields, sizes, names and it is identical - all clauses the same).
We are going to be using these external tables a lot, and need them to be reliable, so increasing READSIZE to the largest value I can doesn't make me comfortable, since I can't be sure in production how large an input file may become.
Should I report this as a bug to Oracle, or am I missing something?
Thanks,
BobI have an External Table defined with fixed record size, using Oracle 10.2.0.2.0 on HP/UX. At 279 byte records (1 or more fields, doesn't seem to matter), it can read almost 5M bytes in the file (17,421 records to be exact). At 280 byte records, it can not, but blows up with "partial record at end of file" - which is nonsense. It can read up to 3744 records, just below 1,048,320 bytes (1M bytes). 1 record over that, it blows up.
Now, If I add READSIZE and set it to 1.5M, then it works. I found this extends further, for instance 280 recsize with READSIZE 1.5M will work for a while but blows up on 39M bytes in the file (I didn't bother figuring exactly where it stops working in this case). Increasing READSIZE to 5M works again, for 78M bytes in file. But change the definition to have 560 byte records and it blows up. Decrease the file size to 39M bytes and it still won't work with 560 byte records.
Anyone have any explanation for this behavior? The docs say READSIZE is the read buffer, but only mentions that it is important to the largest record that can be processed - mine are only 280/560 bytes. My table definition is practically taken right out of the example in the docs for fixed length records (change the fields, sizes, names and it is identical - all clauses the same).
We are going to be using these external tables a lot, and need them to be reliable, so increasing READSIZE to the largest value I can doesn't make me comfortable, since I can't be sure in production how large an input file may become.
Should I report this as a bug to Oracle, or am I missing something?
Thanks,
Bob -
External table max record length check
This question is regarding fixed length input file.
I am checking to see if it is possible to add a check to verify the max length of a record when defining an external table.
For example, if I am expecting a fixed length file with max record/row length of 100, is it possible to reject lines in the
file that are more than 100 characters long?
Thanks.What you can do is something like:
DROP TABLE TBL_EXT
CREATE TABLE TBL_EXT(
VAL VARCHAR2(4),
INDICATOR VARCHAR2(1)
ORGANIZATION EXTERNAL(
TYPE ORACLE_LOADER
DEFAULT DIRECTORY TEMP
ACCESS PARAMETERS (
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
MISSING FIELD VALUES ARE NULL
REJECT ROWS WITH ALL NULL FIELDS
VAL POSITION(1:4),
INDICATOR POSITION(5:5) CHAR NOTRIM
LOCATION ('tbl_ext.txt')
SELECT *
FROM TBL_EXT
WHERE INDICATOR IS NULL
{code}
Now tbl_ext.txt:
{code}
X
XX
XXX
XXXX
XXXX XXXXXX
XXXX <-- this line has trailing spaces
XXX
XX
X
{code}
Now:
{code}
SQL> CREATE TABLE TBL_EXT(
2 VAL VARCHAR2(4),
3 INDICATOR VARCHAR2(1)
4 )
5 ORGANIZATION EXTERNAL(
6 TYPE ORACLE_LOADER
7 DEFAULT DIRECTORY TEMP
8 ACCESS PARAMETERS (
9 FIELDS TERMINATED BY ','
10 OPTIONALLY ENCLOSED BY '"'
11 MISSING FIELD VALUES ARE NULL
12 REJECT ROWS WITH ALL NULL FIELDS
13 (
14 VAL POSITION(1:4),
15 INDICATOR POSITION(5:5) CHAR NOTRIM
16 )
17 )
18 LOCATION ('tbl_ext.txt')
19 )
20 /
Table created.
SQL> SELECT *
2 FROM TBL_EXT
3 WHERE INDICATOR IS NULL
4 /
VAL I
X
XX
XXX
XXXX
XXX
XX
X
7 rows selected.
SQL> As you can see, lines 'XXXX XXXXXX' and 'XXXX ' were not selected since WHERE INDICATOR IS NULL for these rows results in FALSE.
SY. -
How to import tab delimited using external table?
Hi all,
I'm using external table to import data from a tab delimited file. However, I keep getting an error message. I used both TERMINATED BY OX'09' and X'09' but still could not get it work.
Does anyone know how to solve this problem?
Thank you very much.Hi
Try this:
import datafile="/myfiles/mydata" out=mydata dbms=tab replace delimiter='&' getnames=yes
run;
Thanks -
Error: while Selecting External table
Hi everybody,
When i Select an external table i am getting this error. The file is like this:
229|1|506460|SIGROUP |4890|100|0|0|10:31:01|2007/12/17|M009|20191395001|L|B|12|CLIENT|INE547A01012|10:31:00|
229|1|506460|SIGROUP |4900|900|0|0|10:31:01|2007/12/17|M009|20191395001|L|B|13|CLIENT|INE547A01012|10:31:00|
229|1|500407|SWARAJENG |21400|300|0|0|10:33:28|2007/12/17|OWN|20191397001|L|B|154|OWN|INE277A01016|10:33:28|
I had created the Table like this:
SQL> CREATE TABLE TEMP_SAUDA
2 (S_A VARCHAR2(20),
3 S_TYPE VARCHAR2(20),
4 S_CO VARCHAR2(20),
5 S_CONAME VARCHAR2(40),
6 S_RATE NUMBER,
7 S_QTY NUMBER,
8 S_G NUMBER,
9 S_H NUMBER,
10 S_TIME TIMESTAMP WITH TIME ZONE,
11 S_DATE DATE,
12 S_PCODE VARCHAR2(20),
13 S_SETNO VARCHAR2(20),
14 S_M VARCHAR2(20),
15 S_N VARCHAR2(20),
16 S_O VARCHAR2(20),
17 S_CLIENTOWN VARCHAR2(10),
18 S_ISIN VARCHAR2(12),
19 S_ORDER_TIME TIMESTAMP WITH TIME ZONE
20 )
21 ORGANIZATION EXTERNAL
22 (TYPE oracle_loader
23 DEFAULT DIRECTORY BSE17122007
24 ACCESS PARAMETERS
25 (RECORDS DELIMITED BY NEWLINE
26 FIELDS
27 (
28 S_A CHAR(20),
29 S_TYPE CHAR(20),
30 S_CO CHAR(20),
31 S_CONAME CHAR(20),
32 S_RATE CHAR(20),
33 S_QTY CHAR(20),
34 S_G CHAR(20),
35 S_H CHAR(20),
36 S_TIME CHAR(35) date_format TIMESTAMP WITH TIMEZONE mask "DD-MON-RR HH.MI.SSXFF AM TZH:TZM
37 S_DATE CHAR(22) date_format DATE mask "mm/dd/yyyy hh:mi:ss ",
38 S_PCODE CHAR(20),
39 S_SETNO CHAR(20),
40 S_M CHAR(20),
41 S_N CHAR(20),
42 S_O CHAR(20),
43 S_CLIENTOWN CHAR(20),
44 S_ISIN CHAR(20),
45 S_ORDER_TIME date_format TIMESTAMP WITH TIMEZONE mask "DD-MON-RR HH.MI.SSXFF AM TZH:TZM"
46 )
47 )
48 location (BSE17122007:'BR171207.DAT')
49 )
50 ;
Table created.
SQL> SELECT * FROM TEMP_SAUDA;
SELECT * FROM TEMP_SAUDA
ERROR at line 1:
ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-00554: error encountered while parsing access parameters
KUP-01005: syntax error: found "date_format": expecting one of: "binary_double,
binary_float, comma, char, date, defaultif, decimal, double, float, integer, (,
nullif, oracle_date, oracle_number, position, raw, recnum, ), unsigned,
varrawc, varchar, varraw, varcharc, zoned"
KUP-01007: at line 21 column 14
ORA-06512: at "SYS.ORACLE_LOADER", line 19
Is there any mistake in this table creation.
what i have to declare to the time format if the format in the file id hh:mm:ss
Thank u...!
RaviThe output you posted is completely wrong, I could not even create the table without errors.
Try with this.
CREATE TABLE TEMP_SAUDA
(S_A VARCHAR2(20),
S_TYPE VARCHAR2(20),
S_CO VARCHAR2(20),
S_CONAME VARCHAR2(40),
S_RATE NUMBER,
S_QTY NUMBER,
S_G NUMBER,
S_H NUMBER,
S_TIME TIMESTAMP WITH TIME ZONE,
S_DATE DATE,
S_PCODE VARCHAR2(20),
S_SETNO VARCHAR2(20),
S_M VARCHAR2(20),
S_N VARCHAR2(20),
S_O VARCHAR2(20),
S_CLIENTOWN VARCHAR2(10),
S_ISIN VARCHAR2(12),
S_ORDER_TIME TIMESTAMP WITH TIME ZONE
ORGANIZATION EXTERNAL
(TYPE oracle_loader
DEFAULT DIRECTORY BSE17122007
ACCESS PARAMETERS
(RECORDS DELIMITED BY NEWLINE
FIELDS terminated by "|"
S_A CHAR(20),
S_TYPE CHAR(20),
S_CO CHAR(20),
S_CONAME CHAR(20),
S_RATE CHAR(20),
S_QTY CHAR(20),
S_G CHAR(20),
S_H CHAR(20),
S_TIME CHAR(8) date_format TIMESTAMP WITH TIMEZONE mask "HH.MI.SSXFF AM TZH:TZM",
S_DATE CHAR(10) date_format DATE mask "yyyy/mm/dd",
S_PCODE CHAR(20),
S_SETNO CHAR(20),
S_M CHAR(20),
S_N CHAR(20),
S_O CHAR(20),
S_CLIENTOWN CHAR(20),
S_ISIN CHAR(20),
S_ORDER_TIME char(8) date_format TIMESTAMP WITH TIMEZONE mask "HH.MI.SSXFF AM TZH:TZM"
location (BSE17122007:'BR171207.DAT')
;With this you get:
SQL> col s_time format a40
SQL> col s_date format a40
SQL> col s_order_time format a40
SQL> r
1* select s_time,s_date,s_order_time from temp_sauda
S_TIME S_DATE S_ORDER_TIME
01-JAN-08 10.31.01.000000 AM +00:00 17.DEC.2007 00:00:00 01-JAN-08 10.31.00.000000 AM +00:00
01-JAN-08 10.31.01.000000 AM +00:00 17.DEC.2007 00:00:00 01-JAN-08 10.31.00.000000 AM +00:00
01-JAN-08 10.33.28.000000 AM +00:00 17.DEC.2007 00:00:00 01-JAN-08 10.33.28.000000 AM +00:00Be aware that your file does not contain date information for the time fields, so as you see above it is defaulted to 01-JAN-08 for the S_TIME and S_ORDER_TIME column. -
Error while creating external table
Hi i tried to create external table. The table is created but while selecting that table it is throwing below errors
ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-04040: file Countries1.txt in EXT_TABLES not found
ORA-06512: at "SYS.ORACLE_LOADER", line 19I've created temp directory in window under oracle directory " C:\oracle\product\10.2.0\temp"
In the temp directory i've a text file countries1.txt
the text file has the below information
ENG,England,English
SCO,Scotland,English
IRE,Ireland,English
WAL,Wales,WelshI've connected to system user and created one directory and granted the read and write permissions to user SCOTT.
SQL> create or replace directory ext_tables as 'C:\oracle\product\10.2.0\temp\';
Directory created.
SQL> grant read,write on directory ext_tables to scott;
Grant succeeded.The creation of external table query is
CREATE TABLE countries_ext (
country_code VARCHAR2(5),
country_name VARCHAR2(50),
country_language VARCHAR2(50)
ORGANIZATION EXTERNAL (
TYPE ORACLE_LOADER
DEFAULT DIRECTORY ext_tables
ACCESS PARAMETERS (
RECORDS DELIMITED BY NEWLINE
FIELDS TERMINATED BY ','
MISSING FIELD VALUES ARE NULL
country_code CHAR(5),
country_name CHAR(50),
country_language CHAR(50)
LOCATION ('Countries1.txt')
PARALLEL 5
REJECT LIMIT UNLIMITED;And the error is
SQL> select *from countries_ext;
select *from countries_ext
ERROR at line 1:
ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-04040: file Countries1.txt in EXT_TABLES not found
ORA-06512: at "SYS.ORACLE_LOADER", line 19
SQL> Please help me in thisYou are missing something. Most probably the file does not exists in your specified path. This is working in my 10.2.0.3
Step1: Check the file is actually there.
C:\oracle\product\10.2.0>mkdir temp
C:\oracle\product\10.2.0>cd temp
C:\oracle\product\10.2.0\temp>dir
Volume in drive C is C_Drive
Volume Serial Number is 8A93-1441
Directory of C:\oracle\product\10.2.0\temp
07/30/2011 12:00 PM <DIR> .
07/30/2011 12:00 PM <DIR> ..
07/30/2011 12:00 PM 79 countries1.txt
1 File(s) 79 bytes
2 Dir(s) 50,110,582,784 bytes free
C:\oracle\product\10.2.0\temp>type countries1.txt
ENG,England,English
SCO,Scotland,English
IRE,Ireland,English
WAL,Wales,Welsh
C:\oracle\product\10.2.0\temp>Step 2: Creating the directory object.
SQL> show user
USER is "SYS"
SQL> SELECT * FROM v$version;
BANNER
Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - Prod
PL/SQL Release 10.2.0.3.0 - Production
CORE 10.2.0.3.0 Production
TNS for 32-bit Windows: Version 10.2.0.3.0 - Production
NLSRTL Version 10.2.0.3.0 - Production
SQL> create or replace directory ext_tables as 'C:\oracle\product\10.2.0\temp';
Directory created.
SQL> grant read,write on directory ext_tables to scott;
Grant succeeded.
SQL>Step 3: Table definition.
C:\>sqlplus scott@orclsb/tiger
SQL*Plus: Release 10.1.0.4.2 - Production on Sat Jul 30 12:04:24 2011
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Connected to:
Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - Production
With the Partitioning, OLAP and Data Mining options
SQL> CREATE TABLE countries_ext (
2 country_code VARCHAR2(5),
3 country_name VARCHAR2(50),
4 country_language VARCHAR2(50)
5 )
6 ORGANIZATION EXTERNAL (
7 TYPE ORACLE_LOADER
8 DEFAULT DIRECTORY ext_tables
9 ACCESS PARAMETERS (
10 RECORDS DELIMITED BY NEWLINE
11 FIELDS TERMINATED BY ','
12 MISSING FIELD VALUES ARE NULL
13 (
14 country_code CHAR(5),
15 country_name CHAR(50),
16 country_language CHAR(50)
17 )
18 )
19 LOCATION ('Countries1.txt')
20 )
21 PARALLEL 5
22 REJECT LIMIT UNLIMITED;
Table created.
SQL> SELECT * FROM countries_ext;
COUNT COUNTRY_NAME
COUNTRY_LANGUAGE
ENG England
English
SCO Scotland
English
IRE Ireland
English
COUNT COUNTRY_NAME
COUNTRY_LANGUAGE
WAL Wales
Welsh -
Error while selecting date from external table
Hello all,
I am getting the follwing error while selecting data from external table. Any idea why?
SQL> CREATE TABLE SE2_EXT (SE_REF_NO VARCHAR2(255),
2 SE_CUST_ID NUMBER(38),
3 SE_TRAN_AMT_LCY FLOAT(126),
4 SE_REVERSAL_MARKER VARCHAR2(255))
5 ORGANIZATION EXTERNAL (
6 TYPE ORACLE_LOADER
7 DEFAULT DIRECTORY ext_tables
8 ACCESS PARAMETERS (
9 RECORDS DELIMITED BY NEWLINE
10 FIELDS TERMINATED BY ','
11 MISSING FIELD VALUES ARE NULL
12 (
13 country_code CHAR(5),
14 country_name CHAR(50),
15 country_language CHAR(50)
16 )
17 )
18 LOCATION ('SE2.csv')
19 )
20 PARALLEL 5
21 REJECT LIMIT UNLIMITED;
Table created.
SQL> select * from se2_ext;
SQL> select count(*) from se2_ext;
select count(*) from se2_ext
ERROR at line 1:
ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-04043: table column not found in external source: SE_REF_NO
ORA-06512: at "SYS.ORACLE_LOADER", line 19It would appear that you external table definition and the external data file data do not match up. Post a few input records so someone can duplicate the problem and determine the fix.
HTH -- Mark D Powell --
Maybe you are looking for
-
We have just purchased a new home computer and need to set up I-Tunes for everyone in our family. Is that possible to do on one computer?
-
How can I get Firefox to except the home page that I want to use
I have clicked and dragged the icon a hundred times, and still when I click on Firefox a previously set up home page appears. This is very annoying because I can get to my homepage; but I have to sign on to yahoo e-mail every time; then go to my home
-
Hi In the tunning guide i read that parallelism for jms adapter when you have sync calls can be implemented with 2 channels. But this is not possible because you can't create 2 sender agrrements with the same sender,interface but different channels.
-
Why doesn't Acrobat Reader work with windows 7? I have been fighting this issue for over a year now and for awhile I just gave up and had to use friend's computers to read files I needed. NOW ENOUGH IS ENOUGH! I want this problem fixed! The installat
-
Hi, I have a single step batching process managing my formlist creation. In this process I execute PDF printing and AFP printing. PDF print with RULMultiFilePrint callback funciton. When executing gendata it prints my PDFs as we build it. but AFP bat