External Table loading?
Hi:
Suppose I have an external table with following two fields
FIELD_A VARCHAR(1),
FIELD_B VARCHAR(1)
Suppose I have a file on which external table is based with following data:
A1
B1
C1
A2
As you can see in the file I have two rows with a FIELD_1 value of A. Can i specify a rule for the external table to only accept the last row, in the case A2?
Thanks,
Thomas
Not sure what your actual data looks like but you may be able to do something like the following when you select from the external table. You will need to be able to specify what qualifies as the 'last row';
SQL> create table ext_t (
c1 varchar2(20),
c2 varchar2(20))
organization external
type oracle_loader
default directory my_dir
access parameters
records delimited by newline
fields terminated by ','
missing field values are null
( c1,
c2
location('test.txt')
Table created.
SQL> select * from ext_t
C1 C2
A1 some descrition1
B1 some descrition2
C1 some descrition3
A2 some descrition4
4 rows selected.
SQL> select sc1, c1, c2
from (
select c1, c2,substr(c1,1,1) sc1,
row_number() over (partition by substr(c1,1,1) order by c1 desc) rn
from ext_t)
where rn = 1
SC1 C1 C2
A A2 some descrition4
B B1 some descrition2
C C1 some descrition3
3 rows selected.
Similar Messages
-
External Table Load KUP-04037 Error
I was asked to repost this here. This was a follow on question to this thread how to load special characters (diacritics) in the Exort/Import/SQL Loader/External Table Forum.
I've defined an external table and on my one instance running the WE8MSWIN1252 character set everything works fine. On my other instance running AL32UTF8 I get the KUP-04037 error, terminator not found on the field that has "à" (the letter a with a grave accent). Changing it to a standard "a" works avoids the error. Changing the column definition in the external table to nvarchar2 does NOT help.
Any ideas anyone?
Thanks,
Bob SiegelExactly. If you do not specify the CHARACTERSET parameter, the database character set is used to interpret the input file. As the input file is in WE8MSWIN1252, the ORACLE_LOADER driver gets confused trying to interpret single-byte WE8MSWIN1252 codes as multibyte AL32UTF8 codes.
The character set of the input file depends on the way it was created. Even on US Windows, you can create text files in different encodings. Notepad allows you to save the file in ANSI code page (=WE8MSWIN1252 on US Windows), Unicode (=AL16UTF16LE), Unicode big endian (=AL16UTF16), and UTF-8 (=AL32UTF8). The Command Prompt edit.exe editor saves the files in the OEM code page (=US8PC437 on US Windows).
-- Sergiusz -
Hi,
Could you please guide me on how much maximum load(Flat File) size is supported by External Table, in 9i and 10g.
Thanks,
AshishI am not sure any size limits exist - what size files are you planning to use ?
HTH
Srini -
Need info on using external tables load/write data
Hi All,
We are planning to load conversion/interface data using external tables feature available in the Oracle database.
Also for outbound interfaces, we are planning to use the same feature to write data in the file system.
If you have done similar exercise in any of your projects, please share sample code units. Also, let me know if there
are any cons/limitations in this approach.
Thanks,
BalajiPlease see old threads for similar discussion -- http://forums.oracle.com/forums/search.jspa?threadID=&q=external+AND+tables&objID=c3&dateRange=all&userID=&numResults=15&rankBy=10001
Thanks,
Hussein -
External Table Loads - Insufficient Privs
Hi...please advise:
Facts:
1. I have 2 schemas: SCHEMA_A and SCHEMA_B
2. I have an oracle directory 'VPP_DIR' created under SCHEMA_A and granted WRITE and READ on the dir to SCHEMA_B.
3. The physical dir on the unix server to which VPP_DIR points has read, write, execute privs for the Oracle user.
4. I have a procedure in SCHEMA_A (CET_PROC) which dynamically creates the external table with parameters passed to it like directory_name, file_name, column_definitions, load_when_clause etc.
5. The CET_PROC also does a grant SELECT on external table to SCHEMA_B once it is created.
6. SCHEMA_B has EXECUTE privs to SCHEMA_A.CET_PROC.
7. SCHEMA_B has a proc (DO_LOAD_PROC) that calls SCHEMA_A.CET_PROC.
At the point where SCHEMA_A.CET_PROC tries to do the EXECUTE_IMMEDIATE command with the create table code, it fails with "ORA-01031: insufficient privileges"
If I execute SCHEMA_A.CET_PROC from within SCHEMA_A with the same parameters it works fine.
If I create CET_PROC inside SCHEMA_B and execute this version from within SCHEMA_B it works fine.
From accross schemas, it fails. Any advice...please?Works for me without CREATE ANY TABLE.
I found it easier to follow the permissions if I replaced SCHEMA_A and SCHEMA_B with OVERLORD and FLUNKY.
/Users/williamr: cat /Volumes/Firewire1/william/testexttable.dat
1,Eat,More,Bananas,TodayAs SYS:
Connected to:
Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
Account: SYS@//centosvm.starbase.local:1521/dev10g.starbase.local
SQL> CREATE USER overlord IDENTIFIED BY overlord
2 DEFAULT TABLESPACE users QUOTA UNLIMITED ON users ACCOUNT UNLOCK;
User created.
SQL> CREATE USER flunky IDENTIFIED BY flunky
2 DEFAULT TABLESPACE users QUOTA UNLIMITED ON users ACCOUNT UNLOCK;
User created.
SQL> GRANT CREATE SESSION, CREATE TABLE, CREATE PROCEDURE TO overlord,flunky;
Grant succeeded.
SQL> GRANT READ,WRITE ON DIRECTORY extdrive TO overlord;
Grant succeeded.As OVERLORD:
Account: OVERLORD@//centosvm.starbase.local:1521/dev10g.starbase.local
SQL> get afiedt.buf
1 CREATE OR REPLACE PROCEDURE build_xt
2 ( p_data OUT SYS_REFCURSOR )
3 AS
4 v_sqlstr VARCHAR2(4000) := q'|
5 CREATE TABLE test_xt
6 ( id NUMBER(8)
7 , col1 VARCHAR2(10)
8 , col2 VARCHAR2(10)
9 , col3 VARCHAR2(10)
10 , col4 VARCHAR2(10) )
11 ORGANIZATION EXTERNAL
12 ( TYPE oracle_loader
13 DEFAULT DIRECTORY extdrive
14 ACCESS PARAMETERS
15 ( RECORDS DELIMITED BY newline
16 BADFILE 'testexttable.bad'
17 DISCARDFILE 'testexttable.dsc'
18 LOGFILE 'testexttable.log'
19 FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
20 ( id, col1, col2, col3, col4 ) )
21 LOCATION ('testexttable.dat') )
22 |';
23 BEGIN
24 EXECUTE IMMEDIATE v_sqlstr;
25 OPEN p_data FOR 'SELECT * FROM test_xt';
26* END build_xt;
27
28 .
SQL> @afiedt.buf
Procedure created.
SQL> grant execute on build_xt to flunky;
Grant succeeded.
SQL> -- Prove it works:
SQL> var results refcursor
SQL>
SQL> exec build_xt(:results)
PL/SQL procedure successfully completed.
ID COL1 COL2 COL3 COL4
1 Eat More Bananas Today
1 row selected.
SQL> drop table test_xt purge;
Table dropped.As FLUNKY:
Account: FLUNKY@//centosvm.starbase.local:1521/dev10g.starbase.local
SQL> SELECT * FROM user_sys_privs;
USERNAME PRIVILEGE ADM
FLUNKY CREATE TABLE NO
FLUNKY CREATE SESSION NO
FLUNKY CREATE PROCEDURE NO
3 rows selected.
SQL> var results refcursor
SQL>
SQL> exec overlord.build_xt(:results)
PL/SQL procedure successfully completed.
ID COL1 COL2 COL3 COL4
1 Eat More Bananas Today
1 row selected. -
External Table and Direct path load
Hi,
I was just playing with Oracle sql loader and external table features. few things which i ovserved are that data loading through direct path method of sqlloader is much faster and takes much less hard disk space than what external table method takes. here are my stats which i found while data loading:
For Direct Path: -
# OF RECORDS.............TIME...................SOURCE FILE SIZE...................DATAFILE SIZE(.dbf)
478849..........................00:00:43.53...................108,638 KB...................142,088 KB
957697..........................00:01:08.81...................217,365 KB...................258,568 KB
1915393..........................00:02:54.43...................434,729 KB...................509,448 KB
For External Table: -
# OF RECORDS..........TIME...................SOURCE FILE SIZE...................DATAFILE SIZE(.dbf)
478849..........................00:02:51.03...................108,638 KB...................966,408 KB
957697..........................00:08:05.32...................217,365 KB...................1,930,248 KB
1915393..........................00:17:16.31...................434,729 KB...................3,860,488 KB
1915393..........................00:23:17.05...................434,729 KB...................3,927,048 KB
(With PARALLEL)
i used same files for testing and all other conditions are similar also. In my case datafile is autoextendable, hence, as par requirement its size is automatically increased and hard disk space is reduced thus.The issue is that, is this an expected behaviour? and why in case of external tables such a large hard disk space is used when compared to direct path method? Performance of external table load is also very bad when compared to direct path load.
one more thing is that while using external table load with PARALLEL option, ideally, it should take less time. But what i actually get is more than what the time was without PARALLEL option.
In both the cases i am loading data from the same file to the same table (once using direct path and once using external table). before every fresh loading i truncate my internal table into which data was loaded.
any views??
Deep
Message was edited by:
DeepThanx to all for your suggestions.
John, my scripts are as follows:
for external table:
CREATE TABLE LOG_TBL_LOAD
(COL1 CHAR(20), COL2 CHAR(2), COL3 CHAR(20), COL4 CHAR(400),
COL5 CHAR(20), COL6 CHAR(400), COL7 CHAR(20), COL8 CHAR(20),
COL9 CHAR(400), COL10 CHAR(400))
ORGANIZATION EXTERNAL
(TYPE ORACLE_LOADER
DEFAULT DIRECTORY EXT_TAB_DIR
ACCESS PARAMETERS
(RECORDS DELIMITED BY NEWLINE
FIELDS TERMINATED BY WHITESPACE OPTIONALLY ENCLOSED BY '"' MISSING FIELD VALUES ARE NULL
LOCATION ('LOGZ3.DAT')
REJECT LIMIT 10;
for loading i did:
INSERT INTO LOG_TBL (COL1, COL2, COL3, COL4,COL5, COL6,
COL7, COL8, COL9, COL10)
(SELECT COL1, COL2, COL3, COL4, COL5, COL6, COL7, COL8,
COL9, COL10 FROM LOG_TBL_load_1);
for direct path my control file is like this:
OPTIONS (
DIRECT = TRUE
LOAD DATA
INFILE 'F:\DATAFILES\LOGZ3.DAT' "str '\n'"
INTO TABLE LOG_TBL
APPEND
FIELDS TERMINATED BY WHITESPACE OPTIONALLY ENCLOSED BY '"'
(COL1 CHAR(20),
COL2 CHAR(2),
COL3 CHAR(20),
COL4 CHAR(400),
COL5 CHAR(20),
COL6 CHAR(400),
COL7 CHAR(20),
COL8 CHAR(20),
COL9 CHAR(400),
COL10 CHAR(400))
and ya, i have used same table in both the situations. after loading once i used to truncate my table, LOG_TBL. i used the same source file, LOGZ3.DAT.
my tablespace USERS is loaclly managed.
thanks -
While loading through External Tables, Japanese characters wrong load
Hi all,
I am loading a text file through External Tables. While loading, japanese characters are loading as junk characters. In text file, the characters are showing correctly.
My spool file
SET ECHO OFF
SET VERIFY OFF
SET Heading OFF
SET LINESIZE 600
SET NEWPAGE NONE
SET PAGESIZE 100
SET feed off
set trimspool on
spool c:\SYS_LOC_LOGIC.txt
select CAR_MODEL_CD||',' || MAKER_CODE||',' || CAR_MODEL_NAME_CD||',' || TYPE_SPECIFY_NO||',' ||
CATEGORY_CLASS_NO||',' || SPECIFICATION||',' || DOOR_NUMBER||',' || RECOGNITION_TYPE||',' ||
TO_CHAR(SALES_START,'YYYY-MM-DD') ||',' || TO_CHAR(SALES_END,'YYYY-MM-DD') ||',' || LOGIC||',' || LOGIC_DESCRIPTION
from Table where rownum < 100;
spool off
My External table load script
CREATE TABLE SYS_LOC_LOGIC
CAR_MODEL_CD NUMBER ,
MAKER_CODE NUMBER,
CAR_MODEL_NAME_CD NUMBER,
TYPE_SPECIFY_NO NUMBER ,
CATEGORY_CLASS_NO NUMBER ,
SPECIFICATION VARCHAR2(300),
DOOR_NUMBER NUMBER,
RECOGNITION_TYPE VARCHAR2(30),
SALES_START DATE ,
SALES_END DATE ,
LOGIC NUMBER,
LOGIC_DESCRIPTION VARCHAR2(100)
ORGANIZATION EXTERNAL
TYPE ORACLE_LOADER
DEFAULT DIRECTORY XMLTEST1
ACCESS PARAMETERS
RECORDS DELIMITED BY NEWLINE
FIELDS TERMINATED BY ','
MISSING FIELD VALUES ARE NULL
CAR_MODEL_CD,MAKER_CODE,CAR_MODEL_NAME_CD,TYPE_SPECIFY_NO,
CATEGORY_CLASS_NO,SPECIFICATION,DOOR_NUMBER,RECOGNITION_TYPE,
SALES_START date 'yyyy-mm-dd', SALES_END date 'yyyy-mm-dd',
LOGIC, LOGIC_DESCRIPTION
LOCATION ('SYS_LOC_LOGIC.txt')
--location ('products.csv')
REJECT LIMIT UNLIMITED;
How to solve this.
Thanks in advance,
PalJust so I'm clear, user1 connects to the database server and runs the spool to generate a flat file from the database. User2 then uses that flat file to load that data back in to the same database? If the data isn't going anywhere, I assume there is a good reason to jump through all these unload and reload hoops rather than just moving the data from one table to another...
What is the NLS_LANG set in the client's environment when the spool is generated? Note that the NLS_CHARACTERSET is a database setting, not a client setting.
What character set is the text file? Are you certain that the text file is UTF-8 encoded? And not encoded using the operating system's local code page (assuming the operating system is capable of displaying Japanese text)
There is a CHARACTERSET parameter for the external table definition, but that should default to the character set of the database.
Justin -
hi
I use Windows 2008R2 Std. and Oracle 11gR2 RAC
I have successfully mounted ACFS share
ASMCMD> volinfo -a
Diskgroup Name: SHARED
Volume Name: SHARED_ACFS
Volume Device: \\.\asm-shared_acfs-106
State: ENABLED
Size (MB): 8192
Resize Unit (MB): 256
Redundancy: UNPROT
Stripe Columns: 4
Stripe Width (K): 128
Usage: ACFS
Mountpath: C:\SHARED
I have created directory in Oracle mapped onto ACFS share
and granted read,write access to it to my user
Then I created external table with success BUT...
though I see metadata
ADM@proton22> desc t111;
Name Null? Type
NAME VARCHAR2(4000)
VALUE VARCHAR2(4000)
I got error:
ADM@proton22> select * from t111;
select * from t111
ERROR at line 1:
ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-04027: file name check failed: C:\SHARED\EXTTAB\EXT_PARAM.log
How to cope with this?
I granted "full control" privileges to "everyone" user at OS level with no avail.
Edited by: g777 on 2011-06-01 13:47
SORRY, I MOVED TO RAC FORUM.See "Bug 14045247 : KUP-04027 ERROR WHEN QUERY DATA FROM EXTERNAL TABLE ON ACFS" in MOS.
This is actually reported as not being a Bug:
"An ACFS directory on the MS-Windows platform is implemented as a JUNCTION,and is therefore a symbolic link. Therefore, DISABLE_DIRECTORY_LINK_CHECK needs to be used, or a non-ACFS directory."
i.e. when creating the External Table, the DISABLE_DIRECTORY_LINK_CHECK Clause must be used if using the ORACLE_LOADER Access Driver
e.g. CREATE TABLE ...
... ORGANIZATION EXTERNAL
(TYPE ORACLE_LOADER ... ACCESS PARAMETERS (RECORDS ... DISABLE_DIRECTORY_LINK_CHECK)
For full syntax see: http://docs.oracle.com/cd/E11882_01/server.112/e22490/et_params.htm
Also note Security Implications mentioned in above documentation:
"Use of this parameter involves security risks because symbolic links can potentially be used to redirect the input/output of the external table load operation" -
How change NLS_NUMERIC_CHARACTERS parameter for load external table
Hi,
I use this version:
OWB 11gR2
Database 11gR2
Parameter NLS_NUMERIC_CHARACTERS Database ., Instance ,.
When I created database with wizard and in this moment I don't set spanish language, later I changed this parameters in instance parameters.
Now I want load data from a file to external table, but I've an error when I try load data with decimal point.
why does it use the database parameter instead of instance parameter?
Is possible to change this parameter?
Cheers
MarisolAt this moment , this is not possible . You can see metalink note ID 268906.1.
It says:
Currently, external tables always use the setting of NLS_NUMERIC_CHARACTERS
+at the database level.+
Cheers
Marisol -
External table.How to load numbers (decimal and scientific notation format)
Hi all, I need to load inside an external table records that contain 7 fields. The last field is called AMOUNT and it's represented in some records with the decimal format, in others records with the scientific notation format as, for example, below:
CY001_STATU;2009;Jan;11220020GR;'03900;CYZ900;-9,99999999839929e-03
CY001_STATU;2009;Jan;11200100;'60800;CYZ900;41380,77
The External table's script is the following:
CREATE TABLE HYP_DATA
COUNTRY VARCHAR2(50 BYTE),
YEAR VARCHAR2(20 BYTE),
PERIOD VARCHAR2(20 BYTE),
ACCOUNT VARCHAR2(50 BYTE),
DEPT VARCHAR2(20 BYTE),
ACTIVITY_LOC VARCHAR2(20 BYTE),
AMOUNT VARCHAR2(50 BYTE)
ORGANIZATION EXTERNAL
( TYPE ORACLE_LOADER
DEFAULT DIRECTORY HYP_DATA_DIR
ACCESS PARAMETERS
( RECORDS DELIMITED BY NEWLINE
BADFILE 'HYP_BAD_DIR':'HYP_LOAD.bad'
DISCARDFILE 'HYP_DISCARD_DIR':'HYP_LOAD.dsc'
LOGFILE 'HYP_LOG_DIR':'HYP_LOAD.log'
SKIP 0
FIELDS TERMINATED BY ";"
MISSING FIELD VALUES ARE NULL
REJECT ROWS WITH ALL NULL FIELDS
"COUNTRY" Char,
"YEAR" Char,
"PERIOD" Char,
"ACCOUNT" Char,
"DEPT" Char,
"ACTIVITY_LOC" Char,
"AMOUNT" Char
LOCATION (HYP_DATA_DIR:'Total.txt')
REJECT LIMIT UNLIMITED
NOPARALLEL
NOMONITORING;
If, for the field AMOUNT I use the datatype VARCHAR (as above), the table is loaded but I have some records rejected, and all these records contain the last field AMOUNT with the scientific notation as:
CY001_STATU;2009;Jan;11220020GR;'03900;CYZ900;-9,99999999839929e-03
CY001_STATU;2009;Feb;11220020GR;'03900;CYZ900;-9,99999999839929e-03
CY001_STATU;2009;Mar;11220020GR;'03900;CYZ900;-9,99999999839929e-03
CY001_STATU;2009;Dec;11220020GR;'03900;CYZ900;-9,99999999839929e-03
All the others records with a decimal AMOUNT are loaded correctly.
So, my problem is that I NEED to load all the records (with the decimal and the scientific notation format) together (without records rejected), but I don't know which datatype I have to use for the AMOUNT field....
Anybody has any idea ???
Any help would be appreciated
Thanks in advance
Alex@OP,
What version of Oracle are you using?
Just cut'n'paste of you script and example woked FINE for me.
however my quation is... An external table will LOAD all data or none at all. How are you validating/concluding that...
I have some records rejected, and all these records contain the last field AMOUNT with the scientific notation
select * from v$version where rownum <2;
Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
select * from mydata;
CY001_STATU 2009 Jan 11220020GR '03900 CYZ900 -9,99999999839929e-03
CY001_STATU 2009 Feb 11220020GR '03900 CYZ900 -9,99999999839929e-03
CY001_STATU 2009 Jan 11220020GR '03900 CYZ900 -9,99999999839929e-03
CY001_STATU 2009 Jan 11200100 '60800 CYZ900 41380,77
CY001_STATU 2009 Mar 11220020GR '03900 CYZ900 -9,99999999839929e-03
CY001_STATU 2009 Dec 11220020GR '03900 CYZ900 -9,99999999839929e-03
CY001_STATU 2009 Jan 11220020GR '03900 CYZ900 -9,99999999839929e-03
CY001_STATU 2009 Jan 11200100 '60800 CYZ900 41380,77MYDATA table script is...
drop table mydata;
CREATE TABLE mydata
COUNTRY VARCHAR2(50 BYTE),
YEAR VARCHAR2(20 BYTE),
PERIOD VARCHAR2(20 BYTE),
ACCOUNT VARCHAR2(50 BYTE),
DEPT VARCHAR2(20 BYTE),
ACTIVITY_LOC VARCHAR2(20 BYTE),
AMOUNT VARCHAR2(50 BYTE)
ORGANIZATION EXTERNAL
( TYPE ORACLE_LOADER
DEFAULT DIRECTORY IN_DIR
ACCESS PARAMETERS
( RECORDS DELIMITED BY NEWLINE
BADFILE 'IN_DIR':'HYP_LOAD.bad'
DISCARDFILE 'IN_DIR':'HYP_LOAD.dsc'
LOGFILE 'IN_DIR':'HYP_LOAD.log'
SKIP 0
FIELDS TERMINATED BY ";"
MISSING FIELD VALUES ARE NULL
REJECT ROWS WITH ALL NULL FIELDS
"COUNTRY" Char,
"YEAR" Char,
"PERIOD" Char,
"ACCOUNT" Char,
"DEPT" Char,
"ACTIVITY_LOC" Char,
"AMOUNT" Char
LOCATION (IN_DIR:'total.txt')
REJECT LIMIT UNLIMITED
NOPARALLEL
NOMONITORING;vr,
Sudhakar B. -
Hi,
I am trying to insert a csv file as a single record as a CLOB (one row for entire csv file) and trying to do that via external table. But unalbe to do so. Following is the syntax I tried with:
create table testext_tab2
( file_data clob
organization external
TYPE ORACLE_LOADER
DEFAULT DIRECTORY dir_n1
access parameters
RECORDS DELIMITED BY NEWLINE
BADFILE DIR_N1:'lob_tab_%a_%p.bad'
LOGFILE DIR_N1:'lob_tab_%a_%p.log'
FIELDS TERMINATED BY ','
MISSING FIELD VALUES ARE NULL
clob_filename CHAR(100)
COLUMN TRANSFORMS (file_data FROM LOBFILE (clob_filename) FROM (DIR_N1) CLOB)
LOCATION ('emp.txt')
REJECT LIMIT UNLIMITED
--it gives the output that the table is created but the table does not have any rows (select count(*) from testext_tab2 gives 0 rows)
-- and the logfile has entries like follows:
Fields in Data Source:
CLOB_FILENAME CHAR (100)
Terminated by ","
Trim whitespace same as SQL Loader
Column Transformations
FILE_DATA
is set from a LOBFILE
directory is from constant DIR_N1
directory object list is ignored
file is from field CLOB_FILENAME
file contains character data
in character set WE8ISO8859P1
KUP-04001: error opening file /oracle/dba/dir_n1/7369
KUP-04017: OS message: No such file or directory
KUP-04065: error processing LOBFILE for field FILE_DATA
KUP-04101: record 1 rejected in file /oracle/dba/dir_n1/emp.txt
KUP-04001: error opening file /oracle/dba/dir_n1/7499
KUP-04017: OS message: No such file or directory
KUP-04065: error processing LOBFILE for field FILE_DATA
KUP-04101: record 2 rejected in file /oracle/dba/dir_n1/emp.txt
KUP-04001: error opening file /oracle/dba/dir_n1/7521
KUP-04017: OS message: No such file or directory
KUP-04065: error processing LOBFILE for field FILE_DATA
KUP-04101: record 3 rejected in file /oracle/dba/dir_n1/emp.txt
and also the file to be loaded (emp.txt) has data like this:
7369,SMITH,CLERK,7902,12/17/1980,800,null,20
7499,ALLEN,SALESMAN,7698,2/20/1981,1600,300,30
7521,WARD,SALESMAN,7698,2/22/1981,1250,500,30
7566,JONES,MANAGER,7839,4/2/1981,2975,null,20
7654,MARTIN,SALESMAN,7698,9/28/1981,1250,1400,30
7698,BLAKE,MANAGER,7839,5/1/1981,2850,null,30
7782,CLARK,MANAGER,7839,6/9/1981,2450,null,10
7788,SCOTT,ANALYST,7566,12/9/1982,3000,null,20
7839,KING,PRESIDENT,null,11/17/1981,5000,null,10
7844,TURNER,SALESMAN,7698,9/8/1981,1500,0,30
7876,ADAMS,CLERK,7788,1/12/1983,1100,null,20
7900,JAMES,CLERK,7698,12/3/1981,950,null,30
7902,FORD,ANALYST,7566,12/3/1981,3000,null,20
7934,MILLER,CLERK,7782,1/23/1982,1300,null,10I will be thankful for help on this. Also I read on asktom site (http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:1669379500346411993)
that LOB are not supported for external tables but there are other sites with examples of CLOB being loaded by external tables.
With regards,
OrausernCMcM wrote:
Hi all
We have an application that runs fine on 10.2.0.4 on most platforms, but a customer has reported an error when running 10.2.0.3 on HP. We have since reproduced the error on 10.2.0.3 on XP but have failed to reproduce it on Solaris or Linux.
The exact error is within a set of procedures, but the simplest reproducible form of the error is pasted in below.Except that you haven't pasted output to show us what the actual error is. Are we supposed to guess?
SQL> select * from v$version;
BANNER
Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Prod
PL/SQL Release 10.2.0.1.0 - Production
CORE 10.2.0.1.0 Production
TNS for 32-bit Windows: Version 10.2.0.1.0 - Production
NLSRTL Version 10.2.0.1.0 - Production
SQL> ed
Wrote file afiedt.buf
1 declare
2 vstrg clob:= 'A';
3 Thisstrg varchar2(32000);
4 begin
5 for i in 1..31999 loop
6 vstrg := vstrg||'A';
7 end loop;
8 ThisStrg := vStrg;
9* end;
SQL> /
PL/SQL procedure successfully completed.
SQL>Works ok for me on 10.2.0.1 (Windows 2003 server) -
Error while loading data into External table from the flat files
HI ,
We have a data load in our project which feeds the oracle external tables with the data from the Flat Files(.bcp files) in unix.
While loading the data, we are encountering the following error.
Error occured (Error Code : -29913 and Error Message : ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-04063: un) while loading data into table_ext
Please let us know what needs to be done in this case to solve this problem.
Thanks,
KartheekKartheek,
I used Google (mine still works).... please check those links:
http://oraclequirks.blogspot.com/2008/07/ora-29400-data-cartridge-error.html
http://jonathanlewis.wordpress.com/2011/02/15/ora-29913/
HTH,
Thierry -
External tables in Oracle 11g database is not loading null value records
We have upgraded our DB from Oracle 9i to 11g...
It was noticed that data load to external tables in 9i is rejecting the records with null columns..However upgrading it to 11g,it allows the records with null values.
Is there are way to restrict loading the records that has few coulmns that are null..
Can you please share if this is the expected behaviour in Oracle 11g...
Thanks.Data isn't really loaded to an External Table. Rather, the external table lets you query an external data source as if it were a regular database table. To not see the rows with the NULL value, simply filter those rows out with your SQL statement:
SELECT * FROM my_external_table WHERE colX IS NOT NULL;
HTH,
Brian -
Error when loading from External Tables in OWB 11g
Hi,
I face a strange problem while loading data from flat file into the External Tables.
ORA-12899: value too large for column EXPIRED (actual: 4, maximum: 1)
error processing column EXPIRED in row 9680 for datafile <data file location>/filename.dat
In a total of 9771 records nearly 70 records are rejected due to the above mentioned error. The column (EXPIRED) where the error being reported doesn't have value greater than 1 at all. I suspect it to be a different problem.
Example: One such record that got rejected is as follows:
C|234|Littérature commentée|*N*|2354|123
highlightened in Bold is the EXPIRED Column.
When I tried to insert this record into the External Table using UTL_FILE Utility it got loaded successfully. But when I try with the file already existing in the file directory it again fails with the above error, and I would like to mention that all the records which have been loaded are not Ok, please have a look at the DESCRIPTION Column which is highlightened. The original information in the data file looks like:
C|325|*Revue Générale*|N|2445|132
In the External Table the Description Value is replaced by the inverted '?' as follows:
Reue G¿rale
Please help.
Thanks,
JL.user1130292 wrote:
Hi,
I face a strange problem while loading data from flat file into the External Tables.
ORA-12899: value too large for column EXPIRED (actual: 4, maximum: 1)
error processing column EXPIRED in row 9680 for datafile <data file location>/filename.dat
In a total of 9771 records nearly 70 records are rejected due to the above mentioned error. The column (EXPIRED) where the error being reported doesn't have value greater than 1 at all. I suspect it to be a different problem.
Example: One such record that got rejected is as follows:
C|234|Littérature commentée|*N*|2354|123
highlightened in Bold is the EXPIRED Column.
When I tried to insert this record into the External Table using UTL_FILE Utility it got loaded successfully. But when I try with the file already existing in the file directory it again fails with the above error, and I would like to mention that all the records which have been loaded are not Ok, please have a look at the DESCRIPTION Column which is highlightened. The original information in the data file looks like:
C|325|*Revue Générale*|N|2445|132
In the External Table the Description Value is replaced by the inverted '?' as follows:
Reue G¿rale
Please help.
Thanks,
JL.sorry, couldnt see the highlighted test.could you plesae enclsoe it in tags
also post the table definition with attributes. BTW..Whats your NLS_LANGUAGE SET TO? -
Error 'Loading Textfile data into external table'
I am using Apex 2.0, Oralce Database 10G and Internet Explorer (version 6.5)
I tried to run following code using SQLCommands of SQLWORKSHOP of Apex
Code Here:
declare
ddl1 varchar2(200);
ddl2 varchar2(4000);
begin
ddl1 := 'create or replace directory data_dir as
''C:\LAUSD_DATA\''';
execute immediate ddl1;
ddl2:= 'create table tbl_temp_autoload
(ID NUMBER,
RECTYPE VARCHAR2(2),
EXPORTNBR NUMBER(2),
EXPORTDATE DATE,
BIMAGEID VARCHAR2(11),
APPLICNBR NUMBER(10),
MEALIMAGE VARCHAR2(17),
MEDIIMAGE VARCHAR2(17),
LANGUAGE VARCHAR2(1),
APPLICCNT NUMBER(1),
OTHAPPLICNBR VARCHAR2(10),
PEDETDATE DATE,
PESTATUS VARCHAR2(1),
ERRORREMARKS VARCHAR2(50),
COMMENTS VARCHAR2(50),
SID1 VARCHAR2(10),
WID1 NUMBER(10),
MEDIROW1 VARCHAR2(1),
LASTNAME1 VARCHAR2(20),
FIRSTNAME1 VARCHAR2(20),
SCHOOL1 VARCHAR2(15),
LOCCD1 VARCHAR2(4),
BIRTHDATE1 DATE,
CASENBR1 VARCHAR2(15),
FOSTER1 VARCHAR2(1),
CHILDINC1 VARCHAR2(4),
RACEAIAN VARCHAR2(1),
RACEBAA VARCHAR2(1),
RACENHPI VARCHAR2(1),
RACEASIAN VARCHAR2(1),
RACEWHITE VARCHAR2(1),
HISPANIC VARCHAR2(1),
NOTHISPANIC VARCHAR2(1),
ADULTSIG3 VARCHAR2(1),
ADULTSIGDATE3 VARCHAR2(10),
ADULTNAME3 VARCHAR2(30),
SSN3 VARCHAR2(1),
SSN3NOT VARCHAR2(1),
ADDRESS VARCHAR2(30),
APTNBR VARCHAR2(6),
CTY VARCHAR2(20),
ZIP NUMBER(5),
PHONEHOME VARCHAR2(12),
PHONEWORK VARCHAR2(12),
PHONEEXTEN VARCHAR2(6),
MEALROWA VARCHAR2(1),
LNAMEA VARCHAR2(20),
FNAMEA VARCHAR2(20),
MINITA VARCHAR2(6),
GENDERA VARCHAR2(1),
AGEA NUMBER(2),
MOTYPEA VARCHAR2(1),
MONAMEA VARCHAR2(1),
FATYPEA VARCHAR2(1),
FANAMEA VARCHAR2(1),
FAMNBR NUMBER(1),
PARENTINC NUMBER(5),
FAMILYINC NUMBER(5),
FAMILYSIZE NUMBER(2),
PGSIG1 VARCHAR2(1),
PGNAME1 VARCHAR2(30),
PGSIGDATE1 VARCHAR2(10),
FAMSIZE1 VARCHAR2(2),
FAMINC1 VARCHAR2(6),
PGSIG2 VARCHAR2(1),
PGNAME2 VARCHAR2(30),
PGSIGDATE2 DATE,
FAMSIZE2 VARCHAR2(2),
FAMINC2 VARCHAR2(4),
GRADE NUMBER(2),
SCHOOL VARCHAR2(40),
RACE VARCHAR2(4),
MEDI_CALID VARCHAR2(15),
MEDSTATUS VARCHAR2(1),
ADULT1NAME VARCHAR2(40),
ADULT1TYPE VARCHAR2(1),
ADULT1INC1 VARCHAR2(5),
ADULT1INC2 VARCHAR2(5),
ADULT1INC3 VARCHAR2(5),
ADULT1INC4 VARCHAR2(5),
ADULT2NAME VARCHAR2(40),
ADULT2TYPE VARCHAR2(1),
ADULT2INC1 VARCHAR2(5),
ADULT2INC2 VARCHAR2(5),
ADULT2INC3 VARCHAR2(5),
ADULT2INC4 VARCHAR2(5),
ADULT3NAME VARCHAR2(40),
ADULT3TYPE VARCHAR2(1),
ADULT3INC1 VARCHAR2(5),
ADULT3INC2 VARCHAR2(5),
ADULT3INC3 VARCHAR2(5),
ADULT3INC4 VARCHAR2(5),
ADULT4NAME VARCHAR2(40),
ADULT4TYPE VARCHAR2(1),
ADULT4INC1 VARCHAR2(5),
ADULT4INC2 VARCHAR2(5),
ADULT4INC3 VARCHAR2(5),
ADULT4INC4 VARCHAR2(5),
ADULT5NAME VARCHAR2(40),
ADULT5TYPE VARCHAR2(1),
ADULT5INC1 VARCHAR2(5),
ADULT5INC2 VARCHAR2(5),
ADULT5INC3 VARCHAR2(5),
ADULT5INC4 VARCHAR2(5),
ADULT6NAME VARCHAR2(40),
ADULT6TYPE VARCHAR2(1),
ADULT6INC1 VARCHAR2(5),
ADULT6INC2 VARCHAR2(5),
ADULT6INC3 VARCHAR2(5),
ADULT6INC4 VARCHAR2(5),
AGE1LT19 VARCHAR2(1),
AGE2LT19 VARCHAR2(1),
AGE3LT19 VARCHAR2(1),
AGE4LT19 VARCHAR2(1),
AGE5LT19 VARCHAR2(1),
AGE6LT19 VARCHAR2(1),
MIDINIT1 VARCHAR2(1)
organization external
( type oracle_loader
default directory data_dir
access parameters
( fields terminated by '','' )
location (''DataJuly07.txt'')
execute immediate ddl2;
end;
Error Received:
ORA-29913:error in executing ODCIEXITTABLEFETCH callout ORA-30653: reject limit reached
Please help ASAP. I am new with oracle and Apex. Any help will be appreciated greatly.I downloaded the External table simple application from the Apex Packaged application and installed it. It installed successfully but when I run it - it doesn't load the data eventhough I get a message stating it was successful. I am running it on OracleXE - Apex 3.0.1.
In addition, I tried running the stored procedure directly (ext_employees_load) in sql-developer and received the ora-30648.
Please help.
thanks,
Tom
Maybe you are looking for
-
EBS: Automatic clearing of customer open items
Hi Experts, I am setting up the electronic bank statement with format MT940 structured with field 86. As part of the scope, business is requesting to ensure that customer open items gets cleared automatically when the bank statement is uploaded thro
-
Is there a way to see the full album names on my iPod Touch?
When albums have long names, they are cut off. This isn't normally a problem, except if the album is part of a multi-disc set, where the names all begin the same way, as shown in this screen shot of my iPod Touch At the end of each album name is a nu
-
Hi all! This is something that doesn't always happen: I've received a pps file which won't open either with PowerPoint or neoOffice apps. PowerPoint stops responding and I must force quit the app as well as force quite the corresponding MS error repo
-
my installation of the latest version of itunes keeps getting interupted how do i fix this??? also i have iphone 3g which i cant update please help
-
Cisco ASR1004 error %VRF specified does not match existing router
Hi all, I have a problem and I really do not know the reason. Simple scenario. I need to configure ospf into a ASR1004 router. Here the command vrf definition TEST-02 address-family ipv4 exit-address-family interface Loopback25 description TEST-02