IGNORE COMMENTS IN EXTERNAL TABLE
I currently have a partitioned table, split by month going back some 7 years.
The business have now agreed to archive off some of the data but to have it
available if required. I therefore intend to select month by month, the data
to a flat file and make use of external tables to provide the data for the
business if required. Then to remove the months partition thus releaseing
space back to the database. Each months data is only about 100bytes long, but
there could be about 15million rows.
I can successfully create the external table and read the data.
However I would like to place a header in the spooled file but the only method
I have found to ignore the header is to use the "SKIP" parameter. I would
rather not use this as it hard codes the number of lines I reserve for a
header.
Is there another method to add a comment to a external tables file and have the
selects on the file ignore the comments?
Could I use "LOAD WHEN ( column1 != '--' )" as each comment line begins with a
double dash?
Also, is there a limit to the size of an external tables file?
Thanks
JD
Hi Gurus..
any help will be highly appreciated.
Similar Messages
-
IGNORE COMMENTS IN EXTERNAL TABLE FILE
I currently have a partitioned table, split by month going back some 7 years.
The business have now agreed to archive off some of the data but to have it
available if required. I therefore intend to select month by month, the data
to a flat file and make use of external tables to provide the data for the
business if required. Then to remove the months partition thus releaseing
space back to the database. Each months data is only about 100bytes long, but
there could be about 15million rows.
I can successfully create the external table and read the data.
However I would like to place a header in the spooled file but the only method
I have found to ignore the header is to use the "SKIP" parameter. I would
rather not use this as it hard codes the number of lines I reserve for a
header.
Is there another method to add a comment to a external tables file and have the
selects on the file ignore the comments?
Could I use "LOAD WHEN ( column1 != '--' )" as each comment line begins with a
double dash?
Also, is there a limit to the size of an external tables file?When I added the lines to attempt to
exclude the Header lines to be:
RECORDS DELIMITED BY newline
NOBADFILE
NODISCARDFILE
NOLOGFILE
SKIP 0
LOAD WHEN ( cvt_seq_num LIKE '--%' )
FIELDS
TERMINATED BY ","
OPTIONALLY ENCLOSED BY '"'
MISSING FIELD VALUES ARE NULL
REJECT ROWS WITH ALL NULL FIELDS
( cvt_seq_num INTEGER EXTERNAL(12)
I got:
select * from old_cvt_external
ERROR at line 1:
ORA-29913: error in executing ODCIEXTTABLEOPENcallout
ORA-29400: data cartridge error
KUP-00554: error encountered while parsing access parameters
KUP-01005: syntax error: found "identifier": expecting one of: "equal,notequal"
KUP-01008: the bad identifier was: LIKE
KUP-01007: at line 6 column 33
ORA-06512: at "SYS.ORACLE_LOADER", line 14
ORA-06512: at line 1
When I moved the line I got:
RECORDS DELIMITED BY newline
NOBADFILE
NODISCARDFILE
NOLOGFILE
SKIP 0
FIELDS
TERMINATED BY ","
OPTIONALLY ENCLOSED BY '"'
MISSING FIELD VALUES ARE NULL
REJECT ROWS WITH ALL NULL FIELDS
LOAD WHEN ( cvt_seq_num LIKE '--%' )
I got:
select * from old_cvt_external
ERROR at line 1:
ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-00554: error encountered while parsing access parameters
KUP-01005: syntax error: found "load": expecting one of: "exit, ("
KUP-01007: at line 11 column 11
ORA-06512: at "SYS.ORACLE_LOADER", line 14
ORA-06512: at line 1
So it appears that the "LOAD WHEN" does not work.
Can you please suggest anything else? -
En français SVP. J'ai fais une sauvegarde de ma bibliothèque sur un disque dur externe, mais j'ignore comment faire des mise à jour de ma sauvegarde quand on y ajoute de nouveaux disques. merci à l,avance
Go to
http://support.apple.com/downloads/
Find and download the 9.1 updater. Run it. Then Run Software update again.
Regards
TD -
Hi,
I am trying to insert a csv file as a single record as a CLOB (one row for entire csv file) and trying to do that via external table. But unalbe to do so. Following is the syntax I tried with:
create table testext_tab2
( file_data clob
organization external
TYPE ORACLE_LOADER
DEFAULT DIRECTORY dir_n1
access parameters
RECORDS DELIMITED BY NEWLINE
BADFILE DIR_N1:'lob_tab_%a_%p.bad'
LOGFILE DIR_N1:'lob_tab_%a_%p.log'
FIELDS TERMINATED BY ','
MISSING FIELD VALUES ARE NULL
clob_filename CHAR(100)
COLUMN TRANSFORMS (file_data FROM LOBFILE (clob_filename) FROM (DIR_N1) CLOB)
LOCATION ('emp.txt')
REJECT LIMIT UNLIMITED
--it gives the output that the table is created but the table does not have any rows (select count(*) from testext_tab2 gives 0 rows)
-- and the logfile has entries like follows:
Fields in Data Source:
CLOB_FILENAME CHAR (100)
Terminated by ","
Trim whitespace same as SQL Loader
Column Transformations
FILE_DATA
is set from a LOBFILE
directory is from constant DIR_N1
directory object list is ignored
file is from field CLOB_FILENAME
file contains character data
in character set WE8ISO8859P1
KUP-04001: error opening file /oracle/dba/dir_n1/7369
KUP-04017: OS message: No such file or directory
KUP-04065: error processing LOBFILE for field FILE_DATA
KUP-04101: record 1 rejected in file /oracle/dba/dir_n1/emp.txt
KUP-04001: error opening file /oracle/dba/dir_n1/7499
KUP-04017: OS message: No such file or directory
KUP-04065: error processing LOBFILE for field FILE_DATA
KUP-04101: record 2 rejected in file /oracle/dba/dir_n1/emp.txt
KUP-04001: error opening file /oracle/dba/dir_n1/7521
KUP-04017: OS message: No such file or directory
KUP-04065: error processing LOBFILE for field FILE_DATA
KUP-04101: record 3 rejected in file /oracle/dba/dir_n1/emp.txt
and also the file to be loaded (emp.txt) has data like this:
7369,SMITH,CLERK,7902,12/17/1980,800,null,20
7499,ALLEN,SALESMAN,7698,2/20/1981,1600,300,30
7521,WARD,SALESMAN,7698,2/22/1981,1250,500,30
7566,JONES,MANAGER,7839,4/2/1981,2975,null,20
7654,MARTIN,SALESMAN,7698,9/28/1981,1250,1400,30
7698,BLAKE,MANAGER,7839,5/1/1981,2850,null,30
7782,CLARK,MANAGER,7839,6/9/1981,2450,null,10
7788,SCOTT,ANALYST,7566,12/9/1982,3000,null,20
7839,KING,PRESIDENT,null,11/17/1981,5000,null,10
7844,TURNER,SALESMAN,7698,9/8/1981,1500,0,30
7876,ADAMS,CLERK,7788,1/12/1983,1100,null,20
7900,JAMES,CLERK,7698,12/3/1981,950,null,30
7902,FORD,ANALYST,7566,12/3/1981,3000,null,20
7934,MILLER,CLERK,7782,1/23/1982,1300,null,10I will be thankful for help on this. Also I read on asktom site (http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:1669379500346411993)
that LOB are not supported for external tables but there are other sites with examples of CLOB being loaded by external tables.
With regards,
OrausernCMcM wrote:
Hi all
We have an application that runs fine on 10.2.0.4 on most platforms, but a customer has reported an error when running 10.2.0.3 on HP. We have since reproduced the error on 10.2.0.3 on XP but have failed to reproduce it on Solaris or Linux.
The exact error is within a set of procedures, but the simplest reproducible form of the error is pasted in below.Except that you haven't pasted output to show us what the actual error is. Are we supposed to guess?
SQL> select * from v$version;
BANNER
Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Prod
PL/SQL Release 10.2.0.1.0 - Production
CORE 10.2.0.1.0 Production
TNS for 32-bit Windows: Version 10.2.0.1.0 - Production
NLSRTL Version 10.2.0.1.0 - Production
SQL> ed
Wrote file afiedt.buf
1 declare
2 vstrg clob:= 'A';
3 Thisstrg varchar2(32000);
4 begin
5 for i in 1..31999 loop
6 vstrg := vstrg||'A';
7 end loop;
8 ThisStrg := vStrg;
9* end;
SQL> /
PL/SQL procedure successfully completed.
SQL>Works ok for me on 10.2.0.1 (Windows 2003 server) -
Error when loading from External Tables in OWB 11g
Hi,
I face a strange problem while loading data from flat file into the External Tables.
ORA-12899: value too large for column EXPIRED (actual: 4, maximum: 1)
error processing column EXPIRED in row 9680 for datafile <data file location>/filename.dat
In a total of 9771 records nearly 70 records are rejected due to the above mentioned error. The column (EXPIRED) where the error being reported doesn't have value greater than 1 at all. I suspect it to be a different problem.
Example: One such record that got rejected is as follows:
C|234|Littérature commentée|*N*|2354|123
highlightened in Bold is the EXPIRED Column.
When I tried to insert this record into the External Table using UTL_FILE Utility it got loaded successfully. But when I try with the file already existing in the file directory it again fails with the above error, and I would like to mention that all the records which have been loaded are not Ok, please have a look at the DESCRIPTION Column which is highlightened. The original information in the data file looks like:
C|325|*Revue Générale*|N|2445|132
In the External Table the Description Value is replaced by the inverted '?' as follows:
Reue G¿rale
Please help.
Thanks,
JL.user1130292 wrote:
Hi,
I face a strange problem while loading data from flat file into the External Tables.
ORA-12899: value too large for column EXPIRED (actual: 4, maximum: 1)
error processing column EXPIRED in row 9680 for datafile <data file location>/filename.dat
In a total of 9771 records nearly 70 records are rejected due to the above mentioned error. The column (EXPIRED) where the error being reported doesn't have value greater than 1 at all. I suspect it to be a different problem.
Example: One such record that got rejected is as follows:
C|234|Littérature commentée|*N*|2354|123
highlightened in Bold is the EXPIRED Column.
When I tried to insert this record into the External Table using UTL_FILE Utility it got loaded successfully. But when I try with the file already existing in the file directory it again fails with the above error, and I would like to mention that all the records which have been loaded are not Ok, please have a look at the DESCRIPTION Column which is highlightened. The original information in the data file looks like:
C|325|*Revue Générale*|N|2445|132
In the External Table the Description Value is replaced by the inverted '?' as follows:
Reue G¿rale
Please help.
Thanks,
JL.sorry, couldnt see the highlighted test.could you plesae enclsoe it in tags
also post the table definition with attributes. BTW..Whats your NLS_LANGUAGE SET TO? -
Error 'Loading Textfile data into external table'
I am using Apex 2.0, Oralce Database 10G and Internet Explorer (version 6.5)
I tried to run following code using SQLCommands of SQLWORKSHOP of Apex
Code Here:
declare
ddl1 varchar2(200);
ddl2 varchar2(4000);
begin
ddl1 := 'create or replace directory data_dir as
''C:\LAUSD_DATA\''';
execute immediate ddl1;
ddl2:= 'create table tbl_temp_autoload
(ID NUMBER,
RECTYPE VARCHAR2(2),
EXPORTNBR NUMBER(2),
EXPORTDATE DATE,
BIMAGEID VARCHAR2(11),
APPLICNBR NUMBER(10),
MEALIMAGE VARCHAR2(17),
MEDIIMAGE VARCHAR2(17),
LANGUAGE VARCHAR2(1),
APPLICCNT NUMBER(1),
OTHAPPLICNBR VARCHAR2(10),
PEDETDATE DATE,
PESTATUS VARCHAR2(1),
ERRORREMARKS VARCHAR2(50),
COMMENTS VARCHAR2(50),
SID1 VARCHAR2(10),
WID1 NUMBER(10),
MEDIROW1 VARCHAR2(1),
LASTNAME1 VARCHAR2(20),
FIRSTNAME1 VARCHAR2(20),
SCHOOL1 VARCHAR2(15),
LOCCD1 VARCHAR2(4),
BIRTHDATE1 DATE,
CASENBR1 VARCHAR2(15),
FOSTER1 VARCHAR2(1),
CHILDINC1 VARCHAR2(4),
RACEAIAN VARCHAR2(1),
RACEBAA VARCHAR2(1),
RACENHPI VARCHAR2(1),
RACEASIAN VARCHAR2(1),
RACEWHITE VARCHAR2(1),
HISPANIC VARCHAR2(1),
NOTHISPANIC VARCHAR2(1),
ADULTSIG3 VARCHAR2(1),
ADULTSIGDATE3 VARCHAR2(10),
ADULTNAME3 VARCHAR2(30),
SSN3 VARCHAR2(1),
SSN3NOT VARCHAR2(1),
ADDRESS VARCHAR2(30),
APTNBR VARCHAR2(6),
CTY VARCHAR2(20),
ZIP NUMBER(5),
PHONEHOME VARCHAR2(12),
PHONEWORK VARCHAR2(12),
PHONEEXTEN VARCHAR2(6),
MEALROWA VARCHAR2(1),
LNAMEA VARCHAR2(20),
FNAMEA VARCHAR2(20),
MINITA VARCHAR2(6),
GENDERA VARCHAR2(1),
AGEA NUMBER(2),
MOTYPEA VARCHAR2(1),
MONAMEA VARCHAR2(1),
FATYPEA VARCHAR2(1),
FANAMEA VARCHAR2(1),
FAMNBR NUMBER(1),
PARENTINC NUMBER(5),
FAMILYINC NUMBER(5),
FAMILYSIZE NUMBER(2),
PGSIG1 VARCHAR2(1),
PGNAME1 VARCHAR2(30),
PGSIGDATE1 VARCHAR2(10),
FAMSIZE1 VARCHAR2(2),
FAMINC1 VARCHAR2(6),
PGSIG2 VARCHAR2(1),
PGNAME2 VARCHAR2(30),
PGSIGDATE2 DATE,
FAMSIZE2 VARCHAR2(2),
FAMINC2 VARCHAR2(4),
GRADE NUMBER(2),
SCHOOL VARCHAR2(40),
RACE VARCHAR2(4),
MEDI_CALID VARCHAR2(15),
MEDSTATUS VARCHAR2(1),
ADULT1NAME VARCHAR2(40),
ADULT1TYPE VARCHAR2(1),
ADULT1INC1 VARCHAR2(5),
ADULT1INC2 VARCHAR2(5),
ADULT1INC3 VARCHAR2(5),
ADULT1INC4 VARCHAR2(5),
ADULT2NAME VARCHAR2(40),
ADULT2TYPE VARCHAR2(1),
ADULT2INC1 VARCHAR2(5),
ADULT2INC2 VARCHAR2(5),
ADULT2INC3 VARCHAR2(5),
ADULT2INC4 VARCHAR2(5),
ADULT3NAME VARCHAR2(40),
ADULT3TYPE VARCHAR2(1),
ADULT3INC1 VARCHAR2(5),
ADULT3INC2 VARCHAR2(5),
ADULT3INC3 VARCHAR2(5),
ADULT3INC4 VARCHAR2(5),
ADULT4NAME VARCHAR2(40),
ADULT4TYPE VARCHAR2(1),
ADULT4INC1 VARCHAR2(5),
ADULT4INC2 VARCHAR2(5),
ADULT4INC3 VARCHAR2(5),
ADULT4INC4 VARCHAR2(5),
ADULT5NAME VARCHAR2(40),
ADULT5TYPE VARCHAR2(1),
ADULT5INC1 VARCHAR2(5),
ADULT5INC2 VARCHAR2(5),
ADULT5INC3 VARCHAR2(5),
ADULT5INC4 VARCHAR2(5),
ADULT6NAME VARCHAR2(40),
ADULT6TYPE VARCHAR2(1),
ADULT6INC1 VARCHAR2(5),
ADULT6INC2 VARCHAR2(5),
ADULT6INC3 VARCHAR2(5),
ADULT6INC4 VARCHAR2(5),
AGE1LT19 VARCHAR2(1),
AGE2LT19 VARCHAR2(1),
AGE3LT19 VARCHAR2(1),
AGE4LT19 VARCHAR2(1),
AGE5LT19 VARCHAR2(1),
AGE6LT19 VARCHAR2(1),
MIDINIT1 VARCHAR2(1)
organization external
( type oracle_loader
default directory data_dir
access parameters
( fields terminated by '','' )
location (''DataJuly07.txt'')
execute immediate ddl2;
end;
Error Received:
ORA-29913:error in executing ODCIEXITTABLEFETCH callout ORA-30653: reject limit reached
Please help ASAP. I am new with oracle and Apex. Any help will be appreciated greatly.I downloaded the External table simple application from the Apex Packaged application and installed it. It installed successfully but when I run it - it doesn't load the data eventhough I get a message stating it was successful. I am running it on OracleXE - Apex 3.0.1.
In addition, I tried running the stored procedure directly (ext_employees_load) in sql-developer and received the ora-30648.
Please help.
thanks,
Tom -
SQL and External table question
Hello averyone,
I have a file to be read as an external table part of it is below:
ISO-10303-21;
HEADER;
FILE_DESCRIPTION((''),'2;1');
FILE_NAME('BRACKET','2005-07-08T',('broilo'),(''),
'PRO/ENGINEER BY PARAMETRIC TECHNOLOGY CORPORATION, 2004400',
'PRO/ENGINEER BY PARAMETRIC TECHNOLOGY CORPORATION, 2004400','');
FILE_SCHEMA(('CONFIG_CONTROL_DESIGN'));
ENDSEC;
DATA;
#5=CARTESIAN_POINT('',(5.5E0,5.5E0,-5.1E1));
#6=DIRECTION('',(0.E0,0.E0,1.E0));
#7=DIRECTION('',(-1.E0,0.E0,0.E0));
#8=AXIS2_PLACEMENT_3D('',#5,#6,#7);
The first question is: how to ignore the lines until the DATA; line or SQL already does it for me?
The second question is: since the fields of interest are separated by commas and the first field does not interest me (it is solved with a varchar2) how can I read the following fields as numbers ignoring the (,# and ) characters please?
Thanks for any help.
Sincerely yours,
André LuizThe SKIP option can be used with SQL*Loader to skip a certain number of lines before starting to load. Off hand I cannot see any easy way to load the data in the format given. The format does not resemble a typical CVS format. You can look at the test cases provided for SQ*Loader in the Oracle® Database Utilities guide - or simply write PL/SQL code to load this data manually using UTL_FILE.
-
Trying to install the External_Table_Simple_0.9 application but I am receiving errors.
I installed the application but it could not see the flat file, so I de-installed and on re-install I got these errors.
report error:
ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-04063: unable to open log file EXT_EMPLOYEES_EXTERNAL_5476_2344.log
OS error The data is invalid.
Has anybody experienced this.
GusGus,
Copy and paste will not do the work every time.
I copied and pasted the package definition you posted and it won't compile. One
of the declarations is not properly quoted:
x :=
q'! comment on table ext_employees is 'Data loaded: !'
|| TO_CHAR (SYSDATE, 'FMDay, Mon ddth, hh:mi am');
x := x || q'! by !' || NVL (v ('APP_USER'), USER) || q'!' !';
EXECUTE IMMEDIATE x;
...Other than that, you also had an inproper positioning specification here:
FIRST_NAME position(8:22),
last_name position(23:34),
where you didn't consider whitspace between the two columns.
I assume the file you use is of fixed column length. I used the information
you pasted here and modified it accordingly to get a fixed lenght of the columns:
EMP_ID FIRST_NAME LAST_NAME EMAIL PHONE_NUMBER HIRE_DATE JOB_ID SALARY COMM_PCT MGR_ID DEPT_ID
100 Steven King SKING 515.123.4567 06/17/1987 AD_PRES 24000 90
101 Neena Kochhar NKOCHHAR 515.123.4568 09/21/1989 AD_VP 17000 100 90
102 Lex De Haan LDEHAAN 515.123.4569 01/13/1993 AD_VP 17000 100 90
103 Al Hunold AHUNOLD 590.423.4567 01/03/1990 IT_PROG 9000 102 60Saved it as employees.txt into the directory DATA_LOAD.
Running the following script will create an external table:
CREATE TABLE ext_employees_external
employee_id VARCHAR(20),
first_name VARCHAR(20),
last_name VARCHAR(20),
email VARCHAR(20),
phone_number VARCHAR(20),
hire_date VARCHAR(20),
job_id VARCHAR(20),
salary VARCHAR(20),
commission_pct VARCHAR(20),
manager_id VARCHAR(20),
department_id VARCHAR(20)
ORGANIZATION EXTERNAL
( TYPE oracle_loader
DEFAULT DIRECTORY data_load
ACCESS PARAMETERS
( RECORDS DELIMITED BY NEWLINE
SKIP 2
FIELDS MISSING FIELD VALUES ARE NULL(
employee_id POSITION(001:006),
first_name POSITION(008:0017),
last_name POSITION(019:027),
email POSITION(029:036),
phone_number POSITION(038:049),
hire_date POSITION(051:060) ,
job_id POSITION(062:068),
salary POSITION(070:075),
commission_pct POSITION(077:083),
manager_id POSITION(086:092),
department_id POSITION(093:100)
LOCATION (data_load:'employees.txt')
REJECT LIMIT 10
NOPARALLEL
NOMONITORING;This table is reading the data properly. You may now include this in your procedure.
You certanly need to have a look at your .txt file and exactly determie the column
widths. I was using varchar2 in the column definiton of the extenal table. Since you are
loading the external table into your "real" table, you may want to perform data
conversion at that point.
The directory I specified in this script is DATA_LOAD. Your directory may be different.
Denes Kubicek -
How can an external table handle data with line feed between delimiters?
I have defined an external table as below. My data is pipe delimited and comes from a DOS system.
I already remove any carriage returns before putting the file into the DATA_DIR for reading. But
I have found that some of my VARCHAR fields have embeded line feeds.
Is it possible to have a definition that would remove any line feed characters between the delimiters?
Below I also threw together a sample data set there ID #2 has that extra character. Yes, I could
write an awk script to pre-process all my data files. But I am hoping there is a way for Oracle
to also do this.
I understand the LDTRIM to remove any leading and trailing spaces in the delimited field. Is there a
REPLACE or TRANSLATE option. I did a bit of searching but I must be asking the wrong things.
Thanks for any help
Eric
CREATE TABLE table_ext
id NUMBER,
desc1 VARCHAR2(64 CHAR),
desc2 VARCHAR2(255 CHAR),
add_date DATE
ORGANIZATION EXTERNAL
TYPE ORACLE_LOADER
DEFAULT DIRECTORY data_dir
ACCESS PARAMETERS
RECORDS DELIMITED BY NEWLINE
CHARACTERSET WE8ISO8859P1
BADFILE log_dir:'table_ext.bad'
DISCARDFILE log_dir:'table_ext.dis'
LOGFILE log_dir:'table_ext.log'
FIELDS TERMINATED BY '|' LDRTRIM
MISSING FIELD VALUES ARE NULL
id INTEGER EXTERNAL(38),
desc1 CHAR(64),
desc2 CHAR(255),
add_date CHAR DATE_FORMAT DATE MASK "yyyy-mm-dd hh24:mi",
LOCATION( 'data.txt' )
PARALLEL
REJECT LIMIT UNLIMITED;
1|short desc|long desc|2001-01-01 00:00
2|short desc| long
desc |1999-03-03 23:23
3|short desc| long desc | 2011-02-02 02:02Thanks for looking. But that relates to the record delimiter which in my case is the pipe character '|'. In my various data sets this is consistent. I expect each record to be one per line. But between two delimiters some data has a line feed. So I'm looking for a method that will "cleanup" the field data as it gets imported.
I was hoping there was an option that would ignore any embedded line feeds (\n) characters. I.e., those not at the end of the line.
Eric -
Headers appear in external table
I have defined several external tables over text files that have been built in the same way - in excel, then saved as a csv file.
Most of them work fine but there is one that always returns the column headers as data. I cant understand how this happens for this one file when they are all built in the the same way. I have checked the ddl for the external tables and they are equivalent.
I realise I could set it to skip 1 row but I am loathe to do that because if this is some sort of bug then it may be resolved later resulting in the first row of data being lost - a problem that may not become obvious until found by end users.
I have tried rebuilding from scratch and get the same result.
Has anyone else faced this situation?I don't know what the exact trick is that OWB does,
but when you sample the file you use for the external
table and tell OWB to use the first row for the
header names, it won't use the first line as a data
line, even though there's nothing to be found in the
external table definition that indicates that.Hi,
that's no trick. If you sample the file and use the first row for column names the first row only disapears in the wizard. The resulting external table definition doesn't contain a skip row statement in this case automatically. The reason why you don't recognize this is because per default the loader writes no bad files. So the header row is normally ignored (because it doesn't match the structure) and you don't find it in the external table. If you will configure your external tables based on a file with an header and a skip parameter = 0 to produce a bad file (Bad File Name = file.bad) you will find the header row in this bad file then.
The only right way ist to configure the skip parameter - not to repress the resulting errors.
In this special file I guess the header matches the structure of the external table, so it isn't ignored but loaded into the external table.
Regards,
Detlef -
Use of External tables to load XML data.
Hi,
I have used external table definitions to load various XML files to the database, usually splitting the XML into seperate records - 1 per major element tag, and using PL/SQL to parse out a primary key to store in a relational table with all of the XML relevant to that primary key value stored as an XMLTYPE coumn in a row of the table. This has worked fine for XML with a single major entity (element tag) .
However, I now have an XML file that contains two "major" elements (both children of the root) that I would like to split out and store in seperate tables.
The XML file is of the following basic format:-
<drugs>
<drug>drug 1...</drug>
<drug>drug 2...</drug>
<partners>
<partner>partner 1</partner>
<partner>partner 2</partner>
</partners>
</drugs>
The problem is there are around 18000 elements of the first type, followed by several thousand of the 2nd type. I can create two seperate external tables - one for each element type, but how do I get the external table for the 2nd to ignore all the elements of the first type? My external table definition is :-
CREATE TABLE DRUGBANK_OWNER.DRUGBANK_PARTNERS_XML_EXTERNAL
DRUGBANK_XML CLOB
ORGANIZATION EXTERNAL
( TYPE ORACLE_LOADER
DEFAULT DIRECTORY DRUGBANK_DIR
ACCESS PARAMETERS
( records delimited by "</partner>" SKIP 100000
characterset al32utf8
badfile extlogs:'drugbank_partners_xml.bad'
logfile extlogs:'drugbank_partners_xml.log'
discardfile extlogs:'drugbank_partners_xml.dis'
READSIZE 52428800
fields
drugbank_xml CHAR(50000000) terminated by '</partners>'
LOCATION (DRUGBANK_DIR:'drugbank.xml')
REJECT LIMIT UNLIMITED
PARALLEL ( DEGREE 8 INSTANCES 1 )
NOMONITORING;
The problem is that before the first <partners> element the 1800 or so <drugs> elements cause a data cartrdige error:-
ORA-29913: error in executing ODCIEXTTABLEFETCH callout
ORA-29400: data cartridge error
KUP-04020: found record longer than buffer size supported, 52428800
This happens regardless of the value of the SKIP or the size of the drugbank_xml field.
I have tried using an OR on the "records delimited by" access parameter, to 'delimit by "</partner>" OR "</drug>"', with the intention of filtering out the <drug> elements but this leads to a syntax error.
Anyone ever tried anything similar and got it to work?
Any other suggestions?
Thanks,
Sid.No, the content inside quotes is spanned across multiple lines....there are line breaks after every html tag.
"What's the error message you are getting?"
Iam not getting any error while selecting from external table , but I am getting tose rows in BAD file and log file has the following entries
KUP-04021: field formatting error for field TKBS_DSCN
KUP-04036: second enclosing delimiter not found
Message was edited by:
user627610 -
Oracle 11g - External Table Issue?
Oracle 11g - External Table Issue?
=====================
I hope this is the right forum for this issue, if not let me, where to go.
We are using Oracle 11g (11.2.0.1.0) on (Platform : solaris[tm] oe (64-bit)), Sql Developer 3.0.04
We are trying to use oracle external table to load text files in .csv format. Here is our data look like.
======================
Date1,date2,Political party,Name, ROLE
20-Jan-66,22-Nov-69,Democratic,"John ", MMM
22-Nov-70,20-Jan-71,Democratic,"John Jr.",MMM
20-Jan-68,9-Aug-70,Republican,"Rick Ford Sr.", MMM
9-Aug-72,20-Jan-75,Republican,Henry,MMM
------ ALL NULL -- record
20-Jan-80,20-Jan-89,Democratic,"Donald Smith",MMM
======================
Our Expernal table structures is as follows
CREATE TABLE P_LOAD
DATE1 VARCHAR2(10),
DATE2 VARCHAR2(10),
POL_PRTY VARCHAR2(30),
P_NAME VARCHAR2(30),
P_ROLE VARCHAR2(5)
ORGANIZATION EXTERNAL
(TYPE ORACLE_LOADER
DEFAULT DIRECTORY P_EXT_TAB_D
ACCESS PARAMETERS (
RECORDS DELIMITED by NEWLINE
SKIP 1
FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"' LDRTRIM
REJECT ROWS WITH ALL NULL FIELDS
MISSING FIELD VALUES ARE NULL
DATE1 CHAR (10) Terminated by "," ,
DATE2 CHAR (10) Terminated by "," ,
POL_PRTY CHAR (30) Terminated by "," ,
P_NAME CHAR (30) Terminated by "," OPTIONALLY ENCLOSED BY '"' ,
P_ROLE CHAR (5) Terminated by ","
LOCATION ('Input.dat')
REJECT LIMIT UNLIMITED;
It created successfully using SQL Developer
Here is the issue.
It is not loading the records, where fields are enclosed in '"' (Rec # 2,3,4,7)
It is loading all NULL value record (Rec # 6)
*** If we remove the '"' from input data, it loads all records including all NULL records
Log file has
KUP-04021: field formatting error for field P_NAME
KUP-04036: second enclosing delimiter not found
KUP-04101: record 2 rejected in file ....
Our questions
Why did "REJECT ROWS WITH ALL NULL FIELDS" not working?
Why did Terminated by "," OPTIONALLY ENCLOSED BY '"' not working?
Any idea?
Thanks in helping.
Edited by: qwe16235 on Jun 10, 2011 2:16 PMThe following worked for me:
drop table p_load;
CREATE TABLE P_LOAD
DATE1 VARCHAR2(10),
DATE2 VARCHAR2(10),
POL_PRTY VARCHAR2(30),
P_NAME VARCHAR2(30),
P_ROLE VARCHAR2(5)
ORGANIZATION EXTERNAL
(TYPE ORACLE_LOADER
DEFAULT DIRECTORY scott_def_dir1
ACCESS PARAMETERS (
RECORDS DELIMITED by NEWLINE
badfile scott_def_dir2:'p_load_%a_%p.bad'
logfile scott_def_dir2:'p_load_%a_%p.log'
SKIP 1
FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"' LDRTRIM
MISSING FIELD VALUES ARE NULL
REJECT ROWS WITH ALL NULL FIELDS
DATE1 CHAR (10) Terminated by "," ,
DATE2 CHAR (10) Terminated by "," ,
POL_PRTY CHAR (30) Terminated by "," ,
P_NAME CHAR (30) Terminated by "," OPTIONALLY ENCLOSED BY '"' ,
P_ROLE CHAR (5) Terminated by ","
LOCATION ('Input.dat')
REJECT LIMIT UNLIMITED;
Note that I had to interchange the two lines:
MISSING FIELD VALUES ARE NULL
REJECT ROWS WITH ALL NULL FIELDS
Just to get the access parameters to parse correctly.
I added two empty lines, one in the middle and one at the end - both were rejected.
In the log file, you will see the rejectiions:
$ cat p_load_000_9219.log
LOG file opened at 07/08/11 19:47:23
Field Definitions for table P_LOAD
Record format DELIMITED BY NEWLINE
Data in file has same endianness as the platform
Reject rows with all null fields
Fields in Data Source:
DATE1 CHAR (10)
Terminated by ","
Enclosed by """ and """
Trim whitespace same as SQL Loader
DATE2 CHAR (10)
Terminated by ","
Enclosed by """ and """
Trim whitespace same as SQL Loader
POL_PRTY CHAR (30)
Terminated by ","
Enclosed by """ and """
Trim whitespace same as SQL Loader
P_NAME CHAR (30)
Terminated by ","
Enclosed by """ and """
Trim whitespace same as SQL Loader
P_ROLE CHAR (5)
Terminated by ","
Enclosed by """ and """
Trim whitespace same as SQL Loader
KUP-04073: record ignored because all referenced fields are null for a record
KUP-04073: record ignored because all referenced fields are null for a record
Input Data:
Date1,date2,Political party,Name, ROLE
20-Jan-66,22-Nov-69,Democratic,"John ", MMM
22-Nov-70,20-Jan-71,Democratic,"John Jr.",MMM
20-Jan-68,9-Aug-70,Republican,"Rick Ford Sr.", MMM
9-Aug-72,20-Jan-75,Republican,Henry,MMM
4-Aug-70,20-Jan-75,Independent
Result:
SQL> select * from p_load;
DATE1 DATE2 POL_PRTY P_NAME P_ROL
20-Jan-66 22-Nov-69 Democratic John MMM
22-Nov-70 20-Jan-71 Democratic John Jr. MMM
20-Jan-68 9-Aug-70 Republican Rick Ford Sr. MMM
9-Aug-72 20-Jan-75 Republican Henry MMM
Regards,
- Allen -
Interactive report errors when trying to query external table.
I'm trying to create an interactive report against an external table and getting the following error. I see a note in metalink the describes its cause, but I doesn't seem to apply here.
Query cannot be parsed, please check the syntax of your query. (ORA-06550: line 2, column 17: PLS-00302: component 'ODCIOBJECTLIST' must be declared ORA-06550: line 2, column 13: PL/SQL: Item ignored ORA-06550: line 4, column 18: PLS-00302: component 'ORACLE_LOADER' must be declared ORA-06550: line 4, column 6: PL/SQL: Statement ignored ORA-06550: line 5, column 12: PLS-00320: the declaration of the type of this expression is incomplete or malformed ORA-06550: line 5, column 6: PL/SQL: Statement ignored)
Metalink Note:437896.1 identifies this same error related to external tables. It says the cause is an object named 'sys' that exist in the user's schema and must be removed. I checked dba_objects and there is no object named 'sys'Please ignore thread - operator error. There was an object named sys.
-
We have a new scada and database system being delivered which has a section for Alarm Data. We have been told that without extensive re-writing of the ActiveX components of the scada, then we cannot put this alarm data information into Oracle (10.2). Instead, aswell as the Oracle database (which resides on a RAID5 server alongside the application server, the flash recovery area (which is our database backup mechanism), the domain controller, the forms and reports - getting off track with a rant, but they're taking the mick...), they are giving us SQL Express to store the alarm data, which is ported out as a flat file.
I am not yet too familiar with anything above 7.3.4, but my limited research suggests to me that using the external tables feature of oracle would solve this problem and allow us to use the flat file to import the data into an oracle instance and manage it alongside the rest of the database.
Am I right? Seems too obvious a solution that they wouldn't have thought of it, but this is just one of the many, many problems we're having that I feel we could provide a better solution for. Comments??hi,
u mention the Alarm data.
whether the data is in XML format or CSV format.
if it is CSV format use External Tables.
CSV format means store in a flat file.
i was worked on External Table.
External Tables r u used to store the data into oracle tables..
Regards
SATYA -
How to read any file using external tables.
Hi folks,
I have written an application that reads a series of csv files using external tables which works fine as long as I specify each file name in the directory i.e.......
CREATE TABLE gb_test
(file_name varchar2(10),
rec_date date
rec_name VARCHAR2(20),
rec_age number,
ORGANIZATION EXTERNAL
TYPE ORACLE_LOADER
DEFAULT DIRECTORY GB_TEST
ACCESS PARAMETERS
RECORDS DELIMITED BY NEWLINE
FIELDS TERMINATED BY ','
LOCATION ('data1.csv','data2.csv','data3.csv','data4.csv')
PARALLEL 5
REJECT LIMIT 20000;
However I have discovered that I may not know the name of the files to be processed prior to the program being run so just want to read any file regardless of it's name (although it will always be a .csv file).
Is there a way to ensure that you don't need to specify the files to be read in the LOCATION part of the syntax.
Thanks in advance.
Graham.Right, I have now completed this, however it's currently only working as SYS as opposed to any user, however here is a detail of the scenario and the steps required in case any of you guys need in the future ......
The problem was I needed to search for csv files on my hard-drive. These files would be stored in a series of directories (a through to z), so I needed a way to read all 26 directories and process all files in these directories.
The problem was, prior to running the program, the user would remove all the files in the directories and insert new ones, but it was never known how many he would decide to do each time.
Solution: I created a table called stock_data_directories as follows ...
create table stock_data_directories(sdd_rec_no number,
sdd_table_name varchar2(50),
sdd_directory_name varchar2(50),
sdd_directory_path varchar2(100));
Then inserted 26 records like ...
insert into stock_data_directories(sdd_rec_no,sdd_table_name,sdd_directory_name,sdd_directory_path)
values(1,'rawdata_a','KPOLLOCKA','C:\KPOLLOCK\A')
insert into stock_data_directories(sdd_rec_no,sdd_table_name,sdd_directory_name,sdd_directory_path)
values(2,'rawdata_b','KPOLLOCKB','C:\KPOLLOCK\B');
etc...etc...
Then created 26 DIRECTORIES E.G.
CREATE OR REPLACE DIRECTORY KPOLLOCKA AS 'C:\KPOLLOCK\A';
CREATE OR REPLACE DIRECTORY KPOLLOCKB AS 'C:\KPOLLOCK\B';
Then created 26 external tables like the following ...
CREATE TABLE rawdata_a
(stock varchar2(1000),
stock_date varchar2(10),
stock_open VARCHAR2(20),
stock_high varchar2(20),
stock_low varchar2(20),
stock_close VARCHAR2(30),
stock_qty varchar2(20) )
ORGANIZATION EXTERNAL
TYPE ORACLE_LOADER
DEFAULT DIRECTORY KPOLLOCKA
ACCESS PARAMETERS
RECORDS DELIMITED BY NEWLINE
FIELDS TERMINATED BY ','
LOCATION ('AA.csv')
PARALLEL 5
REJECT LIMIT 20000
This basically says in directory rawdata_a it currently has 1 file called AA.csv.
Then wrote a procedure as follows ...
procedure p_process_files(pv_return_message OUT varchar2)is
cursor c_get_stock_data_directories is
select distinct sdd_directory_path,
sdd_table_name
from stock_data_directories
order by sdd_table_name;
vv_return_message varchar2(1000);
begin
-- here get the files for each directory
for r_get_stock_directories in c_get_stock_data_directories loop
p_build_external_table(r_get_stock_directories.sdd_directory_path,
r_get_stock_directories.sdd_table_name,
vv_return_message);
end loop;
end;
then wrote a procedure called p_build_external_table as follows ...
procedure p_build_external_table(pv_directory_path IN stock_data_directories.sdd_directory_path%type, -- e.g. 'C:\kpollock\A\
pv_table_name IN stock_data_directories.sdd_table_name%type, -- e.g. rawdata_a
pv_return_message OUT varchar2) is
vv_pattern VARCHAR2(1024);
ns VARCHAR2(1024);
vv_file_name varchar2(4000);
vv_start_string varchar2(1) := '''';
vv_end_string varchar2(3) := ''',';
vn_counter number := 0;
vv_err varchar2(2000);
BEGIN
vv_pattern := pv_directory_path||'*';
SYS.DBMS_BACKUP_RESTORE.searchFiles(vv_pattern, ns);
FOR each_file IN (SELECT FNAME_KRBMSFT AS name FROM X$KRBMSFT) LOOP
if each_file.name like '%.CSV' then
vv_file_name := vv_file_name||vv_start_string||substr(each_file.name,instr(each_file.name,'\',1,3)+1)||vv_end_string;
vn_counter := vn_counter + 1;
end if;
END LOOP;
vv_file_name := substr(vv_file_name,1,length(vv_file_name)-1); -- remove final , from string
execute immediate 'alter table '||pv_table_name||' location('||vv_file_name||')';
pv_return_message := 'Successfully changed '||pv_table_name||' at '||pv_directory_path||' to now have '||to_char(vn_counter)||' directories';
exception
when others then
vv_err := sqlerrm;
pv_return_message := ' Error found updating directories. Error = '||vv_err;
END;
This reads every file in the directory and appends it to a list, so if it finds A.csv and ABC.csv, then using the dynamic sql, it alters the location to now read 'a.csv','abc.csv',
It ignores all other file extentions.
Maybe you are looking for
-
Looking to get a new MacBook Pro
Hey guys, I've been looking to get a new MBP for the past few months now, and have held off in wait of Lion being announced. I know it's coming out sometime in July, but nothing more specific than that, and I'm really eager to get a MBP! I was wonder
-
Question: I need to make an InDesign CS5 slide show
I need to make an InDesign CS5 slide show that loops. Adobe says this is possible, but I don't understand how to do it in a loop and without button activation. (In other words, it should be activated "on loading." Can anyone help me? Thanks in advanc
-
Creation of tree gets hanged.
Hi I am creating a tree from BAPI. The table structure is: ParentID ParentDesc ChildID ChildDesc When I create a tree with this information, portal get hanged..... Here is the code I am using public void populateTreeValues( java.lang.String selectedV
-
HT2638 Help! Lost images in iPhoto.
I have spent weeks organising my photos on old MacBook pro (2006) ready to move to new one with Lion and latest IPhoto. I dragged onto an external hard drive and tried to put onto new computer. No images but lots of empty frames and same albums there
-
How we get the component at front or back in a Jpanel
How we get the component at front or back in a Jpanel. I want to manage the Z-axis position of the components in the Jpanel.