IGNORE COMMENTS IN EXTERNAL TABLE FILE
I currently have a partitioned table, split by month going back some 7 years.
The business have now agreed to archive off some of the data but to have it
available if required. I therefore intend to select month by month, the data
to a flat file and make use of external tables to provide the data for the
business if required. Then to remove the months partition thus releaseing
space back to the database. Each months data is only about 100bytes long, but
there could be about 15million rows.
I can successfully create the external table and read the data.
However I would like to place a header in the spooled file but the only method
I have found to ignore the header is to use the "SKIP" parameter. I would
rather not use this as it hard codes the number of lines I reserve for a
header.
Is there another method to add a comment to a external tables file and have the
selects on the file ignore the comments?
Could I use "LOAD WHEN ( column1 != '--' )" as each comment line begins with a
double dash?
Also, is there a limit to the size of an external tables file?
When I added the lines to attempt to
exclude the Header lines to be:
RECORDS DELIMITED BY newline
NOBADFILE
NODISCARDFILE
NOLOGFILE
SKIP 0
LOAD WHEN ( cvt_seq_num LIKE '--%' )
FIELDS
TERMINATED BY ","
OPTIONALLY ENCLOSED BY '"'
MISSING FIELD VALUES ARE NULL
REJECT ROWS WITH ALL NULL FIELDS
( cvt_seq_num INTEGER EXTERNAL(12)
I got:
select * from old_cvt_external
ERROR at line 1:
ORA-29913: error in executing ODCIEXTTABLEOPENcallout
ORA-29400: data cartridge error
KUP-00554: error encountered while parsing access parameters
KUP-01005: syntax error: found "identifier": expecting one of: "equal,notequal"
KUP-01008: the bad identifier was: LIKE
KUP-01007: at line 6 column 33
ORA-06512: at "SYS.ORACLE_LOADER", line 14
ORA-06512: at line 1
When I moved the line I got:
RECORDS DELIMITED BY newline
NOBADFILE
NODISCARDFILE
NOLOGFILE
SKIP 0
FIELDS
TERMINATED BY ","
OPTIONALLY ENCLOSED BY '"'
MISSING FIELD VALUES ARE NULL
REJECT ROWS WITH ALL NULL FIELDS
LOAD WHEN ( cvt_seq_num LIKE '--%' )
I got:
select * from old_cvt_external
ERROR at line 1:
ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-00554: error encountered while parsing access parameters
KUP-01005: syntax error: found "load": expecting one of: "exit, ("
KUP-01007: at line 11 column 11
ORA-06512: at "SYS.ORACLE_LOADER", line 14
ORA-06512: at line 1
So it appears that the "LOAD WHEN" does not work.
Can you please suggest anything else?
Similar Messages
-
IGNORE COMMENTS IN EXTERNAL TABLE
I currently have a partitioned table, split by month going back some 7 years.
The business have now agreed to archive off some of the data but to have it
available if required. I therefore intend to select month by month, the data
to a flat file and make use of external tables to provide the data for the
business if required. Then to remove the months partition thus releaseing
space back to the database. Each months data is only about 100bytes long, but
there could be about 15million rows.
I can successfully create the external table and read the data.
However I would like to place a header in the spooled file but the only method
I have found to ignore the header is to use the "SKIP" parameter. I would
rather not use this as it hard codes the number of lines I reserve for a
header.
Is there another method to add a comment to a external tables file and have the
selects on the file ignore the comments?
Could I use "LOAD WHEN ( column1 != '--' )" as each comment line begins with a
double dash?
Also, is there a limit to the size of an external tables file?
Thanks
JDHi Gurus..
any help will be highly appreciated. -
How to ignore comments in a CSV file
Here's what my CSV file looks like
#Comment 1
#Comment 2
#Comment 3
US,20080101,20080526,20080901
I'm reading in this file. How can I ignore all the comment lines? I can code it to ignore all lines begin with # but I thought this feature is built into the language already. Maybe I'm using a wrong class. I'm using Scanner right now and it reads in all the comments as well.
Scanner sc = new Scanner(new File("test.txt")).useDelimiter(",");
while (sc.hasNext()) {
System.err.println(sc.next());
}It's not built into the language. Your best bet is probably to put the CSV-Reading logic into its own class.
Or find one of the thousands of classes that already implement such a thing. -
Directory selection in external table
How do we select the directory path for external table files?.whether we can select any directory in any drive or we have to select particular drive and particular directory path?.how do we check for this?.
example
create or replace directory ext_dir as 'c:/temp';
or
create or replace directory ext_dir as 'd:/temp';
any procedure for that for selecting drive and directory?.any read or wrte permission is needed?Hi,
any procedure for that for selecting drive and directory?Not really clear to me what you mean by that?
any read or wrte permission is needed?For external tables, READ privileges are required on directory objects that contain data sources, while WRITE privileges are required for directory objects containing bad, log, or discard files.
http://download.oracle.com/docs/cd/B19306_01/server.102/b14231/tables.htm#sthref2462
Just check the docs, I bet your questions will be answered while you read:
http://www.oracle.com/pls/db102/homepage
or
http://www.oracle.com/pls/db111/homepage -
We have a Weather Portlet that runs within Oracle 9iAS Portal. Data for this portlet is currently transferred and loaded using PERL.
Could this process also be done using external tables? For example a script could run to transfer a data file and replace the current external table file. What issues might I run into?
Thanks,
ChristopherWe have a Weather Portlet that runs within Oracle 9iAS Portal. Data for this portlet is currently transferred and loaded using PERL.
Could this process also be done using external tables? For example a script could run to transfer a data file and replace the current external table file. What issues might I run into?
Thanks,
Christopher -
How to read any file using external tables.
Hi folks,
I have written an application that reads a series of csv files using external tables which works fine as long as I specify each file name in the directory i.e.......
CREATE TABLE gb_test
(file_name varchar2(10),
rec_date date
rec_name VARCHAR2(20),
rec_age number,
ORGANIZATION EXTERNAL
TYPE ORACLE_LOADER
DEFAULT DIRECTORY GB_TEST
ACCESS PARAMETERS
RECORDS DELIMITED BY NEWLINE
FIELDS TERMINATED BY ','
LOCATION ('data1.csv','data2.csv','data3.csv','data4.csv')
PARALLEL 5
REJECT LIMIT 20000;
However I have discovered that I may not know the name of the files to be processed prior to the program being run so just want to read any file regardless of it's name (although it will always be a .csv file).
Is there a way to ensure that you don't need to specify the files to be read in the LOCATION part of the syntax.
Thanks in advance.
Graham.Right, I have now completed this, however it's currently only working as SYS as opposed to any user, however here is a detail of the scenario and the steps required in case any of you guys need in the future ......
The problem was I needed to search for csv files on my hard-drive. These files would be stored in a series of directories (a through to z), so I needed a way to read all 26 directories and process all files in these directories.
The problem was, prior to running the program, the user would remove all the files in the directories and insert new ones, but it was never known how many he would decide to do each time.
Solution: I created a table called stock_data_directories as follows ...
create table stock_data_directories(sdd_rec_no number,
sdd_table_name varchar2(50),
sdd_directory_name varchar2(50),
sdd_directory_path varchar2(100));
Then inserted 26 records like ...
insert into stock_data_directories(sdd_rec_no,sdd_table_name,sdd_directory_name,sdd_directory_path)
values(1,'rawdata_a','KPOLLOCKA','C:\KPOLLOCK\A')
insert into stock_data_directories(sdd_rec_no,sdd_table_name,sdd_directory_name,sdd_directory_path)
values(2,'rawdata_b','KPOLLOCKB','C:\KPOLLOCK\B');
etc...etc...
Then created 26 DIRECTORIES E.G.
CREATE OR REPLACE DIRECTORY KPOLLOCKA AS 'C:\KPOLLOCK\A';
CREATE OR REPLACE DIRECTORY KPOLLOCKB AS 'C:\KPOLLOCK\B';
Then created 26 external tables like the following ...
CREATE TABLE rawdata_a
(stock varchar2(1000),
stock_date varchar2(10),
stock_open VARCHAR2(20),
stock_high varchar2(20),
stock_low varchar2(20),
stock_close VARCHAR2(30),
stock_qty varchar2(20) )
ORGANIZATION EXTERNAL
TYPE ORACLE_LOADER
DEFAULT DIRECTORY KPOLLOCKA
ACCESS PARAMETERS
RECORDS DELIMITED BY NEWLINE
FIELDS TERMINATED BY ','
LOCATION ('AA.csv')
PARALLEL 5
REJECT LIMIT 20000
This basically says in directory rawdata_a it currently has 1 file called AA.csv.
Then wrote a procedure as follows ...
procedure p_process_files(pv_return_message OUT varchar2)is
cursor c_get_stock_data_directories is
select distinct sdd_directory_path,
sdd_table_name
from stock_data_directories
order by sdd_table_name;
vv_return_message varchar2(1000);
begin
-- here get the files for each directory
for r_get_stock_directories in c_get_stock_data_directories loop
p_build_external_table(r_get_stock_directories.sdd_directory_path,
r_get_stock_directories.sdd_table_name,
vv_return_message);
end loop;
end;
then wrote a procedure called p_build_external_table as follows ...
procedure p_build_external_table(pv_directory_path IN stock_data_directories.sdd_directory_path%type, -- e.g. 'C:\kpollock\A\
pv_table_name IN stock_data_directories.sdd_table_name%type, -- e.g. rawdata_a
pv_return_message OUT varchar2) is
vv_pattern VARCHAR2(1024);
ns VARCHAR2(1024);
vv_file_name varchar2(4000);
vv_start_string varchar2(1) := '''';
vv_end_string varchar2(3) := ''',';
vn_counter number := 0;
vv_err varchar2(2000);
BEGIN
vv_pattern := pv_directory_path||'*';
SYS.DBMS_BACKUP_RESTORE.searchFiles(vv_pattern, ns);
FOR each_file IN (SELECT FNAME_KRBMSFT AS name FROM X$KRBMSFT) LOOP
if each_file.name like '%.CSV' then
vv_file_name := vv_file_name||vv_start_string||substr(each_file.name,instr(each_file.name,'\',1,3)+1)||vv_end_string;
vn_counter := vn_counter + 1;
end if;
END LOOP;
vv_file_name := substr(vv_file_name,1,length(vv_file_name)-1); -- remove final , from string
execute immediate 'alter table '||pv_table_name||' location('||vv_file_name||')';
pv_return_message := 'Successfully changed '||pv_table_name||' at '||pv_directory_path||' to now have '||to_char(vn_counter)||' directories';
exception
when others then
vv_err := sqlerrm;
pv_return_message := ' Error found updating directories. Error = '||vv_err;
END;
This reads every file in the directory and appends it to a list, so if it finds A.csv and ABC.csv, then using the dynamic sql, it alters the location to now read 'a.csv','abc.csv',
It ignores all other file extentions. -
How to import external table, which exist in export dump file.
My export dump file has one external table. While i stated importing into my developement instance , I am getting the error "ORA-00911: invalid character".
The original definition of the extenal table is as given below
CREATE TABLE EXT_TABLE_EV02_PRICEMARTDATA
EGORDERNUMBER VARCHAR2(255 BYTE),
EGINVOICENUMBER VARCHAR2(255 BYTE),
EGLINEITEMNUMBER VARCHAR2(255 BYTE),
EGUID VARCHAR2(255 BYTE),
EGBRAND VARCHAR2(255 BYTE),
EGPRODUCTLINE VARCHAR2(255 BYTE),
EGPRODUCTGROUP VARCHAR2(255 BYTE),
EGPRODUCTSUBGROUP VARCHAR2(255 BYTE),
EGMARKETCLASS VARCHAR2(255 BYTE),
EGSKU VARCHAR2(255 BYTE),
EGDISCOUNTGROUP VARCHAR2(255 BYTE),
EGREGION VARCHAR2(255 BYTE),
EGAREA VARCHAR2(255 BYTE),
EGSALESREP VARCHAR2(255 BYTE),
EGDISTRIBUTORCODE VARCHAR2(255 BYTE),
EGDISTRIBUTOR VARCHAR2(255 BYTE),
EGECMTIER VARCHAR2(255 BYTE),
EGECM VARCHAR2(255 BYTE),
EGSOLATIER VARCHAR2(255 BYTE),
EGSOLA VARCHAR2(255 BYTE),
EGTRANSACTIONTYPE VARCHAR2(255 BYTE),
EGQUOTENUMBER VARCHAR2(255 BYTE),
EGACCOUNTTYPE VARCHAR2(255 BYTE),
EGFINANCIALENTITY VARCHAR2(255 BYTE),
C25 VARCHAR2(255 BYTE),
EGFINANCIALENTITYCODE VARCHAR2(255 BYTE),
C27 VARCHAR2(255 BYTE),
EGBUYINGGROUP VARCHAR2(255 BYTE),
QTY NUMBER,
EGTRXDATE DATE,
EGLISTPRICE NUMBER,
EGUOM NUMBER,
EGUNITLISTPRICE NUMBER,
EGMULTIPLIER NUMBER,
EGUNITDISCOUNT NUMBER,
EGCUSTOMERNETPRICE NUMBER,
EGFREIGHTOUTBOUNDCHARGES NUMBER,
EGMINIMUMORDERCHARGES NUMBER,
EGRESTOCKINGCHARGES NUMBER,
EGINVOICEPRICE NUMBER,
EGCOMMISSIONS NUMBER,
EGCASHDISCOUNTS NUMBER,
EGBUYINGGROUPREBATES NUMBER,
EGINCENTIVEREBATES NUMBER,
EGRETURNS NUMBER,
EGOTHERCREDITS NUMBER,
EGCOOP NUMBER,
EGPOCKETPRICE NUMBER,
EGFREIGHTCOSTS NUMBER,
EGJOURNALBILLINGCOSTS NUMBER,
EGMINIMUMORDERCOSTS NUMBER,
EGORDERENTRYCOSTS NUMBER,
EGRESTOCKINGCOSTSWAREHOUSE NUMBER,
EGRETURNSCOSTADMIN NUMBER,
EGMATERIALCOSTS NUMBER,
EGLABORCOSTS NUMBER,
EGOVERHEADCOSTS NUMBER,
EGPRICEADMINISTRATIONCOSTS NUMBER,
EGSHORTPAYMENTCOSTS NUMBER,
EGTERMCOSTS NUMBER,
EGPOCKETMARGIN NUMBER,
EGPOCKETMARGINGP NUMBER,
EGWEIGHTEDAVEMULTIPLIER NUMBER
ORGANIZATION EXTERNAL
( TYPE ORACLE_LOADER
DEFAULT DIRECTORY EV02_PRICEMARTDATA_CSV_CON
ACCESS PARAMETERS
LOCATION (EV02_PRICEMARTDATA_CSV_CON:'VPA.csv')
REJECT LIMIT UNLIMITED
NOPARALLEL
NOMONITORING;
While importing , when i seen the log file , it is failing the create the external table. Getting the error "ORA-00911: invalid character".
Can some one suggest how to import external tables
Addressing this issue will be highly appriciated.
NaveenHi Srinath,
When i observed the create table syntax of external table from import dump log file, it show few lines as below. I could not understand these special characters. And create table definationis failing with special character viz ORA-00911: invalid character
ACCESS PARAMETERS
LOCATION (EV02_PRICEMARTDATA_CSV_CON:'VPA.csv').
I even observed the create table DDL from TOAD. It is same as i mentioned earlier
Naveen -
DATE fields and LOG files in context with external tables
I am facing two problems when dealing with the external tables feature in Oracle 9i.
I created an External Table with some fileds with the DATE data type . There were no issues during the creation part. But when i query the table, the DATE fields are not properly selected though the data is there in the files. Is there any ideas to deal with this ?
My next question is regarding the log files. The contents in the log file seems to be growing when querying the external tables. Is there a way to control this behaviour?
Suggestions / Advices on the above two issues are welcome.
Thanks
LakshminarayananHi
If you have date datatypes than:
select
greatest(TABCASER1.CASERRECIEVEDDATE, EVCASERS.FINALEVDATES, EVCASERS.PUBLICATIONDATE, EVCASERS.PUBLICATIONDATE, TABCASER.COMPAREACCEPDATE)
from TABCASER, TABCASER1, EVCASERS
where ...-- join and other conditions
1. greatest is good enough
2. to_date creates date dataype from string with the format of format string ('mm/dd/yyyy')
3. decode(a, b, c, d) is a function: if a = b than return c else d. NULL means that there is no data in the cell of the table.
6. to format the date for display use to_char function with format modell as in the to_date function.
Ott Karesz
http://www.trendo-kft.hu -
How to spilt files when using DATA UNLOAD (External Table, 10g)?
Hi,
I am runnin 10gR2 and need to export partitions by using data_pump driver
via dmp files (External Tables).
Now as requierment the created files can not exceed 2GB mark.
Some of partitions are larger than that.
How could I split the partition so I could be able to create files smaller than 2GB? Is there any parameter I am not aware of or do I need to do SELECT COUNT(*) FROM source_table PARTITION(partiton_01);
and than to work with ROWNUM?
This example working fine for all partitions samller than 2GB:
CREATE TABLE partiton_01_tbl
2 ORGANIZATION EXTERNAL
3 (
4 TYPE ORACLE_DATAPUMP
5 DEFAULT DIRECTORY def_dir1
6 LOCATION ('inv_xt1.dmp')
7 )
8 PARALLEL 3
9 AS SELECT * FROM source_table PARTITION(partiton_01);You could specify multiple destination files in the LOCATION parameter (the number of files should match the degree of parallelism specified). I am not aware of an option that would allow the external table to automatically add new data files as the partition size increased, so you'd likely have to do some sort computation about the expected size of the dump file in order to figure out how many files to create.
Justin -
External table refere to a file in subfolder location
Hi,
I'm trying to link a external table to a file in a subfolder in a file location in OWB. Is this possible or do i need to make a new connectior for every folder i want to use for an external file?
The server is an AIX server. So for example:
- Location1 is refering to /POS
But i want to use a file located in /POS/BLA
regards,
Osman
Edited by: Ossy81 on Nov 5, 2009 3:39 PMYes. Here the input from user guide:
Describing the Flat File Module
Type a name and an optional description for the flat file module.
Recall that you create a module for each folder in your file system from which you want to sample existing files or to which you want to create new files. Therefore,
consider naming the module based on the folder name. Each file module corresponds to one folder in the file system.
For example, to import flat files from c:\folder1 and a subfolder such as c:\folder1\subfolder,create two file modules such as C_FOLDER1 and C_FODLER1_SUBFOLDER. -
External table: How to load data from a fixed format UTF8 external file
Hi Experts,
I am trying to read data from a fixed format UTF8 external file in to a external table. The file has non-ascii characters, and the presence of the non-ascii characters causes the data to be positioned incorrectly in the external table.
The following is the content's of the file:
20100423094529000000I1 ABÄCDE 1 000004
20100423094529000000I2 OMS Crew 2 2 000004
20100423094529000000I3 OMS Crew 3 3 000004
20100423094529000000I4 OMS Crew 4 4 000004
20100423094529000000I5 OMS Crew 5 5 000004
20100423094529000000I6 OMS Crew 6 6 000004
20100423094529000000I7 Mobile Crew 7 7 000004
20100423094529000000I8 Mobile Crew 8 8 000004
The structure of the data is as follows:
Name Type Start End Length
UPDATE_DTTM CHAR 1 20 20
CHANGE_TYPE_CD CHAR 21 21 1
CREW_CD CHAR 22 37 16
CREW_DESCR CHAR 38 97 60
CREW_ID CHAR 98 113 16
UDF1_CD CHAR 114 143 30
UDF1_DESCR CHAR 144 203 60
UDF2_CD CHAR 204 233 30
DATA_SOURCE_IND CHAR 294 299 6
UDF2_DESCR CHAR 234 293 60
I create the external table as follows:
CREATE TABLE "D_CREW_EXT"
"UPDATE_DTTM" CHAR(20 BYTE),
"CHANGE_TYPE_CD" CHAR(1 BYTE),
"CREW_CD" CHAR(16 BYTE),
"CREW_DESCR" CHAR(60 BYTE),
"CREW_ID" CHAR(16 BYTE),
"UDF1_CD" CHAR(30 BYTE),
"UDF1_DESCR" CHAR(60 BYTE),
"UDF2_CD" CHAR(30 BYTE),
"DATA_SOURCE_IND" CHAR(6 BYTE),
"UDF2_DESCR" CHAR(60 BYTE)
ORGANIZATION EXTERNAL
TYPE ORACLE_LOADER DEFAULT DIRECTORY "TMP"
ACCESS PARAMETERS ( RECORDS DELIMITED BY NEWLINE
CHARACTERSET UTF8
STRING SIZES ARE IN BYTES
NOBADFILE NODISCARDFILE NOLOGFILE FIELDS NOTRIM
( "UPDATE_DTTM" POSITION (1:20) CHAR(20),
"CHANGE_TYPE_CD" POSITION (21:21) CHAR(1),
"CREW_CD" POSITION (22:37) CHAR(16),
"CREW_DESCR" POSITION (38:97) CHAR(60),
"CREW_ID" POSITION (98:113) CHAR(16),
"UDF1_CD" POSITION (114:143) CHAR(30),
"UDF1_DESCR" POSITION (144:203) CHAR(60),
"UDF2_CD" POSITION (204:233) CHAR(30),
"DATA_SOURCE_IND" POSITION (294:299) CHAR(6),
"UDF2_DESCR" POSITION (234:293) CHAR(60) )
) LOCATION ( 'D_CREW_EXT.DAT' )
REJECT LIMIT UNLIMITED;
Check the result in database:
select * from D_CREW_EXT;
I found the first row is incorrect. For each non-ascii character,the fields to the right of the non-ascii character are off by 1 character,meaning that the data is moved 1 character to the right.
Then I tried to use the option STRING SIZES ARE IN CHARACTERS instead of STRING SIZES ARE IN BYTES, it doesn't work either.
The database version is 11.1.0.6.
Edited by: yuan on May 21, 2010 2:43 AMHi,
I changed the BYTE in the create table part to CHAR, it still doesn't work. The result is the same. I think the problem is in ACCESS PARAMETERS.
Any other suggestion? -
Error in creating an external table referring to a XML file
I've got an XML file and I've tried to create an external table referring to it in this way:
CREATE TABLE mytable
XML_DATA_COLUMN XMLType
ORGANIZATION EXTERNAL
TYPE ORACLE_LOADER DEFAULT DIRECTORY TEST_DIR
ACCESS PARAMETERS ( records delimited BY newline
(XML_DATA_COLUMN LOB) ) LOCATION ( 'myfile.xml' )
where TEST_DIR is the directory where myfile.xml is stored,
but I get this message:
Error at Command Line:3 Column:4
Error report:
SQL Error: ORA-30656: column type not supported on external organized table
30656.0000 - "column type not supported on external organized table"
*Cause: Attempt to create an external organized table with a column
of type LONG, LOB, BFILE, ADT, or VARRAY.
*Action: These column types are not supported, change the DDL.
I want to have in the XML_DATA_COLUMN the content of myfile.xml so as to handle it by using extract and extractvalue functions.
My oracle version is 10gR2 Express Edition
Thanks!The examples in the following thread include an insert, but you could also use the select statement alone without the insert.
http://www.orafaq.com/forum/mv/msg/172162/511897/0/#msg_511897 -
Error while loading data into External table from the flat files
HI ,
We have a data load in our project which feeds the oracle external tables with the data from the Flat Files(.bcp files) in unix.
While loading the data, we are encountering the following error.
Error occured (Error Code : -29913 and Error Message : ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-04063: un) while loading data into table_ext
Please let us know what needs to be done in this case to solve this problem.
Thanks,
KartheekKartheek,
I used Google (mine still works).... please check those links:
http://oraclequirks.blogspot.com/2008/07/ora-29400-data-cartridge-error.html
http://jonathanlewis.wordpress.com/2011/02/15/ora-29913/
HTH,
Thierry -
Issues with external table from excel file
dear all,
i have been trying to use the below statement to create an external table.this table is referencing an excel file.
CREATE TABLE EMPFRAUD_TEST
SERIAL_NUM VARCHAR2(10 BYTE),
BRANCH_CODE VARCHAR2(10 BYTE),
BUSINESS_ADD VARCHAR2(100 BYTE),
REGIONS VARCHAR2(50 BYTE),
TRANSACTION_DATE_TIME DATE,
REPORT_DATE_TIME DATE,
NO_OF_TRANS VARCHAR2(4 BYTE),
AMOUNT NUMBER,
FRAUD_TYPE VARCHAR2(25 BYTE),
IMPACT_CATEGORY VARCHAR2(10 BYTE)
ORGANIZATION EXTERNAL
( TYPE ORACLE_LOADER
DEFAULT DIRECTORY EXT_EMP_TEST
ACCESS PARAMETERS
( records delimited by newline
badfile 'empfraud%a.bad'
logfile 'empfraud%a.log'
fields terminated by ','
optionally enclosed by '"'lrtrim
missing field values are null
LOCATION ('fraud.csv')
REJECT LIMIT UNLIMITED
NOPARALLEL
NOMONITORING;
the problems is as follows
1) when i run the query above the table will be created,
but when i try to select from the table,an empty table will be display.
when i checked the error log file,the following message was given.
it was gotten from an oracle db on unix server.
"L_NUM
KUP-04036: second enclosing delimiter not found
KUP-04101: record 71 rejected in file /home/oracle/ext_folder_test/fraud.csv
KUP-04021: field formatting error for field ACCOUNT_KEY
KUP-04036: second enclosing delimiter not found
KUP-04101: record 79 rejected in file /home/oracle/ext_folder_test/fraud.csv
KUP-04021: field formatting error for field SERIAL_NUM
KUP-04036: second enclosing delimiter not found
KUP-04101: record 80 rejected in file /home/oracle/ext_folder_test/fraud.csv
error processing column TRANSACTION_DATE_TIME in row 1 for datafile /home/oracle/ext_folder_test/fraud.csv
ORA-01858: a non-numeric character was found where a numeric was expected
error processing column TRANSACTION_DATE_TIME in row 2 for datafile /home/oracle/ext_folder_test/fraud.csv
ORA-01843: not a valid month
error processing column TRANSACTION_DATE_TIME in row 3 for datafile /home/oracle/ext_folder_test/fraud.csv
ORA-01843: not a valid month
error processing column TRANSACTION_DATE_TIME in row 8 for datafile /home/oracle/ext_folder_test/fraud.csv
ORA-01843: not a valid month
error processing column TRANSACTION_DATE_TIME in row 9 for datafile /home/oracle/ext_folder_test/fraud.csv
ORA-01843: not a valid month
error processing column TRANSACTION_DATE_TIME in row 10 for datafile /home/oracle/ext_folder_test/fraud.csv
ORA-01843: not a valid month
error processing column TRANSACTION_DATE_TIME in row 11 for datafile /home/oracle/ext_folder_test/fraud.csv
ORA-01858: a non-numeric character was found where a numeric was expected
error processing column TRANSACTION_DATE_TIME in row 12 for datafile /home/oracle/ext_folder_test/fraud.csv
ORA-01843: not a valid month
error processing column TRANSACTION_DATE_TIME in row 13 for datafile /home/oracle/ext_folder_test/fraud.csv
ORA-01843: not a valid month
error processing column TRANSACTION_DATE_TIME in row 14 for datafile /home/oracle/ext_folder_test/fraud.csv
ORA-01843: not a valid month
error processing column TRANSACTION_DATE_TIME in row 15 for datafile /home/oracle/ext_folder_test/fraud.csv
ORA-01843: not a valid month"
pls i need help to resolve it fast
thank
regards
ajani abdulrahman olayide
NB:
after conversion to .csv format,
BELOW IS THE DATA I AM TRYING TO ACCESS FROM THE EXCEL FILE
BUSINESS OFFICE,REGIONS,Transaction_Date_Time,Report_Date_Time,Account_Key (Account No), Number_of_Transactions,Total_Amount (N)
1,162,9 ojo street,Lagos South,various ,17/01/10,16200388749987,1,5100000,CHEQUE
2,0238,"10 cyril Road, Enugu",East,21/06/2006,23/12/10,020968765357 09867653920174,1,20000000
3,0127,"261, obiageli Rd, Asaba",Mid-West,22/12/2010,23/12/10,'00160030006149,1,6000000
4,0519,"just road, Onitsha
",East,12/03/2010,14/02/11,0896002416575,1,5000000
5,0519,"just road, Onitsha
",East,03/12/2010,14/02/11,06437890134356,1,5000000
6,149,olayide street,Lagos South,10/02/2010,17/02/11,NGN01492501036 ,1,6108950
7,0066,wale,Mid - west,18/02/2011,18/02/10,'05590020002924,1,55157977.53
8,66,john,Mid- west,11/03/2010,14/03/09,'00660680054177,1,6787500
9,0273,waheed Biem,N/Central,Jan 09 to Dec 2010,01/04/11,Nil,1,14146040As others suggested, you have to do the debugging yourself, To avoid the date error you may need something like this:
CREATE TABLE EMPFRAUD_TEST
SERIAL_NUM VARCHAR2(10),
BRANCH_CODE VARCHAR2(10),
BUSINESS_ADD VARCHAR2(100),
REGIONS VARCHAR2(50),
TRANSACTION_DATE_TIME DATE ,
REPORT_DATE_TIME DATE ,
NO_OF_TRANS VARCHAR2(50),
AMOUNT NUMBER,
FRAUD_TYPE VARCHAR2(25),
IMPACT_CATEGORY VARCHAR2(10)
ORGANIZATION EXTERNAL
type oracle_loader default directory saubhik
access parameters
( records delimited by newline
badfile 'empfraud%a.bad'
logfile 'empfraud%a.log'
skip 1
fields terminated by ','
optionally enclosed by '"' ltrim
missing field values are null
( serial_num ,
branch_code ,
business_add ,
regions ,
transaction_date_time date "dd/mm/rrrr",
report_date_time date "dd/mm/rr",
no_of_trans ,
amount ,
FRAUD_TYPE ,
IMPACT_CATEGORY ) ) LOCATION ('fraud.csv')
REJECT LIMIT UNLIMITED
{code} -
Create an external table based on an XML file
Hi ,
I am trying to create an external table based on XML file .Is it possible to do in OWB ?If Yes .Can someone please let me know the same .
I was successfully able to create an external table based on a flat file which is of the extension *.txt.
I was even able to load an XML file to a normal
table .How do we do it to an external table?
Thanks,
ManjulaHello Manjula,
Is it possible for you to spare sometime to list down the steps to be followed for creating External Table using flat file *.txt?
I am trying to do it but not getting through it.
Thanks in Advance,
Ripesh
Maybe you are looking for
-
GR / IR Account maintanence
Hi all, Is one GR / IR account is sufficient to post all GR / IR related postings or a seperate GR / IR account has to be created for each receivings done in the system Thanks Shravan
-
Maps app not working on Network Users
Maps app not working on network users on client machines?, is everyone having same issue?, Thanks in Advance
-
Dragging image from minibridge to frame which isn't on top (CS5)
Hi there, I'm working with InDesign CS5 I've created a frame on a page, dragged an image to it. I now need to replace it with a new image. I do this by: 1) selecting the frame 2)dragging from mini bridge. This works fine if the frame is the top most
-
Changing Names of Temporary tables Created BY CKm and LKM
Hi, I am new to ODI. I want to know if we can change the default name structure that is used by ODi to create temp tables. I$,C$ and E$ table. E.G:- If i want to keep I$1 for one schema and I$2 for another schema. Is there any way by which we can do
-
I have very precious letters I wrote and I want to retrieve them from 2001. Is there any way? Thanks for your help.