SQL loader zoned and nullif
Hi,
I am using SQL loader to insert data from a flat file.
While searching for other options in sqlloader. I have found Zoned datatype.
If I have some negative value in flat file like 98765.4321-
now I have searched on internet and found if I write Zoned(9,4) to store above value, it will store the negative sign also? Just want to confirm if it is? as I have seen readed is it does take in Zoned datatype but not in Zoned external.
So if you can confirm or send me some link for same.
Also I want to write nullif for more then one value for one column. What I found in internet is 2 approach.
1) or condition in nullif. For example
TerminationDate POSITION(58:63) DATE(6) "YYMMDD"
NULLIF(TerminationDate = "000000" OR TerminationDate = "999999" OR
TerminationDate = "731014")
2) Decode the value. For example
TerminationDate POSITION(58:63) "decode (:TerminationDate,
'000000', NULL, '999999', NULL, '731014', NULL, to_date (:TerminationDate,
'YYMMDD') )"
Which one is the better approach out of these 2?
Thanks
user539644 wrote:
1) or condition in nullif. For example
TerminationDate POSITION(58:63) DATE(6) "YYMMDD"
NULLIF(TerminationDate = "000000" OR TerminationDate = "999999" OR
TerminationDate = "731014")
2) Decode the value. For example
TerminationDate POSITION(58:63) "decode (:TerminationDate,
'000000', NULL, '999999', NULL, '731014', NULL, to_date (:TerminationDate,
'YYMMDD') )"
Which one is the better approach out of these 2?The best one is the one that works correctly with good performance and maintainability is the best one - beyond that you decide.
I personally like the NULLIF answer better because I find DECODE to be hard to work with; if you must use DECODE and have a recent version of the database use CASE instead.
Similar Messages
-
SQL*Loader issue with NULLIF
Hi all,
I am trying to use following control file,
LOAD DATA
INFILE *
REPLACE
INTO TABLE T1
FIELDS TERMINATED BY '|'
TRAILING NULLCOLS
OBJECT_NAME CHAR NULLIF OBJECT_NAME = "NULL" ,
SUBOBJECT_NAME CHAR NULLIF SUBOBJECT_NAME = "NULL" ,
OBJECT_ID DECIMAL EXTERNAL NULLIF OBJECT_ID = "NULL" ,
DATA_OBJECT_ID DECIMAL EXTERNAL NULLIF DATA_OBJECT_ID = "NULL" ,
OBJECT_TYPE CHAR NULLIF OBJECT_TYPE = "NULL" ,
CREATED DATE "DD/MM/YYYY HH24:MI:SS" NULLIF CREATED = "NULL" ,
LAST_DDL_TIME DATE "DD/MM/YYYY HH24:MI:SS" NULLIF LAST_DDL_TIME = "NULL" ,
TIMESTAMP CHAR NULLIF TIMESTAMP = "NULL" ,
STATUS CHAR NULLIF STATUS = "NULL" ,
TEMPORARY CHAR NULLIF TEMPORARY = "NULL" ,
GENERATED CHAR NULLIF GENERATED = "NULL" ,
SECONDARY CHAR NULLIF SECONDARY = "NULL"
)I am getting error,
SQL*Loader-350: Syntax error at line 21.
Expecting positive integer or column name, found keyword timestamp.
CHAR NULLIF TIMESTAMP = "NULL" ,
STATUSThe file I am trying to load is a pipe delimited file and has a string "NULL" for NULL values. So, I have added NULLIF for all columns.
Interesting thing is, Oracle allows us to have column names like TIMESTAMP or GENERATED, but I use it in the NULLIF clause, it is effectively syntax error.
The table I am using is like this (it is same as user_objects view),
SQL> desc t1
Name Null? Type
OBJECT_NAME VARCHAR2(128)
SUBOBJECT_NAME VARCHAR2(30)
OBJECT_ID NUMBER
DATA_OBJECT_ID NUMBER
OBJECT_TYPE VARCHAR2(19)
CREATED DATE
LAST_DDL_TIME DATE
TIMESTAMP VARCHAR2(19)
STATUS VARCHAR2(7)
TEMPORARY VARCHAR2(1)
GENERATED VARCHAR2(1)
SECONDARY VARCHAR2(1)If I remove the NULLIF clause for columns, timestamp and generated, there is no problem, the control file works fine.
How can I get around this problem ?
Thanks in advanceTIMESTAMP is a keyword for the loader and confuses it.
rename your column -
Sql@loader-704 and ORA-12154: error messages when trying to load data with SQL Loader
I have a data base with two tables that is used by Apex 4.2. One table has 800,000 records . The other has 7 million records
The client recently upgraded from Apex 3.2 to Apex 4.2 . We exported/imported the data to the new location with no problems
The source of the data is an old mainframe system; I needed to make changes to the source data and then load the tables.
The first time I loaded the data i did it from a command line with SQL loader
Now when I try to load the data I get this message:
sql@loader-704 Internal error: ulconnect OCISERVERATTACH
ORA-12154: tns:could not resolve the connect identifier specified
I've searched for postings on these error message and they all seem to say that SQL Ldr can't find my TNSNAMES file.
I am able to connect and load data with SQL Developer; so SQL developer is able to find the TNSNAMES file
However SQL Developer will not let me load a file this big
I have also tried to load the file within Apex (SQL Workshop/ Utilities) but again, the file is too big.
So it seems like SQL Loader is the only option
I did find one post online that said to set an environment variable with the path to the TNSNAMES file, but that didn't work..
Not sure what else to try or where to look
thanksHi,
You must have more than one tnsnames file or multiple installations of oracle. What i suggest you do (as I'm sure will be mentioned in ed's link that you were already pointed at) is the following (* i assume you are on windows?)
open a command prompt
set TNS_ADMIN=PATH_TO_DIRECTOT_THAT_CONTAINS_CORRECT_TNSNAMES_FILE (i.e. something like set TNS_ADMIN=c:\oracle\network\admin)
This will tell oracle use the config files you find here and no others
then try sqlldr user/pass@db (in the same dos window)
see if that connects and let us know.
Cheers,
Harry
http://dbaharrison.blogspot.com -
SQL Loader : Trim and Decode functions help please
Hi,
I have to load data from a flat file, for some columns i need to use TRIM and DECODE functions.It is a pipe delimited file.
I get syntax errors (one is below) same error listed for TRIM.
SQL*Loader-350: Syntax error at line xx.
Expecting "," or ")", found "DECODE".
===========
,FINAL_BILL_DATE CHAR(30) "TRIM(:FINAL_BILL_DATE)"
,BUSINESS_ID "DECODE(:BUSINESS_ID,'B',1,'C',2,'E',3,'G',4,'O',5,'R',6,'T',7,'U',8,'H',9,-1)"
Can anyone please help.
Thanks
CherrishHello Cherrish.
The error you are receiving leads me to believe that at some point prior to the DECODE on the line for BUSINESS_ID, probably some line even before the FINAL_BILL_DATE line, there a syntactical error causing the quotes before the DECODE to actually terminate some other syntax. Without all of the lines that could actually contribute to this, including the header details, this is the best I can advise.
Hope this helps,
Luke
Please mark the answer as helpful or answered if it is so. If not, provide additional details.
Always try to provide create table and insert table statements to help the forum members help you better. -
SQL Loader Truncate and SQL TRUNCATE difference
Could any one let me know what is difference between truncate used by control file of the SQL Loader and TRUNCATE command used by SQL? Is there any impact or difference of these both over the data files.
ThanksMr Jens I think TRUNCATE in SQLLDR control file reuses extents, unlike SQL TRUNCATE command. In my opinion it is best to truncate these to show the normal usage of these tables, not the elevated values.
Could you please further comment? -
Sql loader scheduling and file polling on Windows
Hi there,
I am looking for tool which can poll a particular folder on Windows box for files upload events and then kick of the sqlldr process by passing the file.
I don't want to write a custom utility but looking for something out of the box.
Let me know your thoughts.
Thanks,oops I think i did not post correctly.
I have the sqlldr part all coded and working against the csv file and able to load into tables. this is fine.
Also In windows I can schedule a batch job and get this running. What I am looking for is a scheduler which also includes file listener utility. THis will run 24X7 on the windows box and as and when a file is uploaded will kickoff the sqlldr job by passing the appropriate file name to the sqlldr task.
I believe this has to be at the OS level if I am not mistaken. If there is way to code this in Oracle I am all for it. Let me know
Thanks In Advance.
Edited by: ssp on Feb 10, 2011 2:26 PM -
I'm setting up a sql*loader script and trying to use the decode function as referred to in 'Applying SQL Operators to Fields' I'm getting an error message ' Token longer than max allowable length of 258 chars'. Is there a limit to the size of the decode statement within sql*loader - or is it better to use a table trigger to handle this on insert? I ran the decode statement as a select through SQL*Plus and it works okay there. Oracle 8.0 Utilities shows example of decode in Ch. 5, but Oracle 9i Utilities Ch. 6 does not. Has anyone done this and what's the impact on performance of the load if I can get it to work? See my example below:
LOAD DATA
INFILE 'e2e_prod_cust_profile.csv'
APPEND
INTO TABLE APPS.RA_CUSTOMER_PROFILES_INTERFACE
FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"'
(Insert_update_flag CHAR(1),
Orig_system_customer_ref CHAR(240),
customer_profile_class_name CHAR(30) NULLIF customer_profile_class=BLANKS
"decode(customer_profile_class_name,
'NORTHLAND Default','(MIA) Default',
'NORTHLAND Non Consolidated','(MIA) Non Cons',
'NORTHLAND Consolidated A','(MIA) Cons A',
'NORTHLAND Consolidated B','(MIA) Cons B',
'NORTHLAND Consolidated C','(MIA) Cons C',
'NORTHLAND Consolidated D','(MIA) Cons D',
'NORTHLAND Cons A NonZS','(MIA) Cons A NonZS',
'NORTHLAND Cons B NonZS','(MIA) Cons B NonZS',
'NORTHLAND Cons C NonZS','(MIA) Cons C NonZS',
'NORTHLAND Cons D NonZS','(MIA) Cons D NonZS',
'NORTHLAND International Billing','(MIA) International Billing',
customer_profile_class_name)",
credit_hold CHAR(1),
overall_credit_limit INTERGER EXTERNAL,
"e2e_cust_profile.ctl" 49 lines, 1855 characters
SQL*Loader-350: Syntax error at line 15.
Token longer than max allowable length of 258 chars
'NORTHLAND Consolidated D','(MIA) Cons D',
^Your controlfile is incomplete and has some typos, but you could try something like:
create or replace function decode_profile_class_name (p_longname IN VARCHAR2)
return VARCHAR2
is
begin
CASE p_longname
WHEN 'NORTHLAND Default' THEN RETURN '(MIA) Default';
WHEN 'NORTHLAND Non Consolidated' THEN RETURN '(MIA) Non Cons';
WHEN 'NORTHLAND Consolidated A' THEN RETURN '(MIA) Cons A';
WHEN 'NORTHLAND Consolidated B' THEN RETURN '(MIA) Cons B';
WHEN 'NORTHLAND Consolidated C' THEN RETURN '(MIA) Cons C';
WHEN 'NORTHLAND Consolidated D' THEN RETURN '(MIA) Cons D';
WHEN 'NORTHLAND Cons A NonZS' THEN RETURN '(MIA) Cons A NonZS';
WHEN 'NORTHLAND Cons B NonZS' THEN RETURN '(MIA) Cons B NonZS';
WHEN 'NORTHLAND Cons C NonZS' THEN RETURN '(MIA) Cons C NonZS';
WHEN 'NORTHLAND Cons D NonZS' THEN RETURN '(MIA) Cons D NonZS';
WHEN 'NORTHLAND International Billing' THEN RETURN '(MIA) International Billing';
ELSE RETURN p_longname;
END CASE;
end;
LOAD DATA
INFILE 'e2e_prod_cust_profile.csv'
APPEND
INTO TABLE APPS.RA_CUSTOMER_PROFILES_INTERFACE
FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"'
Insert_update_flag CHAR(1),
Orig_system_customer_ref CHAR(240),
customer_profile_class_name CHAR(30) NULLIF customer_profile_class=BLANKS "decode_profile_class_name(:customer_profile_class_name)"
credit_hold CHAR(1),
overall_credit_limit INTEGER EXTERNAL -
I know I've done this before but, I don't use SQL Loader often and I'm having issues getting a file to load.
The table has 6 columns in it - one of which is a timestamp.
I was having issues loading it initially with date format issues. I ruled out any issues with the timestamp format by simply loading a dummy table with some timestamp based data and had no issue.
So - I think the issue is around the fact that the table I'm loading's first column being a value I'm attempting to default with a sequence when loading and - that I'm screwing up something there.
Table is as such:
CREATE TABLE ACS_IPS
(seq_ips NUMBER,
col2 VARCHAR2(100),
col3 VARCHAR2(100),
col4 VARCHAR2(100),
col5 TIMESTAMP,
col6 VARCHAR2(100),
col7 VARCHAR2(100),
col8 DATE DEFAULT SYSDATE NOT NULL,
col9 VARCHAR2(30) DEFAULT USER);The control file is:
load data
truncate
into table acs_ips
fields terminated by ","
trailing nullcols
seq_ips "seq_ips.nextval",
col2,
col3,
col4,
col5 TIMESTAMP "YYYY-MM-DD HH24:MI:SS.FF9",
col6,
col7
)The sequence column isn't in the file being loaded... and - there are additional columns that are defaulted on the table that aren't in the control file.
Any help is appreciated... The error I'm getting is:
Rejected - Error on table CAMS.ACS_IPS, column ACTION_START.
ORA-01841: (full) year must be between -4713 and +9999, and not be 0Yeah - not sure that clears it up...
The sequence in my table is the first column - and - I think that's the problem... the sequence is being loaded as the first column in the control file but - it's not in the file being loaded so - it's skewing (again - I think??) the data being read in - which is why I'm getting the timestamp issues on the one column (it's actually reading the next column in the file vs. the actual timestamp one).
If that's the issue - I'm not sure how to avoid it without restructuring the table to stick the sequence physically at the end. I'm certain that's not necessary and - I'm overlooking something that's otherwise simple but evading me. -
ORA-12899 error from function invoked from SQL*Loader
I am getting the above error when I call a function from my SQL*Loader script, and I am not seeing what the problem is. As far as I can see, there should be no problem with the field lengths, unless the length of the automatic variable within my function is somehow being set at 30? Here are the details (in the SQL*Loader script, the field of interest is the last one):
====
Error:
====
Record 1: Rejected - Error on table TESTM8.LET_DRIVE_IN_FCLTY, column DIF_CSA_ID.
ORA-12899: value too large for column "TESTM8"."LET_DRIVE_IN_FCLTY"."DIF_CSA_ID" (actual: 30, maximum: 16)
=======
Function:
=======
CREATE OR REPLACE FUNCTION find_MCO_id (di_oid_in DECIMAL)
RETURN CHAR IS mco_id CHAR;
BEGIN
SELECT AOL_MCO_LOC_CD INTO mco_id
FROM CONV_DI_FLCTY
WHERE DIF_INST_ELMNT_OID = di_oid_in;
RETURN TRIM(mco_id);
END;
==============
SQL*Loader Script:
==============
LOAD DATA
INFILE 'LET_DRIVE_IN_FCLTY.TXT'
BADFILE 'LOGS\LET_DRIVE_IN_FCLTY_BADDATA.TXT'
DISCARDFILE 'LOGS\LET_DRIVE_IN_FCLTY_DISCARDDATA.TXT'
REPLACE
INTO TABLE TESTM8.LET_DRIVE_IN_FCLTY
FIELDS TERMINATED BY '~' OPTIONALLY ENCLOSED BY '"'
DIF_DRIVE_IN_OID DECIMAL EXTERNAL,
DIF_FCLTY_TYPE_OID DECIMAL EXTERNAL NULLIF DIF_FCLTY_TYPE_OID = 'NULL',
DIF_INST_ELMNT_OID DECIMAL EXTERNAL,
DIF_PRI_PERSON_OID DECIMAL EXTERNAL NULLIF DIF_PRI_PERSON_OID = 'NULL',
DIF_SEC_PERSON_OID DECIMAL EXTERNAL NULLIF DIF_SEC_PERSON_OID = 'NULL',
DIF_CREATE_TS TIMESTAMP "yyyy-mm-dd-hh24.mi.ss.ff6",
DIF_LAST_UPDATE_TS TIMESTAMP "yyyy-mm-dd-hh24.mi.ss.ff6",
DIF_ADP_ID CHAR NULLIF DIF_ADP_ID = 'NULL',
DIF_CAT_CLAIMS_IND CHAR,
DIF_CAT_DIF_IND CHAR,
DIF_DAYLT_SAVE_IND CHAR,
DIF_OPEN_PT_TM_IND CHAR,
DIF_CSA_ID CONSTANT "find_MCO_id(:DIF_DRIVE_IN_OID)"
============
Table Definitions:
============
SQL> describe CONV_DI_FLCTY;
Name Null? Type
DIF_INST_ELMNT_OID NOT NULL NUMBER(18)
AOL_MCO_LOC_CD NOT NULL VARCHAR2(3)
SQL> describe LET_DRIVE_IN_FCLTY;
Name Null? Type
DIF_DRIVE_IN_OID NOT NULL NUMBER(18)
DIF_INST_ELMNT_OID NOT NULL NUMBER(18)
DIF_FCLTY_TYPE_OID NUMBER(18)
DIF_ADP_ID VARCHAR2(10)
DIF_CAT_DIF_IND NOT NULL VARCHAR2(1)
DIF_CAT_CLAIMS_IND NOT NULL VARCHAR2(1)
DIF_CSA_ID VARCHAR2(16)
DIF_DAYLT_SAVE_IND NOT NULL VARCHAR2(1)
DIF_ORG_ENTY_ID VARCHAR2(16)
DIF_OPEN_PT_TM_IND NOT NULL VARCHAR2(1)
DIF_CREATE_TS NOT NULL DATE
DIF_LAST_UPDATE_TS NOT NULL DATE
DIF_ITM_FCL_MKT_ID NUMBER(18)
DIF_PRI_PERSON_OID NUMBER(18)
DIF_SEC_PERSON_OID NUMBER(18)
=========================
Thanks for any help with this one!I changed one line of the function to:
RETURN CHAR IS mco_id VARCHAR2(16);
But I still get the same error:
ORA-12899: value too large for column "TESTM8"."LET_DRIVE_IN_FCLTY"."DIF_CSA_ID" (actual: 30, maximum: 16)
I just am not seeing what is being defined as 30 characters. Any ideas much appreciated! -
Hi I am trying to use decode function with sql loader.. and gating error
SQL*Loader-350: Syntax error at line 8.
Expecting "," or ")", found keyword nullif.
code(:error_type,'Banned SBI',:error_type)" NULLIF error_type=BLANKS,
My ctl file is below...
LOAD DATA
INFILE 'abc.dat'
BADFILE 'abc.bad'
INTO TABLE xyz_tmp
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(record_id INTEGER EXTERNAL(4) NULLIF record_id=BLANKS,
error_type CHAR(100) "decode(:error_type,'Banned',:error_type)" NULLIF error_type=BLANKS,
error_subtype CHAR(100) NULLIF error_subtype=BLANKS,
number_of_errors INTEGER EXTERNAL(3) NULLIF number_of_errors=BLANKS,
sbi_number INTEGER EXTERNAL(8) NULLIF sbi_number=BLANKS,
team_name CHAR(60) NULLIF team_name=BLANKS
Could any one help me to correct it please.Hi,
Try changeing the line with decode to,
error_type CHAR(100) "decode(trim(:error_type),NULL, NULL,'Banned',:error_type)" ,
Cheers -
Hi,
Does anybody know if I can generate the unique primary key using an Oracle Sequence for a Database table to which I am inserting records in SQL Loader?
I checked the SQL Loader manual and there is no information as to how to make use of a Oracle sequence.. in the control file?
Thanks
SurajitYes, you can do it. Create the sequence (suppose you call it "PK_SEQ_X") and then in your control file reference it as "PK_SEQ_X.NEXTVAL". For example suppose you wanted to put it into a column named 'Y' the entry in your control file will look like 'load data insert into table Z (Y "PK_SEQ_X.NEXTVAL", ....)'
Note that the double quotes around the sequence name are required. -
Sql Loader - Decimal numbers showing in null column
Greetings,
My apologies if this is in the wrong forum section. It seemed to be the most logical.
I have added new column to a control file used in a sql loader upload and I am getting unexpected results. Long story short, I copy foxpro tables from a network directory to my local pc. A foxpro exe converts these tables to .dat files. Sql loader then uploads the .dat files to matching oracle tables. I've run this program from my pc for years with no problems.
Problem now: We added a new column to a foxpro table and to the matching oracle table. This column in FoxPro in null for now - no data at all. I then added the new column to my ctl file for this table. The program runs, sql loader does it's thing with no errors. However, in the new field in Oracle, I'm finding decimal numbers in many of the records, when all records should have null values in this field. I've checked all other columns in the oracle table and the data looks accurate. I'm not sure why I'm getting these decimal values in the new column.
My log and bad files show no hints of any problems. The bad file is empty for this table.
At first I thought the positioning of the new column in the fox table, .ctl file and the oracle table were not lining up correctly, but I checked and they are.
I've double checked the FoxPro table and all records for this new column are null.
I'm not sure what to check for next or what to test. I am hoping someone in this forum might lend a clue or has maybe seen this problem before. Below is my control file. The new column is the last one: fromweb_id. It is a number field in both FoxPro and Oracle.
Thanks for any advise.
JOBS table control file:
load data
infile 'convdata\fp_ora\JOBS.dat' "str X'08'"
into table JOBS
fields terminated by X'07'
TRAILING NULLCOLS
(SID,
CO_NAME "replace(replace(:CO_NAME,chr(11),chr(10)),chr(15),chr(13))",
JOB_TITLE "replace(replace(:JOB_TITLE,chr(11),chr(10)),chr(15),chr(13))",
CREDITS,
EARN_DATE date "mm/dd/yyyy",
COMMENTS CHAR(2000) "replace(replace(:COMMENTS,chr(11),chr(10)),chr(15),chr(13))",
DONT_SHOW,
PC_SRC "replace(replace(:PC_SRC,chr(11),chr(10)),chr(15),chr(13))",
PC_SRC_NO,
SALARY,
SALFOR,
ROOM,
BOARD,
TIPS,
UPD_DATE date "mm/dd/yyyy hh12:mi:ss am",
STUKEY,
JOBKEY,
JO_COKEY,
JO_CNKEY,
JO_ZUKEY,
EMPLID,
CN_NAME "replace(replace(:CN_NAME,chr(11),chr(10)),chr(15),chr(13))",
JOB_START date "mm/dd/yyyy",
JOB_END date "mm/dd/yyyy",
FROMWEB_ID)I apologize for not explaining how this was resolved. Sql Loader was working as it should.
The problem was due to new fields being added to the FoxPro table, along with the fromweb_id column, that I was not informed about. I was asked to add a column named fromweb_id to the oracle jobs table and to the sql-loader program. I was not told that there were other columns added at the same time. In the foxpro table, the fromweb_id column was the last column added.
The jobs.dat file contained data from all columns in the foxpro table, including all the new columns. I only added the "fromweb_id" to the control file, which is what I was asked to do. When it ran, it was getting values from one of the new columns and the values were being uploaded into the fromweb_id column in Oracle. It is that simple.
When I had checked the FoxPro table earlier, I did not pickup on the other new columns. I was focussing in on looking for values in the fromweb_id column. When back-tracing data in the jobs.dat file, I found a value in the fromweb_id column that matched a value in a differnt column (new column) in FoxPro. That is when I realized the other new columns. I instantly knew what the problem was.
Thanks for all the feedback. I'm sorry if this was an inconvenience to anyone. I'll try to dig a little deeper next time. Lessons learned...
regards, -
Using SQL Loader script in a Stored Procedure
Can I use SQL Loader script in a stored procedure and then execute it from a front-end appl.? The reason for this seemingly convoluted solution is that the users don't want a batch load though the records volume is quite high (around 1 mil). Other loads using ODBC connection or OLE DB seem to be inferior to SQL Loader.
I would suggest a couple of solutions:
1. Have a cgi script that can upload the file to the server from a web ui, then have the cgi script call the sql*loader file, and it will insert into the database.
2. You can try to use External tables. This is avaliable in 9i and onwards. You will be able to make any sql DML on the external table.
I would normally use sql*loader, move the data to a staging table with nologging, and paralle loading. After it has been loaded into the staging table I would process it into my main tables. Have used this approach with up to 60 million records in one load.
You can do calls to C procedures, Pro*C procedures through PLSQL, as well as java calls, or use Java stored procedures.
My experience is that SQL*Loader is the fastest way to load data into the database. -
Track flat files that failed loading in sql loader
Hi,
Can anyone please suggest me any way to track the flat files which failed while loading in sql loader. And pass this failed flat file name to any column of a database table
Thanks in advance.
Edited by: 806821 on Nov 2, 2010 10:22 AMHi Morgan thnannks for ur reply.
Define failed. 1 row not loaded ... no rows not loaded ... what operating system ... what version of the Oracle database ... track in a table, send an email?
Your inquiry is long on generalities and short on specifics: Fill in all the blanks ... not just the ones I've listed above.
even if 1 row is not loaded even then it should be considered failed to load and the name of the that particular flat file should be fetched.
Operating system is unix
Oracle database we are using is R12
track in a table , yeah we want to send an email notificaiton whenever the flat files fails to load.
Thanks once again...!! -
Can I have two Data Files in One control file of sql*loader tool
hi,
Can someone help me out. is it possible to have two Data Files in one control file of Sql*loader.
And isit possible to run 10,000 records before lunch and 10,000 records before tea and 10,000 records before evening session by giving breaks after every 10,000 records.
Thanks
RamYes. You can specify two datafiles in one control file and can load using sql loader.
I give you the sample control file.
Load DATA
INFILE 'TEST1.CSV'
INFILE 'TEST2.CSV'
TRUNCATE
INTO TABLE TEST_P
FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(COL_1,
COL_2,
COL_n)
Hope It will help you.
-Karthik
Maybe you are looking for
-
What to format an external hard drive to
I have a imac, terrabyte drive and all that... however, I also have another external 1.5 terrabyte that I've not used before. If I want to use this drive between windows and a mac so it works on both basically...do I format the drive through fusion o
-
I would like to be able to use my desktop speakers to listen to a satelitte radio receiver using the Mac Pro's line input and volume controls but I can not find a way to nake this audio "pass through" from the line input to the line output. I know th
-
PowerBook G4 on 802.11n
Hi, I've seen some discussion about this on other threads, but no resolution ... even someone whose PB died on them before getting to try it out. Anyone out there have any success??? I'm wondering about these Quickertek products? pcmcia – http://www.
-
Hello, I have an issue with DMS ACL's. I have activated the TDSwitch and was able to see the Authorization tab in the DIR. I added a user and given that user "NOauth" access for a particular document. But when I login as that user and searched for th
-
Automatic Login into HTMLDB App
hi all, I want to allow a user to access to a page in my HTMLDB application. The application sends an email to a user with a link to the page in the application. When the user clicks the link in the mail, the browser will automatically display the pa