Automate SQL*LOADER
Hi,
I am working on a Decision Support System project.
I need to load flat files into Oracle tables through SQL * Loader. And the entire process should be invoked through JAVA front end.
How do I go about?
I will deeply appreciate any help.
Raghu.
Hi,
In our prev. project, We have customized-(automated) SQL*LOADER. There, we were using UNIX O/S. So We have used shell scripts for creating the control file (a script can create control file automatically). And it will call the sql loader and load the data into tables.
Here u can use same logic.
If ur flat file contents are in same format, u can use the static control files (means, at installation time u can create control files) and whenever u want this, u can call. Dont go for dynamic control files.
1. If u is using Java as front end, u can use native methods and call sql loader (exe). Problem is, U can not invoke, and such thing from client m/c. U can do it only from server side.
2. This way also u can try. By using external procedure method, u can call shared library and shared library can invoke sql loader (write a small C shared library program for invoking SQL*LOADER). Here, u can invoke SQL*LOADER from client m/c also.
3. One more ways is there. By using listener tech. u can invoke it. Create listener program and run on server side as back ground process. Whenever, there is request, it will sql loader.
With regards,
Boby Jose Thekkanath
[email protected]
Dharma Computers(p) Ltd.
Bangalore-India.
Similar Messages
-
Hi,
Does anybody knows if it is possible to export more than one table in loader format in SQL developer.
I have no idea how it can be possible (scripting, java or sql extension...???)
If somebody has always managed to perform this king of stuff, can he help me ? I would give him eternal thanks ;-)
Thanks in advance,
Regards,
Raphael GEGOUE
FranceAt this stage it is not possible to export to more than one table to SQL Loader at a time. Please post a request on the Exchange for this.
Regards
Sue -
Hi,
I need to automate schedule job to import excel files into Oracle tables using SQL Loader.
Can any one please tell me how to do this?
thanks
JennyDepends on a couple of factors.
If you are on a UNIX platform, you can create a shell script and schedule it with cron.
If you are on a Windows platform, you can create a batch file and schedule it with the Windows scheduler.
Or, if you are on Oracle 9i or 10g, you could use the external table feature instead of SQL*Loader. Then you could write a stored procedure to process the external table and schedule it using the Oracle scheduler (DBMS_JOB). This would probably be my preference. -
ORA-12899 error from function invoked from SQL*Loader
I am getting the above error when I call a function from my SQL*Loader script, and I am not seeing what the problem is. As far as I can see, there should be no problem with the field lengths, unless the length of the automatic variable within my function is somehow being set at 30? Here are the details (in the SQL*Loader script, the field of interest is the last one):
====
Error:
====
Record 1: Rejected - Error on table TESTM8.LET_DRIVE_IN_FCLTY, column DIF_CSA_ID.
ORA-12899: value too large for column "TESTM8"."LET_DRIVE_IN_FCLTY"."DIF_CSA_ID" (actual: 30, maximum: 16)
=======
Function:
=======
CREATE OR REPLACE FUNCTION find_MCO_id (di_oid_in DECIMAL)
RETURN CHAR IS mco_id CHAR;
BEGIN
SELECT AOL_MCO_LOC_CD INTO mco_id
FROM CONV_DI_FLCTY
WHERE DIF_INST_ELMNT_OID = di_oid_in;
RETURN TRIM(mco_id);
END;
==============
SQL*Loader Script:
==============
LOAD DATA
INFILE 'LET_DRIVE_IN_FCLTY.TXT'
BADFILE 'LOGS\LET_DRIVE_IN_FCLTY_BADDATA.TXT'
DISCARDFILE 'LOGS\LET_DRIVE_IN_FCLTY_DISCARDDATA.TXT'
REPLACE
INTO TABLE TESTM8.LET_DRIVE_IN_FCLTY
FIELDS TERMINATED BY '~' OPTIONALLY ENCLOSED BY '"'
DIF_DRIVE_IN_OID DECIMAL EXTERNAL,
DIF_FCLTY_TYPE_OID DECIMAL EXTERNAL NULLIF DIF_FCLTY_TYPE_OID = 'NULL',
DIF_INST_ELMNT_OID DECIMAL EXTERNAL,
DIF_PRI_PERSON_OID DECIMAL EXTERNAL NULLIF DIF_PRI_PERSON_OID = 'NULL',
DIF_SEC_PERSON_OID DECIMAL EXTERNAL NULLIF DIF_SEC_PERSON_OID = 'NULL',
DIF_CREATE_TS TIMESTAMP "yyyy-mm-dd-hh24.mi.ss.ff6",
DIF_LAST_UPDATE_TS TIMESTAMP "yyyy-mm-dd-hh24.mi.ss.ff6",
DIF_ADP_ID CHAR NULLIF DIF_ADP_ID = 'NULL',
DIF_CAT_CLAIMS_IND CHAR,
DIF_CAT_DIF_IND CHAR,
DIF_DAYLT_SAVE_IND CHAR,
DIF_OPEN_PT_TM_IND CHAR,
DIF_CSA_ID CONSTANT "find_MCO_id(:DIF_DRIVE_IN_OID)"
============
Table Definitions:
============
SQL> describe CONV_DI_FLCTY;
Name Null? Type
DIF_INST_ELMNT_OID NOT NULL NUMBER(18)
AOL_MCO_LOC_CD NOT NULL VARCHAR2(3)
SQL> describe LET_DRIVE_IN_FCLTY;
Name Null? Type
DIF_DRIVE_IN_OID NOT NULL NUMBER(18)
DIF_INST_ELMNT_OID NOT NULL NUMBER(18)
DIF_FCLTY_TYPE_OID NUMBER(18)
DIF_ADP_ID VARCHAR2(10)
DIF_CAT_DIF_IND NOT NULL VARCHAR2(1)
DIF_CAT_CLAIMS_IND NOT NULL VARCHAR2(1)
DIF_CSA_ID VARCHAR2(16)
DIF_DAYLT_SAVE_IND NOT NULL VARCHAR2(1)
DIF_ORG_ENTY_ID VARCHAR2(16)
DIF_OPEN_PT_TM_IND NOT NULL VARCHAR2(1)
DIF_CREATE_TS NOT NULL DATE
DIF_LAST_UPDATE_TS NOT NULL DATE
DIF_ITM_FCL_MKT_ID NUMBER(18)
DIF_PRI_PERSON_OID NUMBER(18)
DIF_SEC_PERSON_OID NUMBER(18)
=========================
Thanks for any help with this one!I changed one line of the function to:
RETURN CHAR IS mco_id VARCHAR2(16);
But I still get the same error:
ORA-12899: value too large for column "TESTM8"."LET_DRIVE_IN_FCLTY"."DIF_CSA_ID" (actual: 30, maximum: 16)
I just am not seeing what is being defined as 30 characters. Any ideas much appreciated! -
How to have more than one condition on same column --- using SQL Loader
Hi All,
I am stuck with SQL Loader..
How do I filter records before loading in the table using when clause..
i should load data only when
field1 = 'AC' or 'VC'
field2 is NULL
i used various combinations in when clause like
a) when field1='AC' or field1='VC' and field2 = BLANKS
b) when (field1='AC') and (field2 = BLANKS )
& similar...
In all the cases I tried I could not implement OR condition with field1 and null condition with field2
but my main concern is can we use OR or IS NULL things in when clause of SQL Loader..
is it possible to check this anywhere??
any alternate solution u could suggest??
Thanks
DikshitOk I'll try that, although I did try it earlier when I had iTunes 5.xx loaded, I think.
As to size of playlists, I have a master (900 songs) that defines what will fit onto the ipod , I then generate all the others as subsets of the master (not of the library)- hence I know they will all fit
Can you also clarify something for me: the dialogue box we are discussing is intended, I think, so that one can set the automatic synching of certain playlists between the PC & the ipod. Is that the only way one can select other playlists to go to the ipod - i.e. as static once-offs, not synchronised ?
Apple' docs, I think, are poor in this regard - they assume most ipods are bigger then the users song library and they gloss over the details of this alternate mode of playlist synching.
Thanks - Nick -
Comparison of Data Loading techniques - Sql Loader & External Tables
Below are 2 techniques using which the data can be loaded from Flat files to oracle tables.
1) SQL Loader:
a. Place the flat file( .txt or .csv) on the desired Location.
b. Create a control file
Load Data
Infile "Mytextfile.txt" (-- file containing table data , specify paths correctly, it could be .csv as well)
Append or Truncate (-- based on requirement) into oracle tablename
Separated by "," (or the delimiter we use in input file) optionally enclosed by
(Field1, field2, field3 etc)
c. Now run sqlldr utility of oracle on sql command prompt as
sqlldr username/password .CTL filename
d. The data can be verified by selecting the data from the table.
Select * from oracle_table;
2) External Table:
a. Place the flat file (.txt or .csv) on the desired location.
abc.csv
1,one,first
2,two,second
3,three,third
4,four,fourth
b. Create a directory
create or replace directory ext_dir as '/home/rene/ext_dir'; -- path where the source file is kept
c. After granting appropriate permissions to the user, we can create external table like below.
create table ext_table_csv (
i Number,
n Varchar2(20),
m Varchar2(20)
organization external (
type oracle_loader
default directory ext_dir
access parameters (
records delimited by newline
fields terminated by ','
missing field values are null
location ('file.csv')
reject limit unlimited;
d. Verify data by selecting it from the external table now
select * from ext_table_csv;
External tables feature is a complement to existing SQL*Loader functionality.
It allows you to –
• Access data in external sources as if it were in a table in the database.
• Merge a flat file with an existing table in one statement.
• Sort a flat file on the way into a table you want compressed nicely
• Do a parallel direct path load -- without splitting up the input file, writing
Shortcomings:
• External tables are read-only.
• No data manipulation language (DML) operations or index creation is allowed on an external table.
Using Sql Loader You can –
• Load the data from a stored procedure or trigger (insert is not sqlldr)
• Do multi-table inserts
• Flow the data through a pipelined plsql function for cleansing/transformation
Comparison for data loading
To make the loading operation faster, the degree of parallelism can be set to any number, e.g 4
So, when you created the external table, the database will divide the file to be read by four processes running in parallel. This parallelism happens automatically, with no additional effort on your part, and is really quite convenient. To parallelize this load using SQL*Loader, you would have had to manually divide your input file into multiple smaller files.
Conclusion:
SQL*Loader may be the better choice in data loading situations that require additional indexing of the staging table. However, we can always copy the data from external tables to Oracle Tables using DB links.Please let me know your views on this.
-
Hi All,
In our application, we are allowing user to upload data using excel sheet in UI.
We are using PHP script in UI and using SQL Loader to load data from excel sheet to temp_table.
The temp_table has a primary key.
Here my question is , Is there any way to put some batch id for every upload in that table in automatic way ?
so that we can easily extract the data by using batch id
we are using Oracle 11g.All that does is load a constant value, in which case you might as well just use constant 815 in your control file. If you want to automatically increment the value for each batch, then you need to use a different method.
Please see the example below. Prior to each data load, it loads the next value of the sequence into a separate table, then selects that value during the data load. Note that a SQL*Loader expression that uses select must be enclosed within parentheses within the double quotes.
SCOTT@orcl_11gR2> host type test1.dat
1 Prod1
2 Prod2
3 Prod3
4 Prod4
5 Prod5
SCOTT@orcl_11gR2> host type test2.dat
6 Prod6
7 Prod7
8 Prod8
SCOTT@orcl_11gR2> host type batch.ctl
options(load=1)
load data
replace
into table batch_tab
(batch_id expression "test_seq.nextval")
SCOTT@orcl_11gR2> host type data.ctl
load data
append
into table temp_table
fields terminated by whitespace
trailing nullcols
(p_id,
p_name,
batch_id expression "(select batch_id from batch_tab)")
SCOTT@orcl_11gR2> create table temp_table
2 (p_id number primary key,
3 p_name varchar2(6),
4 batch_id number)
5 /
Table created.
SCOTT@orcl_11gR2> create sequence test_seq
2 /
Sequence created.
SCOTT@orcl_11gR2> create table batch_tab
2 (batch_id number)
3 /
Table created.
SCOTT@orcl_11gR2> -- first load:
SCOTT@orcl_11gR2> host sqlldr scott/tiger control=batch.ctl log=batch1.log
SQL*Loader: Release 11.2.0.1.0 - Production on Fri Apr 19 17:16:33 2013
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Commit point reached - logical record count 1
SCOTT@orcl_11gR2> host sqlldr scott/tiger control=data.ctl data=test1.dat log=test1.log
SQL*Loader: Release 11.2.0.1.0 - Production on Fri Apr 19 17:16:33 2013
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Commit point reached - logical record count 5
SCOTT@orcl_11gR2> select * from batch_tab
2 /
BATCH_ID
1
1 row selected.
SCOTT@orcl_11gR2> select * from temp_table
2 /
P_ID P_NAME BATCH_ID
1 Prod1 1
2 Prod2 1
3 Prod3 1
4 Prod4 1
5 Prod5 1
5 rows selected.
SCOTT@orcl_11gR2> -- second load:
SCOTT@orcl_11gR2> host sqlldr scott/tiger control=batch.ctl log=batch2.log
SQL*Loader: Release 11.2.0.1.0 - Production on Fri Apr 19 17:16:33 2013
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Commit point reached - logical record count 1
SCOTT@orcl_11gR2> host sqlldr scott/tiger control=data.ctl data=test2.dat log=test2.log
SQL*Loader: Release 11.2.0.1.0 - Production on Fri Apr 19 17:16:33 2013
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Commit point reached - logical record count 3
SCOTT@orcl_11gR2> select * from batch_tab
2 /
BATCH_ID
2
1 row selected.
SCOTT@orcl_11gR2> select * from temp_table
2 /
P_ID P_NAME BATCH_ID
1 Prod1 1
2 Prod2 1
3 Prod3 1
4 Prod4 1
5 Prod5 1
6 Prod6 2
7 Prod7 2
8 Prod8 2
8 rows selected. -
SQL Loader and Insert Into Performance Difference
Hello All,
Im in a situation to measure performance difference between SQL Loader and Insert into. Say there 10000 records in a flat file and I want to load it into a staging table.
I know that if I use PL/SQL UTL_FILE to do this job performance will degrade(dont ask me why im going for UTL_FILE instead of SQL Loader). But I dont know how much. Can anybody tell me the performance difference in % (like 20% will decrease) in case of 10000 records.
Thanks,
Kannan.Kannan B wrote:
Do not confuse the topic, as I told im not going to use External tables. This post is to speak the performance difference between SQL Loader and Simple Insert Statement.I don't think people are confusing the topic.
External tables are a superior means of reading a file as it doesn't require any command line calls or external control files to be set up. All that is needed is a single external table definition created in a similar way to creating any other table (just with the additional external table information obviously). It also eliminates the need to have a 'staging' table on the database to load the data into as the data can just be queried as needed directly from the file, and if the file changes, so does the data seen through the external table automatically without the need to re-run any SQL*Loader process again.
Who told you not to use External Tables? Do they know what they are talking about? Can they give a valid reason why external tables are not to be used?
IMO, if you're considering SQL*Loader, you should be considering External tables as a better alternative. -
hi,
I want to insert 100,000 records daily in a table for the first month and then in next month these records are going to be replaced by new updated records.
there might be few addition and deletion in the previous records also.
actually its consumer data so there might be few consumer who have withdrawn the utility and there will be some more consumer added in the database.
but almost 99% of the previous month data have to be updated/replaced with the fresh month data.
For instance, what i have in my mind is that i will use sql loader to load data for the first month and then i will delete the previous data using sqlPlus and load the fresh month data using sql loader again.
1. Is this ok ? or there is some better solution to this.
2. I have heard of external files, are they feasible in my scenario?
3. I have planned that i will make scripts for sqlPlus and Loader and use them in batch files. (OS windows 2003 server, Oracle 9i database). is there some better choice to make all the procedure automatic?
looking for your suggestions
nadeem ameerI would suggest u use External tables since its more flexible then
sqlloader & is a better option.
For using external tables
1)u will have to create a directory first
2)Generally creation od directory is done by sys,hence after creating the directory
privileges read & write to be provided to user .
3)Creation of external tables.
4) Now use the table as a normal table to insert ,update delete in
ur table.
U can get more information from
http://www.oracle-base.com/articles/9i/SQLNewFeatures9i.php#ExternalTables
Create Directory <directory_name> as <Directory path where file be present>
Grant read,write on directory <directory_name> to <username>
CREATE TABLE <table_name>
(<column names>)
ORGANIZATION EXTERNAL
TYPE ORACLE_LOADER
DEFAULT DIRECTORY ,directory_name>
ACCESS PARAMETERS
RECORDS DELIMITED BY NEWLINE
FIELDS TERMINATED BY ','
LOCATION (<filename>)
PARALLEL 5
REJECT LIMIT 200;
Hope this helps. -
Importing to a Oracle Table from SQL Loader Fails
Hi ,
When I try to upload one xml file from my server to my table in oracle server using sql loader it fails at times.Some times it works perfectly.
This is a daily process which automatically dumps data to my oracle.
Please find the error log :
SQL*Loader: Release 10.2.0.4.0 - Production on Thu Dec 5 04:07:32 2013
Copyright (c) 1982, 2007, Oracle. All rights reserved.
Control File: xmlFeedDelta.ctl
Data File: xmlFileNames_Delta.txt
Bad File: xmlFileNames_Delta.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 1000
Bind array: 50000 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table XMLFEEDDELTA, loaded from every logical record.
Insert option in effect for this table: APPEND
Column Name Position Len Term Encl Datatype
FILENAME FIRST 4000 , CHARACTER
FILECONTENT DERIVED * EOF CHARACTER
Dynamic LOBFILE. Filename in field FILENAME
value used for ROWS parameter changed from 50000 to 63
SQL*Loader-643: error executing INSERT statement for table XMLFEEDDELTA
ORA-03113: end-of-file on communication channel
SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
Table XMLFEEDDELTA:
0 Rows successfully loaded.
0 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 252378 bytes(63 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 1
Total logical records rejected: 0
Total logical records discarded: 0
Run began on Thu Dec 05 04:07:32 2013
Run ended on Thu Dec 05 04:08:42 2013
Elapsed time was: 00:01:10.05
CPU time was: 00:00:00.28
My Control File Looks like this :
LOAD DATA
INFILE xmlFileNames_Delta.txt
INTO TABLE xmlFeedDelta APPEND
fields terminated by ','
filename CHAR(4000),
filecontent LOBFILE(filename) terminated by eof
My Database version :
Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production
PL/SQL Release 11.2.0.2.0 - Production
"CORE 11.2.0.2.0 Production"
TNS for Linux: Version 11.2.0.2.0 - Production
NLSRTL Version 11.2.0.2.0 - Production
I am not sure why this is happening at times . Any help would be appreciated.Hi,
have you tried with the FILLER command like
LOAD DATA
INFILE xmlFileNames_Delta.txt
INTO TABLE xmlFeedDelta APPEND
fields terminated by ','
filename FILLER CHAR(4000),
filecontent LOBFILE(filename) terminated by eof -
SQL Loader (how to cut data header)
Hi there,
[oracle 11g]
I got the following text file:
mod; DD.MM.YYYY; HH:MM:SS; aligned
src; "ptv "; "15.04.2012"; "10:47:49"
chs; "ISO8859-1"
ver; "V1.0"
ifv; "V1.0"
dve; "V1.0"
fft; "LIO"
tbl; MENGE_FGR
atr; BASIS_VERSION; FGR_NR; FGR_TEXT
frm; num[9.0]; num[5.0]; char[40]
rec; 122; 8; "VVZ"
rec; 123; 18; "VHZ"
rec; 124; 13; "VTZ" Now I am interested in the column TBL and ATR and follwing rawdata
Do you see a way to automatically create the table MENGE_FR with columns BASIS_VERSION; FGR_NR;FGR_TEST and column types num, num, char and insert the raw data below?
PS:OK, this is mysql ...so I need to convert this first to sql. So you should see num as number.
Thx in advance Thorsten
Edited by: Thorsten on 16.05.2013 07:30
Edited by: Thorsten on 16.05.2013 07:32There are various ways that you could do this. I have demonstrated one method below. I created a table with two columns, then used SQL*Loader to load the data from the text file into those two columns. Skipping the header rows is optional. You could also use an external table instead, if the text file is on your server, not your client. I then used some PL/SQL to create and execute "create table" and "insert" statements. This is just some starter code. You will need to make modifications to handle other data types and such that were not in the example that you provided, but it should give you the general idea.
SCOTT@orcl_11gR2> host type text_file.dat
mod; DD.MM.YYYY; HH:MM:SS; aligned
src; "ptv "; "15.04.2012"; "10:47:49"
chs; "ISO8859-1"
ver; "V1.0"
ifv; "V1.0"
dve; "V1.0"
fft; "LIO"
tbl; MENGE_FGR
atr; BASIS_VERSION; FGR_NR; FGR_TEXT
frm; num[9.0]; num[5.0]; char[40]
rec; 122; 8; "VVZ"
rec; 123; 18; "VHZ"
rec; 124; 13; "VTZ"
SCOTT@orcl_11gR2> host type test.ctl
options(skip=7)
load data
infile text_file.dat
into table tbl
(col1 terminated by ';',
col2 terminated by x'0a')
SCOTT@orcl_11gR2> create table tbl
2 (col1 varchar2(4),
3 col2 varchar2(60))
4 /
Table created.
SCOTT@orcl_11gR2> host sqlldr scott/tiger control=test.ctl log=test.log
SQL*Loader: Release 11.2.0.1.0 - Production on Thu May 16 13:44:24 2013
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Commit point reached - logical record count 6
SCOTT@orcl_11gR2> select * from tbl
2 /
COL1 COL2
tbl MENGE_FGR
atr BASIS_VERSION; FGR_NR; FGR_TEXT
frm num[9.0]; num[5.0]; char[40]
rec 122; 8; "VVZ"
rec 123; 18; "VHZ"
rec 124; 13; "VTZ"
6 rows selected.
SCOTT@orcl_11gR2> declare
2 v_tab varchar2(30);
3 v_atr varchar2(32767);
4 v_frm varchar2(32767);
5 v_sql varchar2(32767);
6 v_cols number;
7 v_next varchar2(32767);
8 begin
9 select col2 into v_tab from tbl where col1 = 'tbl';
10 select col2 || ';' into v_atr from tbl where col1 = 'atr';
11 select col2 || ';' into v_frm from tbl where col1 = 'frm';
12 v_sql := 'CREATE TABLE ' || v_tab || ' (';
13 select regexp_count (col2, ';') + 1 into v_cols from tbl where col1 = 'atr';
14 for i in 1 .. v_cols loop
15 v_sql := v_sql || substr (v_atr, 1, instr (v_atr, ';') - 1) || ' ';
16 v_next := substr (v_frm, 1, instr (v_frm, ';') - 1);
17 v_next := replace (v_next, '[', '(');
18 v_next := replace (v_next, ']', ')');
19 v_next := replace (v_next, '.', ',');
20 v_next := replace (v_next, 'num', 'number');
21 v_next := replace (v_next, 'char', 'varchar2');
22 v_sql := v_sql || v_next || ',';
23 v_atr := substr (v_atr, instr (v_atr, ';') + 1);
24 v_frm := substr (v_frm, instr (v_frm, ';') + 1);
25 end loop;
26 v_sql := rtrim (v_sql, ',') || ')';
27 dbms_output.put_line (v_sql);
28 execute immediate v_sql;
29 for r in (select col2 from tbl where col1 = 'rec') loop
30 v_sql := 'INSERT INTO ' || v_tab || ' VALUES (''';
31 v_sql := v_sql || replace (replace (r.col2, ';', ''','''), '"', '');
32 v_sql := v_sql || ''')';
33 dbms_output.put_line (v_sql);
34 execute immediate v_sql;
35 end loop;
36 end;
37 /
CREATE TABLE MENGE_FGR ( BASIS_VERSION number(9,0), FGR_NR number(5,0),
FGR_TEXT varchar2(40))
INSERT INTO MENGE_FGR VALUES (' 122',' 8',' VVZ')
INSERT INTO MENGE_FGR VALUES (' 123',' 18',' VHZ')
INSERT INTO MENGE_FGR VALUES (' 124',' 13',' VTZ')
PL/SQL procedure successfully completed.
SCOTT@orcl_11gR2> describe menge_fgr
Name Null? Type
BASIS_VERSION NUMBER(9)
FGR_NR NUMBER(5)
FGR_TEXT VARCHAR2(40)
SCOTT@orcl_11gR2> select * from menge_fgr
2 /
BASIS_VERSION FGR_NR FGR_TEXT
122 8 VVZ
123 18 VHZ
124 13 VTZ
3 rows selected. -
SQL LOADER in JSP i know Oracle 10g have a utility sqlldr
Can i submit file(CSV) from a remote client to server which can be than upload into Oracle DB.
i know how to upload file into root directory
How can i use JSP to run sqlldr on server so that the process become automatic.
I know how to give command sqlldr in Command prompt but what is the scripting on JSP so that sqlldr can run and csv file get uploaded.
where from i can info about it please let me know if you have any idea
thanks
Message was edited by:
UDAYHi
Can you please check that is it possible to execute OS commands in JSP ??
If possible then you can very well execute SQLLDR in JSP .
thanks
Note : This is Oracle database general forum. -
SQL Loader loads duplicate records even when there is PK defined
Hi,
I have created table with pk on one of the column.Loaded the table using sql loader.the flat file has duplicate record.It loaded all the records without any error but now the index became unusable.
The requirement is to fail the process if there are any duplicate.Please help me in understaing why this is happening.
Below is the ctl file
OPTIONS(DIRECT=TRUE, ERRORS=0)
UNRECOVERABLE
load data
infile 'test.txt'
into table abcdedfg
replace
fields terminated by ',' optionally enclosed by '"'
col1 ,
col2
i defined pk on col1Check out..
http://download-east.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_modes.htm#sthref1457
It states...
During a direct path load, some integrity constraints are automatically disabled. Others are not. For a description of the constraints, see the information about maintaining data integrity in the Oracle Database Application Developer's Guide - Fundamentals.
Enabled Constraints
The constraints that remain in force are:
NOT NULL
UNIQUE
PRIMARY KEY (unique-constraints on not-null columns)Since OP has the primary key in place before starting the DIRECT path load, this is contradictory to what the documentation says or probably a bug? -
Need faster data loading (using sql-loader)
i am trying to load approx. 230 million records (around 60-bytes per record) from a flat file into a single table. i have tried sql-loader (conventional load) and i'm seeing performance degrade as the file is being processed. i am avoiding direct path sql-loading because i need to maintain uniqueness using my primary key index during the load. so the degradation of the load performance doesn't shock me. the source data file contains duplicate records and may contain records that are duplicates of those that are already in the table (i am appending during sql-loader).
my other option is to unload the entire table to a flat file, concatenate the new file onto it, run it through a unique sort, and then direct-path load it.
has anyone had a similar experience? any cool solutions available that are quick?
thanks,
jeffIt would be faster to suck into a Oracle table, call it a temporary table and then make a final move into the final table.
This way you could direct load into an oracle table then you could
INSERT /*+ APPEND */ INTO Final_Table
SELECT DISTINCT *
FROM Temp_Table
ORDER BY ID;This would do a 'direct load' type move from your temp teable to the final table, which automatically merging the duplicate records;
So
1) Direct Load from SQL*Loader into temp table.
2) Place index (non-unique) on temp table column ID.
3) Direct load INSERT into the final table.
Step 2 may make this process faster or slower, only testing will tell.
Good Luck,
Eric Kamradt -
Hello,
I have got some problemes with sql loader syntax.
I'm trying to import some data into my table. And I want that the sysdate is import in my table automatically.
This is my char.ctl file :
LOAD DATA
INFILE *
INTO TABLE CHANGE_REQUEST
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
(CHAR_ID,DATE_CREATION SYSDATE,STATUS,ABSTRACT,USER_ID_FK,SYSTEM_ID_FK,PRIORITY)
BEGINDATA
1,PENDING,Need to have the firsname when listing users,1,Medium
But I still get the error : ORA-00947: not enough values
I have try different line like :
1, ,PENDING,Need to have the firsname when listing users,1,Medium
or
1,SYSDATE,PENDING,Need to have the firsname when listing users,1,Medium
But nothing is working.
Where is the mistake?
Regards.Another way, wich does not require an alter table is something like this:
LOAD DATA
INFILE *
INTO TABLE CHANGE_REQUEST
FIELDS TERMINATED BY ','
TRAILING NULLCOLS
(CHAR_ID,
stATUS,
ABSTRACT,
uSER_ID_FK,SYSTEM_ID_FK,PRIORITY,
DATE_CREATION "sysdate"
BEGINDATA
1,PENDING,Print issue,1,1,High
I putted DATE_CREATION in last position only for example.
Hope this can help someone.
Bye bye.
Antonio
Maybe you are looking for
-
Why an Additional Accounting document created after cancellation?
Dear Experts, The sales order item was Invoiced using VF01 and excise invoice generated which had two accounting document created. When we cancelled through VF11 we have now 3 accounting document instead of two. When checked the above two mentioned
-
How to change password programaticaly in LDAP server
Hi Can anyone give suggestion how to change the password programatically in the LDAP server using JNDI. Thanks
-
Problem editing background images
I recorded a web interface simulation that contains sensitive customer information. All customer data has been edited by individually modifying background images in Photoshop to convert real customer information into generic looking accounts. The mov
-
Peut-on désautoriser un périphérique dans Adobe Digital Editions ?
Bonjour, Je suis arrivée au maximum d'autorisations possible avec mon identifiant Adobe. Mais parmi celles que j'ai créées, il y en a que je n'utilise plus (ces autorisations correspondent à des liseuses que nous n'utilisons plus). J'aimerais donc sa
-
TopLink vers. 9.0.3.5 needed
Can anyone help me find this version? The new version generates different files and I don't want to have to change everything... I am looking for where I can download vers. 9.0.3.5. Thx.