Log file in Sql*Loader
Hi,
Sql loader always creates the logfile with same name as .ctl file,
but i want log file should get created in case of unsuccessful run
not in case of successful run.
Is this possible.
Regards,
SS
I don’t think it is possible to suppress log file generation on successful run. The Log file contains a detailed summary of the load and helps you to identify successful/fail load.
Similar Messages
-
Unknown issue while loading .dbf file by sql loader
Hi guys,
I am having a unknown issue while loading .dbf file by sql loader.
I need to load .dbf data into oracle table.for this I converted .dbf file by just changing its extension as .csv . file structure after changing .dbf to .csv --
C_N_NUMBER,COMP_CODE,CPT_CODE,C_N_AMT,CM_NUMBER
1810/4,LKM,30,45,683196
1810/5,LKM,30,45,683197
1810/6,LKM,30,45,683198
1810/7,LKM,30,135,683200
1810/8,LKM,30,90,683201
1810/9,LKM,1,45,683246
1810/9,LKM,2,90,683246
1810/10,LKF,1,90,683286
2810/13,LKJ,1,50.5,680313
2810/14,LKJ,1,50,680316
1910/1,LKQ,1,90,680344
3910/2,LKF,1,238.12,680368
3910/3,LKF,1,45,680382
3910/4,LKF,1,45,680395
7910/5,LKS,1,45,680397
7910/6,LKS,1,90,680400
7910/7,LKS,1,45,680401
7910/8,LKS,1,238.12,680414
7910/9,LKS,1,193.12,680415
7910/10,LKS,1,45,680490
then I am loading it by sql loader.but I am getting always error below ...
Record 1: Rejected - Error on table C_N_DETL_TAB, column CPT_CODE.
ORA-01438: value larger than specified precision allowed for this column
Record 2: Rejected - Error on table C_N_DETL_TAB, column CPT_CODE.
ORA-01438: value larger than specified precision allowed for this column
table structure-
create table C_N_DETL_tab
"C_N_NUMBER" VARCHAR2(13),
"COMP_CODE" VARCHAR2(3),
"CPT_CODE" NUMBER(4),
"C_N_AMT" NUMBER(20,18),
"CM_NUMBER" NUMBER(7)
control file-
options(skip=1)
load data
infile '/softdump/pc/C_N_DETL.csv'
badfile '/softdump/pc/C_N_DETL.bad'
discardfile '/softdump/pc/C_N_DETL.dsc'
into table C_N_DETL_tab
truncate
FIELDS TERMINATED BY ","
OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
C_N_NUMBER CHAR,
COMP_CODE CHAR,
CPT_CODE INTEGER,
C_N_AMT INTEGER,
CM_NUMBER INTEGER
but guys when I am increasing size of all columns of tabel upto its max value then data is loaded but when I am checking column max length after data loaded then its very less..
changed table structure-
create table C_N_DETL_tab
"C_N_NUMBER" VARCHAR2(130),
"COMP_CODE" VARCHAR2(30),
"CPT_CODE" NUMBER(32), ---- max value of number
"C_N_AMT" NUMBER(32,18), ---- max value of number
"CM_NUMBER" NUMBER(32) ---- max value of number
now i ma running ...
sqlldr express/express control=C_N_DETL.ctl log=C_N_DETL.log
o/p-
Table C_N_DETL_TAB, loaded from every logical record.
Insert option in effect for this table: TRUNCATE
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
---------- ---- ---- C_N_NUMBER FIRST * , O(") CHARACTER
COMP_CODE NEXT * , O(") CHARACTER
CPT_CODE NEXT 4 INTEGER
C_N_AMT NEXT 4 INTEGER
CM_NUMBER NEXT 4 INTEGER
Table C_N_DETL_TAB:
20 Rows successfully loaded.
0 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
select max(length( CPT_CODE))from C_N_DETL_tab ---> 9
can u tell me why I need to increase size of table columns upto max value?although length of data is soo much less.
kindly check it..thnx in advance...
rgds,
pcNo database version of course. Unimportant.
If I recall correctly, it is 'integer external ' and you would best double quoting the alphanumerics (which you didn't ).
Try changing integer in integer external in the ctl file.
Sybrand Bakker
Senior Oracle DBA -
Upload Multiple files in SQL * Loader in one session
Dear all,
I want to upload multiple files using SQL*Loader in one go. In Unix its very easy as I have a quite experience of that. Could any body tell what is a way in Windows to upload multiple files in one go while using SQL * Loader. I want to run that using DOS's Batch file.
Thanks
GhulamIn Unix its very easy like we are putting $ sign.
sqlldr userid=?????/?????
control="/u12/cad_delta.ctl",log="/u12/full_extract/$1",data="/u12/$1",bad="/u12//full_extract/$1",errors=500,silent=feedback
Suggest for Windows? It should read all .DAT files of the Windows Folder.
Thanks -
Problem with loading file with SQL loader
i am getting a problem with loading a file with SQL loader. The loading is getting
terminated after around 2000 rows whereas there are around 2700000 rows in the file.
The file is like
919879086475,11/17/2004,11/20/2004
919879698625,11/17/2004,11/17/2004
919879698628,11/17/2004,11/17/2004
the control file, i am using is like:-
load data
infile 'c:\ran\temp\pps_fc.txt'
into table bm_05oct06
fields terminated by ","
(mobile_no, fcal, frdate )
I hope, my question is clear. Please help, in solving the doubt.
regards.So which thread is telling the truth?
Doubt with SQL loader file wih spaces
Are the fields delimited with spaces or with commas?
Perhaps they are a mixture of delimiters and that is where the error is coming in? -
Not loading from flat file using SQL*Loader
Hi,
I am trying to load from an excel file.
first i converted excel file into csv file and save it as as dat file.
in the excel file one column is salary and the data is like $100,000
while converting xls to csv the salary is changed to "$100,000 " (with quotes and a space after the amount)
after the last digit it will put a space.
in the control file of sql*loader i had given
salary "to_number('L999,999')"
my problem is the space after the salary in the dat file.---> "$100,000 "
what changes i have to make in the to_number function which is in the control file.
Please guide me.
Thanks & Regards
Salih KM
Message was edited by:
kmsalihThanks a lot Jens Petersen
It's is loading ..........
MI means miniute.
am i correct.
but i didn't get the logic behind that.
can u please explain that.
Thanks & Regards
Salih KM -
Issue while loading a csv file using sql*loader...
Hi,
I am loading a csv file using sql*loader.
On the number columns where there is data populated in them, decimal number/integers, the row errors out on the error -
ORA-01722: invalid number
I tried checking the value picking from the excel,
and found the chr(13),chr(32),chr(10) values characters on the value.
ex: select length('0.21') from dual is giving a value of 7.
When i checked each character as
select ascii(substr('0.21',5,1) from dual is returning a value 9...etc.
I tried the following command....
"to_number(trim(replace(replace(replace(replace(:std_cost_price_scala,chr(9),''),chr(32),''),chr(13),''),chr(10),'')))",
to remove all the non-number special characters. But still facing the error.
Please let me know, any solution for this error.
Thanks in advance.
Kirancontrol file:
OPTIONS (ROWS=1, ERRORS=10000)
LOAD DATA
CHARACTERSET WE8ISO8859P1
INFILE '$Xx_TOP/bin/ITEMS.csv'
APPEND INTO TABLE XXINF.ITEMS_STAGE
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' TRAILING NULLCOLS
ItemNum "trim(replace(replace(:ItemNum,chr(9),''),chr(13),''))",
cross_ref_old_item_num "trim(replace(replace(:cross_ref_old_item_num,chr(9),''),chr(13),''))",
Mas_description "trim(replace(replace(:Mas_description,chr(9),''),chr(13),''))",
Mas_long_description "trim(replace(replace(:Mas_long_description,chr(9),''),chr(13),''))",
Org_description "trim(replace(replace(:Org_description,chr(9),''),chr(13),''))",
Org_long_description "trim(replace(replace(:Org_long_description,chr(9),''),chr(13),''))",
user_item_type "trim(replace(replace(:user_item_type,chr(9),''),chr(13),''))",
organization_code "trim(replace(replace(:organization_code,chr(9),''),chr(13),''))",
primary_uom_code "trim(replace(replace(:primary_uom_code,chr(9),''),chr(13),''))",
inv_default_item_status "trim(replace(replace(:inv_default_item_status,chr(9),''),chr(13),''))",
inventory_item_flag "trim(replace(replace(:inventory_item_flag,chr(9),''),chr(13),''))",
stock_enabled_flag "trim(replace(replace(:stock_enabled_flag,chr(9),''),chr(13),''))",
mtl_transactions_enabled_flag "trim(replace(replace(:mtl_transactions_enabled_flag,chr(9),''),chr(13),''))",
revision_qty_control_code "trim(replace(replace(:revision_qty_control_code,chr(9),''),chr(13),''))",
reservable_type "trim(replace(replace(:reservable_type,chr(9),''),chr(13),''))",
check_shortages_flag "trim(replace(replace(:check_shortages_flag,chr(9),''),chr(13),''))",
shelf_life_code "trim(replace(replace(replace(replace(:shelf_life_code,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
shelf_life_days "trim(replace(replace(replace(replace(:shelf_life_days,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
lot_control_code "trim(replace(replace(:lot_control_code,chr(9),''),chr(13),''))",
auto_lot_alpha_prefix "trim(replace(replace(:auto_lot_alpha_prefix,chr(9),''),chr(13),''))",
start_auto_lot_number "trim(replace(replace(:start_auto_lot_number,chr(9),''),chr(13),''))",
negative_measurement_error "trim(replace(replace(replace(replace(:negative_measurement_error,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
positive_measurement_error "trim(replace(replace(replace(replace(:positive_measurement_error,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
serial_number_control_code "trim(replace(replace(:serial_number_control_code,chr(9),''),chr(13),''))",
auto_serial_alpha_prefix "trim(replace(replace(:auto_serial_alpha_prefix,chr(9),''),chr(13),''))",
start_auto_serial_number "trim(replace(replace(:start_auto_serial_number,chr(9),''),chr(13),''))",
location_control_code "trim(replace(replace(:location_control_code,chr(9),''),chr(13),''))",
restrict_subinventories_code "trim(replace(replace(:restrict_subinventories_code,chr(9),''),chr(13),''))",
restrict_locators_code "trim(replace(replace(:restrict_locators_code,chr(9),''),chr(13),''))",
bom_enabled_flag "trim(replace(replace(:bom_enabled_flag,chr(9),''),chr(13),''))",
costing_enabled_flag "trim(replace(replace(:costing_enabled_flag,chr(9),''),chr(13),''))",
inventory_asset_flag "trim(replace(replace(:inventory_asset_flag,chr(9),''),chr(13),''))",
default_include_in_rollup_flag "trim(replace(replace(:default_include_in_rollup_flag,chr(9),''),chr(13),''))",
cost_of_goods_sold_account "trim(replace(replace(:cost_of_goods_sold_account,chr(9),''),chr(13),''))",
std_lot_size "trim(replace(replace(replace(replace(:std_lot_size,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
sales_account "trim(replace(replace(:sales_account,chr(9),''),chr(13),''))",
purchasing_item_flag "trim(replace(replace(:purchasing_item_flag,chr(9),''),chr(13),''))",
purchasing_enabled_flag "trim(replace(replace(:purchasing_enabled_flag,chr(9),''),chr(13),''))",
must_use_approved_vendor_flag "trim(replace(replace(:must_use_approved_vendor_flag,chr(9),''),chr(13),''))",
allow_item_desc_update_flag "trim(replace(replace(:allow_item_desc_update_flag,chr(9),''),chr(13),''))",
rfq_required_flag "trim(replace(replace(:rfq_required_flag,chr(9),''),chr(13),''))",
buyer_name "trim(replace(replace(:buyer_name,chr(9),''),chr(13),''))",
list_price_per_unit "trim(replace(replace(replace(replace(:list_price_per_unit,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
taxable_flag "trim(replace(replace(:taxable_flag,chr(9),''),chr(13),''))",
purchasing_tax_code "trim(replace(replace(:purchasing_tax_code,chr(9),''),chr(13),''))",
receipt_required_flag "trim(replace(replace(:receipt_required_flag,chr(9),''),chr(13),''))",
inspection_required_flag "trim(replace(replace(:inspection_required_flag,chr(9),''),chr(13),''))",
price_tolerance_percent "trim(replace(replace(replace(replace(:price_tolerance_percent,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
expense_account "trim(replace(replace(:expense_account,chr(9),''),chr(13),''))",
allow_substitute_receipts_flag "trim(replace(replace(:allow_substitute_receipts_flag,chr(9),''),chr(13),''))",
allow_unordered_receipts_flag "trim(replace(replace(:allow_unordered_receipts_flag,chr(9),''),chr(13),''))",
receiving_routing_code "trim(replace(replace(:receiving_routing_code,chr(9),''),chr(13),''))",
inventory_planning_code "trim(replace(replace(:inventory_planning_code,chr(9),''),chr(13),''))",
min_minmax_quantity "trim(replace(replace(replace(replace(:min_minmax_quantity,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
max_minmax_quantity "trim(replace(replace(replace(replace(:max_minmax_quantity,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
planning_make_buy_code "trim(replace(replace(:planning_make_buy_code,chr(9),''),chr(13),''))",
source_type "trim(replace(replace(:source_type,chr(9),''),chr(13),''))",
mrp_safety_stock_code "trim(replace(replace(:mrp_safety_stock_code,chr(9),''),chr(13),''))",
material_cost "trim(replace(replace(:material_cost,chr(9),''),chr(13),''))",
mrp_planning_code "trim(replace(replace(:mrp_planning_code,chr(9),''),chr(13),''))",
customer_order_enabled_flag "trim(replace(replace(:customer_order_enabled_flag,chr(9),''),chr(13),''))",
customer_order_flag "trim(replace(replace(:customer_order_flag,chr(9),''),chr(13),''))",
shippable_item_flag "trim(replace(replace(:shippable_item_flag,chr(9),''),chr(13),''))",
internal_order_flag "trim(replace(replace(:internal_order_flag,chr(9),''),chr(13),''))",
internal_order_enabled_flag "trim(replace(replace(:internal_order_enabled_flag,chr(9),''),chr(13),''))",
invoice_enabled_flag "trim(replace(replace(:invoice_enabled_flag,chr(9),''),chr(13),''))",
invoiceable_item_flag "trim(replace(replace(:invoiceable_item_flag,chr(9),''),chr(13),''))",
cross_ref_ean_code "trim(replace(replace(:cross_ref_ean_code,chr(9),''),chr(13),''))",
category_set_intrastat "trim(replace(replace(:category_set_intrastat,chr(9),''),chr(13),''))",
CustomCode "trim(replace(replace(:CustomCode,chr(9),''),chr(13),''))",
net_weight "trim(replace(replace(replace(replace(:net_weight,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
production_speed "trim(replace(replace(:production_speed,chr(9),''),chr(13),''))",
LABEL "trim(replace(replace(:LABEL,chr(9),''),chr(13),''))",
comment1_org_level "trim(replace(replace(:comment1_org_level,chr(9),''),chr(13),''))",
comment2_org_level "trim(replace(replace(:comment2_org_level,chr(9),''),chr(13),''))",
std_cost_price_scala "to_number(trim(replace(replace(replace(replace(:std_cost_price_scala,chr(9),''),chr(32),''),chr(13),''),chr(10),'')))",
supply_type "trim(replace(replace(:supply_type,chr(9),''),chr(13),''))",
subinventory_code "trim(replace(replace(:subinventory_code,chr(9),''),chr(13),''))",
preprocessing_lead_time "trim(replace(replace(replace(replace(:preprocessing_lead_time,chr(9),''),chr(32),''),chr(13),''),chr(10),''))",
processing_lead_time "trim(replace(replace(replace(replace(:processing_lead_time,chr(9),''),chr(32),''),chr(13),''),chr(10),''))",
wip_supply_locator "trim(replace(replace(:wip_supply_locator,chr(9),''),chr(13),''))"
Sample data from csv file.
"9901-0001-35","390000","JMKL16 Pipe bend 16 mm","","JMKL16 Putkikaari 16 mm","","AI","FJE","Ea","","","","","","","","","","","","","","","","","","","","","","","","","21-21100-22200-00000-00000-00-00000-00000","0","21-11100-22110-00000-00000-00-00000-00000","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","0.1","Pull","AFTER PROD","","","Locator for Production"
The load errors out on especially two columns :
1) std_cost_price_scala
2) list_price_per_unit
both are number columns.
And when there is data being provided on them. It errors out. But, if they are holding null values, the records go through fine.
Message was edited by:
KK28 -
Problem loading XML-file using SQL*Loader
Hello,
I'm using 9.2 and tryin to load a XML-file using SQL*Loader.
Loader control-file:
LOAD DATA
INFILE *
INTO TABLE BATCH_TABLE TRUNCATE
FIELDS TERMINATED BY ','
FILENAME char(255),
XML_DATA LOBFILE (FILENAME) TERMINATED BY EOF
BEGINDATA
data.xml
The BATCH_TABLE is created as:
CREATE TABLE BATCH_TABLE (
FILENAME VARCHAR2 (50),
XML_DATA SYS.XMLTYPE ) ;
And the data.xml contains the following lines:
<?xml version="2.0" encoding="UTF-8"?>
<!DOCTYPE databatch SYSTEM "databatch.dtd">
<batch>
<record>
<data>
<type>10</type>
</data>
</record>
<record>
<data>
<type>20</type>
</data>
</record>
</batch>
However, the sqlldr gives me an error:
Record 1: Rejected - Error on table BATCH_TABLE, column XML_DATA.
ORA-21700: object does not exist or is marked for delete
ORA-06512: at "SYS.XMLTYPE", line 0
ORA-06512: at line 1
If I remove the first two lines
"<?xml version="2.0" encoding="UTF-8"?>"
and
"<!DOCTYPE databatch SYSTEM "databatch.dtd">"
from data.xml everything works, and the contentents of data.xml are loaded into the table.
Any idea what I'm missing here? Likely the problem is with special characters.
Thanks in advance,I'm able to load your file just by removing the second line <!DOCTYPE databatch SYSTEM "databatch.dtd">. I dont have your dtd file, so skipped that line. Can you check if it's problem with ur DTD?
-
Can I have two Data Files in One control file of sql*loader tool
hi,
Can someone help me out. is it possible to have two Data Files in one control file of Sql*loader.
And isit possible to run 10,000 records before lunch and 10,000 records before tea and 10,000 records before evening session by giving breaks after every 10,000 records.
Thanks
RamYes. You can specify two datafiles in one control file and can load using sql loader.
I give you the sample control file.
Load DATA
INFILE 'TEST1.CSV'
INFILE 'TEST2.CSV'
TRUNCATE
INTO TABLE TEST_P
FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(COL_1,
COL_2,
COL_n)
Hope It will help you.
-Karthik -
Need suggestions on loading 5000+ files using sql loader
Hi Guys,
I'm writing a shell script to load more than 5000 files using sql loader.
My intention is to load the files in parallel. When I checked the maximum number of sessions in v$parameter, it is around 700.
Before starting the data load, programmatically I am getting the number of current sessions and maximum number of sessions and keeping free 200 sessions without using them (max. no. of sessions minus 200 ) and utilizing the remaining ~300 sessions to load the files in parallel.
Also I am using a "wait" option to make the shell to wait until the 300 concurrent sql loader process to complete and moving further.
Is there any way to make it more efficient? Also is it possible to reduce the wait time without hard coding the seconds (For Example: If any of those 300 sessions becomes free, assign the next file to the job queue and so on..)
Please share your thoughts on this.
Thanks.Manohar wrote:
I'm writing a shell script to load more than 5000 files using sql loader.
My intention is to load the files in parallel. When I checked the maximum number of sessions in v$parameter, it is around 700. Concurrent load you mean? Parallel processing implies take a workload, breaking that up into smaller workloads, and doing that in parallel. This is what the Parallel Query feature does in Oracle.
SQL*Loader does not do that for you. It uses a single session to load a single file. To make it run in parallel, requires manually starting multiple loader sessions and perform concurrent loads.
Have a look at Parallel Data Loading Models in the Oracle® Database Utilities guide. It goes into detail on how to perform concurrent loads. But you need to parallelise that workload yourself (as explained in the manual).
Before starting the data load, programmatically I am getting the number of current sessions and maximum number of sessions and keeping free 200 sessions without using them (max. no. of sessions minus 200 ) and utilizing the remaining ~300 sessions to load the files in parallel.
Also I am using a "wait" option to make the shell to wait until the 300 concurrent sql loader process to complete and moving further.
Is there any way to make it more efficient? Also is it possible to reduce the wait time without hard coding the seconds (For Example: If any of those 300 sessions becomes free, assign the next file to the job queue and so on..)Consider doing it the way that Parallel Query does (as I've mentioned above). Take the workload (all files). Break the workload up into smaller sub-workloads (e.g. 50 files to be loaded by a process). Start a 100 processes in parallel and provide each one with a sub-workload to do (100 processes each loading 50 odd files).
This is a lot easier to manage than starting for example a 5000 load processes and then trying some kind of delay method to ensure that not all hit the database at the same time.
I'm loading about 100+ files (3+ million rows) every 60 seconds 24x7 using SQL*Loader. Oracle is quite scalable and SQL*Loader quite capable. -
Concatenate positions in control file for sql*loader utility.
Is there a way to concatenate several character positions in a control file for sql*loader???
example... field1 position(1:3) || position(5:7) ????
I would rather not create any unnecessary temp tables for a straight load...How about...
/code
field1 position(1:7) char "substr(:field1, 1, 3) || substr(:field1, 5)" -
Different log file name in the Control file of SQL Loader
Dear all,
I get every day 3 log files with ftp from a Solaris Server to a Windows 2000 Server machine. In this Windows machine, we have an Oracle Database 9.2. These log files are in the following format: in<date>.log i.e. in20070429.log.
I would like to load this log file's data to an Oracle table every day and I would like to use SQL Loader for this job.
The problem is that the log file name is different every day.
How can I give this variable log file name in the Control file, which is used for the SQL Loader?
file.ctl
LOAD DATA
INFILE 'D:\gbal\in<date>.log'
APPEND INTO TABLE CHAT_SL
FIELDS TERMINATED BY WHITESPACE
TRAILING NULLCOLS
(SL1 DATE "Mon DD, YYYY HH:MI:SS FF3AM",
SL2 char,
SL3 DATE "Mon DD, YYYY HH:MI:SS FF3AM",
SL4 char,
SL5 char,
SL6 char,
SL7 char,
SL8 char,
SL9 char,
SL10 char,
SL11 char,
SL12 char,
SL13 char,
SL14 char,
SL15 char)
Do you have any better idea about this issue?
I thought of renaming the log file to an instant name, such as in.log, but how can I distinguish the desired log file, from the other two?
Thank you very much in advance.
Giorgos BaliotisI don't have a direct solution for your problem.
However if you invoke the SQL loader from an Oracle stored procedure, it is possible to dynamically set control\log file.
# Grant previleges to the user to execute command prompt statements
BEGIN
dbms_java.grant_permission('bc4186ol','java.io.FilePermission','C:\windows\system32\cmd.exe','execute');
END;
* Procedure to execute Operating system commands using PL\SQL(Oracle script making use of Java packages
CREATE OR REPLACE AND COMPILE JAVA SOURCE NAMED "Host" AS
import java.io.*;
public class Host {
public static void executeCommand(String command) {
try {
String[] finalCommand;
finalCommand = new String[4];
finalCommand[0] = "C:\\windows\\system32\\cmd.exe";
finalCommand[1] = "/y";
finalCommand[2] = "/c";
finalCommand[3] = command;
final Process pr = Runtime.getRuntime().exec(finalCommand);
new Thread(new Runnable() {
public void run() {
try {
BufferedReader br_in = new BufferedReader(new InputStreamReader(pr.getInputStream()));
String buff = null;
while ((buff = br_in.readLine()) != null) {
System.out.println("Process out :" + buff);
try {Thread.sleep(100); } catch(Exception e) {}
catch (IOException ioe) {
System.out.println("Exception caught printing process output.");
ioe.printStackTrace();
}).start();
new Thread(new Runnable() {
public void run() {
try {
BufferedReader br_err = new BufferedReader(new InputStreamReader(pr.getErrorStream()));
String buff = null;
while ((buff = br_err.readLine()) != null) {
System.out.println("Process err :" + buff);
try {Thread.sleep(100); } catch(Exception e) {}
catch (IOException ioe) {
System.out.println("Exception caught printing process error.");
ioe.printStackTrace();
}).start();
catch (Exception ex) {
System.out.println(ex.getLocalizedMessage());
public static boolean isWindows() {
if (System.getProperty("os.name").toLowerCase().indexOf("windows") != -1)
return true;
else
return false;
* Oracle wrapper to call the above procedure
CREATE OR REPLACE PROCEDURE Host_Command (p_command IN VARCHAR2)
AS LANGUAGE JAVA
NAME 'Host.executeCommand (java.lang.String)';
* Now invoke the procedure with an operating system command(Execyte SQL-loader)
* The execution of script would ensure the Prod mapping data file is loaded to PROD_5005_710_MAP table
* Change the control\log\discard\bad files as apropriate
BEGIN
Host_Command (p_command => 'sqlldr system/tiburon@orcl control=C:\anupama\emp_join'||1||'.ctl log=C:\anupama\ond_lists.log');
END;Does that help you?
Regards,
Bhagat -
Loading multiple files with SQL Loader
Hello.
I will appreciate your recommendation about the way to load multiple files (to multiple tables) using SQL Loader with only one Control file.
file1 to load to Table1, file2 to load to Table2 etc.
How the Control file should look like?
I was looking on Web, but didn't find exactly what I need.
Thanks!Ctl File : myctl.ctl
---------- Start ---------
LOAD DATA
INFILE 'F:\sqlldr\abc1.dat'
INFILE 'F:\sqlldr\abc2.dat'
INTO TABLE hdfc1
(TRANS_DATE CHAR,
NARRATION CHAR,
VALUE_DATE CHAR,
DEBIT_AMOUNT INTEGER,
CREDIT_AMOUNT INTEGER,
CHQ_REF_NUMBER CHAR,
CLOSING_BALANCE CHAR)
INTO TABLE hdfc2
(TRANS_DATE CHAR,
NARRATION CHAR,
VALUE_DATE CHAR,
DEBIT_AMOUNT INTEGER,
CREDIT_AMOUNT INTEGER,
CHQ_REF_NUMBER CHAR,
CLOSING_BALANCE CHAR)
-----------End-----------
Sqlldr Command
sqlldr scott/tiger@dbtalk control=F:\sqlldr\myctl.ctl log=F:\sqlldr\ddl_file1.txt
Regards,
Abu -
Problem when loading xml file using sql loader
I am trying to load data into table test_xml (xmldata XMLType)
i have an xml file and i want whole file to be loaded into a single column
when i use the following control file and executed from command prompt as follows
sqlldr $1@$TWO_TASK control=$XXTOP/bin/LOAD_XML.ctl direct=true;:
LOAD DATA
INFILE *
TRUNCATE INTO TABLE test_xml
xmltype(xmldata)
FIELDS
ext_fname filler char(100),
xmldata LOBFILE (ext_fname) TERMINATED BY EOF
BEGIN DATA
/u01/APPL/apps/apps_st/appl/xxtop/12.0.0/bin/file.xml
the file is being loaded into table perfectly.
unfortunatley i cant hardcode file name as file name will be changed dynamically.
so i removed the block
BEGIN DATA
/u01/APPL/apps/apps_st/appl/xxtop/12.0.0/bin/file.xml
from control file and tried to execute by giving file path from command line as follows
sqlldr $1@$TWO_TASK control=$XXTOP/bin/LOAD_XML.ctl data=/u01/APPL/apps/apps_st/appl/xxtop/12.0.0/bin/file.xml direct=true;
but strangely it's trying to load each line of xml file into table instead of whole file
Please find the log of the program with error
Loading of XML through SQL*Loader Starts
SQL*Loader-502: unable to open data file '<?xml version="1.0"?>' for field XMLDATA table TEST_XML
SQL*Loader-553: file not found
SQL*Loader-509: System error: No such file or directory
SQL*Loader-502: unable to open data file '<Root>' for field XMLDATA table TEST_XML
SQL*Loader-553: file not found
SQL*Loader-509: System error: No such file or directory
SQL*Loader-502: unable to open data file '<ScriptFileType>' for field XMLDATA table TEST_XML
SQL*Loader-553: file not found
SQL*Loader-509: System error: No such file or directory
SQL*Loader-502: unable to open data file '<Type>Forms</Type>' for field XMLDATA table TEST_XML
SQL*Loader-553: file not found
SQL*Loader-509: System error: No such file or directory
SQL*Loader-502: unable to open data file '</ScriptFileType>' for field XMLDATA table TEST_XML
SQL*Loader-553: file not found
SQL*Loader-509: System error: No such file or directory
SQL*Loader-502: unable to open data file '<ScriptFileType>' for field XMLDATA table TEST_XML
SQL*Loader-553: file not found
SQL*Loader-509: System error: No such file or directory
SQL*Loader-502: unable to open data file '<Type>PLL</Type>' for field XMLDATA table TEST_XML
SQL*Loader-553: file not found
SQL*Loader-509: System error: No such file or directory
SQL*Loader-502: unable to open data file '</ScriptFileType>' for field XMLDATA table TEST_XML
SQL*Loader-553: file not found
SQL*Loader-509: System error: No such file or directory
SQL*Loader-502: unable to open data file '<ScriptFileType>' for field XMLDATA table TEST_XML
please help me how can i load full xml into single column using command line without hardcoding in control file
Edited by: 907010 on Jan 10, 2012 2:24 AMbut strangely it's trying to load each line of xml file into table instead of whole fileNothing strange, the data parameter specifies the file containing the data to load.
If you use the XML filename here, the control file will try to interpret each line of the XML as being separate file paths.
The traditional approach to this is to have the filename stored in another file, say filelist.txt and use, for example :
echo "/u01/APPL/apps/apps_st/appl/xxtop/12.0.0/bin/file.xml" > filelist.txt
sqlldr $1@$TWO_TASK control=$XXTOP/bin/LOAD_XML.ctl data=filelist.txt direct=true; -
OWB 10gR2 : How to configure ctl and log locations for Sql*Loader mappings?
Hi all,
I'm using OWB 10gR2 to load data in tables with Sql*Loader mappings.
In my project I have a datafile module and an Oracle module.
When creating an sql*loader mapping in the oracle module, there is two properties for this mappings that I want to modify. The first is Control File Location and the second is Log File Location. Values for those properties are equal to the data file module location. When trying to change those values I can only chose "Use module configuration location".
Somebody knows how to configure those properties with different locations as the one of the flat file module?
What I want to do is to store the data file in one directory, and control file and log file in other directories.
Thank you for your help.
BernardHi,
You're right, my problem is that the dropdown only show the location associated with the flat file location even if I have other file locations created in the design repository.
The good news is that I have found the solution to solve the problem :
1) Edit the file module and in tab "Data locations", add the locations you want to use for control file and log file.
2) Open configuration window of the mapping and then the dropdown for properties Control File Location and Log File Location show new locations
I have tested my mapping after changing those properties and it's working.
Bernard -
Controller file for SQL Loader
Guys,
My data is coming in flat from some third party in the below format
1,Stewart,"Current Address: Street-A, Flat-507, London, UK
Permanent Address: Street-B, Flat-201, London, UK"
2,Patrick,"Current Address: Street-A, Flat-507, Bangalore, India
Permanent Address: Street-B, Flat-201, Delhi, India"
I want to load the data in a custom table by using a SQL loader program which will use a controller program. This program needs to populate the table in below manner
SEQ EMPLOYEE ADDRESS INFORMATION
1 Stewart Current Address: Street-A, Flat-507, London, UK
Permanent Address: Street-B, Flat-201, London, UK
2 Patrick Current Address: Street-A, Flat-507, Bangalore, India
Permanent Address: Street-B, Flat-201, Delhi, India
Could you guys please help me in writing the controller file to read the data from file and populate the table as described above. Thanks.
-SunilIt is a little hard to tell exactly what your data file looks like and what you want your results to be, due to this forum mangling things a bit and adding extra lines and such. In general you can either use CONTINUEIF or CONCATENATE. I have demonstrated both below. When using CONTINUEIF, it assumes each additional line begins with the word "Permanent". When using CONCATENATE, it assumes each two lines constitutes one record. I have also used REPLACE to add a line feed in front of "Permanent", after it is removed during continuation or concatenation.
SCOTT@orcl12c> host type test.dat
1,Stewart,"Current Address: Street-A, Flat-507, London, UK
Permanent Address: Street-B, Flat-201, London, UK"
2,Patrick,"Current Address: Street-A, Flat-507, Bangalore, India
Permanent Address: Street-B, Flat-201, Delhi, India"
SCOTT@orcl12c> host type test.ctl
load data
infile test.dat
continueif next preserve (1:9) = 'Permanent'
into table test_tab
fields terminated by ','
optionally enclosed by '"'
trailing nullcols
(seq, employee,
address_information "replace (:address_information, 'Permanent', CHR(10) || 'Permanent')")
SCOTT@orcl12c> host type test2.ctl
load data
infile test.dat
concatenate 2
into table test_tab
fields terminated by ','
optionally enclosed by '"'
trailing nullcols
(seq, employee,
address_information "replace (:address_information, 'Permanent', CHR(10) || 'Permanent')")
SCOTT@orcl12c> create table test_tab
2 (seq number,
3 employee varchar2(8),
4 address_information varchar2(200))
5 /
Table created.
SCOTT@orcl12c> host sqlldr scott/tiger control=test.ctl log=test.log
SQL*Loader: Release 12.1.0.1.0 - Production on Mon Dec 16 13:11:40 2013
Copyright (c) 1982, 2013, Oracle and/or its affiliates. All rights reserved.
Path used: Conventional
Commit point reached - logical record count 1
Commit point reached - logical record count 2
Table TEST_TAB:
2 Rows successfully loaded.
Check the log file:
test.log
for more information about the load.
SCOTT@orcl12c> column address_information format a60
SCOTT@orcl12c> select * from test_tab
2 /
SEQ EMPLOYEE ADDRESS_INFORMATION
1 Stewart Current Address: Street-A, Flat-507, London, UK
Permanent Address: Street-B, Flat-201, London, UK
2 Patrick Current Address: Street-A, Flat-507, Bangalore, India
Permanent Address: Street-B, Flat-201, Delhi, India
2 rows selected.
SCOTT@orcl12c> truncate table test_tab
2 /
Table truncated.
SCOTT@orcl12c> host sqlldr scott/tiger control=test2.ctl log=test2.log
SQL*Loader: Release 12.1.0.1.0 - Production on Mon Dec 16 13:11:40 2013
Copyright (c) 1982, 2013, Oracle and/or its affiliates. All rights reserved.
Path used: Conventional
Commit point reached - logical record count 2
Table TEST_TAB:
2 Rows successfully loaded.
Check the log file:
test2.log
for more information about the load.
SCOTT@orcl12c> select * from test_tab
2 /
SEQ EMPLOYEE ADDRESS_INFORMATION
1 Stewart Current Address: Street-A, Flat-507, London, UK
Permanent Address: Street-B, Flat-201, London, UK
2 Patrick Current Address: Street-A, Flat-507, Bangalore, India
Permanent Address: Street-B, Flat-201, Delhi, India
2 rows selected.
Maybe you are looking for
-
Problems applying patch 8.1.7.1.1
WE HAVE 2 SEPARATE INSTALLATIONS AS FOLLOWS: 1) NT4 / DB 8.1.7.0 / iAS 1.0.2.0(portal 3.0.7) all under same machine WE USE THIS SERVER FOR TESTING 2) W2000 / DB 8.1.7.0 / iAS 1.0.2.2 (portal 3.0.9). db in one server, iAS in another server. WE ARE PLA
-
LMS 3.2 Veritas HA requirements question
In the Install document, for Windows Install, the Veritas HA solution says it only supports 32-bit Windows OS and Windows 2003. I see the document was written in May and I was wondering if it will support 64-bit Windows 2008 now? Also, will 4.0 sup
-
The fan on my air Processor 1.7 GHz Intel Core i5 seems like it on all the time and loud
the fan on my air Processor 1.7 GHz Intel Core i5 seems like it on all the time and loud
-
Hi All, I have developed a New Form.And I'm having 3 organizations.My problem is that i sholud not allow the Users to see data of the Other Organizations data. So i have to restrict the Users at the form level. Can any one help me in this... Thanks,
-
I've found that the cross-platform JMF software that I have recently downloaded from the sun website doesnt contain the com.sun.media.protocol.vfw.VFWCapture package I was wondering if anyone could provide me with this file. I have extracted my "pack