SQLLoader Problem
Hello everybody,
I have a problem when loading a file with about 3000000 Datasets.
but the problem already occurs after dataset 38111.
The loader reports in the logfile, that in various rows a field is too
long for the definition in the database.
Here is the create script for the table:
create table logdaten (
LOGNAME VARCHAR2(2000),
DATUM DATE,
IPADRESSE VARCHAR2(15),
AUFRUF VARCHAR2(4000)
The corresponding text int the datafile for the column aufruf has a maximum of 3000 characters, which is hardcoded when I create this file.
I have the loader script and a datafile with about 50000 datasets for download, so you can have a look at what I'm doing wrong, at http://212.185.114.146/result.tar.gz
I already tried to play a bit with bindsize or the rows parameter, but it hasn't helped.
Perhaps someone can help me.
Greetings Markus
When loading varchar longer than 256, you have to explicitly set the length in the control file.
load data
infile 'myfile.dat'
into table logdaten
fields terminated by ',' optionally enclosed by '"' (longname char(2000),
Similar Messages
-
SQLLOADER PROBLEM IN LOADING DATA TO MULTIPLE TABLES
My problem is I have to data from a flat file which consists 64 columns and 11040 records into 5 different tables.Other thin is I have to check that only UNIQUE record should goto database and then I have to generate a primary key for the record that came to database.
So I have written a BEFORE INSERT TRIGGER FOR EACH ROW for all the 5 tables to check uniques of the record arrived.
Now my problem is SQLLDR is loading only those number of records for all the tables which are in minimum to a table uniquely .i.e.,
TABLES RECORDS(ORGINALLY)
TIME 11
STORES 184
PROMOTION 20
PRODUCT 60
Now it is loadin only 11 records for all the problem
with regards
vijayankarThe easiest thing is to do data manipulation in the database; that's what SQL is good for.
So load your file into tables without any unique constraints. Then apply unique constraints using the EXCEPTIONS INTO... clause. This will populate your exceptions table with the rowid of all the non-unique rows. You can then decide which rows to zap.
If you don't already have an exceptions table you'll need to run utlexcpt.sql.
HTH
P.S. This isn't the right forum to be posting SQL*Loader enquiries. -
Sqlloader control file when clause
Problem 1
Lets say, I have a table x with (f1 number(2), f2 varchar(14), f3 Number(1))
datafile1
10,ls
22,st
45,tanveer
77,cool
1,not cool
requirement is to insert the first 2 columns in f1 and f2 and insert 0 in f3 only
if f2 == tanveer
Problem 2
Table is y (f4 number (2), f5 varchar (16))
datafile2
ls,xxxx
tanveer,yyyy
cool,zzzz
not cool, mmmm
given the value of the first coloumn I would like to query table x
and find the corresponding value of f1 insert it to table y's column f4
and associated string to f5.
Please let me know how to write the control files for sqlloader.Problem 1:
-- control file:
LOAD DATA
INFILE 'DATAFILE1.DAT'
APPEND
INTO TABLE x
FIELDS TERMINATED BY ','
TRAILING NULLCOLS
(f1, f2,
f3 "DECODE (:f2, 'tanveer', 0, NULL)")
Problem 2:
Create another table z,
load the data into z,
insert into y selecting
from join of x and z:
-- create staging table:
CREATE TABLE z
(f6 VARCHAR2 (14),
f7 VARCHAR2 (16));
-- control file:
LOAD DATA
INFILE 'DATFILE2.DAT'
APPEND
INTO TABLE z
FIELDS TERMINATED BY ','
TRAILING NULLCOLS
(f6, f7)
-- insert:
INSERT INTO y (f4, f5)
SELECT x.f1, z.f7
FROM x, z
WHERE z.f6 = x.f2; -
Problem with sqlloader and rtrim in direct path
Im trying to load near 40 Gb of information to Oracle 9i (9.2.0.6.0) so to acelerate the load we are using direct=true but we are having problems with table where we need to aply sql funtions. Like an example check this control file please:
LOAD DATA
INFILE 'PSACTIVIMG.dat' "str '~~~~~'"
BADFILE 'PSACTIVIMG.bad'
DISCARDFILE 'PSACTIVIMG.dsc'
INTO TABLE PSACTIVIMG
TRAILING NULLCOLS (
ACTIVITYNAME CHAR terminated by '@@#' "NVL(RTRIM(:ACTIVITYNAME),:ACTIVITYNAME)",
SEQNO terminated by '@@#',
PSIMAGEVER terminated by '@@#',
IMGSEG RAW(65536))
When we try to run it with direct=true we are getting:
SQL*Loader-961: Error calling once/load finishing for table PSACTIVIMG
ORA-26090: row is in partial state
SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
But in alert.log we are getting:
ORA-00600: internal error code, arguments: [klafre_30], [0], [], [], [], [], [], []
ORA-01403: no data found
If i remove the rtrim function or i use conventional path the data its loades correctly. Any body have any suggestion about this problem?
thanks in advance.
Alejandro Amador.Could you use an external table and insert as select with the /*+ append */ hint?
-
Another problem with SqlLoader
I got the following error message when I tried to run Loader from OEM.
SQL*Loader-500: Unable to open file (C:\CONV\TEST.CTL)
SQL*Loader-553: file not found
SQL*Loader-509: System error: The system cannot find the file specified
The file 'C:\CONV\TEST.CTL' exists (I have created it myself) and I have full rights to that directory.
Please let me know if you have any suggestion what could cause this error.
Thanks,
Nina.Hi Karsten,
I'm not a C++ person, so hopefully someone can help out with the equivalent code, or you can figure out how to access these properties. There are two DAQmx device properties that will give you the information from two perspectives--either a list of the modules in a particular chassis, or the chassis containing a particular module. I've attached a LabVIEW screenshot using these two properties, which require you to write the Active Device property to select the device you're interested in first. You should be able to use one or both of these properties for your application.
Please post if you are successful or if you have more questions.
Regards,
Kyle
Attachments:
propertynodes.GIF 7 KB -
Problem with sqlldr and commit
Hi,
i have a problem with sqlldr and commit.
I have a simple table with one colum [ col_id number(6) not null ]. The column "col_id" is primary key in the table. I have one file with 100.000 records ( number from 0 to 99.999 ).
I want load the file in the table with sqlldr ( sql*loader ) but i want commit only if all records are loaded. If one record is discarded i want discarded all record of file.
The proble is that in coventional path the commit is on 64 row but if i want the same records of file isn't possible and in direct path sqlldr disable primary key :(
There are a solutions?
Thanks
I'm for the bad EnglishThis is my table:
DROP TABLE TEST_SQLLOADER;
CREATE TABLE TEST_SQLLOADER
( COL_ID NUMBER NOT NULL,
CONSTRAINT TEST_SQLLOADER_PK PRIMARY KEY (COL_ID)
This is my ctlfile ( test_sql_loader.ctl )
OPTIONS
DIRECT=false
,DISCARDMAX=1
,ERRORS=0
,ROWS=100000
load data
infile './test_sql_loader.csv'
append
into table TEST_SQLLOADER
fields terminated by "," optionally enclosed by '"'
( col_id )
test_sql_loader.csv
0
1
2
3
99999
i run sqlloader
sqlldr xxx/yyy@orcl control=test_sql_loader.ctl log=test_sql_loader.log
output on the screen
Commit point reached - logical record count 92256
Commit point reached - logical record count 93248
Commit point reached - logical record count 94240
Commit point reached - logical record count 95232
Commit point reached - logical record count 96224
Commit point reached - logical record count 97216
Commit point reached - logical record count 98208
Commit point reached - logical record count 99200
Commit point reached - logical record count 100000
Logfile
SQL*Loader: Release 11.2.0.1.0 - Production on Sat Oct 3 14:50:17 2009
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Control File: test_sql_loader.ctl
Data File: ./test_sql_loader.csv
Bad File: test_sql_loader.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 0
Bind array: 100000 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table TEST_SQLLOADER, loaded from every logical record.
Insert option in effect for this table: APPEND
Column Name Position Len Term Encl Datatype
COL_ID FIRST * , O(") CHARACTER
value used for ROWS parameter changed from 100000 to 992
Table TEST_SQLLOADER:
100000 Rows successfully loaded.
0 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 255936 bytes(992 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 100000
Total logical records rejected: 0
Total logical records discarded: 0
Run began on Sat Oct 03 14:50:17 2009
Run ended on Sat Oct 03 14:50:18 2009
Elapsed time was: 00:00:01.09
CPU time was: 00:00:00.06
The commit is on 992 row
if i have error on 993 record i have commit on first 992 row :(
Edited by: inter1908 on 3-ott-2009 15.00 -
Multibyte character error in SqlLoader when utf8 file with chars like €Ää
hello,
posting from Germany, special charactes like german umlaute and euro sign in UTF8 Textfile, SqlLoader rejecting row with Multibyte character error
Oracle Database 11g Release 11.2.0.2.0 - 64bit Production
Database Characterset: WE8MSWIN1252
OS: SLES 11 x86_64
Testcase SqlDeveloper:
CREATE TABLE utf8file_to_we8mswin1252 (
ID NUMBER,
text VARCHAR2(40 CHAR)
can't enter euro symbol in this posting, end's in '€' (?)
SELECT ascii(euro symbol) FROM dual;
128
SELECT chr(128) from dual;
INSERT INTO utf8file_to_we8mswin1252 (ID, text) VALUES (1, '0987654321098765432109876543210987654321');
INSERT INTO utf8file_to_we8mswin1252 (ID, text) VALUES (2, 'äüöäüöäüöäÄÖÜÄÖÜÄÖÜÄßßßßßßßßß߀€€€€€€€€€');
INSERT INTO utf8file_to_we8mswin1252 (ID, text) VALUES (3, 'äüöäüöäüöäÄÖÜÄÖÜÄÖÜÄäüöäüöäüöäÄÖÜÄÖÜÄÖÜÄ');
INSERT INTO utf8file_to_we8mswin1252 (ID, text) VALUES (4, 'ۧۧۧۧۧۧۧۧۧۧ1');
INSERT INTO utf8file_to_we8mswin1252 (ID, text) VALUES (5, 'äüöäüöäüöäÄÖÜÄÖÜÄÖÜÄäüöäüöäüöä');
INSERT INTO utf8file_to_we8mswin1252 (ID, text) VALUES (6, 'ßßßßßßßßß߀€€€€€€€€€1');
INSERT INTO utf8file_to_we8mswin1252 (ID, text) VALUES (7, 'ßßßßßßßßß߀€€€€€€€€€äüöäüöäüöäÄÖÜÄÖÜÄÖÜÄ');
commit;
Select shows correct result, no character is wrong or missing!!!!
put this in a UTF8 file without delimiter and enclosure like
10987654321098765432109876543210987654321
the SqlLoader controlfile:
LOAD DATA characterset UTF8
TRUNCATE
INTO TABLE utf8file_to_we8mswin1252
ID CHAR(1)
, TEXT CHAR(40)
on a linux client machine, NOT the Oracle-Server
export NLS_LANG=AMERICAN_AMERICA.WE8MSWIN1252
sqlldr user/pwd@connectstring CONTROL=TEST01.ctl DATA=TEST01.dat LOG=TEST01.log
Record 6: Rejected - Error on table UTF8FILE_TO_WE8MSWIN1252, column TEXT.
Multibyte character error.
Record 7: Rejected - Error on table UTF8FILE_TO_WE8MSWIN1252, column TEXT.
Multibyte character error.
Select shows missing characters in row 4 and 5, SqlLoader loads only the first 20 characters (maybe at random)
and as shown above, row 6 and 7 never loaded
Problem:
can't load UTF8 Flatfiles with SqlLoader when german umlaute and special characters like euro symbol included.
Any hint or help would be appreciated
Regards
Michael## put this in a UTF8 file without delimiter and enclosure like
The basic question is how you put the characters into the file. Most probably, you produced a WE8MSWIN1252 file and not an UTF8 file. To confirm, a look at the binary codes in the file would be necessary. Use a hex-mode-capable editor. If the file is WE8MSWIN1252, and not UTF8, then the SQL*Loader control file should be:
LOAD DATA characterset WE8MSWIN1252
TRUNCATE
INTO TABLE utf8file_to_we8mswin1252
ID CHAR(1)
, TEXT CHAR(40)
)-- Sergiusz -
Problems with creating a complex cursor
Let me prefix this post with the fact that I am really new at this and this is my first shot at creating a Stored Proc. I have the shell and I have tried to code this but I am having some issues with the cursor. Any tips or tricks that you can give me would be greatly appreciated.
Below is what I am trying to accomplish
Looking for the best approach to work with a complex cursor.
I have 4 files that are going to dump into a temp table. This is a sample of the Temp Table
CHAN_ADDR,BRA,SRC_ID,R_Flag,C_Flag,S_Flag,N_Flag,Expire_Date,Wireless_Flag
1111111111,R,1-a,,,,NDNC,7/7/2006,
2222222222,R,2-b,,,SDNC,NDNC,7/7/2006,WIR
3333333333,R,3-c,,,SDNC,NDNC,7/8/2006,
4444444444,R,4-d,y,,SDNC,NDNC,7/9/2006,WIR
5555555555,R,5-e,y,,SDNC,,7/10/2006,
6666666666,R,6-f,y,,,,,WIR
7777777777,R,7-g,,,,,,
8888888888,R,8-h,y,,,NDNC,7/7/2006,WIR
I need to take this data and dump it into another table that looks like the following:
ADDR Per_ID Method Name Expire Date Flag
1111111111 1-a Phone Nat 7/7/2006 Y
2222222222 2-b Mobile State 7/7/2006 Y
2222222222 2-b Mobile Nat 7/7/2006 Y
4444444444 3-c Mobile R 7/9/2006 y
4444444444 3-c Mobile State 7/9/2006 y
4444444444 3-c Mobile Nat 7/9/2006 y
know that I need to use a cursor using loop and fetch but I am kind of confused on how to make this work. I am fairly new to writing PL/SQL so any tips and tricks would be greatly appreciated.
For each phone number there can be 1 to 3 records written based on the flags. For each of those records I must store the phone number, the id, flag data, expire date (for only State or National) and Flag must always be checked.
I have put together a small shell of the program but what goes in the middle is where I am having some problems.
CREATE OR REPLACE PROCEDURE USP_EIM_CONTACT3_UPD
IS
CURSOR dnc_cursor IS
SELECT CHAN_ADDR,
BRA,
SRC_ID,
R_FLAG,
C_FLAG,
S_FLAG,
N_FLAG,
EBR_EXPIRE_DATE,
WIRELESS_FLAG
FROM eim_admin.RCCL_OPT_OUT_TMP;
v_counter NUMBER := 0;
v_insert NUMBER := 0;
v_sysdate DATE:=SYSDATE;
v_chan_addr eim_admin.RCCL_OPT_OUT_TMP.chan_addr%TYPE;
BEGIN
DBMS_OUTPUT.PUT_LINE ('***Begining USP_EIM_CONTACT3_UPD, time is ' ||
TO_CHAR (v_sysdate, 'MON-DD-YYYY HH24:MI.SS'));
DBMS_OUTPUT.NEW_LINE;
FOR rec IN dnc_cursor LOOP
v_counter:=v_counter+1;
BEGIN
SELECT CHAN_ADDR
INTO v_chan_addr
FROM eim_admin.RCCL_OPT_OUT_TMP;
EXCEPTION
WHEN OTHERS THEN
DBMS_OUTPUT.PUT_LINE('*** Could not lookup RCCL_OPT_OUT_TMP. ***');
DBMS_OUTPUT.PUT_LINE('ORA-'||SQLCODE||' '||SQLERRM);
END;
BEGIN
INSERT INTO siebel.S_PER_COMM_ADDR
(ADDR)
VALUES
(rec.CHAN_ADDR);
v_insert := v_insert + SQL%ROWCOUNT;
EXCEPTION WHEN OTHERS THEN
DBMS_OUTPUT.PUT_LINE('*** Failed to insert into siebel.S_PER_COMM_AADDR DDR ***');
DBMS_OUTPUT.PUT_LINE('ORA-'||SQLCODE||' '||SQLERRM);
END;
END LOOP;
--Output STATISTICS.
DBMS_OUTPUT.PUT_LINE('**Number records read :'||v_counter);
DBMS_OUTPUT.NEW_LINE;
DBMS_OUTPUT.PUT_LINE ('***Completing USP_EIM_CONTACT3_UPD, time is ' ||
TO_CHAR (SYSDATE, 'MON-DD-YYYY HH24:MI.SS'));
EXCEPTION
WHEN OTHERS THEN
DBMS_OUTPUT.PUT_LINE(' ****Error in USP_EIM_CONTACT3_UPD ****'||SQLCODE||SQLERRM);
END USP_EIM_CONTACT3_UPD;
I have already created the table. What we are going to do is using sqlloader to load our text files into our temp table. From there we are going to read the data from the temp table into a table that is already created.
Basically what I am trying to do is the following:
Cursor 1
select * from Temp where, State_Flag and National_Flag is not null
Insert into Siebel.S_PER_COMM_ADDR
set id = source var
set phone = chann_add var
If
wireless flag is not null
set method = 'Mobile"
else
set method = 'Home'
end if
If
Brand = 'r' and R_Flag is not null
set name flg1
set flag as 'T'
and so on.
I am getting confused as to where to call the variables:
set id = source var
set phone = chann_add var
I am also confused on how to do the sets and how to set the date for S_Flag and N_Flag.
One of the other issues is that I need to go and look at the base table to pick up anyone else that has the phone number and update or delete based on the data in the table.I am getting confused as to where to call the variables:
set id = source var
set phone = chann_add var have you defined your variables in the DECLARE area? is that a cursor variable or plain variable? need more info. -
How to ensure File Read Adapter handles like SQLLOADER's TRAILING NULLCOLS
I have to read a CSV file which has a specific format but sometimes the trailing columns values can be missing and i would like to handle it in such way that they values are treated as null. This is similar to the SQLLoader's TRAILING NULLCOLS clause.
Currently if my data is as shown below:
As-Of Date,As-Of-Time,Bank ID,Bank Name,State,Acct No,Acct Type,Acct Name,Currency,BAI Type Code,Tran Desc,Debit Amt,Credit Amt,0 Day Flt Amt,1 Day Flt Amt,2+ Day Flt Amt,Customer Ref No,Value Date,Location,Bank Reference,Tran Status,Descriptive Text,Descriptive Text1,Descriptive Text2,Descriptive Text3,Descriptive Text4,Descriptive Text5,Descriptive Text6
20061031,23:59:00,121000248,"WELLS FARGO BANK, N.A.",CA,4121235097,COMMERCIAL DDA,Silicon Image - A/P,USD,475,CHECK PAID,55.86,0,0,0,0,51689,10/31/2006,,IA000313659233,POSTED,,,,,,,
20061031,23:59:00,121000248,"WELLS FARGO BANK, N.A.",CA,4121235097,COMMERCIAL DDA,Silicon Image - A/P,USD,475,CHECK PAID,1377.57,0,0,0,0,51685,10/31/2006,,IA000210166161,POSTED
20061031,23:59:00,121000248,"WELLS FARGO BANK, N.A.",CA,4121235097,COMMERCIAL DDA,Silicon Image - A/P,USD,475,CHECK PAID,1435,0,0,0,0,51621,10/31/2006,,IA000627084628,POSTED
It reads the first row properly , but when in encounters the second row POSTED column , it tries to find the terminated column and since it is not present it reads data from the next line until it finds a comma.
This is not the way i would like to handle it.
We get around 1000-2000 records in that CSV from the Bank and they cant change the way they are sending the file. So , please help in resolving this issue.
Thanks
SridharHi thanks for the reply
well i've been having a play around but haven't got it to work yet,
Here's what i have so far
private void jButton2MouseClicked(java.awt.event.MouseEvent evt) {
BufferedReader sb = new BufferedReader();
sb.append(""+jTextField1.setText()+",");
sb.append(""+jFormattedTextField1.setText()+"\n");
sb.append(""+jTextField2.setText()+",");
sb.append(""+jFormattedTextField2.setText()+"\n");
sb.append(""+jTextField3.setText()+",");
sb.append(""+jFormattedTextField3.setText()+"\n");
sb.append(""+jTextField4.setText()+",");
sb.append(""+jFormattedTextField4.setText()+"\n");
sb.append(""+jTextField5.setText()+",");
sb.append(""+jFormattedTextField5.setText()+"\n");
ReadFromFile(sb.toString());
private void ReadFromFile(String s) {
try {
BufferedReader in = new BufferedReader(new FileReader("/users/data.txt"));
String str;
while ((str = in.readLine()) != null) {
Process(str);
in.close();
} catch (IOException e) {
}// TODO add your handling code here:
}im thinking the problem is the parts that say sb.append but i don't know what to in it's place? -
SQL LOADER Problem when data is loaded but not come in standard formate
Hi guys,
I got problem when sql loader run data loaded successfully in table but UOM data not come in standard formate.
UOM table column contains the Unit of measure data but in my excel sheet it's look like :
EXCEl SHEET DATA:
1541GAFB07080 0 Metres
1541GAFE10040 109.6 Metres
1541GAFE10050 594.2 Metres
1541GAFE10070 126.26 Metres
1541GAFE14040 6.12 Metres
1541GAFE14050 0 Metres
1541SAFA05210 0 Metres
1541SAFA07100 0 Metres
1551EKDA05210 0 Nos
1551EKDA07100 0 Nos
1551EKDA07120 0 Nos
1551EKDA07140 0 Nos
1551EKDA07200 0 Nos.
1551EKDA08160 0 Nos.
1551EKDA08180 0 Nos.
1551EKDA08200 0 Nos.
1551EKDA10080 41 Nos.
1551EKDA10140 85 Nos.
.ctl file :
OPTIONS (silent=(header,feedback,discards))
LOAD DATA
INFILE *
APPEND
INTO TABLE XXPL_PO_REQUISITION_STG
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY'"'
TRAILING NULLCOLS
( ITEM_CODE CHAR,
ITEM_DESCRIPTION CHAR "TRIM(:ITEM_DESCRIPTION)",
QUANTITY,
UOM,
NEED_BY_DATE,
PROJECT,
TASK_NAME,
BUYER,
REQ_TYPE,
STATUS,
ORGANIZATION_CODE,
LOCATION,
SUBINVENTORY,
LINE_NO,
REQ_NUMBER,
LOADED_FLAG CONSTANT 'N',
SERIAL_NO "XXPL_PRREQ_SEQ.NEXTVAL",
CREATED_BY,
CREATION_DATE SYSDATE,
LAST_UPDATED_BY,
LAST_UPDATED_DATE,
LAST_UPDATED_LOGIN
Some output came in table like:
W541WDCA05260 0 Metres|
W541WDCA05290 3 Metres|
W541WDCA05264 4 Metres|
W541WDCA05280 8 Metres|
1551EADA04240 0 Nos|
1551EADA07100 0 Nos|
1551EKDA10080 0 Nos.|
1551EKDA10080 41 Nos.|
proble in | delimiter...how to remove it ' | ' from my table when sqlloader program runnig ...... where i can change in .ctl file or excel file....it's urgent guys olz help me ..
thanksHi,
How are you extracting the data to Excel sheet ?
Please check the format type of the column in Excel sheet for UOM.
There is no issue in the SQL loader control file, but issue is there in your source excel file. (Try using a different method to extract the data to Excel sheet.)
Regards,
Yuvaraj.C -
Hello,
I've created a Mapping from a Flatfile to a table.
Since SQL-Loader is used I have to set several properties (TRALING_NULLCOLS, FILELOCATION, etc)
However, every property I try to set fails by the same error:
OMBALTER MAPPING 'MAP_DIM_UNIT_TEST' \
SET PROPERTIES (TRAILING_NULLCOLS) VALUES ('true')
--> OMB02902: Error setting property TRAILING_NULLCOLS of MAP_DIM_UNIT_TEST: MMM1034: Property TRAILING_NULLCOLS does not exist.Reading and setting these properties on Mappings created by the Design Center was no problem.
What am I missing?
When configuring a Mapping in the Design Center I have to create the "SQL Loader Data Files".
Maybe that's got something to do with it. I have no idea how to do that with OMBPlus.
Any clue?
Regards
uhuebnerHi
You should set the GENERATION LANGUAGE property to SQLLOADER for the map. The property set is different depending on the type of the map, the UI does some tricks I think and sets up, but the API does not (it does at certain points like generation/validation).
Here are some other snippets you will find useful.....
1. To add a source data file to a mapping:
OMBALTER MAPPING 'LOAD_LOCATIONS' ADD SOURCE_DATA_FILE 'LOCATIONS_APAC'
2. To define the discard file info
OMBALTER MAPPING 'LOAD_LOCATIONS' MODIFY SOURCE_DATA_FILE 'LOCATIONS_APAC' SET PROPERTIES (DISCARD_FILE_LOCATION,DISCARD_FILE_NAME) VALUES ('APAC_FILE_STG_LOC', 'apac_discard.txt')
3. To define the data file info
OMBALTER MAPPING 'LOAD_LOCATIONS' MODIFY SOURCE_DATA_FILE 'LOCATIONS_APAC' SET PROPERTIES (DATA_FILE_LOCATION, DATA_FILE_NAME) VALUES ('APAC_FILE_STG_LOC', 'apac_data.txt')
4. To define the bad file info
OMBALTER MAPPING 'LOAD_LOCATIONS' MODIFY SOURCE_DATA_FILE 'LOCATIONS_APAC' SET PROPERTIES (BAD_FILE_LOCATION,BAD_FILE_NAME) VALUES ('APAC_FILE_STG_LOC', 'apac_bad.txt')
5 To query the files it is not obvious (or consistent with other commands)...
OMBRETRIEVE MAPPING 'SR_AGS_LDR' GET SOURCE_DATA_FILE
Notice the 'GET SOURCE_DATA_FILE' is singular even though it possibly return a list (in most other commands some form of the plural is used ie. GET TABLE OPERATORS).
Cheers
David -
Problems with autoextent on a 8.1.7.3 64bit installation
Hi,
two weeks ago I installed a 8.1.7.3 database with 4 tablespaces on a testbase,
two for the data, two for the indexes. The data-tablespaces and the index-tablespaces
were splittet to 4 datafiles. The datafiles of the data-tbs had a start size of
2000MB each, autoextent on and next extent of 500MB. The datafiles of the index-tbs
had a start size of 500 MB each, autoextent on and next of 500MB.
I had a look at the datafile parameters every file is set to autoextent, but the
autoextent doesn't work. I analysed this problem completly, everything is set up correct.
The only thing I noticed is that the data is loaded with sqlloader.
Could it be that this is known bug of this version or is this a well known problem?
Does someone of you have a tip for me what i can do?
Thank you for your help!
TobiasQ -
1) What is the actual error you are getting ? cut paste error/alertlog/oserror.
2) Is the table running out of extents ? ( maxextents ???? )
3) Is your mount where all the datafiles are stored capable of handling files with 2GB+ ? ( see largefiles )
4) Is your sqlldr using "direct=yes" ? If so see the hwm of the tables as well as the tablespace.
5) Are the tables sized correctly ?
6) Are you truncating the tables before the load ? If so are u using drop storage clause ?
Too many questions - very little details about your problem.
Not to offend you ( apologies ) , but should help: http://www.tuxedo.org/~esr/faqs/smart-questions.html
HTH,
V
Hi,
two weeks ago I installed a 8.1.7.3 database with 4 tablespaces on a testbase,
two for the data, two for the indexes. The data-tablespaces and the index-tablespaces
were splittet to 4 datafiles. The datafiles of the data-tbs had a start size of
2000MB each, autoextent on and next extent of 500MB. The datafiles of the index-tbs
had a start size of 500 MB each, autoextent on and next of 500MB.
I had a look at the datafile parameters every file is set to autoextent, but the
autoextent doesn't work. I analysed this problem completly, everything is set up correct.
The only thing I noticed is that the data is loaded with sqlloader.
Could it be that this is known bug of this version or is this a well known problem?
Does someone of you have a tip for me what i can do?
Thank you for your help!
Tobias -
SQLLoader - using more than 1 function in a single field
Hi.
I have problem with sqlloader. i'm trying to call 2 function (DECODE and NPCS_FINBOUND_CTRL) for field type like below
LOAD DATA
INFILE "TEST.txt"
APPEND
INTO TABLE NPCS_INBOUND_NIZ
TYPE POSITION(1:1) "DECODE(:TYPE,1,2,:TYPE)" "NPCS_FINBOUND_CTRL(:TYPE,NPCS_FSEQCTRL('P_GFA','0',:TYPE),NPCS_FSEQC
TRL('P_GFA','1',:TYPE),NPCS_TRANS_ID_SEQ.nextval,null,null,null)"
,FILE_ID "NPCS_FSEQCTRL('P_GFA','0',DECODE(:TYPE,1,2,:TYPE))"
,BATCH_ID "NPCS_FSEQGET('P_GFA','1',:TYPE)"
,TRANS_ID "TRANS_ID_SEQ.nextval"
,DATA POSITION(1:160)
is sqlloader allow us to call more than 1 function because when i run the loader i got error like this
SQL*Loader-350: Syntax error at line 6.
Expecting valid column specification, "," or ")", found "DECODE(:TYPE,1,2,:TYPE)".
DECODE(:TYPE,1,2,:TYPE)" "DECODE(:TYPE,1,2,:TYPE)"
,FILE_ID "NP
ok
tq.create the control file as given below
load data
infile *
into table emp1 truncate
(empno char terminated by '\n',
ename expression "substr(function1(:empno),1,3)",
ename1 expression "substr(function2(:empno),1,3)")
begindata
111
222
333
Regards,
Abu -
SQL*Loader importing problem, with file with eastern european files
Hello,
on Oracle 11g with UTF-8 encoding, I tried to import a csv file into a table via sqlload, the separator is the semicolon ";" all work fine except for some lines witch are not well integrated (the concerned files come from Eastern European countries like Bulgary, Hungary and Czech Republic).
For example:
For:
text_1; text_2; text_with_char_at_end_like_š; new_text
during the integration instead of have:
| text_1 | text_2 | text_with_char_at_end_like_š| new_text |
I got:
| text_1 | text_2 | text_with_char_at_end_like_š; new_text | null |
does anyone has this problem, I tried to change the delimiter by code X'59', specified in sqlldr ENCODING UTF8 ... but it does not work
do you have an idea about this problem
Thank you in advanceThanks,
the problem was solved since, the file was not in UTF8 format (for example GREEK FORMAT) and the NLS_LANG was AMERICAN_AMERICA ASCII
then i translate all files to UTF 8 and changed the NLS_LANG to UtF8.
Regards -
Problem import csv file with SQL*loader and control file
I have a *csv file looking like this:
E0100070;EKKJ 1X10/10 1 KV;1;2003-06-16;01C;75
E0100075;EKKJ 1X10/10 1 KV;500;2003-06-16;01C;67
E0100440;EKKJ 2X2,5/2,5 1 KV;1;2003-06-16;01C;37,2
E0100445;EKKJ 2X2,5/2,5 1 KV;500;2003-06-16;01C;33,2
E0100450;EKKJ 2X4/4 1 KV;1;2003-06-16;01C;53
E0100455;EKKJ 2X4/4 1 KV;500;2003-06-16;01C;47,1
I want to import this csv file to this table:
create table artikel (artnr varchar2(10), namn varchar2(25), fp_storlek number, datum date, mtrlid varchar2(5), pris number);
My controlfile looks like this:
LOAD DATA
INFILE 'e:\test.csv'
INSERT
INTO TABLE ARTIKEL
FIELDS TERMINATED BY ';'
TRAILING NULLCOLS
(ARTNR, NAMN, FP_STORLEK char "to_number(:fp_storlek,'99999')", DATUM date 'yyyy-mm-dd', MTRLID, pris char "to_number(:pris,'999999D99')")
I cant get sql*loader to import the last column(pris) as I want. It ignore my decimal point which in this case is "," and not "." maybe this is the problem. If the decimal point is the problem how can I get oracle to recognize "," as a decimal point??
the result from the import now, is that a decimal number (37,2) becomes 372 in the tableSet NLS_NUMERIC_CHARACTERS environment variable at OS level, before running SqlLoader :
$ cat test.csv
E0100070;EKKJ 1X10/10 1 KV;1;2003-06-16;01C;75
E0100075;EKKJ 1X10/10 1 KV;500;2003-06-16;01C;67
E0100440;EKKJ 2X2,5/2,5 1 KV;1;2003-06-16;01C;37,2
E0100445;EKKJ 2X2,5/2,5 1 KV;500;2003-06-16;01C;33,2
E0100450;EKKJ 2X4/4 1 KV;1;2003-06-16;01C;53
E0100455;EKKJ 2X4/4 1 KV;500;2003-06-16;01C;47,1
$ cat artikel.ctl
LOAD DATA
INFILE 'test.csv'
replace
INTO TABLE ARTIKEL
FIELDS TERMINATED BY ';'
TRAILING NULLCOLS
(ARTNR, NAMN, FP_STORLEK char "to_number(:fp_storlek,'99999')", DATUM date 'yyyy-mm-dd', MTRLID, pris char "to_number(:pris,'999999D99')")
$ sqlldr scott/tiger control=artikel
SQL*Loader: Release 10.1.0.3.0 - Production on Sat Nov 12 15:10:01 2005
Copyright (c) 1982, 2004, Oracle. All rights reserved.
Commit point reached - logical record count 6
$ sqlplus scott/tiger
SQL*Plus: Release 10.1.0.3.0 - Production on Sat Nov 12 15:10:11 2005
Copyright (c) 1982, 2004, Oracle. All rights reserved.
Connected to:
Oracle Database 10g Enterprise Edition Release 10.1.0.3.0 - Production
With the Partitioning, OLAP and Data Mining options
SQL> select * from artikel;
ARTNR NAMN FP_STORLEK DATUM MTRLI PRIS
E0100070 EKKJ 1X10/10 1 KV 1 16/06/2003 01C 75
E0100075 EKKJ 1X10/10 1 KV 500 16/06/2003 01C 67
E0100440 EKKJ 2X2,5/2,5 1 KV 1 16/06/2003 01C 372
E0100445 EKKJ 2X2,5/2,5 1 KV 500 16/06/2003 01C 332
E0100450 EKKJ 2X4/4 1 KV 1 16/06/2003 01C 53
E0100455 EKKJ 2X4/4 1 KV 500 16/06/2003 01C 471
6 rows selected.
SQL> exit
Disconnected from Oracle Database 10g Enterprise Edition Release 10.1.0.3.0 - Production
With the Partitioning, OLAP and Data Mining options
$ export NLS_NUMERIC_CHARACTERS=',.'
$ sqlldr scott/tiger control=artikel
SQL*Loader: Release 10.1.0.3.0 - Production on Sat Nov 12 15:10:41 2005
Copyright (c) 1982, 2004, Oracle. All rights reserved.
Commit point reached - logical record count 6
$ sqlplus scott/tiger
SQL*Plus: Release 10.1.0.3.0 - Production on Sat Nov 12 15:10:45 2005
Copyright (c) 1982, 2004, Oracle. All rights reserved.
Connected to:
Oracle Database 10g Enterprise Edition Release 10.1.0.3.0 - Production
With the Partitioning, OLAP and Data Mining options
SQL> select * from artikel;
ARTNR NAMN FP_STORLEK DATUM MTRLI PRIS
E0100070 EKKJ 1X10/10 1 KV 1 16/06/2003 01C 75
E0100075 EKKJ 1X10/10 1 KV 500 16/06/2003 01C 67
E0100440 EKKJ 2X2,5/2,5 1 KV 1 16/06/2003 01C 37,2
E0100445 EKKJ 2X2,5/2,5 1 KV 500 16/06/2003 01C 33,2
E0100450 EKKJ 2X4/4 1 KV 1 16/06/2003 01C 53
E0100455 EKKJ 2X4/4 1 KV 500 16/06/2003 01C 47,1
6 rows selected.
SQL> Control file is exactly as yours, I just put replace instead of insert.
Maybe you are looking for
-
CRASH IN L9 ...select 10 files at once from Audio Bin (short sfx files) to a track set to flex (monophonic flex mode)......when prompt appears select: "all files on same track" CRASH! SvK
-
I am fairly new to photoshop and am learning alot. I love the tutorials. I recently did a photo shoot for a friend at her daughters birthday party, spent days editing and fixing the pictures and finally copied them do a disc. She took them home to
-
Hi Friends, Oracle DB 10G OEL 5.3 We have a PROD server with Oracle database 10g on it. We have also an RMAN backup which is saved on the server, but on a separate disk of its own. I want to test if my RMAN backup is good and can be restored or recov
-
Can I download OS 4.0 from internet
Hi, I have been lax about updating my OS and now and having a really hard time accessing internet sites. Could someone offer me their opinion on the best apporach to take. I have a 1.5 ghz processor, 512 mb of ram and 10.3.9. For instance, my current
-
Function Module to create AFS Outbound delivery for sales order
Guys, I am looking for a function module to create deliveries for a sales order IN IS-AFS. I cannot make BAPI_DELIVERYPROCESSING_EXEC work. Also, if I use the IDOC type /AFS/DELVRY03 via message type WHSCON and function module IDOC_INPUT_DELVRY it do