(urgent) SQL*Loader Large file support in O734
hi there,
i have the following sqlloader error when trying to upload data file(s),
each has size 10G - 20G to Oracle 734 DB on SunOS 5.6 .
>>
SQL*Loader-500: Unable to open file (..... /tstt.dat)
SVR4 Error: 79: Value too large for defined data type
<<
i know there's bug fix for large file support in Oracle 8 -
>>
Oracle supports files over 2GB for the oracle executable.
Contact Worldwide Support for information about fixes for bug 508304,
which will add large file support for imp, exp, and sqlldr
<<
however, really want to know if any fix for Oracle 734 ?
thx.
Example
Control file
C:\DOCUME~1\MAMOHI~1>type dept.ctl
load data
infile dept.dat
into table dept
append
fields terminated by ',' optionally enclosed by '"'
trailing nullcols
(deptno integer external,
dname char,
loc char)
Data file
C:\DOCUME~1\MAMOHI~1>type dept.dat
50,IT,VIKARABAD
60,INVENTORY,NIZAMABAD
C:\DOCUME~1\MAMOHI~1>
C:\DOCUME~1\MAMOHI~1>dir dept.*
Volume in drive C has no label.
Volume Serial Number is 9CCC-A1AF
Directory of C:\DOCUME~1\MAMOHI~1
09/21/2006 08:33 AM 177 dept.ctl
04/05/2007 12:17 PM 41 dept.dat
2 File(s) 8,043 bytes
0 Dir(s) 1,165 bytes free
Intelligent sqlldr command
C:\DOCUME~1\MAMOHI~1>sqlldr userid=hary/hary control=dept.ctl
SQL*Loader: Release 10.2.0.1.0 - Production on Thu Apr 5 12:18:26 2007
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Commit point reached - logical record count 2
C:\DOCUME~1\MAMOHI~1>sqlplus hary/hary
SQL*Plus: Release 10.2.0.1.0 - Production on Thu Apr 5 12:18:37 2007
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Connected to:
Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
As I am appending I got two extra rows. One department in your district and another in my district :)
SQL> select * from dept;
DEPTNO DNAME LOC
10 ACCOUNTING NEW YORK
20 RESEARCH DALLAS
30 SALES CHICAGO
40 OPERATIONS BOSTON
50 IT VIKARABAD
60 INVENTORY NIZAMABAD
6 rows selected.
SQL>
Similar Messages
-
Loading large files in Java Swing GUI
Hello Everyone!
I am trying to load large files(more then 70 MB of xml text) in a Java Swing GUI. I tried several approaches,
1)Byte based loading whith a loop similar to
pane.setText("");
InputStream file_reader = new BufferedInputStream(new FileInputStream
(file));
int BUFFER_SIZE = 4096;
byte[] buffer = new byte[BUFFER_SIZE];
int bytesRead;
String line;
while ((bytesRead = file_reader.read(buffer, 0, BUFFER_SIZE)) != -1)
line = new String(buffer, 0, bytesRead);
pane.append(line);
}But this is gives me unacceptable response times for large files and runs out of Java Heap memory.
2) I read in several places that I could load only small chunks of the file at a time and when the user scrolls upwards or downwards the next/previous chunk is loaded , to achieve this I am guessing extensive manipulation for the ScrollBar in the JScrollPane will be needed or adding an external JScrollBar perhaps? Can anyone provide sample code for that approach? (Putting in mind that I am writting code for an editor so I will be needing to interact via clicks and mouse wheel roatation and keyboard buttons and so on...)
If anyone can help me, post sample code or point me to useful links that deal with this issue or with writting code for editors in general I would be very grateful.
Thank you in advance.Hi,
I'm replying to your question from another thread.
To handle large files I used the new IO libary. I'm trying to remember off the top of my head but the classes involved were the RandomAccessFile, FileChannel and MappedByteBuffer. The MappedByteBuffer was the best way for me to read and write to the file.
When opening the file I had to scan through the contents of the file using a swing worker thread and progress monitor. Whilst doing this I indexed the file into managable chunks. I also created a cache to further optimise file access.
In all it worked really well and I was suprised by the performance of the new IO libraries. I remember loading 1GB files and whilst having to wait a few seconds to perform the indexing you wouldn't know that the data for the JList was being retrieved from a file whilst the application was running.
Good Luck,
Martin. -
Create sql loader data file dynamically
Hi,
I want a sample program/approach which is used to create a sql loader data file.
The program will read table name as i/p and will use
select stmt will column list derived from user_tab_columns from data dictionary
assuming multiple clob columns in the column list.
Thanks
ManojI 'm writing clob and other columns to a sql loader dat file.
Below sample code for writing clob column is giving file write error.
How can I write multiple clobs to dat file so that control file will handle it correctly
offset NUMBER := 1;
chunk VARCHAR2(32000);
chunk_size NUMBER := 32000;
WHILE( offset < dbms_lob.getlength(l_rec_type.narrative) )
LOOP
chunk := dbms_lob.substr(l_rec_type.narrative, chunk_size, offset );
utl_file.put( l_file_handle, chunk );
utl_file.fflush(l_file_handle);
offset := offset + chunk_size;
END LOOP;
utl_file.new_line(l_file_handle); -
How to call sql loader ctrl file with in the pl/sql procedure
Hi Friends,
I am doing a project related with the transferring data using queues. In the queue , I will get a tab delimited data in the form of CLOB variable/message. I do want to store that dat in the oracle table.
When updating data into the table ,
1. Don't want to write that data into a file.( Want to access directly after dequeueing from the specfic queue).
2. As the data is in tab delimited form, I want to use sql loader concept.
How do I call the sql loader ctrl file with in my pl/sql procedure. When I searched , most of the forums recommending external procedure or Java program.
Please Guide me on this issue. my preferrence is pl sql, But don't know about external procedure . If no other way , I will try Java.
I am using oracle 9.2.0.8.0.
Thanks in advance,
Vimal..Neither SQL*Loader nor external tables are designed to read data from a CLOB stored in the database. They both work on files stored on the file system. If you don't want the data to be written to a file, you're going to have to roll your own parsing code. This is certainly possible. But it is going to be far less efficient than either SQL*Loader or external tables. And it's likely to involve quite a bit more code.
The simplest possible thing that could work would be to use something like Tom Kyte's string tokenization package to read a line from the CLOB, break it into the component pieces, and then store the different tokens in a meaningful collection (i.e. an object type or a record type that corresponds to the table definition). Of course, you'll need to handle things like converting strings to numbers or dates, rejecting rows, writing log files, etc.
Justin -
Parameters / Variables in SQL Loader Control File
Here's my problem, I have created SQL Loader control file (ctl) to load a file of employees. This loader program is registered in Oracle Applications. What I want is to pass a parameter derived from the profile like the user name of the one who executed the loader program and pass it to the loader program so that the user name info is included in the loading
example
input data
123456,John,Smith
654321,Jane,Doe
loaded data
123456,John,Smith,03-JUL-2009,USER13
654321,Jane,Doe,03-JUL-2009,USER13
the sysdate is easy because it can be defined in the column but what I really want to achieve is a parameter passed to the ctl file.
Thanks in advance.Dear user!
Please have a look at this thread.
{thread:id=915277}
In the last three posts I explain how to use a shellscript with AWK and SED to pass parameters to a controlfile.
Please feel free to post again if you have some questions regarding my explanatory notes.
Yours sincerely
Florian W. -
Define variable in SQL Loader Control File
Hi,
I have an input file where the first line is the header record, followed by the detail records. For the processing, I do not need to store the fields of the header record but I need a date field from this header record and store in as part of the detail record in an oracle record row.
Is it possible to define a variable within the sql loader control file to store the value that I need in memory then use it when I do the sql insert by the sql loader?
Thanks for any advice.Not sure that you can. But if your on unix/linux/mac its easy enough to write a shell script to populates the variables in a template file that you can then use as the ctl file. The perl template toolkit could be an option for that as well
-
Urgent: SQL*Loader-562: record too long
with sqlldr73 , no problem
but with sqlldr of 817 : i've got the problem !!??
any help please ...
Mourad from ParisHi Sandeep,
Oracle guru Dave Moore has many sample SQL*Loader control files published:
http://www.google.com/search?&q=oracle+moore+sql%2aloader
Here is a simple sample control file to load many tables:
http://www.dba-oracle.com/t_sql_loader_multiple_tables_sqlldr.htm
Hope this helps. . .
Donald K. Burleson
Oracle Press author -
Way to generate sql*loader ctl file from a table?
I'm an oracle newbie. (Oracle 8i, HP Unix)Is there any way to take an existing table
description and generate a sql*loader control file from it? If anyone has already written a procedure or knows where one can be found I'd really appreciate it. We're doing a mass conversion to oracle and would like an easy way to re-write all our loads.
Thanks! Eileen from Polaroid.Hi,
I have shell program, which will get column names from system table and create temp. control file and call sqlloader exe. and load the data automatically.
You can customise this file, according to your needs.
Shell Program
if [ $# -ne 2 ]
then
echo " Usage : $0 table_name flat file name"
exit
fi
#assigning envir. variable to unix variables
table_name=$1
flat_file=$2
#creating the control file
echo "LOAD DATA" > $table_name.ctl
echo "INFILE '$flat_file'" >> $table_name.ctl
echo "into table $table_name " >> $table_name.ctl
echo "fields terminated by '\t'" >> $table_name.ctl
#calling functions for making column name part
#describing the table and spooling into file
sqlplus -s $CUST_ORA_USER << sql_block
spool $table_name.lst
desc $table_name
spool off
sql_block
# creating suitable file and add the feilds into control file
# cutting the first line (headings)
tail +3 $table_name.lst > temp
rm -f $table_name.lst
k=`wc -l < temp`
k1=`expr $k - 1`
#cutting the last line
head -$k1 temp > tempx
record_count=`wc -l < tempx`
counter=1
echo "(" > wxyz.ctl
# reading file line by line
cat tempx | while in_line=`line`
do
#cutting the first field
field_name=`echo $in_line | cut -f1 -d' '`
#calculating the no of characters
word_cnt=`echo $in_line | wc -w`
#calculating count in a line
if [ $word_cnt = 2 ]
then
data_type=`echo $in_line | cut -f2 -d' ' | cut -f1 -d'('`
if [ "$data_type" = "DATE" ]
then
data_fmt="DECODE(LENGTH(LTRIM(RTRIM(:$field_name))),'11',to_date(ltrim(rtrim(:$field_name)),'dd-mon-yyyy'),'9',to_date(ltrim(rtrim(:$field_name)),'dd-mm-yy'),'10',to_date(ltrim(rtr im(:$field_name)),'dd-mon-yy'),'yyyy/mm/dd hh24:mi:ss')"
elif [ "$data_type" = "CHAR" ]
then
data_fmt="NVL(RTRIM(LTRIM(:$field_name)),' ')"
elif [ "$data_type" = "VARCHAR2" ]
then
data_fmt="NVL(RTRIM(LTRIM(:$field_name)),' ')"
else
data_fmt="NVL(:$field_name,0) "
fi
else
data_type=`echo $in_line | cut -f4 -d' ' | cut -f1 -d'('`
if [ "$data_type" = "DATE" ]
then
data_fmt="DECODE(LENGHTH(LTRIM(RTRIM(:$field_name))),'11',to_date(ltrim(rtrim(:$field_name)),'dd-mon-yyyy'),'9',to_date(ltrim(rtrim(:$field_name)),'dd-mm-yy'),'10',to_date(ltrim(rt rim(:$field_name)),'dd-mon-yy'),'yyyy/mm/dd hh24:mi:ss')"
elif [ "$data_type" = "CHAR" ]
then
data_fmt="NVL(RTRIM(LTRIM(:$field_name)),' ')"
elif [ "$data_type" = "VARCHAR2" ]
then
data_fmt="NVL(RTRIM(LTRIM(:$field_name)),' ')"
else
data_fmt="NVL(:$field_name,0) "
fi
fi
#if last line put );
#else ,
if test $record_count -eq $counter
then
echo $field_name \"$data_fmt\"");" >> wxyz.ctl
else
echo $field_name \"$data_fmt\""," >> wxyz.ctl
fi
#counter increamenting for each record
counter=`expr $counter + 1`
done
#removing the file
rm -f temp tempx
cat wxyz.ctl >> $table_name.ctl
rm -f x.ctl
#calling the SQLLOADER
SQLLDR $CUST_ORA_USER CONTROL=$table_name.ctl ERROR=99999999
#removing the control file
rm -f $table_name.ctl -
Add "Trailing Nullcolls" to sql loader control files generated by OWB
Hi gurus,
I've got a problem loading data with SQL Loader. I need to add the parameter "trailing nullcols" into the SQL Loader control file generated by OWB. I don't want to do this by saving the skript on my hard disk and run it manually, so any ideas where I can put this parameter? I am using OWB 10.1 on a windows 2000 machine.
Thanks
StephanHi,
I found the solution to problem.
1; Select the mapping where you map your source flat file to a table.
2; Right click on the mapping and select "configure"
3; Go Sources and Targets -> YOUR_TABLE_NAME -> SQL*Loader Parameters
4; Set Trailing Nullcols = true :-)
Thank you to anyone looking at this problem.
Greetings
Stephan -
Importing already-made SQL*Loader control files in OWB 10g
Hi all,
Suppose that I have a certain amount of already hand-made SQL*Loader control files, say 50 or so.
Each of these control files have a variable quantity of fields.
I feel really lazy, so I would like to know if it is possible to import directly these control files instead of re-typing them in the design center. That would save me a lot of time.
Thanks !
BurgyHi Burgy
One option is to use the sqlloader option to generate an external table from the SQLLoader control file (search for generate_only in the sqlloader documentation). If you define the external table in the database you can reverse engineer this into OWB.
Cheers
David -
Problem specifying SQL Loader Log file destination using EM
Good evening,
I am following the example given in the 2 Day DBA document chapter 8 section 16.
In step 5 of 7, EM does not allow me to specify the destination of the SQL Loader log file to be on a mapped network drive.
The question: Does SQL Loader have a limitation that I am not aware of, that prevents placing the log file on a network share or am I getting this error because of something else I am inadvertently doing wrong ?
Note: I have placed the DDL, load file data and steps I follow in EM at the bottom of this post to facilitate reproducing the problem *(drive Z is a mapped drive)*.
Thank you for your help,
John.
DDL (generated using SQL developer, you may want to change the space allocated to be less)
CREATE TABLE "NICK"."PURCHASE_ORDERS"
"PO_NUMBER" NUMBER NOT NULL ENABLE,
"PO_DESCRIPTION" VARCHAR2(200 BYTE),
"PO_DATE" DATE NOT NULL ENABLE,
"PO_VENDOR" NUMBER NOT NULL ENABLE,
"PO_DATE_RECEIVED" DATE,
PRIMARY KEY ("PO_NUMBER") USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS NOCOMPRESS LOGGING TABLESPACE "USERS" ENABLE
SEGMENT CREATION DEFERRED PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING STORAGE
INITIAL 67108864
TABLESPACE "USERS" ;
Load.dat file contents
1, Office Equipment, 25-MAY-2006, 1201, 13-JUN-2006
2, Computer System, 18-JUN-2006, 1201, 27-JUN-2006
3, Travel Expense, 26-JUN-2006, 1340, 11-JUL-2006
Steps I am carrying out in EM
log in, select data movement -> Load Data from User Files
Automatically generate control file
(enter host credentials that work on your machine)
continue
Step 1 of 7 ->
Data file is located on your browser machine
"Z:\Documentation\Oracle\2DayDBA\Scripts\Load.dat"
click next
step 2 of 7 ->
Table Name
nick.purchase_orders
click next
step 3 of 7 ->
click next
step 4 of 7 ->
click next
step 5 of 7 ->
Generate log file where logging information is to be stored
Z:\Documentation\Oracle\2DayDBA\Scripts\Load.LOG
Validation Error
Examine and correct the following errors, then retry the operation:
LogFile - The directory does not exist.Hi John,
But, i did'nt found any error when i am going the same what you did.
My Oracle Version is 10.2.0.1 and using Windows xp. See what i did and i got worked
1.I created one table in scott schema :
SCOTT@orcl> CREATE TABLE "PURCHASE_ORDERS"
2 (
3 "PO_NUMBER" NUMBER NOT NULL ENABLE,
4 "PO_DESCRIPTION" VARCHAR2(200 BYTE),
5 "PO_DATE" DATE NOT NULL ENABLE,
6 "PO_VENDOR" NUMBER NOT NULL ENABLE,
7 "PO_DATE_RECEIVED" DATE,
8 PRIMARY KEY ("PO_NUMBER") USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS NOCOMPRESS LOGGING TABLESPACE "USERS" ENABLE
9 )
10 TABLESPACE "USERS";
Table created.I logged into em Maintenance-->Data Movement-->Load Data from User Files-->My Host Credentials
Here i total 3 text boxes :
1.Server Data File : C:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\USERS01.DBF
2.Data File is Located on Your Browser Machine : z:\load.dat <--- Here z:\ means other machine's shared doc folder; and i selected this option (as option button click) and i created the same load.dat as you mentioned.
3.Temporary File Location : C:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\ <--- I did'nt mentioned anything.
Step 2 of 7 Table Name : scott.PURCHASE_ORDERS
Step 3 of 7 I just clicked Next
Step 4 of 7 I just clicked Next
Step 5 of 7 I just clicked Next
Step 6 of 7 I just clicked Next
Step 7 of 7 Here it is Control File Contents:
LOAD DATA
APPEND
INTO TABLE scott.PURCHASE_ORDERS
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
PO_NUMBER INTEGER EXTERNAL,
PO_DESCRIPTION CHAR,
PO_DATE DATE,
PO_VENDOR INTEGER EXTERNAL,
PO_DATE_RECEIVED DATE
And i just clicked on submit job.
Now i got all 3 rows in purchase_orders :
SCOTT@orcl> select count(*) from purchase_orders;
COUNT(*)
3So, there is no bug, it worked and please retry if you get any error/issue.
HTH
Girish Sharma -
Large File Support OEL 5.8 x86 64 bit
All,
I have downloaded some software from OTN and would like to combine the individual zip files into one large file. I assumed that I would unzip the individual files and then create a single zip file. However, my OEL zip utility fails with the file is too large error.
Perhaps related, I can't ls the directories that contain huge files. For example, a huge VM image file.
Do I need to rebuild the utilities to support huge files or do I need install/upgrade rpm's? If I need to relink lint programs, what are the commands to do such?
Thanks.Correction: This is occurring on a 5.7 release and not a 5.8. The command uname -a yields 2.6.32-200.13.1.2...
The file size I'm trying to assemble is ~ 4 GB (the zip files for 12c DB). As for using tar, as I understand it, EM 12c wants a single zip file for software deployments.
I'm assuming that the large file support was not linked into the utilities. For example when I try to list contents of a directory which has huge/large files, I get the following message:
ls: V29653-01.iso: Value too large for defined data type
I'm assuming that I need to upgrade to later release.
Thanks. -
IdcApache2Auth.so Compiled With Large File Support
Hi, I'm installing UCM 10g on solaris 64 Bit plattform and Apache 2.0.63 , everything went fine until I update configuration in the httpd.conf file. When I query server status it seems to be ok:
+./idcserver_query+
Success checking Content Server idc status. Status: Running
but in the apache error_log and I found the next error description:
Content Server Apache filter detected a bad request_rec structure. This is possibly a problem with LFS (large file support). Bad request_rec: uri=NULL;
Sizing information:
sizeof(*r): 392
+[int]sizeof(r->chunked): 4+
+[apr_off_t]sizeof(r->clength): 4+
+[unsigned]sizeof(r->expecting_100): 4+
If the above size for r->clength is equal to 4, then this module
was compiled without LFS, which is the default on Apache 1.3 and 2.0.
Most likely, Apache was compiled with LFS, this has been seen with some
stock builds of Apache. Please contact Support to obtain an alternate
build of this module.
When I search at My Oracle Support for suggestions about how to solve my problem I found a thread which basically says that Oracle ECM support team could give me a copy IdcApache2Auth.so compiled with LFS.
What do you suggest me?
Should I ask for ECM support team help? (If yes please tell me How can I do it)
or should I update the apache web server to version 2.2 and use IdcApache22Auth.so wich is compiled with LFS?
Thanks in advance, I hope you can help me.Hi ,
Easiest approach would be to use Apache2.2 and the corresponding IdcApache22Auth.so file .
Thanks
Srinath -
Skipping fields in SQL*LOADER data file
I have a data file that has more fields than the target table does. How can I write a SQL*LOADER control file to skip some fields in the middle of the text line?
nullIf you don't want to define input fields by position, the simplest way I think is to use FILLER fields.
Quoted from SQL*Loader doc:
"Specifying Filler Fields
Filler fields have names but they are not loaded into the table. However, filler fields can be used as arguments to init_specs (for example, NULLIF and DEFAULTIF) as well as to directives (for example, SID, OID, REF, BFILE). Also, filler fields can occur anyplace in the data file. They can be inside of the field list for an object or inside the definition of a VARRAY.
See SQL*Loader DDL Behavior and Restrictions for more information on filler fields and their use.
A sample filler field specification looks as follows:
field_1_count FILLER char,
Ex:
Regards,
Zoltan -
Passing file name dynamically to sql loader ctl file
Hi,
I am new to scripting and I have a complex requirement involving writing a script.
Requierment:
I need to upload a CSV file from a FTP server into oracle table using SQL Loader. The file name resembles like APF0912312359.csv represents 31-DEC-2009 23:59 and there will be multiple files in same day indicated by differnt timestamp as its filename. Once I load a file using SQL loader, I need to move the file to another directory for archival.
Since the sql loader ctl file has a fixed file name, I would like to know how I can pass the file name dynamically to ctl file to load a new file every time it runs.
Any help is greatly appreciated..
Bye
Sudheer
Edited by: user2138419 on Oct 28, 2009 4:08 PMI agree with Pradeep in regards to declaring the file names on the command line. However, I do have a concern regarding presenting the password on the command line as any user that can issue a ps (assuming Unix ~= Linux here) would be able to read it while the sqlldr command is running.
Unfortunately, you may not always have the option of declaring the files on the command line. I was recently challenged with this in a Windows 2003 Server environment running SQL*Loader: Release 10.2.0.1.0. In this environment I found that I am able to set a variable file name in a calling batch file and use that value in the control file successfully. Just to demonstrate the approach:
Batch file:
set IN_FILE=’c:\inbound\load_me.txt’
sqlldr user/pswd@db PARFILE=’c:\parameters.txt’
Parameter file:
errors=500000
rows=50000
control=%CTL_FILE%
bad=%BAD_FILE%
discard=%DSC_FILE%
log=%LOG_FILE%
Control file:
LOAD DATA
INFILE ‘%IN_FILE%’
INSERT
INTO TABLE table_to_be_loaded
I’m really interested to see if this would work on Unix.
-Luke
Maybe you are looking for
-
How do I set up my @me account to text on my phone?
How do I set up a me account or text from my macbook pro?
-
Adobe Photoshop CS5 shows a missing Application manager on Lenovo 64 bit tower
I have installed, uninstalled, and reinstalled Adobe Photoshop CS5 on a Windows XP 64 bit Lenovo tower. When I attempt to open Adobe Photoshop CS5 in 64 but mode I get an error that says it can't find the Adobe Application Manager. I tried to look th
-
Mac Book (Black) and Final Cut 5.
I'm about to buy a Black Mac Book and am hearing conflicting reports. Will it run Final Cut 5 HD ? Mac Book Mac OS X (10.4.8)
-
E-Recruiting : Urgent!!! International assignment ;object type CP
Dear Experts, 1)Can someone please properly explain the concept of International assignment? How is this related to the object type CP from object P? 2)Secondly, when running the report HRALXSYNC to transfer the employees from PA to eRecruiting table
-
Netweaver Technical Architecture
Hi, Has anyone come cross the technical architecture document for all SAP Netweaver components? SAP provides installation guide for each Netweaver component which have pretty good example of the technical architecture view. But I haven't seen any tec