SQL*Loader datafile size
is there anyone who can tell me if SQL*Loader has a filesize restriction on the datafiles loaded?
Thx!
Hello,
The Size limit is mainly given by the OS. So you should care about what the OS could support as SQL*Loader files are unlimited on *64 Bit* but limited to *2GB* in *32 Bit* OS:
http://download.oracle.com/docs/cd/E11882_01/server.112/e10839/appg_db_lmts.htm#UNXAR382
Else, you should be able to load these data into the Table. So you must check that you have enough place inside the Tablespace and/or the Disk (if the Tablespace has to be extended).
Please find enclosed a link about SQL*Loader and scroll down to Limits / Defaults:
http://www.ordba.net/Tutorials/OracleUtilities~SQLLoader.htm
Hope this help.
Best regards,
Jean-Valentin
Similar Messages
-
ORACLE 8I SQL*LOADER DATAFILE의 특정 FIELD DATA를 SKIP하고 LOADING하는 방법
제품 : ORACLE SERVER
작성날짜 : 2002-04-09
ORACLE 8I SQL*LOADER DATAFILE의 특정 FIELD DATA를 SKIP하고 LOADING하는 방법
===========================================================================
아래의 예제와 같이 가변 길이의 filed들이 ',', '|' 와 같은 구분자로
구분이 되고 있는 경우 oracle 8i부터 제공되는 'FILLER'라고 하는 필드
구분자를 사용하여 상태인식자로 표시하여 insert시 skip할 수 있다.
<Example>
TABLE : skiptab
===========================
col1 varchar2(20)
col2 varchar2(20)
col3 varchar2(20)
CONTROLFIEL : skip.ctl
load data
infile skip.dat
into table skiptab
fields terminated by ","
(col1 char,
col2 filler char,
col3 char)
DATAFILE : skip.dat
SMITH, DALLAS, RESEARCH
ALLEN, CHICAGO, SALES
WARD, CHICAGO, SALES
data loading :
$sqlldr scott/tiger control=skip.ctl
결과 :
COL1 COL3
SMITH RESEARCH
ALLEN SALES
WARD SALES -
What would be the maximum datafile size that can support sql*loader
Hi,
I would like to load datafile from xls file which nearly 5 gb into oracle table by using sql*loader. Could you please tell me how much is max datafile size we can load by using sql*loader?
Thanks
VAMSHIHello,
The Size limit is mainly given by the OS. So you should care about what the OS could support as SQL*Loader files are unlimited on *64 Bit* but limited to *2GB* in *32 Bit* OS:
http://download.oracle.com/docs/cd/E11882_01/server.112/e10839/appg_db_lmts.htm#UNXAR382
Else, you should be able to load these data into the Table. So you must check that you have enough place inside the Tablespace and/or the Disk (if the Tablespace has to be extended).
Please find enclosed a link about SQL*Loader and scroll down to Limits / Defaults:
http://www.ordba.net/Tutorials/OracleUtilities~SQLLoader.htm
Hope this help.
Best regards,
Jean-Valentin -
Sql loader maximum data file size..?
Hi - I wrote sql loader script runs through shell script which will import data into table from CSV file. CSV file size is around 700MB. I am using Oracle 10g with Sun Solaris 5 environment.
My question is, is there any maximum data file size. The following code from my shell script.
SQLLDR=
DB_USER=
DB_PASS=
DB_SID=
controlFile=
dataFile=
logFileName=
badFile=
${SQLLDR} userid=$DB_USER"/"$DB_PASS"@"$DB_SID \
control=$controlFile \
data=$dataFile \
log=$logFileName \
bad=$badFile \
direct=true \
silent=all \
errors=5000Here is my control file code
LOAD DATA
APPEND
INTO TABLE KEY_HISTORY_TBL
WHEN OLD_KEY <> ''
AND NEW_KEY <> ''
FIELDS TERMINATED BY ','
TRAILING NULLCOLS
OLD_KEY "LTRIM(RTRIM(:OLD_KEY))",
NEW_KEY "LTRIM(RTRIM(:NEW_KEY))",
SYS_DATE "SYSTIMESTAMP",
STATUS CONSTANT 'C'
)Thanks,
-Soma
Edited by: user4587490 on Jun 15, 2011 10:17 AM
Edited by: user4587490 on Jun 15, 2011 11:16 AMHello Soma.
How many records exist in your 700 MB CSV file? How many do you expect to process in 10 minutes? You may want to consider performing a set of simple unit tests with 1) 1 record, 2) 1,000 records, 3) 100 MB filesize, etc. to #1 validate that your shell script and control file syntax function as expected (including the writing of log files, etc.), and #2 gauge how long the processing will take for the full file.
Hope this helps,
Luke
Please mark the answer as helpful or answered if it is so. If not, provide additional details.
Always try to provide actual or sample statements and the full text of errors along with error code to help the forum members help you better. -
SQL Loader: refer datafile name in the control file
Hi All:
Database: 10GR2
I have a table as following:
test (
filename varchar2(100),
mydata varchar2(1000)
I am using sql loader to populate data for this table. I also need store the datafile name to column test.filename as part of dataload. How to refer datafile name in the control file?
Again, I am processing multiple files in a batch mode. I need store datafile name in filename column so that I can know which "mydata" come from which datafile.
Thanks in advance!
KevinAs Bravid says, you can't do that from within SQL*Loader, unless the control files are dynamically generated by a script and the filename included as the default value for that particular column as a hardcoded literal.
It's easier if you use external tables as the table definition contains the "location" which is the list of filenames it is accessing, and the value can be queried as well as being dynamically changed (though if you're dynamically changing it, then you'll programatically know the filename anyway. hint hint!)
;) -
SQL*LOADER(8I) VARIABLE SIZE FIELD를 여러 TABLE에 LOAD하기 (FILLER)
제품 : ORACLE SERVER
작성날짜 : 2004-10-29
==================================================================
SQL*LOADER(8I) VARIABLE SIZE FIELD를 여러 TABLE에 LOAD하기 (FILLER)
==================================================================
PURPOSE
SQL*LOADER 에서 variable length record와 variable size field를 가진 data
file 을 여러 table에 load하는 방법을 소개하고자 한다.
( 8i new feature인 FILLER 절 사용)
Explanation
SQL*LOADER SYNTAX
여러 table에 load하고자 할때에는 control file에 아래와 같이 하면 된다.
INTO TABLE emp
INTO TABLE emp1
fixed length field을 가진 data file을 여러 table에 같은 data을 load하고자
한다면 아래와 같다.
INTO TABLE emp
(empno POSITION(1:4) INTEGER EXTERNAL,
INTO TABLE emp1
(empno POSITION(1:4) INTEGER EXTERNAL,
위와 같이 양쪽 table의 empno field에 각각의 load할 data로부터 1-4까지를
load 할수 있다. 그러나 field의 길이가 가변적이라면 위와 같이 POSITION 절을
각 field에 사용할 수 없다.
Example
예제 1>
create table one (
field_1 varchar2(20),
field_2 varchar2(20),
empno varchar(10) );
create table two (
field_3 varchar2(20),
empno varchar(10) );
load할 record가 comma로 나누어지며 길이가 가변적이라고 가정하자.
<< data.txt >> - load할 data file
"this is field 1","this is field 2",12345678,"this is field 4"
<< test.ctl >> - control file
load data infile 'data.txt'
discardfile 'discard.txt'
into table one
replace
fields terminated by ","
optionally enclosed by '"' (
field_1,
field_2,
empno )
into table two
replace
fields terminated by ","
optionally enclosed by '"' (
field_3,
dummy1 filler position(1),
dummy2 filler,
empno )
dummy1 field는 filler로 선언되었다. filler로 선언하면 table에 load하지 않는다.
two라는 table에는 dummy1이라는 field는 없으며 position(1)은 current record의
처음부터 시작해서 첫번째 field을 dummy1 filler item에 load한다는 것을 말한다.
그리고 두번째 field을 dummy2 filler item에 load한다. 세번째 field인, one이라는
table에 load되었던 employee number는 two라는 table에도 load되는 것이다,
<< 실행 >>
$sqlldr scott/tiger control=test.ctl data=data.txt log=test.log bindsize=300000
$sqlplus scott/tiger
SQL> select * from one;
FIELD_1 FIELD_2 EMPNO
this is field 1 this is field 2 12345678
SQL> select * from two;
FIELD_3 EMPNO
this is field 4 12345678
예제 2>
create table testA (c1 number, c2 varchar2(10), c3 varchar2(10));
<< data1.txt >> - load할 data file
7782,SALES,CLARK
7839,MKTG,MILLER
7934,DEV,JONES
<< test1.ctl >>
LOAD DATA
INFILE 'data1.txt'
INTO TABLE testA
REPLACE
FIELDS TERMINATED BY ","
c1 INTEGER EXTERNAL,
c2 FILLER CHAR,
c3 CHAR
<< 실행 >>
$ sqlldr scott/tiger control=test1.ctl data=data1.txt log=test1.log
$ sqlplus scott/tiger
SQL> select * from testA;
C1 C2 C3
7782 CLARK
7839 MILLER
7934 JONES
Reference Documents
<Note:74719.1> -
SQL*Loader-418: Bad datafile datatype for column XMLDOC
Hi,
I am trying to load a xml document into a xmltype column in a table using SQL*Loader and I receive this error:
SQL*Loader-418: Bad datafile datatype for column XMLDOC
My ctl is:
LOAD DATA INFILE 'marginalpdbc_xml_20030529.xml'
APPEND INTO TABLE PRUEBA_CARGA
( XMLDOC LOBFILE (CONSTANT 'marginalpdbc_xml_20030529.xml')
TERMINATED BY EOF,
NOMBFICH CONSTANT 'marginalpdbc_xml_20030529.xml' )
And the table is:
create table prueba_carga (NOMBFICH VARCHAR2(200),xmldoc xmltype);
What is wrong with my ctl?
I am using SQL*Loader: Release 9.2.0.1.0 and Oracle9i Enterprise Edition Release 9.2.0.1.0
Thanks in advance.Looks like there is data which takes > 8 digits for the 'DTE_ADDED' column.
Plz check that first.
Can you provide a couple of lines of data for sampling ??? -
Can we concatenate 2 columns of datafile into one column in sql loader
Hi,
Can we concatenate 2 columns of datafile into one column in sql loader?
like
Suppose there are two field in data file
column1 - lastname value tiger
column2 - firstname value scott
Required result
sould be concatenate of the two field into one database coulmn
like name and value should be scott tiger.
ThanksOr try this...
My input file
1,KARTHICK,PATTABIRAMAN,SOFTWARE
2,VIJAY,RENGANATHAN,FINANCE
3,VIMAL,KANTH,SALESAnd my control file.
LOAD DATA
INFILE 'EmpDetail.txt'
BADFILE 'EmpDetail.bad'
DISCARDFILE 'EmpDetail.dsc'
INTO TABLE "EMPDETAIL"
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' TRAILING NULLCOLS
(NO,
NAME1 BOUNDFILLER,
NAME ":NAME1||' '||:NAME",
DEPT) My table is.
SQL> create table empdetail(no integer, name varchar2(50), dept varchar2(50))
2 /
Table created.Then i loaded the data.
D:\karthick\Akiva\Look up\Akiva\Address Look Up>sqlldr sysadm/sysadm@akivanew empdetail.ctl
SQL*Loader: Release 9.2.0.1.0 - Production on Mon Sep 15 12:23:42 2008
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Commit point reached - logical record count 2
Commit point reached - logical record count 3And the output is
SQL> select * from empdetail
2 /
NO NAME DEPT
1 KARTHICK PATTABIRAMAN SOFTWARE
2 VIJAY RENGANATHAN FINANCE
3 VIMAL KANTH SALESThanks,
Karthick. -
Help in sql loader loading a large datafile.
Hi All
Could some one help me in loading the xls file which is about 250MB AND got about 2981807 rows into a database table
i try to open the XLS file it just opened with 65000 records only but i have lot more records in it.
thanks in advancehi i user the sql loader but the file is too big to handel like the file is not opened in notepad or xls sheet...
is there any other way to split the large file -
Sql loader and tablespace doubt
while i was inserting the data into a table using sqlloader and then suddenly the datafile gets full and i got sql*loader 605 error and i alloted a new datafile to the tablespace but the data didnt get inserted after that and the error was same sql*loader 605,but when i increase the size of the datafile then the error gets removed....why is it so???
The data didn't get inserted because there was no space in the datafile.
you increased the data file size, but the insert was already stopped because of the error.
The error didn't happen again, because there was enough space in the datafile.
See the log file for more details of the error. This error is not data related. from 605 that all we can understand.
Note:: Please make short statements and ask clearly your questions. I have to read 5 time to kind of understand what you ask for. -
SQL*Loader and multiple files
Hello, am tasked with loading tables with 21+ million rows. Will SQL*Loader perform better with one large file or many smaller files? Is there any ideal file size for optimal performance? Thank you
DavidDon, when I tried to loada 21M row table using direct, I get the following messages:
Record 2373: Rejected - Error on table STAGE_CUSTOMER.
ORA-03113: end-of-file on communication channel
SQL*Loader-926: OCI error while uldlfca:OCIDirPathColArrayLoadStream for table STAGE_CUSTOMER
SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
SQL*Loader-925: Error while uldlgs: OCIStmtExecute (ptc_hp)
ORA-03114: not connected to ORACLE
SQL*Loader-925: Error while uldlgs: OCIStmtFetch (ptc_hp)
ORA-24338: statement handle not executed
Here is the SQL*Loader log:
SQL*Loader: Release 9.2.0.1.0 - Production on Thu Apr 26 15:38:29 2007
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Control File: stage_customer.ctl
Character Set UTF8 specified for all input.
First primary datafile stage_Customer_20070301.csv has a
utf8 byte order mark in it.
Data File: stage_Customer_20070301.csv
Bad File: stage_Customer_20070301.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 1000
Continuation: none specified
Path used: Direct
Silent options: FEEDBACK
Table STAGE_CUSTOMER, loaded from every logical record.
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
ROW_ID SEQUENCE (MAX, 1)
CUSTOMER_ACCT_NUM FIRST * , O(") CHARACTER
IS_DELETED NEXT * , O(") CHARACTER
SQL string for column : "CASE WHEN UPPER(:Is_Deleted) IN ('FALSE','0') THEN 0 ELSE 1 END"
NAME_PREFIX NEXT * , O(") CHARACTER
FIRST_NAME NEXT * , O(") CHARACTER
MIDDLE_NAME NEXT * , O(") CHARACTER
LAST_NAME NEXT * , O(") CHARACTER
NAME_SUFFIX NEXT * , O(") CHARACTER
NICK_NAME NEXT * , O(") CHARACTER
ALT_FIRST_NAME NEXT * , O(") CHARACTER
ALT_LAST_NAME NEXT * , O(") CHARACTER
MARKETING_SOURCE_ID NEXT * , O(") CHARACTER
HOME_PHONE NEXT * , O(") CHARACTER
WORK_PHONE NEXT * , O(") CHARACTER
MOBILE_PHONE NEXT * , O(") CHARACTER
ALTERNATE_PHONE NEXT * , O(") CHARACTER
EMAIL_ADDR NEXT * , O(") CHARACTER
ALT_EMAIL_ADDR NEXT * , O(") CHARACTER
BIRTH_DATE NEXT * , O(") CHARACTER
SQL string for column : "TRUNC(TO_DATE(:Birth_Date, 'MM/DD/YYYY HH24:MI:SS'))"
SALES_CHANNEL_ID NEXT * , O(") CHARACTER
SQL string for column : "decode(:Sales_Channel_id,NULL,NULL,NULL)"
ASSOCIATE_NUMBER NEXT * , O(") CHARACTER
ALT_ASSOCIATE_NUMBER NEXT * , O(") CHARACTER
UPDATE_LOCATION_CD NEXT * , O(") CHARACTER
UPDATE_DATE NEXT * , O(") CHARACTER
SQL string for column : "TRUNC(TO_DATE(:Update_Date, 'MM/DD/YYYY HH24:MI:SS'))"
CUSTOMER_LOGIN_NAME NEXT * , O(") CHARACTER
DISCOUNT_CD NEXT * , O(") CHARACTER
DISCOUNT_PERCENT NEXT * , O(") CHARACTER
BUSINESS_NAME NEXT * , O(") CHARACTER
POS_TAX_FLAG NEXT * , O(") CHARACTER
POS_TAX_PROMPT NEXT * , O(") CHARACTER
SQL string for column : "CASE WHEN UPPER(:POS_TAX_PROMPT) IN ('FALSE','0') THEN 0 ELSE 1 END"
POS_DEFAULT_TAX_ID NEXT * , O(") CHARACTER
POS_TAX_ID_EXPIRATION_DATE NEXT * , O(") CHARACTER
POS_AUTHORIZED_USER_FLAG NEXT * , O(") CHARACTER
SQL string for column : "CASE WHEN UPPER(:POS_AUTHORIZED_USER_FLAG) IN ('FALSE','0') THEN 0 ELSE 1 END"
POS_ALLOW_PURCHASE_ORDER_FLAG NEXT * , O(") CHARACTER
SQL string for column : "CASE WHEN UPPER(:POS_ALLOW_PURCHASE_ORDER_FLAG) IN ('FALSE','0') THEN 0 ELSE 1 END"
ADDRESS1 NEXT * , O(") CHARACTER
ADDRESS2 NEXT * , O(") CHARACTER
ADDRESS3 NEXT * , O(") CHARACTER
CITY NEXT * , O(") CHARACTER
STATE_CD NEXT * , O(") CHARACTER
POSTAL_CD NEXT * , O(") CHARACTER
COUNTRY_CD NEXT * , O(") CHARACTER
ALLOW_UPDATE NEXT * , O(") CHARACTER
SQL string for column : "CASE WHEN UPPER(:ALLOW_UPDATE) IN ('FALSE','0') THEN 0 ELSE 1 END"
ACTION_CODE NEXT * , O(") CHARACTER
SQL string for column : "CASE WHEN UPPER(:action_code) IN ('FALSE','0') THEN 0 ELSE 1 END"
ACCOUNT_TYPE_ID NEXT * , O(") CHARACTER
LOCALE_CD NEXT * , O(") CHARACTER
SQL string for column : "CASE WHEN :Locale_CD IS NOT NULL AND :Locale_CD LIKE '__-__' THEN :Locale_CD ELSE 'en-US' END"
IS_READY_FOR_PROCESSING CONSTANT
Value is '1'
IS_BUSINESS CONSTANT
Value is '0'
HAD_ERRORS CONSTANT
Value is '0'
Record 2373: Rejected - Error on table STAGE_CUSTOMER.
ORA-03113: end-of-file on communication channel
SQL*Loader-926: OCI error while uldlfca:OCIDirPathColArrayLoadStream for table STAGE_CUSTOMER
SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
SQL*Loader-925: Error while uldlgs: OCIStmtExecute (ptc_hp)
ORA-03114: not connected to ORACLE
SQL*Loader-925: Error while uldlgs: OCIStmtFetch (ptc_hp)
ORA-24338: statement handle not executed
Table STAGE_CUSTOMER:
0 Rows successfully loaded.
1 Row not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Bind array size not used in direct path.
Column array rows : 5000
Stream buffer bytes: 256000
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 3469
Total logical records rejected: 1
Total logical records discarded: 0
Direct path multithreading optimization is disabled
Run began on Thu Apr 26 15:38:29 2007
Run ended on Thu Apr 26 15:38:30 2007
Elapsed time was: 00:00:01.18
CPU time was: 00:00:00.32 -
Maximum datafile size for sqlloader
Hi,
I would like to load data from 4GB xls file into oracle database by using sql*loader. Could you please tell me the maximum datafile size that can support sql*loader?
Thanks
Venkat
Edited by: user12052260 on Jul 2, 2011 6:11 PMI would like to load data from 4GB xls file into oracle database by using sql*loader. Could you please tell me the maximum datafile size that can support sql*loader?you can post this question in SQL loader forum questions. CLose the thread here.
Export/Import/SQL Loader & External Tables -
SQL*LOADER 시 발생하는 ORA-1653 ERROR
제품 : SQL*PLUS
작성날짜 : 2002-04-25
SQL*LOADER 실행 시 발생하는 ORA-1653
====================================
PURPOSE
다음은 SQL*LOADER 실행시 ORA-1653 ERROR가 발생시에 조치하는
방법을 설명한다.
Explanation
ORA-1653 error 는 특정 tablespace 에 space 가 부족해서 table의
extent가 일어나지 못해서 발생하는 error 이다 .
먼저 error message 에서 tablespace name 이 무엇인지 먼저
check 한다.
그리고 다음 command 를 이용해 해당 tablespace 를 늘려주면 된다.
ALTER TABLESPACE tablespace_name ADD DATAFILE '.....' size 100m;
그러나 이때의 tablespace 가 SYSTEM 일 경우는 user 의 default
tablespace 가 잡혀있지 않기 때문이어서 근본적인 해결이 필요하다.
이 경우는 무작정 tablespsace 를 늘리지 말고 user 의 default
tablespace 를 create 후 user 에게 할당해주도록 한다.
CREATE TABLESPACE tablespace_name datafile '...' size 100m;
ALTER USER user_name IDENTIFIED BY passwd
DEFAULT TABLESPACE tablespace_name
TEMPORARY TABLESPACE temp ;
위와 같이 user의 default tablespace 를 변환한 후, 이 default
tablespace 안에 create table을 다시 한 후 sql*loader 를 실행한다.
Reference Documents
--------------------Hi,
You must have more than one tnsnames file or multiple installations of oracle. What i suggest you do (as I'm sure will be mentioned in ed's link that you were already pointed at) is the following (* i assume you are on windows?)
open a command prompt
set TNS_ADMIN=PATH_TO_DIRECTOT_THAT_CONTAINS_CORRECT_TNSNAMES_FILE (i.e. something like set TNS_ADMIN=c:\oracle\network\admin)
This will tell oracle use the config files you find here and no others
then try sqlldr user/pass@db (in the same dos window)
see if that connects and let us know.
Cheers,
Harry
http://dbaharrison.blogspot.com -
I have used OMWB "generate Sql Loader script " option. ANd received the sql loader error.
The previous attempt to use OMWB online loading has generated garbage data. The picture was not matching with the person id.
Table in Sql Server..................
CREATE TABLE [nilesh] (
[LargeObjectID] [int] NOT NULL ,
[LargeObject] [image] NULL ,
[ContentType] [varchar] (40) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[LargeObjectName] [varchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[LargeObjectExtension] [varchar] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[LargeObjectDescription] [varchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[LargeObjectSize] [int] NULL ,
[VersionControl] [bit] NULL ,
[WhenLargeObjectLocked] [datetime] NULL ,
[WhoLargeObjectLocked] [char] (11) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[LargeObjectTimeStamp] [timestamp] NOT NULL ,
[LargeObjectOID] [uniqueidentifier] NOT NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO
Table in Oracle..............
CREATE TABLE LARGEOBJECT
LARGEOBJECTID NUMBER(10) NOT NULL,
LARGEOBJECT BLOB,
CONTENTTYPE VARCHAR2(40 BYTE),
LARGEOBJECTNAME VARCHAR2(255 BYTE),
LARGEOBJECTEXTENSION VARCHAR2(10 BYTE),
LARGEOBJECTDESCRIPTION VARCHAR2(255 BYTE),
LARGEOBJECTSIZE NUMBER(10),
VERSIONCONTROL NUMBER(1),
WHENLARGEOBJECTLOCKED DATE,
WHOLARGEOBJECTLOCKED CHAR(11 BYTE),
LARGEOBJECTTIMESTAMP NUMBER(8) NOT NULL,
LARGEOBJECTOID RAW(16) NOT NULL
TABLESPACE USERS
PCTUSED 0
PCTFREE 10
INITRANS 1
MAXTRANS 255
STORAGE (
INITIAL 64K
MINEXTENTS 1
MAXEXTENTS 2147483645
PCTINCREASE 0
BUFFER_POOL DEFAULT
LOGGING
NOCOMPRESS
LOB (LARGEOBJECT) STORE AS
( TABLESPACE USERS
ENABLE STORAGE IN ROW
CHUNK 8192
PCTVERSION 10
NOCACHE
STORAGE (
INITIAL 64K
MINEXTENTS 1
MAXEXTENTS 2147483645
PCTINCREASE 0
BUFFER_POOL DEFAULT
NOCACHE
NOPARALLEL
MONITORING;
Sql Loader script....
SET NLS_DATE_FORMAT=Mon dd YYYY HH:mi:ssAM
REM SET NLS_TIMESTAMP_FORMAT=Mon dd YYYY HH:mi:ss:ffAM
REM SET NLS_LANGUAGE=AL32UTF8
sqlldr cecildata/@ceciltst control=LARGEOBJECT.ctl log=LARGEOBJECT.log
Sql loader control file......
load data
infile 'nilesh.dat' "str '<er>'"
into table LARGEOBJECT
fields terminated by '<ec>'
trailing nullcols
(LARGEOBJECTID,
LARGEOBJECT CHAR(2000000) "HEXTORAW (:LARGEOBJECT)",
CONTENTTYPE "DECODE(:CONTENTTYPE, CHR(00), ' ', :CONTENTTYPE)",
LARGEOBJECTNAME CHAR(255) "DECODE(:LARGEOBJECTNAME, CHR(00), ' ', :LARGEOBJECTNAME)",
LARGEOBJECTEXTENSION "DECODE(:LARGEOBJECTEXTENSION, CHR(00), ' ', :LARGEOBJECTEXTENSION)",
LARGEOBJECTDESCRIPTION CHAR(255) "DECODE(:LARGEOBJECTDESCRIPTION, CHR(00), ' ', :LARGEOBJECTDESCRIPTION)",
LARGEOBJECTSIZE,
VERSIONCONTROL,
WHENLARGEOBJECTLOCKED,
WHOLARGEOBJECTLOCKED,
LARGEOBJECTTIMESTAMP,
LARGEOBJECTOID "GUID_MOVER(:LARGEOBJECTOID)")
Error Received...
Column Name Position Len Term Encl Datatype
LARGEOBJECTID FIRST * CHARACTER
Terminator string : '<ec>'
LARGEOBJECT NEXT ***** CHARACTER
Maximum field length is 2000000
Terminator string : '<ec>'
SQL string for column : "HEXTORAW (:LARGEOBJECT)"
CONTENTTYPE NEXT * CHARACTER
Terminator string : '<ec>'
SQL string for column : "DECODE(:CONTENTTYPE, CHR(00), ' ', :CONTENTTYPE)"
LARGEOBJECTNAME NEXT 255 CHARACTER
Terminator string : '<ec>'
SQL string for column : "DECODE(:LARGEOBJECTNAME, CHR(00), ' ', :LARGEOBJECTNAME)"
LARGEOBJECTEXTENSION NEXT * CHARACTER
Terminator string : '<ec>'
SQL string for column : "DECODE(:LARGEOBJECTEXTENSION, CHR(00), ' ', :LARGEOBJECTEXTENSION)"
LARGEOBJECTDESCRIPTION NEXT 255 CHARACTER
Terminator string : '<ec>'
SQL string for column : "DECODE(:LARGEOBJECTDESCRIPTION, CHR(00), ' ', :LARGEOBJECTDESCRIPTION)"
LARGEOBJECTSIZE NEXT * CHARACTER
Terminator string : '<ec>'
VERSIONCONTROL NEXT * CHARACTER
Terminator string : '<ec>'
WHENLARGEOBJECTLOCKED NEXT * CHARACTER
Terminator string : '<ec>'
WHOLARGEOBJECTLOCKED NEXT * CHARACTER
Terminator string : '<ec>'
LARGEOBJECTTIMESTAMP NEXT * CHARACTER
Terminator string : '<ec>'
LARGEOBJECTOID NEXT * CHARACTER
Terminator string : '<ec>'
SQL string for column : "GUID_MOVER(:LARGEOBJECTOID)"
SQL*Loader-309: No SQL string allowed as part of LARGEOBJECT field specification
what's the cause ?The previous attempt to use OMWB online loading has generated garbage data. The picture was not matching with the person id.This is being worked on (bug4119713) If you have a reproducible testcase please send it in (small testcases seem to work ok).
I have the following email about BLOBS I could forward to you if I have your email address:
[The forum may cut the lines in the wrong places]
Regards,
Turloch
Oracle Migration Workbench Team
Hi,
This may provide the solution. Without having the customer files here I can only guess at the problem. But this should help.
This email outlines a BLOB data move.
There are quiet a few steps to complete the task of moving a large BLOB into the Oracle database.
Normally this wouldn't be a problem, but as far as we can tell SQL Server's (and possibly Sybase) BCP does not reliably export binary data.
The only way to export binary data properly via BCP is to export it in a HEX format.
Once in a HEX format it is difficult to get it back to binary during a data load into Oracle.
We have come up with the idea of getting the HEX values into Oracle by saving them in a CLOB (holds text) column.
We then convert the HEX values to binary values and insert them into the BLOB column.
The problem here is that the HEXTORAW function in Oracle only converts a maximum of 2000 HEX pairs.
We over came this problem by writing our own procedure that will convert (bit by bit) your HEX data to binary.
NOTE: YOU MUST MODIFY THE START.SQL AND FINISH.SQL TO SUIT YOUR CUSTOMER
The task is split into 4 sub tasks
1) CREATE A TABLESPACE TO HOLD ALL THE LOB DATA
--log into your system schema and create a tablespace
--Create a new tablespace for the CLOB and BLOB column (this may take a while to create)
--You may resize this to fit your data ,
--but I believe you have in excess of 500MB of data in this table and we are going to save it twice (in a clob then a blob)
--Note: This script may take some time to execute as it has to create a tablespace of 1000Mb.
-- Change this to suit your customer.
-- You can change this if you want depending on the size of your data
-- Remember that we save the data once as CLOB and then as BLOB
create tablespace lob_tablespace datafile 'lob_tablespace' SIZE 1000M AUTOEXTEND ON NEXT 50M;
LOG INTO YOUR TABLE SCHEMA IN ORACLE
--Modify this script to fit your requirements
2) START.SQL (this script will do the following tasks)
a) Modify your current schema so that it can accept HEX data
b) Modify your current schema so that it can hold that huge amount of data.
The new tablespace is used; you may want to alter this to your requirements
c) Disable triggers, indexes & primary keys on tblfiles
3)DATA MOVE
The data move now involves moving the HEX data in the .dat files to a CLOB.
The START.SQL script adds a new column to <tablename> called <blob_column>_CLOB.
This is where the HEX values will be stored.
MODIFY YOUR CONTROL FILE TO LOOK LIKE THISload data
infile '<tablename>.dat' "str '<er>'"
into table <tablename>
fields terminated by '<ec>'
trailing nullcols
<blob_column>_CLOB CHAR(200000000),
The important part being "_CLOB" appended to your BLOB column name and the datatype set to CHAR(200000000)
RUN sql_loader_script.bat
log into your schema to check if the data was loaded successfully-- now you can see that the hex values were sent to the CLOB column
SQL> select dbms_lob.getlength(<blob_column>),dbms_lob.getlength(<blob_column>_clob) from <tablename>;
LOG INTO YOUR SCHEMA
4)FINISH.SQL (this script will do the following tasks)
a) Creates the procedure needed to perform the CLOB to BLOB transformation
b) Executes the procedure (this may take some time a 500Mb has to be converted to BLOB)
c) Alters the table back to its original form (removes the <blob_column>_clob)
b) Enables the triggers, indexes and primary keys
Regards,
(NAME)
-- START.SQL
-- Modify this for your particular customer
-- This should be executed in the user schema in Oracle that contains the table.
-- DESCRIPTION:
-- ALTERS THE OFFENDING TABLE SO THAT THE DATA MOVE CAN BE EXECUTED
-- DISABLES TRIGGERS, INDEXES AND SEQUENCES ON THE OFFENDING TABLE
-- 1) Add an extra column to hold the hex string
alter table <tablename> add (FILEBINARY_CLOB CLOB);
-- 2) Allow the BLOB column to accpet NULLS
alter table <tablename> MODIFY FILEBINARY NULL;
-- 3) Dissable triggers and sequences on tblfiles
alter trigger <triggername> disable;
alter table tblfiles drop primary key cascade;
drop index <indexname>;
-- 4) Allow the table to use the tablespace
alter table <tablename> move lob (<blob_column>) store as (tablespace lob_tablespace);
alter table tblfiles move lob (<blob_column>clob) store as (tablespace lobtablespace);
COMMIT;
-- END OF FILE
-- FINISH.SQL
-- Modify this for your particular customer
-- This should be executed in the table schema in Oracle.
-- DESCRIPTION:
-- MOVES THE DATA FROM CLOB TO BLOB
-- MODIFIES THE TABLE BACK TO ITS ORIGIONAL SPEC (without a clob)
-- THEN ENABLES THE SEQUENCES, TRIGGERS AND INDEXES AGAIN
-- Currently we have the hex values saved as text in the <columnname>_CLOB column
-- And we have NULL in all rows for the <columnname> column.
-- We have to get BLOB locators for each row in the BLOB column
-- put empty blobs in the blob column
UPDATE <tablename> SET filebinary=EMPTY_BLOB();
COMMIT;
-- create the following procedure in your table schema
CREATE OR REPLACE PROCEDURE CLOBTOBLOB
AS
inputLength NUMBER; -- size of input CLOB
offSet NUMBER := 1;
pieceMaxSize NUMBER := 50; -- the max size of each peice
piece VARCHAR2(50); -- these pieces will make up the entire CLOB
currentPlace NUMBER := 1; -- this is where were up to in the CLOB
blobLoc BLOB; -- blob locator in the table
clobLoc CLOB; -- clob locator pointsthis is the value from the dat file
-- THIS HAS TO BE CHANGED FOR SPECIFIC CUSTOMER TABLE AND COLUMN NAMES
CURSOR cur IS SELECT <blob_column>clob clobcolumn , <blob_column> blob_column FROM /*table*/<tablename> FOR UPDATE;
cur_rec cur%ROWTYPE;
BEGIN
OPEN cur;
FETCH cur INTO cur_rec;
WHILE cur%FOUND
LOOP
--RETRIVE THE clobLoc and blobLoc
clobLoc := cur_rec.clob_column;
blobLoc := cur_rec.blob_column;
currentPlace := 1; -- reset evertime
-- find the lenght of the clob
inputLength := DBMS_LOB.getLength(clobLoc);
-- loop through each peice
LOOP
-- get the next piece and add it to the clob
piece := DBMS_LOB.subStr(clobLoc,pieceMaxSize,currentPlace);
-- append this peice to the BLOB
DBMS_LOB.WRITEAPPEND(blobLoc, LENGTH(piece)/2, HEXTORAW(piece));
currentPlace := currentPlace + pieceMaxSize ;
EXIT WHEN inputLength < currentplace;
END LOOP;
FETCH cur INTO cur_rec;
END LOOP;
END CLOBtoBLOB;
-- now run the procedure
-- It will update the blob column witht the correct binary represntation of the clob column
EXEC CLOBtoBLOB;
-- drop the extra clob cloumn
alter table <tablename> drop column <blob_column>_clob;
-- 2) apply the constraint we removed during the data load
alter table <tablename> MODIFY FILEBINARY NOT NULL;
-- Now re enable the triggers,indexs and primary keys
alter trigger <triggername> enable;
ALTER TABLE TBLFILES ADD ( CONSTRAINT <pkname> PRIMARY KEY ( <column>) ) ;
CREATE INDEX <index_name> ON TBLFILES ( <column> );
COMMIT;
-- END OF FILE -
Sql loader direct path leaving indexes unusuable
10.2.0.3
redhat 4.5
temp tablespace 6.5 GB
datafile size 33k
I thought in version 10 you can do direct path loads without indexes going unusuable?
i saw this, but we have plenty of temp space.
http://dbasolve.com/OracleError/ora/ORA-01502.txt
Oracle Error:
ORA-01502 Oracle Index in Unusable State
Cause and Solution:
When trying to perform query on Oracle tables with select SQL statement, Oracle returns this error.
The error indicates an attempt has been made to access an index or index partition that has been marked unusable by a direct load or by a DDL operation.
The problem usually happens when using the Direct Path for the SQL*Loader, Direct Load or DDL operations. This requires enough temporary space to build all indexes of the table. If there is no enough space in TEMP tablespace, all rows will still be loaded and imported, but the indices are left with STATUS = 'INVALID'.
Invalid indexes can be checked with down SQL statement.
SQL>SELECT * from USER_INDEXES WHERE STATUS = 'INVALID';
Solution to this error is simple. You can:
1. Drop the specified index and/or recreate the index
2. Rebuild the specified index
3. Rebuild the unusable index partition
Generally, the following SQL manipulation language will be able to rebuild the unusable index:
SQL>ALTER INDEX index_name REBUILD onlineHi Alan,
How big is this table and what sort of indexes are they?
An index update using SQL*Loader unrecoverable direct path load is achieved by an isolated sort followed by a nologging merge of the old index and the new mini-index into a new index segment (this according to seminar notes by Jonathan Lewis). This will conceivably take a long time for a large table / large index.
Performance improvements? Are you loading all records into a new partition?
Cheers,
Colin
Maybe you are looking for
-
Motion menus - are there any other kind?
Hello I am creating a DVD of a memorial service and have chaptered all 15 elements of the service. When I try to burn the DVD, it says that I have too much content in my motion menus. Question 1: Is there a menu/chapter screen available that DOESN'T
-
Dear All, For a third party process, settings can be made such that a purchase requisition gets generated once the sales order doc is saved. Instead of the PR, is there a possibility that on saving the sales order a purchase order (PO) gets generated
-
My RD Gateway cert is going to expire soon
Is it best to open IIS and just create a new cert or renew it? I've searched around and saw that some blogs new certs were created. Not sure why they just didn't use the renew button in IIS ? Can someone suggest the best path please? Windows 2008 R2
-
DISPLAY A WARNING MESSAGE ON LOGIN PAGE
Need to warn users that the application is down due to an upgrade. Users can potentially do damage when the application is brought for configuration during the implementation phase even if a corporate email has been sent out to not log in.
-
How to set the source pin of the counter
I am programming using PCI-6251 DAQ card with the board BNC2110 and Labview 7.1. I just want to write a program to count the frequencies of two signals from outside. In Measurement and Automation, I could use the edge counting by selecting the edge s