Sql loader suggestion
LOAD DATA
INFILE /home/dir1
BADFILE abc.bad
truncate
into table tab1
fields terminated by ','
optionally enclosed by '"'
col1,
col2,
col3,
col4
here my table tab1 structure
col1 number
col2 number
col3 Varchar
col4 varchar
when i load the record like
1,1,1,2 it is also getting loaded.
but i need to load records like 1,1,a,b
what i feel.
is the control file is wrong somewhere so that it is taking the numeric datatype to character columns of the tab1 table.
can i make something in the control file so that it will accept data like (1,1,a,b) and not the (1,1,1,2) one
thanks
What do you want it to do when it comes across a row of data in the file that is e.g.
1,1,1,2
At the moment it is assuming that the last two are character datatype and as such creating them quite correctly in your table.
Perhaps you want to be selective about your data. See this link...
http://www.orafaq.com/wiki/SQL*Loader_FAQ#Can_one_selectively_load_only_the_records_that_one_need.3F
Similar Messages
-
Need suggestions for imporving data load performance via SQL Loader
Hi,
Our requirement is to load 512 (1 GB each) files in Oracle database.
We are using SQL loaders to load files into the DB (A partitioned table) and have tried almost all the possible options that come with sql loaders (Direct load path, parallel=true, multithreading=true, unrecoverable)
As the tables is growing bigger in size, each file load time is increasing (It started with 5 minutes per file and has reached 2 hours per 3 files now and is increasing with every batch- Note we are loading 3 files concurrently on the target table using the parallel = true oprion of sql loader)
Questions 1:
My problem is that somehow multithreading is not working for us (we have multi CPU server and have enabled multithreading=true). Could it be something to do with DB setting which might be hindering the data load to be done in multiple threads?
Question 2:
Would gathering stats on the target table and it's partitions help improve load performance ? I'm not sure if stats improve DML's, they would definitely improve sql queries. Any thoughts?
Question 3:
What would be the best strategy to gather stats on this table (which would end up having 512 GB data) ?
Question 4:
Do you think insertions in a partitioned table (with growing sizes) would have poor performance as compared to a non-partitioned table ?
Any other suggestions to improve performace are most welcome !!
Thanks,
Sachin
Edited by: Sachin Tiwari on Mar 13, 2013 6:29 AM2 hours to load just 3 GB of data seems unreasonable regardless of the SQL Loader settings. It seems likely to me that the problem is not with SQL Loader but somewhere else.
Have you generated a Statspack/ AWR/ ASH report to see where all that time is being spent? Are there triggers on the table? Are there bitmap indexes?
Is your table partitioned in a way that is designed to improve the efficiency of loads so that all the data from one file goes into one partition? Or is data from each file getting inserted into many different partitions.
Justin -
Need suggestions on loading 5000+ files using sql loader
Hi Guys,
I'm writing a shell script to load more than 5000 files using sql loader.
My intention is to load the files in parallel. When I checked the maximum number of sessions in v$parameter, it is around 700.
Before starting the data load, programmatically I am getting the number of current sessions and maximum number of sessions and keeping free 200 sessions without using them (max. no. of sessions minus 200 ) and utilizing the remaining ~300 sessions to load the files in parallel.
Also I am using a "wait" option to make the shell to wait until the 300 concurrent sql loader process to complete and moving further.
Is there any way to make it more efficient? Also is it possible to reduce the wait time without hard coding the seconds (For Example: If any of those 300 sessions becomes free, assign the next file to the job queue and so on..)
Please share your thoughts on this.
Thanks.Manohar wrote:
I'm writing a shell script to load more than 5000 files using sql loader.
My intention is to load the files in parallel. When I checked the maximum number of sessions in v$parameter, it is around 700. Concurrent load you mean? Parallel processing implies take a workload, breaking that up into smaller workloads, and doing that in parallel. This is what the Parallel Query feature does in Oracle.
SQL*Loader does not do that for you. It uses a single session to load a single file. To make it run in parallel, requires manually starting multiple loader sessions and perform concurrent loads.
Have a look at Parallel Data Loading Models in the Oracle® Database Utilities guide. It goes into detail on how to perform concurrent loads. But you need to parallelise that workload yourself (as explained in the manual).
Before starting the data load, programmatically I am getting the number of current sessions and maximum number of sessions and keeping free 200 sessions without using them (max. no. of sessions minus 200 ) and utilizing the remaining ~300 sessions to load the files in parallel.
Also I am using a "wait" option to make the shell to wait until the 300 concurrent sql loader process to complete and moving further.
Is there any way to make it more efficient? Also is it possible to reduce the wait time without hard coding the seconds (For Example: If any of those 300 sessions becomes free, assign the next file to the job queue and so on..)Consider doing it the way that Parallel Query does (as I've mentioned above). Take the workload (all files). Break the workload up into smaller sub-workloads (e.g. 50 files to be loaded by a process). Start a 100 processes in parallel and provide each one with a sub-workload to do (100 processes each loading 50 odd files).
This is a lot easier to manage than starting for example a 5000 load processes and then trying some kind of delay method to ensure that not all hit the database at the same time.
I'm loading about 100+ files (3+ million rows) every 60 seconds 24x7 using SQL*Loader. Oracle is quite scalable and SQL*Loader quite capable. -
SQL*Loader and DECODE function
Hi All,
I am loading data from data files into oracle tables and while loading the data using SQL*Loader, the following requirement needs to be fulfilled.
1) If OQPR < 300, RB = $ 0-299, SC = "SC1"
2) If 300 < OQPR < 1200, RB = $ 300-1199, SC = "SC2"
3) If 1200 < OQPR < 3000, RB = $ 1200-2999, SC = "SC3"
4) If OQPR > 3000 USD, RB = > $3000, SC = "SC4"
Here OPQR is a field in the data file.
Can anyone suggest how do we handle this using DECODE function? Triggers and PL/SQL functions are not to be used.
TIA.
Regards,
Ravi.The following expression gives you different values for your different intervals and boundaries :
SIGN(:OQPR - 300) + SIGN(:OQPR - 1200) + SIGN(:OQPR - 3000) -
How to have more than one condition on same column --- using SQL Loader
Hi All,
I am stuck with SQL Loader..
How do I filter records before loading in the table using when clause..
i should load data only when
field1 = 'AC' or 'VC'
field2 is NULL
i used various combinations in when clause like
a) when field1='AC' or field1='VC' and field2 = BLANKS
b) when (field1='AC') and (field2 = BLANKS )
& similar...
In all the cases I tried I could not implement OR condition with field1 and null condition with field2
but my main concern is can we use OR or IS NULL things in when clause of SQL Loader..
is it possible to check this anywhere??
any alternate solution u could suggest??
Thanks
DikshitOk I'll try that, although I did try it earlier when I had iTunes 5.xx loaded, I think.
As to size of playlists, I have a master (900 songs) that defines what will fit onto the ipod , I then generate all the others as subsets of the master (not of the library)- hence I know they will all fit
Can you also clarify something for me: the dialogue box we are discussing is intended, I think, so that one can set the automatic synching of certain playlists between the PC & the ipod. Is that the only way one can select other playlists to go to the ipod - i.e. as static once-offs, not synchronised ?
Apple' docs, I think, are poor in this regard - they assume most ipods are bigger then the users song library and they gloss over the details of this alternate mode of playlist synching.
Thanks - Nick -
Sql@loader-704 and ORA-12154: error messages when trying to load data with SQL Loader
I have a data base with two tables that is used by Apex 4.2. One table has 800,000 records . The other has 7 million records
The client recently upgraded from Apex 3.2 to Apex 4.2 . We exported/imported the data to the new location with no problems
The source of the data is an old mainframe system; I needed to make changes to the source data and then load the tables.
The first time I loaded the data i did it from a command line with SQL loader
Now when I try to load the data I get this message:
sql@loader-704 Internal error: ulconnect OCISERVERATTACH
ORA-12154: tns:could not resolve the connect identifier specified
I've searched for postings on these error message and they all seem to say that SQL Ldr can't find my TNSNAMES file.
I am able to connect and load data with SQL Developer; so SQL developer is able to find the TNSNAMES file
However SQL Developer will not let me load a file this big
I have also tried to load the file within Apex (SQL Workshop/ Utilities) but again, the file is too big.
So it seems like SQL Loader is the only option
I did find one post online that said to set an environment variable with the path to the TNSNAMES file, but that didn't work..
Not sure what else to try or where to look
thanksHi,
You must have more than one tnsnames file or multiple installations of oracle. What i suggest you do (as I'm sure will be mentioned in ed's link that you were already pointed at) is the following (* i assume you are on windows?)
open a command prompt
set TNS_ADMIN=PATH_TO_DIRECTOT_THAT_CONTAINS_CORRECT_TNSNAMES_FILE (i.e. something like set TNS_ADMIN=c:\oracle\network\admin)
This will tell oracle use the config files you find here and no others
then try sqlldr user/pass@db (in the same dos window)
see if that connects and let us know.
Cheers,
Harry
http://dbaharrison.blogspot.com -
CSV FILES DOESN'T LOAD WITH RIGHT DATA USING SQL LOADER
Hi pals, I have the following information in csv file:
MEXICO,Seretide_Q110,2010_SEE_01,Sales Line,OBJECTIVE,MEXICO,Q110,11/01/2010,02/04/2010,Activo,,,MEXICO
MEXICO,Seretide_Q210,2010_SEE_02,Sales Line,OBJECTIVE,MEXICO,Q210,05/04/2010,25/06/2010,Activo,,,MEXICO
When I use SQLLOADER the data is loaded as follow:*
EXICO,Seretide_Q110,2010_SEE_01,Sales Line,OBJECTIVE,MEXICO,Q110,11/01/2010,02/04/2010,Activo,,,MEXICO
And for the next data in a csv file too:
MX_001,MEXICO,ASMA,20105912,Not Verified,General,,RH469364,RH469364,Change Request,,,,,,,Y,MEXICO,RH469364
MX_002,MEXICO,ASMA,30094612,Verified,General,,LCS1405,LCS1405,Change Request,,,,,,,Y,MEXICO,LCS1405
the data is loaded as follow:
X_001,MEXICO,ASMA,20105912,Not Verified,General,,RH469364,RH469364,Change Request,,,,,,,Y,MEXICO,RH469364
X_002,MEXICO,ASMA,30094612,Verified,General,,LCS1405,LCS1405,Change Request,,,,,,,Y,MEXICO,LCS1405
I mean the first character is truncated and this bug happens with all my data. Any suggestion? I really hope you can help me.
Edited by: user11260938 on 11/06/2009 02:17 PM
Edited by: Mariots on 12/06/2009 09:37 AM
Edited by: Mariots on 12/06/2009 09:37 AMYour table and view don't make sense so I created a "dummy" table to match your .ctl file.
SQL> create table CCI_SRC_MX
2 (ORG_BU varchar2(30)
3 ,name varchar2(30)
4 ,src_num varchar2(30)
5 ,src_cd varchar2(30)
6 ,sub_type varchar2(30)
7 ,period_bu varchar2(30)
8 ,period_name varchar2(30)
9 ,prog_start_dt date
10 ,prog_end_dt date
11 ,status_cd varchar2(30)
12 ,X_ACTUALS_CALC_DATE date
13 ,X_ACTUAL_UPDATE_SRC varchar2(30)
14 ,prod_bu varchar2(30)
15 ,ROW_ID NUMBER(15,0)
16 ,IF_ROW_STAT VARCHAR2(90)
17 ,JOB_ID NUMBER(15,0)
18 );
Table created.
SQL> create sequence GSK_GENERAL_SEQ;
Sequence created.I simplified your .ctl file and moved all the constant and sequence stuff to the end. I also changed the format masks to match the dates in your data.
LOAD DATA
INFILE 'SBSLSLT.txt'
BADFILE 'SBSLSLT.bad'
DISCARDFILE 'SBSLSLT.dis'
APPEND
INTO TABLE CCI_SRC_MX
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(ORG_BU
,NAME
,SRC_NUM
,SRC_CD
,SUB_TYPE
,PERIOD_BU
,PERIOD_NAME
,PROG_START_DT DATE 'dd/mm/yyyy'
,PROG_END_DT DATE 'dd/mm/yyyy'
,STATUS_CD
,X_ACTUALS_CALC_DATE DATE 'dd/mm/yyyy'
,X_ACTUAL_UPDATE_SRC
,PROD_BU
,row_id "GSK_GENERAL_SEQ.nextval"
,if_row_stat CONSTANT 'UPLOADED'
,job_id constant 36889106
{code}
When I run SQL Loader, I get this:
{code}
SQL> select * from CCI_SRC_MX;
ORG_BU NAME SRC_NUM SRC_CD SUB_TYPE PERIOD_BU PERIOD_NAME PROG_START_DT PROG_END_DT STATUS_CD PROD_BU ROW_ID IF_ROW_STAT JOB_ID
MEXICO Seretide_Q110 2010_SEE_01 Sales Line OBJECTIVE MEXICO Q110 11-JAN-2010 00:00:00 02-APR-2010 00:00:00 Activo MEXICO 1 UPLOADED 36889106
MEXICO Seretide_Q210 2010_SEE_02 Sales Line OBJECTIVE MEXICO Q210 05-APR-2010 00:00:00 25-JUN-2010 00:00:00 Activo MEXICO 2 UPLOADED 36889106
{code} -
SQL Loader need to insert the input filename into output table
Hi All,
I've a small problem with my SQL Loader query. My sql loader should read a file and write the data into a file. Also this sql loader should read the filename and write it into the same table. Here's my control file code:
LOAD DATA
APPEND
INTO TABLE XXMW_STG_SOH_HEADER_UK
WHEN (1:4) = '7010'
TRAILING NULLCOLS
(WH POSITION(5:6)
,ITEM POSITION(9:26)
,PRODUCT_STATUS POSITION(33:34)
,BALANCE_ON_HAND POSITION(35:43)
,TO_SHIP_QTY POSITION(71:79)
,RUSH_TO_SHIP_QTY POSITION(80:88)
,RESERVED_QTY POSITION(175:183)
,SNAPSHOT_DATE POSITION(134:143) CHAR "TO_DATE(:SNAPSHOT_DATE,'YYYY-MM-DD')"
,SNAPSHOT_TIME POSITION(144:151) CHAR "TO_DATE(:SNAPSHOT_TIME,'HH24:MI:SS')"
,PROCESSED_IND CONSTANT "N"
,PROCESSED_DATETIME SYSDATE
,_FILENAME POSITION(184) CHAR TERMINATED BY WHITESPACE_
My program should read the filename dynamically (means a shell script is calling this .ctl file which is reading multiple input files) and insert into the filename field in xxmw_stg_soh_header_uk table.
Please let me know for any questions/clarifications.
Regards,
DebabrataWhile I think Blu's suggestion to use external tables is better, if you need to use SQL Loader, you could do something like this.
Create a "generic" control file with a placeholder for the filename, something like:
LOAD DATA
APPEND
INTO TABLE XXMW_STG_SOH_HEADER_UK
WHEN (1:4) = '7010'
TRAILING NULLCOLS (
WH POSITION(5:6),
ITEM POSITION(9:26),
PRODUCT_STATUS POSITION(33:34),
BALANCE_ON_HAND POSITION(35:43),
TO_SHIP_QTY POSITION(71:79),
RUSH_TO_SHIP_QTY POSITION(80:88),
RESERVED_QTY POSITION(175:183),
SNAPSHOT_DATE POSITION(134:143) CHAR "TO_DATE(:SNAPSHOT_DATE,'YYYY-MM-DD')",
SNAPSHOT_TIME POSITION(144:151) CHAR "TO_DATE(:SNAPSHOT_TIME,'HH24:MI:SS')",
PROCESSED_IND CONSTANT "N",
PROCESSED_DATETIME SYSDATE,
FILENAME CONSTANT ":FILE"I am assuming that your shell script is looping through a set of file names and loading each one. So make your shell script look something like:
FILES=`ls *.txt`
CTL=generic.CTL
for f in $FILES
do
cat $CTL| sed "s/:FILE/$f/g" > $f.ctl
sqlldr usr/passwd control=$f.ctl data=$f
doneThe line cat $CTL| sed "s/:FILE/$f/g" > $f.ctl will create a "custom" control file for each file and add the filename as a constant at the end.
John -
ORA-00936 error from SQL expression in SQL*Loader script
I am getting the above error on the following line in my SQL*Loader script:
DIA_CLM_RES_OID DECIMAL EXTERNAL
"SELECT N_ORG_ENTY_ID FROM TESTG4.ORG_ENTITY
WHERE N_USER_ID =
(SELECT UNIQUE WSR_NT_ID FROM CONV_CLM_RESOURCE
WHERE CLM_RES_OID = :DIA_CLM_RES_OID)",
What I am basically trying to do is a 2-table lookup of a value:
1. Find a row in table CONV_CLM_RESOURCE where the value in column CLM_RES_OID matches the value in the input file in field DIA_CLM_RES_OID.
2. Take the value of field WSR_NT_ID from that row and use it to find a row in table TESTG4.ORG_ENTITY.
3. Take the value of field WSR_NT_ID from that row and set it in the target table in field DIA_CLM_RES_OID.
In other words, I am essentially trying to translate the input value by using two other tables to lookup the value to translate to. However, no matter how I arrange it, I keep getting the "ORA-00936: missing expression" error on this statement.
Can anyone see what I am doing wrong, or perhaps suggest a better way of accomplishing a two-table translation of a value?
Thanks!Still not sure why this doesn't work, but I was able to create and use a function to do this instead, which is probably a better approach anyway.
-
I am running SQL*Loader 8.1 on an NT4-SP5 environment and getting a 282 - unable to locate character set.... The OTN reference on NLS is not clear on what to run to get the SQL*Loader program to run a simple ASCII text file into a LOAD/APPEND DDL statement. Any suggestions?
In control file replace:
MATUR_DT POSITION(78:85) DATE 'YYYYMMDD',
OR
On WinDoze, edit registry and set NLS_DATE_FORMAT to YYYYMMDD
On *nix set NLS_DATE_FORMAT environment variable to YYYYMMDD -
I have problem and your help to solve it would be very much appreciated.
I am uploading a text file with SQL Loader into a table. Since I used APPEND option in the Loader, I don't want records to be duplicated. So, I wrote a "BEFORE INSERT .. FOR EACH ROW" trigger to check whether that row already exists or not.
For example, let us consider a table TEST as follows.
Fld1 NUMBER(2);
Fld2 VARCHAR2(10);
Fld3 VARCHAR2(10);
I have a trigger on this table.
CREATE OR REPLACE TRIGGER Trg_Bef_Insert_Test
BEFORE INSERT ON Test FOR EACH ROW
DECLARE
vCount NUMBER(2);
DuplicateRow EXCEPTION;
BEGIN
SELECT Count(*) INTO vCount FROM Test
WHERE fld1 || fld2 || fld3 = :new.fld1 || :new.fld2 || :new.fld3;
IF vCount > 0 THEN
RAISE DuplicateRow;
END IF;
EXCEPTION
WHEN DuplicateRow THEN
Raise_Application_Error (-20001,'Record already exists');
WHEN OTHERS THEN
DBMS_OUTPUT.PUT_LINE('ERROR : ' || SQLCODE || '; ' || SUBSTR(SQLERRM, 1, 150));
END;
Please refer to the following SQL statements which I executed in the SQL Plus.
SQL> insert into test values (1,'one','first');
1 row created.
SQL> insert into test values (1,'one','first');
insert into test values (1,'one','first')
ERROR at line 1:
ORA-20001: Record already exists
ORA-06512: at "CAMELLIA.TRG_TEST", line 13
ORA-04088: error during execution of trigger 'CAMELLIA.TRG_TEST'
Would anyone tell me why do errors -6512 and -4088 occur ?
Also, if you have any other suggestion to handle this situation, please let me know.
By the way, I am using Oracle 8.1.7.
Thank you.There are a few things wrong here, but you should really use a unique constraint for this.
SQL> create table t (a number, b number, c number,
2 constraint uk unique (a, b, c));
Table created.Here's an example data file with 12 records three of which are duplicates.
1,2,3
3,4,5
6,7,8
3,2,1
5,5,5
3,4,5
3,2,1
1,1,1
2,2,2
6,7,8
8,8,8
9,9,9And a control file
load data
infile 'in.dat'
append
into table t
fields terminated by ',' optionally enclosed by '"'
(a, b, c)Running it with sql loader, inserts the nine records, outputs the three duplicates to a .bad file and logs all the errors in the .log file. No need for triggers or any code.
$ sqlldr control=in.ctl
Username:xxx
Password:
SQL*Loader: Release 9.2.0.1.0 - Production on Mon Apr 21 23:16:44 2003
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Commit point reached - logical record count 12
$ cat in.bad
3,4,5
3,2,1
6,7,8
SQL> select * from t;
A B C
1 2 3
3 4 5
6 7 8
3 2 1
5 5 5
1 1 1
2 2 2
8 8 8
9 9 9
9 rows selected. -
SQL Loader and Error ORA-01847/ORA-01839
Hi,
While using the direct loading in SQL-LOADER when we get the ORA-01847/ORA-01839 all the other records are getting errorred out. It goes fine with the conventional loading.
Should I use some parameters or anything to make sure that all the other records are not rejected when we get the ORA-01847/ORA-01839 error while going with the DIRECT loading.
Thanks
JibinIn internet I found this short message:
“AL32UTF8 is a multi-byte characterset,that means some characters are stored in more than 1 character, that's true for these special characters.
If you have same table definitions in both databases you likely face error ORA-12899.
This metalink note discusses this problem, it's also applicable to sqlloader:
Import reports "ORA-12899: Value too large for column" when using BYTE semantic
Doc ID: Note:563893.1”
By metalink, I can see the Note linked to a one Oracle Internal Bug for Oracle 11g.....
I'm waiting you suggestion... thanks very much in advance.
Regards.
Giovanni -
hi,
I want to insert 100,000 records daily in a table for the first month and then in next month these records are going to be replaced by new updated records.
there might be few addition and deletion in the previous records also.
actually its consumer data so there might be few consumer who have withdrawn the utility and there will be some more consumer added in the database.
but almost 99% of the previous month data have to be updated/replaced with the fresh month data.
For instance, what i have in my mind is that i will use sql loader to load data for the first month and then i will delete the previous data using sqlPlus and load the fresh month data using sql loader again.
1. Is this ok ? or there is some better solution to this.
2. I have heard of external files, are they feasible in my scenario?
3. I have planned that i will make scripts for sqlPlus and Loader and use them in batch files. (OS windows 2003 server, Oracle 9i database). is there some better choice to make all the procedure automatic?
looking for your suggestions
nadeem ameerI would suggest u use External tables since its more flexible then
sqlloader & is a better option.
For using external tables
1)u will have to create a directory first
2)Generally creation od directory is done by sys,hence after creating the directory
privileges read & write to be provided to user .
3)Creation of external tables.
4) Now use the table as a normal table to insert ,update delete in
ur table.
U can get more information from
http://www.oracle-base.com/articles/9i/SQLNewFeatures9i.php#ExternalTables
Create Directory <directory_name> as <Directory path where file be present>
Grant read,write on directory <directory_name> to <username>
CREATE TABLE <table_name>
(<column names>)
ORGANIZATION EXTERNAL
TYPE ORACLE_LOADER
DEFAULT DIRECTORY ,directory_name>
ACCESS PARAMETERS
RECORDS DELIMITED BY NEWLINE
FIELDS TERMINATED BY ','
LOCATION (<filename>)
PARALLEL 5
REJECT LIMIT 200;
Hope this helps. -
Substitution for logical OR usage in control file of sql loader
Hi,
can anyone suggest me a substituion method to use the functionality of logical OR in control file developed for sql loader.
Ex:
load data
append
into table ABC
when (1:2) = 'AD'
--AND ((27:28)= '01' OR (27:28)= '02')
AND (1222:1222) = '1'
trailing nullcols
Note: condition commented in the above example need to be replaced.
one way of doing it can be splitting blocks for each condition.
Then it will look like:
load data
append
into table ABC
when (1:2) = 'AD'
AND (27:28)= '01'
AND (1222:1222) = '1'
trailing nullcols
into table ABC
when (1:2) = 'AD'
AND (27:28)= '02'
AND (1222:1222) = '1'
trailing nullcols
So, i'm looking for a better way than this, as i cannot work with the above
splitting logic because i'm dealing with lot many conditions.
Thanx inadvance
KishoreHi,
can anyone suggest me a substituion method to use the functionality of logical OR in control file developed for sql loader.
Ex:
load data
append
into table ABC
when (1:2) = 'AD'
--AND ((27:28)= '01' OR (27:28)= '02')
AND (1222:1222) = '1'
trailing nullcols
Note: condition commented in the above example need to be replaced.
one way of doing it can be splitting blocks for each condition.
Then it will look like:
load data
append
into table ABC
when (1:2) = 'AD'
AND (27:28)= '01'
AND (1222:1222) = '1'
trailing nullcols
into table ABC
when (1:2) = 'AD'
AND (27:28)= '02'
AND (1222:1222) = '1'
trailing nullcols
So, i'm looking for a better way than this, as i cannot work with the above
splitting logic because i'm dealing with lot many conditions.
Thanx inadvance
Kishore -
SQL*LOADER/SQL usage in Migration
I have very limited migration requirements. I DO NOT need to
migrate a database. I DO need to change some SQL and BCP load
scripts from SQL-SERVER 6.5 to their equivalents in ORACLE 8.0.5.
For this limited purpose, should I proceed to handcode these, or
would the workbench be of use to me?
Thanks for your help.
nullThe migration workbench does, as part of the migration,
generate the BCP and SQL*Loader files required to migrate a
database. However, since you already have the BCP files created
then the Workbench would not actually be able to just generate
the other side of the picture (the SQL*Loader files). I can
suggest the following to you :
1. Perhaps use the Workbench to run a tiny migration that would
show you how we generate the SQL*Loader scripts. It is fairly
straight forward however we need to do some manipulation on
dates.
2. There is a chapter on SQL*Loader as part of the Oracle8i
documentation set.
Chapter 3 "SQL*Loader Concepts"
Oracle8i Utilities, Release 8.1.5
A67792-01
Regards,
Marie
Raja Marla (guest) wrote:
: I have very limited migration requirements. I DO NOT need to
: migrate a database. I DO need to change some SQL and BCP
load
: scripts from SQL-SERVER 6.5 to their equivalents in ORACLE
8.0.5.
: For this limited purpose, should I proceed to handcode these,
or
: would the workbench be of use to me?
: Thanks for your help.
Oracle Technology Network
http://technet.oracle.com
null
Maybe you are looking for
-
How to find the Standard Program associated with a Standard IDOC
Hi, I am going work on enhancing the standard IDOC. Any body can suggest how to find a standard pogram associated with a Standard IDOC. And can any body send some exaple code for extending a standard IDOC and to implement the logic in the stand
-
How to get rid of drive encryption?
Hello, I don't know if this is the right place to post this. If not, please redirect to me the appropiate place. I have a HP Elitebook Folio 9470m with Windows 7 Pro x64. Previous user configured Drive Encryption through HP Protect Tools. Everytime
-
Apple TV won't display on my television
I've had my Apple TV for two years and had no problems with it I updated to 3.0.2 a couple of weeks ago and after briefly checking everything was working have not used the Apple TV since. Yesterday I bought a TV show in iTunes and synced the Apple TV
-
Can I make moving Chapter titles in iBA 2?
I would like to make chapter and section pages move. Text on that page should transition from left to right,... like in keynote. But I would like that it starts automaticly, without touch. I know that there is no such possibility in iBA2. Does anyone
-
i cant download apps from appstore. it always show err 1004 please try again. what's meaning?