Control File Help
Hi,
Require Help in SQL Loader Control File :
There is a field (column) called customer line number assuming we are uploading from excel sheet , if that value is null then it should use an sequence to upload in the format of 1,2,3.
Kindly do help in making change in the control file.
regards,
If gaps aren't an issue you can use a sequence with an external table;
SQL> select col1,
NVL(col2, new_seq.NEXTVAL) new_col2,
col3
from ext_tab
COL1 NEW_COL2 COL3
101 201 301
102 2 302
103 203 303
204 4 304In this example COL2 is null in the second and fourth rows. The first and third rows increment the sequence even when it is not used by the NVL function.
Similar Messages
-
SQL Loader Control file help!!!
Hi All,
I was in the process of writing a control file for sql loader for an activity i am performing and would appreciate inputs from you all.
I have a table patient containing 44 columns, i have a column 'SEC_LANG_NAME' in which i have to insert data from a excel file.
The excel file contains 3 columns, PATIENT_ID, NAME,SEC_LANG_NAME, i just want to load the values present in 'SEC_LANG_NAME' column in excel file into the 'SEC_LANG_NAME' coulmn in the patient table using sqlloader and the condition is that PATIENT_ID in excel file should be equal to PATIENT_ID in the patient table.
as a first step i am converting this excel into a csv file, and then will shoot the sqlldr once i am done with control file, can someone please help me with the format of control file for this activity.
thanks in advance,
regards,
Edited by: user10243788 on Jan 3, 2010 12:09 AMHello user10243788.
It appears that your intent is to load data into an existing table that contains the key data and is only lacking a column of information. SQL*Loader itself is not built to do this; what it can do is load records to tables. You will have to use a two-step approach similar to what Srini has suggested. 1) If your process is able to be run local to the database server, an external table is a great option. If your process is instead run remote to the database server, SQL*Loader will allow you to load your data file to a table. 2) You will need to run a process to update the PATIENT table.
Here is a start of a control file based on the details that you've providedLOAD DATA
REPLACE
INTO patient_enhance
FIELDS TERMINATED BY ","
( patient_id
, name FILLER
, sec_lang_name)Hope this helps,
Luke
Please mark the answer as helpful or answered if it is so. If not, provide additional details.
Always try to provide create table and insert table statements to help the forum members help you better.
Edited by: Luke Mackey on Jan 5, 2010 12:30 PM
oops, will have to load patient_id to do the update -
ORA-19571: archived-log recid 118360 stamp not found in control file - help
Hi
Every backup I do I am getting an error like below at the bottom :, every backup I do, it's complaaining about DIFFERENT archive log.
ORA-19571: archived-log recid 118360 stamp 705446770 not found in control file.
I have set keep control file info to 14 days, my retention is 7 days in RMAN, so I don't know why it's complaining, below is the full log.
It seeems to be having problems with archive logs that have just been generated while it's doing the backup I think??
channel d3: specifying archive log(s) in backup set
input archive log thread=1 sequence=705 recid=124248 stamp=707103829
input archive log thread=1 sequence=706 recid=124249 stamp=707103830
input archive log thread=1 sequence=707 recid=124250 stamp=707103831
input archive log thread=1 sequence=708 recid=124251 stamp=707103831
input archive log thread=1 sequence=709 recid=124252 stamp=707103832
input archive log thread=1 sequence=710 recid=124253 stamp=707103832
input archive log thread=1 sequence=711 recid=124254 stamp=707103833
input archive log thread=1 sequence=712 recid=124255 stamp=707103833
input archive log thread=1 sequence=713 recid=124256 stamp=707103834
input archive log thread=1 sequence=714 recid=124257 stamp=707103835
input archive log thread=1 sequence=715 recid=124258 stamp=707103835
input archive log thread=1 sequence=716 recid=124259 stamp=707103836
input archive log thread=1 sequence=717 recid=124260 stamp=707103836
input archive log thread=1 sequence=718 recid=124261 stamp=707103837
input archive log thread=1 sequence=719 recid=124262 stamp=707103837
input archive log thread=1 sequence=720 recid=124263 stamp=707103838
input archive log thread=1 sequence=721 recid=124264 stamp=707103838
input archive log thread=1 sequence=722 recid=124265 stamp=707103839
input archive log thread=1 sequence=723 recid=124266 stamp=707103840
input archive log thread=1 sequence=724 recid=124267 stamp=707103840
input archive log thread=1 sequence=725 recid=124268 stamp=707103841
input archive log thread=1 sequence=726 recid=124269 stamp=707103841
RMAN-03009: failure of backup command on d3 channel at 01/01/2010 02:11:33
ORA-19571: archived-log recid 118360 stamp 705446770 not found in control file
continuing other job steps, job failed will not be re-run
channel d6: starting compressed archive log backupset
channel d6: specifying archive log(s) in backup set
input archive log thread=1 sequence=641 recid=124034 stamp=707077994
input archive log thread=1 sequence=642 recid=124035 stamp=707077994
...It's a rac, so shall I restart one instance at a time?? Or the whole thing? I am using an recovery catalog, so should I be connected to the recovery catalog Or control file to do a cross check backup, also when it's doing the backup do u recon it can't fing archivelog in controlfile file as it's not done a resync, should I force it to do a resync after a backup then backup the archive logs??? I really don't want to bounce the db as it's 24/7
-
Need help to resolve errrors in control file
i created one ctl file to upload data into oracle.
my data almost like this
101,1060425123422,100.05
102,106042523422,101.05
103,1060425223532,110.05
104,1060425123422,200
my contol file like this.
load data
INFILE 'CAP.csv'
APPEND
INTO TABLE STGION
fields terminated by ','
trailing nullcols
CNTL_NUM,
TRAN_DT_TM "TO_DATE(substr(:TRAN_DT_TM,2,6)||lpad(substr(:TRAN_DT_TM,8,6),6,'0'),'YYMMDDHH24MISS')",
TRAN_AMT
I got error messages like this.
Record 2: Rejected - Error on table STGION, column TRAN_DT_TM.
ORA-01850: hour must be between 0 and 23
could anybody solve this issue.
I really thankful for your help.Worked ok here using a 9.2.0.4 database
SQL*Loader: Release 10.1.0.4.0 - Production on Tue Apr 25 11:21:43 2006
Copyright (c) 1982, 2004, Oracle. All rights reserved.
Control File: stgion.ctl
Data File: CAP.csv
Bad File: CAP.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table STGION, loaded from every logical record.
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
CNTL_NUM FIRST * , CHARACTER
TRAN_DT_TM NEXT * , CHARACTER
SQL string for column : "TO_DATE(substr(:TRAN_DT_TM,2,6)||lpad(substr(:TRAN_DT_TM,8,6),6,'0'),'YYMMDDHH24MISS')"
TRAN_AMT NEXT * , CHARACTER
Table STGION:
4 Rows successfully loaded.
0 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 49536 bytes(64 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 4
Total logical records rejected: 0
Total logical records discarded: 0
Run began on Tue Apr 25 11:21:43 2006
Run ended on Tue Apr 25 11:21:44 2006
Elapsed time was: 00:00:01.25
CPU time was: 00:00:00.02
Connected to:
Oracle9i Enterprise Edition Release 9.2.0.4.0 - Production
With the Partitioning, OLAP and Oracle Data Mining options
JServer Release 9.2.0.4.0 - Production
SQL> create table STGION(
2 cntl_num number,
3 tran_dt_tm date,
4 tran_amt number)
5 /
Table created.
SQL> select * from STGION;
CNTL_NUM TRAN_DT_T TRAN_AMT
101 25-APR-06 100.05
102 25-APR-06 101.05
103 25-APR-06 110.05
104 25-APR-06 200
SQL> alter session set nls_date_format = 'dd-mon-yy hh24:mi:ss';
Session altered.
SQL> select * from STGION;
CNTL_NUM TRAN_DT_TM TRAN_AMT
101 25-apr-06 12:34:22 100.05
102 25-apr-06 02:34:22 101.05
103 25-apr-06 22:35:32 110.05
104 25-apr-06 12:34:22 200Could you post your log file - perhaps you have an old version of sqlloader. -
Mssage when opening Photo Stream on-"This file does not have a program associated with it for performing this action. Please install a program or, if one is already installed, create an association in the Default Programs control panel" Help!!!
You are posting in the "icloud on my mac" forum, but your profile mentions Windows. If using a mac, you need to have iphoto or aperture installed in order to receive new photos via photo stream. If using windows, try posting in the iCloud on a PC forum. You'll get better help there.
https://discussions.apple.com/community/icloud/icloud_on_my_pc -
Need a help: how to recover using backup control file?
Please a help:
First computer
1. I create a database
2. put it in archive mode,
3. shutdown and made a cold backup
4. Created backup control file (alter database backup control file to trace)
5. I started up and created some archived log files.
Second Computer
1. I copied all files created in the step 3 above except the control file
2. I create all the same map directory on the 2nd computer as in the 1st computer
3. I recreate the control file from the script got in the step 4 above
4. I Copy archived log files generated at the step 5 in the local directory in the second computer
5. I set the logsource and set the autorecovery to on
6. I recover the database: RECOVER DATABASE USING BACKUP CONTROL FILE UNTIL CANCEL
Error got:
the archived log file applied do not go beyond the first one?,How could I do to applied all archived log files copied from the first computer to the second computer to have the same data?
Thank you very mucharchived log sequeces in the first computerSQL> select sequence#,first_change#,next_change# from v$log_history;
SEQUENCE# FIRST_CHANGE# NEXT_CHANGE#
1 553723 555484
2 555484 557345
Actions I did in the second Computer(after copied the former two archived log files from the 1st computer)
SQL> CREATE CONTROLFILE REUSE DATABASE "ORCL" RESETLOGS ARCHIVELOG
2 MAXLOGFILES 16
3 MAXLOGMEMBERS 3
4 MAXDATAFILES 100
5 MAXINSTANCES 8
6 MAXLOGHISTORY 292
7 LOGFILE
8 GROUP 1 'C:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\REDO01.LOG' SIZE 50M,
9 GROUP 2 'C:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\REDO02.LOG' SIZE 50M,
10 GROUP 3 'C:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\REDO03.LOG' SIZE 50M
11 -- STANDBY LOGFILE
12 DATAFILE
13 'C:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\SYSTEM01.DBF',
14 'C:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\UNDOTBS01.DBF',
15 'C:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\SYSAUX01.DBF',
16 'C:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\USERS01.DBF'
17 CHARACTER SET WE8ISO8859P1
18 ;
Control file created.
SQL> archive log list;
Database log mode Archive Mode
Automatic archival Disabled
Archive destination USE_DB_RECOVERY_FILE_DEST
Oldest online log sequence 0
Next log sequence to archive 0
Current log sequence 0
SQL> alter database archivelog;
Database altered.
SQL> set logsource C:\local_destination1_orcl
SQL> set autorecovery on;
SQL> recover database using backup controlfile until cancel;
ORA-00279: change 555611 generated at 01/18/2007 14:14:14 needed for thread 1
ORA-00289: suggestion :
C:\LOCAL_DESTINATION1_ORCL\ARCH.1_1_612194518_43F17CF5.ARC
ORA-00280: change 555611 for thread 1 is in sequence #1
ORA-00328: archived log ends at change 555483, need later change 555611
ORA-00334: archived log:
'C:\LOCAL_DESTINATION1_ORCL\ARCH.1_1_612194518_43F17CF5.ARC'
Idon't know where the change 555611 is coming from? -
URGENT HELP ON SQL LOADER CONTROL FILE
Dear All,
Please find my control file below. I need two leading zeros for the coulmn DDA_CLEARING_NUMBER. Presently its loading 18 characters. position is (0791:0808). Now i need this to be loaded with two leading zeros to the left. I need padding two zeros in front of the 18 characters. HOW CAN I DO THAT? PLEASE HELP ME IN IT.
OPTIONS (DIRECT=FALSE,
ROWS=30000,
BINDSIZE=3000000,
READSIZE=3000000,
SKIP_UNUSABLE_INDEXES=TRUE)
LOAD DATA
INFILE '/xxx/xxx/xxx.dat'
BADFILE '/xxx/xxx/xxx.bad'
DISCARDFILE '/dev/null'
APPEND
INTO TABLE AAA
WHEN MONTH_END_STATUS = 'O'
TIME_PERIOD POSITION (0079:0086) DATE "YYYYMMDD",
SERVICE_TYPE POSITION (0077:0078) CHAR,
ACCOUNT_NUMBER POSITION (0001:0020) CHAR,
MONTH_END_STATUS POSITION (0060:0060) CHAR,
CIF_BANK POSITION (0066:0069) CHAR,
CIF_BRANCH POSITION (0070:0074) CHAR,
PLAN_CODE POSITION (0075:0076) CHAR,
DDA_CLEARING_NUMBER POSITION (0791:0808) CHAR,
TAX_ID POSITION (0516:0524) CHAR)Dear peterson,
When i do what you said, its inserting nothing. not even space. Please help me in it what to do?
thanks and regards -
Help for: ORA-01103: database name PRIMARY in control file is not STANDBY
Hello all, this will be my first post to the support forum. I'm an associate dba with just 6 months on the job, so if I've forgotten something or not given some infromation that is needed please let me know.
I've also combed the forums/internet, and some of the answers haven't helped. The Oracle Document ORA-1103 While Mounting the Database Using PFILE [ID 237073.1] says my init.ora file is corrupted, but creating a new init.ora file from the spfile does not help. Neither does just starting from the spfile. I have older copies of the init.ora file and the spfiles that the database was running on previously, so I believe they are good.
This standby NIRNASD1 has existed previously, I had to refresh the primary NIKNASD2, and then re-instantiate NIRNASD1 after the refresh is complete.
My env is set correctly, and my ORACLE_SID has been exported to NIRNASD1
NIKNASD2 = Primary Database
NIRNASD1 = Secondary/Standby Database
Goal: Creation of Logical Standby NIRNASD1 after creating Physical Standby from NIKNASD2
My database versions are 10.2.0.4.0, and the databases are on a Unix server. Both databases are located on separate servers.
Steps that I have taken:
I used RMAN to backup our primary database to the staging area:
$ rman target /
run {
backup database
format '/datatransa/dg_stage/%U'
include current controlfile for standby;
sql "alter system archive log current";
backup archivelog all format '/datatransa/dg_stage/%U';
I used RMAN to Create Secondary Database utilizing RMAN DUPLICATE command.
RMAN> run {
2> allocate auxiliary channel auxdisk device type disk;
3> duplicate target database for standby NOFILENAMECHECK;
4> }
On Secondary database I started Managed Recovery mode
SQL> shutdown immediate;
ORA-01109: database not open
Database dismounted.
ORACLE instance shut down.
(I used pfile here, thinking that I needed to mount the database to the pfile so that the database would see the change in the dataguard parameters in the init.ora file, the change from logical to physical- I commeneted out the logical and uncommented the physical line)
# Dataguard Parameters
For logical standby, change db_name to name of standby database.
db_name=NIKNASD2 ### for physical, db_name is same as primary
#db_name=NIRNASD1 ### for logical, db_name is same as unique_name
SQL> STARTUP MOUNT PFILE = /oraa/app/oracle/product/1020/admin/NIRNASD1/pfile/initNIRNASD1.ora;
ORACLE instance started.
Total System Global Area 1577058304 bytes
Fixed Size 2084368 bytes
Variable Size 385876464 bytes
Database Buffers 1174405120 bytes
Redo Buffers 14692352 bytes
Database mounted.
SQL> ALTER DATABASE recover managed standby database using current logfile disconnect;
I then verified the Data Guard Configuration by using “alter system archive log current;” on the primary database and watching the sequence number change in the secondary database.
I made sure that:
• The primary database was in MAXIMUM PERFORMANCE MODE
• Stopped managed recover on the standby database: alter database recover managed standby database cancel;
• Built a logical standby data dictionary on the primary database
• The db_name in init.ora was changed (this is in our document at my job)
• I changed my database name (from physical to logical) in my init.ora pfile (reverse of what I did above)
# Dataguard Parameters
For logical standby, change db_name to name of standby database.
#db_name=NIKNASD2 ### for physical, db_name is same as primary
db_name=NIRNASD1 ### for logical, db_name is same as unique_name
I then went to shutdown my standby database and re-start it in a mount exclusive state, which is where I get the ORA-01103 Error (Again I used the pfile, thinking that I needed to tell the database it is now a logical standby):
SQL> shutdown immediate;
ORA-01109: database not open
Database dismounted.
ORACLE instance shut down.
SQL> STARTUP EXCLUSIVE MOUNT PFILE = /oraa/app/oracle/product/1020/admin/NIRNASD1/pfile/initNIRNASD1.ora;
ORACLE instance started.
Total System Global Area 1577058304 bytes
Fixed Size 2084368 bytes
Variable Size 385876464 bytes
Database Buffers 1174405120 bytes
Redo Buffers 14692352 bytes
ORA-01103: database name 'NIKNASD2' in control file is not 'NIRNASD1'
From what I understand of the process, the name in the control file is correct, I want it to be NIRNASD1. But the database for some reason thinks it should be NIKNASD2. The following are the parts of my init.ora file that include the dataguard parameters:
# Database Identification
db_domain=""
#db_name=NIRNASD1
#db_unique_name=NIRNASD1
# File Configuration
control_files=("/oradba2/oradata/NIRNASD1/control01.ctl", "/oradba3/oradata/NIRNASD1/control02.ctl", "/oradba4/oradata/NIRNASD1/control03.ctl")
# Instance Identification
instance_name=NIRNASD1
# Dataguard Parameters
#db_name=NIKNASD2 ### for physical, db_name is same as prmary
db_name=NIRNASD1 ### for logical, db_name is same as unique_name
db_unique_name=NIRNASD1
dg_broker_start=TRUE
db_file_name_convert='NIKNASD2','NIRNASD1'
log_file_name_convert='NIKNASD2','NIRNASD1'
log_archive_config='dg_config=(NIRNASD1,NIKNASD2)'
log_archive_dest_1='LOCATION="/oraarcha/NIRNASD1/" valid_for=(ONLINE_LOGFILES,all_roles) db_unique_name=NIRNASD1'
#log_archive_dest_2='LOCATION="/oraarcha/NIKNASD2/" valid_for=(standby_logfiles,standby_roles) db_unique_name=NIRNASD1'
log_archive_dest_2='LOCATION="/oraarcha/NIKNASD2/" valid_for=(standby_logfile,standby_role) db_unique_name=NIRNASD1'
STANDBY_ARCHIVE_DEST='LOCATION=/oraarcha/NIKNASD2/'
# Parameters are not needed since this server will NOT become primary
#log_archive_dest_2='service=NIKNASD2
# valid_for=(online_logfiles,primary_role)
# db_unique_name=NIKNASD2'
fal_server='NIKNASD2'
fal_client='NIRNASD1'
I would appreciate any help, or pointing me in the right direction. I'm just missing something. I am reviewing the documents for building a physical and logical standby from oracle. Just not sure where to go from here.
Thank you
Edited by: 977917 on Dec 19, 2012 5:49 PMFirst of all, thank you both for answering my post. I've pulled up Mr. Hesse's page and will make it a go-to staple.
We're in the process of upgrading our databases, but we have 130+ databases and only six Oracle dba's, and I'm one of them. It's a large corporation, and things move at a "slow and tested" pace.
The pfile parameters listed above are from my secondary/standby database. And I do want to create a logical standby.
I forgot to mention that we do use DataGuard Broker, but I did not think that would be the cause of why the database was starting up incorrectly, so I did not mention it. My apologies there.
As far as the db_name, here's my question on that. It's my understanding the the db_name should be the name of the primary database when you are working with a physical standby, but as soon as you convert it to logical, you should change the db_name to the secondary/standby database? Am I correct on that?
Leading from that, during the process of creating the physical standby and converting the physical standby to the logical standby, should I change the db_name in the secondary/standby database in the spfile and never use the pfile at all? For instance, when I create the physical standby I have to change the db_name in the standby to the PRIMARY database, so that makes me think I should change db_name in the spfile? (If you see above, I changed db_name in the pfile and did a startup pfile)
This morning I was able to reach out to a fellow DBA (they are were asleep when I posted this last night), and they tried a few things. We had a redirect in the standby directory /oraa/app/oracle/product/1020/dbs folder that looked like this: spfileNIRNASD1.ora -> /oraa/app/oracle/product/1020/admin/NIRNASD1/pfile/spfileNIRNASD1.ora
She removed the redirect and the startup mount exclusive then worked without the error.
Thank you again for your help Mr.Quluzade and Mr. Hesse, I appreciate you all taking the time to teach someone new to the craft. I will definitely read up on the link that you sent me.
Chris Cranford -
i have data file in .CSV format.
i need to upload this data into control file.
for this i wrote one control file.
my consern with date format in table.
in .csv file column B and C having date and time.
i want to merge this to columns as single column in oracle.onemore thing before date i have one extra character i need to remove that character also
example data in .csv file.
120060422 122544
220060422 122645
320060422 122744
please help me to write control file.Open your CSV and insert a Column next to your date column enter formula
MID(Cell Number, Starting Position, Number of Characters to return)
Drag this formula till last cell select the column and Copy-> Paste Special Values
Then delete the original column
Also you can merge the two date columns in excel itself by using the formula
Concatenate(Cell A, Cell B)
Then you can upload this data.
Hope this helps.
Rajnish -
hi ,
Can you please tell me the defintions of scema,direcotry,dictionary,control file amd library in oracle
Please help me by giving the definitions.....Is it a kind of homework ?
Information you ask for are available through many system views. When I ain't sure of it's name I usually do something like this:
select *
from all_objects
where object_name like '%CONTROL%'to get a list of objects that might help me finding control files.
Another method is actually searching for information in dictionary view. -
hi all
Newbie here........i want to write a control file for uploading data from a csv file(Excel), which has 14 colums eg:
col 1,col 2,col 3,col 4,col 5,...................col 14
and the length is not constant for the feilds...
now i need to upload only selected columns into a table.........
say.....
col2,col3,col4,col5,col8,col9 and col14........and the table is of 7 columns.......
can some one please help me in writing the control file for this requirement......
Many thanks......In that case, you should consider using EXTERNAL TABLEs.
Instead of having a control file sitting outside the database, and having to run and monitor and managed the SQL*Loader process outside the database, you can do it all from inside the database.
An EXTERNAL TABLE is basically a SQL*Loader control file. You define this once using a CREATE TABLE statement, up front. You can use the ALTER TABLE statement to specify the location and name of the CSV file.
A simply SQL SELECT allows you to load the data.
Details are in [url http://oracle.telkom.co.za:7777/oracle/oradoc102/server.102/b14200/statements_7002.htm#i2159541]Oracle® Database SQL Reference guide.
Here's the example given in the guide:
CREATE TABLE dept_external (
deptno NUMBER(6),
dname VARCHAR2(20),
loc VARCHAR2(25)
ORGANIZATION EXTERNAL
(TYPE oracle_loader
DEFAULT DIRECTORY admin
ACCESS PARAMETERS
RECORDS DELIMITED BY newline
BADFILE 'ulcase1.bad'
DISCARDFILE 'ulcase1.dis'
LOGFILE 'ulcase1.log'
SKIP 20
FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"'
deptno INTEGER EXTERNAL(6),
dname CHAR(20),
loc CHAR(25)
LOCATION ('ulcase1.ctl')
REJECT LIMIT UNLIMITED; -
Help required to build SQL loader control file
I have a table, That we need to load using SQL loader.
table structure is --
<emp_id>,<first_name>,<middle_name>,<last_name>,<sal>
The structure of flat file is like below,
<emp_id>|<emp_name>|<sal>
<emp_name> field can contain space to define first name, middle name and last name,
if no space is there means we only need to load first name. and one space means First and last name should load.
Sample flat file--
1001|Ram|10000
1002|Syam Kumar Sharma|20000
1003|Jadu Prashad|15000
Please help me out to build the control file.
Thanks in AdvanceMeans, can use DBMS_SCHEDULER for loading data ?Yes, you can create procedures for that and let the scheduler execute them on the desired interval
(you can even execute OS commands through DBMS_SCHEDULER).
Read about it here:
http://www.oracle.com/pls/db102/search?word=DBMS_SCHEDULER&partno=
http://www.oracle-base.com/articles/10g/Scheduler10g.php
By the way, instead of using sqlloader why not switch to using external tables?
http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:6611962171229
http://www.oracle-base.com/articles/9i/SQLNewFeatures9i.php#ExternalTables
A few other approaches (pre 10g)
http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:2048340300346698595
http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:3084681089099 -
Help needed to write control file.
Hi Guys,
i need to write one control file to upload data from .txt file to oracle table.
.txt file data contains like this
2006041110:40:22
2006041111:30:42
2006041210:40:22
i need to upload this data into date column in oracle table.
please help me to write control file for this requirement.
Thanks for your help and timedata1 "to_date(:data1, 'YYYYMMDDHH24:MI:SS')"
-
Need help in writing the control file for SQLLOADER
Is it possible to error out the Sqlloader in case the data fields in the data file for a row are more than the fields stated in the control file?
i.e. My data file is something like
aaa,bbb,cc
dd,eee
And my ctl file has just 2 columns in it. Is it possible to write a control file which will cause the Sqlloader to error out?
Thanks...Nisha,
Again I posted test example in your other post but here is how can do that
CREATE TABLE mytest111 (
col1 NUMBER,
col2 NUMBER,
col3 NUMBER
LOAD DATA
TRUNCATE INTO TABLE MYTEST111
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
col1 integer external,
col2 integer external
#mytest.dat
1,2,3
1,2
SQL*Loader: Release 10.2.0.1.0 - Production on Fri Apr 10 11:40:39 2009
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Control File: mytest.ctl
Data File: mytest.dat
Bad File: mytest.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table USIUSER.MYTEST111, loaded from every logical record.
Insert option in effect for this table: TRUNCATE
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
COL1 FIRST * , O(") CHARACTER
COL2 NEXT * , O(") CHARACTER
Table MYTEST111:
2 Rows successfully loaded.
0 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 33024 bytes(64 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 2
Total logical records rejected: 0
Total logical records discarded: 0
Run began on Fri Apr 10 11:40:39 2009
Run ended on Fri Apr 10 11:40:40 2009
Elapsed time was: 00:00:00.99
CPU time was: 00:00:00.06
{code}
Regards -
Help in creating sql loader control file
I am having a csv file representing in this format. I have several lines like this
"XXX", "YYY","ZZZ","01/01/2005", "AAA"
So when I created the control file having, I am having problem with inserting the date.
I tried to you use the date column name
load data
infile x.cvt
into table table name
fields terminated by ","
colA
colB
colC
colD "TO_DATE(colD 'dd/mm/yyy')
does not seem to work. Please let me know what I am missing.
thanksIf you don't want to load the 5th field, you need to specify it as filler field in the control file
Load Data
Infile x.cvt
Into Table table name
Fields Terminated by ","
Optionally enclosed by '"'
(colA,
colB,
colC,
colD DATE "dd/mm/yyyy",
colE FILLER
)http://download-west.oracle.com/docs/cd/B10501_01/server.920/a96652/ch06.htm#1006670
If you want to use the function to_date, don't forget the colon before the column name
Load Data
Infile x.cvt
Into Table table name
Fields Terminated by ","
Optionally enclosed by '"'
(colA,
colB,
colC,
colD char "TO_DATE(:colD 'dd/mm/yyyy')",
colE FILLER
)http://download-west.oracle.com/docs/cd/B10501_01/server.920/a96652/ch06.htm#1008153
Message was edited by:
Jens Petersen
Maybe you are looking for
-
Playlists appear in itunes but not on sync list
I have recently exported my itunes library to a new lap top. Several playlists were lost but I wasn't that fussed. Itunes was newly installed on the laptop. When I plug in my iPhone (4) and select "music" only six playlists are available and five
-
How to transfer mp4 file to iphone 5c
how to transfer mp4 file to iphone 5c?
-
Mysterious routing problem / interface determination
Hi, I have a very very strange routing problem with XI. A message is sent from R/3 to XI and then send via adapter to an external party. The routing is configured well. But sometimes I have the following problem: A message is received by XI (from R/3
-
Anyone her use Adsense with BC sites? Any tips on how to go about it with modules, tags, apps? Thanks
-
My father and I downloaded three holiday albums from iTunes to his account a few days ago. We created a "Christmas" Playlist on his iPod, transferred the songs from the three new albums into the playlist. We also went to "Get Info" on each song from