SQL Loader - read 1st line to one table, rest of data to another
Hi
I looked around the FAQs and forums and find similar cases but not mine...
I am running Oracle 9i and have a text file which has the 1st line as a control header and everything beneath it as data, something like this:
14/07/2010|8
12345678|0
12345679|0
12345680|10.87
12345681|7655.8
12345682|100
12345683|0
12345684|-90.44
12345685|0
The first (header) line has a date field and a counter (the number of records expected beneath it)
The rest of the data is an account number and balance.
Since SQL Loader is invoked outside of Oracle (Unix in my case) I assume I should create two tables, such as:
Create table
TIF_CURRENT_BALANCE_DTL
ACCOUNT_REF_NO VARCHAR2(30),
BALANCE_AMT NUMBER(12,2)
Create table
TIF_CURRENT_BALANCE_HDR
HDR_DATE DATE,
HDR_COUNT NUMBER(10)
);And use a control file which will load line 1 to TIF_CURRENT_BALANCE_HDR and other lines (SKIP=1) to TIF_CURRENT_BALANCE_DTL.
Since the header/detail lines are not (necessarily) distinguishable, is there a way to achieve this, without modifying the input file in anyway?
Thanks
Martin
Thanks for your reply - the solution should not be OS dependant as it will run on a Linux and UNIX installation.
The DB will be (for now) Oracle9i Enterprise Edition Release 9.2.0.4.0 - 64bit Production
I looked at that web page you provided and had some hope with the ROWS option, but this is the number of rows to load before each commit.
Any other solutions?
I know I could load it to a common table (text and number) and then convert the values accordingly in the PL/SQL procedure which will run afterwards, but I feel loading the data in the final format is going to be faster.
I am considering using SQL Load to load records (SKIPping row 1) into the DTL table and within the PL.SQL loading the 1st record and performing validation. Since the file has approx 2million rows and is processed daily, 1.99999 million records read with SQL Loader and 1 with conventional methods will still be a vast improvement!
Thanks
Similar Messages
-
Creating SQL-Loader script for more than one table at a time
Hi,
I am using OMWB 2.0.2.0.0 with Oracle 8.1.7 and Sybase 11.9.
It looks like I can create SQL-Loader scripts for all the tables
or for one table at a time. If I want to create SQL-Loader
scripts for 5-6 tables, I have to either create script for all
the tables and then delete the unwanted tables or create the
scripts for one table at a time and then merge them.
Is there a simple way to create migration scripts for more than
one but not all tables at a time?
Thanks,
Prashant RaneNo there is no multi-select for creating SQL-Loader scripts.
You can either create them separately or create them all and
then discard the one you do not need. -
Can SQL Loader read newline chars - multiline column?
Hello All,
I am using SQL Ldr (Release 10.2.0.1.0) to load from a .csv file to a single table.
The .csv file however has a column which spans across multiple lines (with a newline chars).
Example below:
Record1:
A, B, "ABCD", "123"
Record2:
X, Y, "XY
Z", "456"
Record 3:
P, Q, "P
QR", "789"
Notice that record 2 & 3 has newline (it is not a word wrap)
My SQL loader treats each line as a new record!
I tried using the following in my control file but no avail
INFILE 'C:\xyz.csv' --"STR '\r\n'"
I also tried using the following for that particular column but with no use.
"REPLACE(:col_name,CHR(13) || CHR(10))",
Is there a way for this or is there a restriction on the SQLldr version or the Loader itself.
Someone please,
Thanks!Did you [check out|http://en.wikipedia.org/wiki/Rtfm] the section "[Assembling Logical Records from Physical Records|http://download.oracle.com/docs/cd/E11882_01/server.112/e10701/ldr_control_file.htm#i1005509] " in the fine Oracle® Database Utilities - SQL*Loader manual ?
:p -
Procedure to insert data into table by selecting data from another table
Hi all,
I have to create a procedure where i have to select the data from one table and insert it into another table. Any help on this. And i have to update the 2nd table also when ever new records got inserted in the 1st table then.
RegardsHi, you can try something like:
CREATE [OR REPLACE] PROCEDURE procedure_name
IS
BEGIN
INSERT INTO TABLE1
VALUES (SELECT * FROM TABLE2);
END;
For the other part you may create a trigger on the first table AFTER INSERT to insert the values in the second table too. -
SQL*Loader job exits unexpectedly and causes table to be locked with NOWAIT
I have a weekly report job that I run where I have to load about 48 logs with about 750k rows of data in each log. To facilitate this, we have been using a Java job that runs SQL*Loader as an external Process (using ProcessBuilder), one after the other. Recently however, this process has been terminating abnormally during the load which is causing a lock on the table and basically causes the process to grind to a halt until we can open a ticket with the DB team to kill the session that is hung. Is there maybe a better way to handle this upload process than using SQL*Loader or is there some change I could make in either the control file or command line to stop it from dying a horrible death?
At the start of the process, I truncate the table that I'm loading to and then run this command line with the following control file:
COMMAND LINE:
C:\Oracle\ora92\BIN\SQLLDR.EXE userid=ID/PASS@DB_ID load=10000000 rows=100000 DIRECT=TRUE SKIP_INDEX_MAINTENANCE=TRUE control=ControlFile.ctl data=logfile.log
CONTROL FILE:
UNRECOVERABLE
Load DATA
INFILE *
Append
PRESERVE BLANKS
INTO TABLE MY_REPORT_TABLE
FIELDS TERMINATED BY ","
filler_field1 FILLER char(16),
filler_field2 FILLER char(16),
time TIMESTAMP 'MMDDYYYY-HH24MISSFF3' ENCLOSED BY '"',
partne ENCLOSED BY '"',
trans ENCLOSED BY '"',
vendor ENCLOSED BY '"' "SUBSTR(:vendor, 1, 1)",
filler_field4 FILLER ENCLOSED BY '"',
cache_hit_count,
cache_get_count,
wiz_trans_count,
wiz_req_size,
wiz_res_size,
wiz_trans_time,
dc_trans_time,
hostname ENCLOSED BY '"',
trans_list CHAR(2048) ENCLOSED BY '"' "SUBSTR(:trans_list, 1, 256)",
timeouts,
success ENCLOSED BY '"'
Once all of the logs have finished loading, I rebuild the indexes on the table and then start the report process. It seems like it's just dying on random logs now, re-running the process it will fail at a different point each time.
EDIT: The reasons for the UNRECOVERABLE and SKIP_INDEX_MAINTENANCE are to speed the load up. As it is, it still can take 7-12 minutes for each log to load, it's even worse without those on. Overall it's taking about 18 hours for this process to run from start to finish.
Edited by: user6676140 on Jul 7, 2011 11:37 AMPlease note that my post stated: "I have opened a ticket with Oracle support. after 6 days have not had the help that I need."
I also agree that applying the latest PSU is a Best Practice, which Oracle defines as "a cumulative collection of high impact, low risk, and proven fixes for a specific product or component".
With that statement I feel there should not be the drastic issues that we have seen. Our policy is to always apply PSUs, no matter what the product or component, without issue.
Except for now. We did our research, and only open an Oracle ticket when we need expert help. That has not been forthcoming from them, but we are still working the ticket.
Hence, I opened this forum because many times I have found help here, where others have faced the same issue and now have an insight. When having a serious problem I like to use all of my resources, this forum being one of those.
To restate the question:
(1) 97% of our databases reside on RAC. From the Search List for Databases, we do not see the columns Sessions:CPU, Sessions: I/O, Sessions: Other, Instance CPU%, and are told this is working as designed because you must monitor the instance, not the database, with RAC.
(a) After applying PSU2 the Oracle Load Map no longer showed any databases.
All of this in (1) is making the tool less useful for monitoring at the database level, which we do most of the time.
(2) Within a few days of applying PSU2, we couldn't log into EM and got the error "Authentication failed. If problem persists, contact your system administrator."
(b) searching through the emoms.trc files we found the errors listed above posting frantically.
After rolling back PSU we are back in business.
However, there is still the need to remain current with the components of EM.
I am looking for suggestion, insights, experience. While I appreciate Akanksha answering so quickly, a recommendation to open an SR is not what I need.
Sherrie -
SQL Loader reads Data file Sequentially or Randomly?
Will SQL Loader loads the data read from file Sequentially or Randomly?
I have the data file like the below
one
two
three
four
and my control file is
LOAD DATA
INFILE *
TRUNCATE
INTO TABLE T TRAILING NULLCOLS
x RECNUM,
y POSITION (1:4000)
so my table will be polulated like
X Y
1 one
2 Two
3 Three
4 Four
Will this happend sequentially even for the large data sets? say i have from one to one million datas in my data files.
Please clarify.
Thanks,
Rajesh.SQL Loader may read the file sequentially, but you should not rely on the physical ordering of the rows in the table.
It looks like that's what you were hinting at. -
Re: Can SQL Loader read newline chars - multiline column?
I have a scenario where the data file has values separated by ^.While loading this data into the table using sql loader I want to convert it into multiple lines.eg:
data file:
1|1013 park ridge~12345~irving|
2|2013 park ridge~12345~irving|
3|1013 park ridge|
while loading it into table i want the data like
ID ADDRESS
1 "1013 hidden ridge
12345
irving"
2 "2013 hidden ridge
12345
irving"
22 "22013 hidden ridge
12345
irving"
1 1013 hidden ridge
My control file says:
load data
infile "/usr2/home2/adistest/h91ftp/temp/owb_test/owb_test1.csv"
preserve blanks INTO TABLE owbrep.owb_test1
TRUNCATE
fields terminated by '|' TRAILING NULLCOLS
id,
address
Please suggest what should I do?You can use the REPLACE function in your control file to replace whatever character, like ~ in your sample data, is where the newline should be with whatever your newline is on your system, like chr(10) as in the example below.
SCOTT@orcl12c_11gR2> host type owb_test1.csv
1|1013 park ridge~12345~irving|
2|2013 park ridge~12345~irving|
3|1013 park ridge|
SCOTT@orcl12c_11gR2> host type test.ctl
load data
infile "owb_test1.csv"
preserve blanks INTO TABLE owb_test1
TRUNCATE
fields terminated by '|' TRAILING NULLCOLS
id,
address "REPLACE (:address, '~', CHR(10))"
SCOTT@orcl12c_11gR2> create table owb_test1
2 (id number,
3 address varchar2(60))
4 /
Table created.
SCOTT@orcl12c_11gR2> host sqlldr scott/tiger control=test.ctl log=test.log
SQL*Loader: Release 12.1.0.1.0 - Production on Mon Aug 12 10:17:45 2013
Copyright (c) 1982, 2013, Oracle and/or its affiliates. All rights reserved.
Path used: Conventional
Commit point reached - logical record count 3
Table OWB_TEST1:
3 Rows successfully loaded.
Check the log file:
test.log
for more information about the load.
SCOTT@orcl12c_11gR2> select * from owb_test1
2 /
ID ADDRESS
1 1013 park ridge
12345
irving
2 2013 park ridge
12345
irving
3 1013 park ridge
3 rows selected. -
i have troublesome to export of the 1st line from output from the below script
This script basically lists all SIDs and ORACLE_HOMEs and echo for selection.
I want to automatically to set to line 1 for +ASM instances without prompting. As part of my script, i want to source +ASM instance and run the command automatically without prompt
The below scripts automatically picks up the last line. How do i change it to pick up the 1st line??
Script
if [ `uname -s` = SunOS ]
then
ORATAB=/var/opt/oracle/oratab
else
ORATAB=/etc/oratab
fi
if [ -s "$ORATAB" ]
then
#echo "Standby Oratab Entries:"
#echo " # DATABASE NAME ORACLE_HOME"
#echo "----- ------------------ -----------------------------------"
set -x
CURSOR=1
while read LINE
do
case $LINE in
\#*) ;; #comment line
"") ;; #null line
LASTSID=`echo $LINE | cut -f1 -d":"`
LASTHOME=`echo $LINE | cut -f2 -d":"`
printf "%5s %20s %40s\n" $CURSOR $LASTSID $LASTHOME
CURSOR=`expr $CURSOR + 1`
#CURSOR=1
esac
done < $ORATAB
#if [ $CURSOR -eq 2 ]
#then
#echo""
#echo "only one database found in oratab..."
echo $LASTSID
echo $LASTHOME
LD_LIBRARY_PATH=$ORACLE_HOME/lib:$ORACLE_HOME/ctx/lib:/usr/lib64:/lib64:/lib:/usr/lib
SHLIB_PATH=$ORACLE_HOME/lib:$ORACLE_HOME/ctx/lib
SQLPATH=$ORACLE_HOME/rdbms/admin:$ORACLE_HOME/sqlplus/admin
ORA_NLS33=$ORACLE_HOME/ocommon/nls/admin/data
PATH=/opt/oracle/local/bin:/usr/openwin/bin:/usr/sbin:/usr/bin:/usr/ucb:/etc:/bin:/sbin:$ORACLE_HOME/bin:$ORACLE_HOME/jdk/bin:$ORACLE_HOME/jdk/jre/bin:$ORACLE_HOME/Apache/Apache/bin:$ORACLE_HOME/Apache/perl/bin:/usr/local/bin:/bin:/sbin
`set -o vi`
PS1="$ORACLE_SID/`uname -n`-> "
alias rm="rm -f"
else
echo ""
echo " "
echo -n "Choose the Database [ 1 - "`expr $CURSOR - 1`" ] "
#export SELECTION=1
export $CURSOR=1
#read SELECTION
$CURSOR=1
fi
Output
1
+ASM1
/u01/app/11.2.0/grid
2
rdb1v001 /u01/app/oracle/product/11.2.0/dbhome_1
3
rdb1v001r /u01/app/oracle/product/11.2.0/dbhome_1
4
testrj /u01/app/oracle/product/11.2.0/dbhome_1
5
rdbp01 /u01/app/oracle/product/11.2.0/dbhome_1
6
rmb1r001 /u01/app/oracle/product/11.2.0/dbhome_1
rmb1r001
/u01/app/oracle/product/11.2.0/dbhome_1Sorry I don't want to simply post my script here at OTN. I spent several months working on it and put a lot of thoughts into. I may at some time in the future share it, but it will have to be under different circumstances.
But This will not work as i need to source both SID and HOME
Why not? Simply query the appropriate information. SID and HOME are both stored in /etc/oratab. You simply need to fetch or search the appropriate column and return $1 or $2 from the line, which contain the SID or HOME information.
Get SID and HOME for +ASM:
$ var=`awk -F':' '$1 ~ /+ASM/ { print $1,$2 }' /etc/oratab`
$ echo $var
+ASM /u01/app/oracle/product/11.2.0/grid
Convert result into a variable array using space as delimeter and export each value:
$ var=(${var// / })
$ export a1=`echo ${var[0]}`
$ export a2=`echo ${var[1]}`
$ echo $a1
+ASM
$ echo $a2
/u01/app/oracle/product/11.2.0/grid
If you want to do this for other ASM instances, simply specify the instance name. You can also write a procedure to walk through a list of possible search strings using "for item in "ASM+ ASM1+ ASM2+" do; etc.", if you want to get fancy. -
Give user Read-Only access to one table in a database.
Does anyone know how to give a user account Read-only access to 1 table within a SQL Server Database using SQL Server Management Studio? I don't want the account to be able to access any other tables in the database, just the one table. I'm not a sql programmer,
so if there is a way to do it in Sql Server Managment Studio settings that would be the best.Using Management Studio, I assume you already have a login and user for that person. If not,
How to: Create a SQL Server Login http://msdn.microsoft.com/en-us/library/aa337562.aspx
How to: Create a Database User
http://msdn.microsoft.com/en-us/library/aa337545.aspx
1. Then, in Object Explorer, expand the Database, expand
Tables, right-click the table you want, and then click
Properties.
2. On the Permissions page, under Users or Roles, click
Search, then Browse, etc, until you find the user. Click
OK until you are back to the Permissions page.
3. In the Permission for <user>section, find the
SELECT (that's the read permission) and click the Grant
box. Then click OK.
Rick Byham, Microsoft, SQL Server Books Online, Implies no warranty -
Sql Loader by using shell script, not able to insert data
Hi,
I am trying to dump the data by using shell script.(in shell script i am having sqlldr command)(its a host excutable method cocurrent program)
When i am loading the data, by placing my files(.ctl,.prog,.csv,symbolink file for .prog) in $Custom_top/bin, it is loading exactly. 17000 records inserted.
But if i am loading the data by placing my files in $custom_top/custom_folders. unable to insert total data. only 43 records inserting.
Please any one can help me.
Thanks in advance.
Rama.Srini, Thanks a lot for ur reply,
Oracle Apps version R12,
Microsoft windows XP profissional
Version 2002 service Pack 3
My Control file Script is:
load data
infile '$XADP_TOP/data/CPIU/in/XXOKS_Price_Increase.csv'
append
into table XXOKS_CONTRACT_PRICE_INCR_DTLS
fields terminated BY ',' optionally enclosed by '"'
TRAILING NULLCOLS
(EXCLUSION_FLAG,
LEGACY_NUMBER,
CUSTOMER_NUMBER,
CUSTOMER_NAME,
REQUEST_ID,
CONTRACT_NUMBER,
CONTRACT_START_DATE,
CONTRACT_END,
REQUEST_LINE_ID,
LINE_START_DATE,
LINE_END_DATE,
ITEM_NUMBER,
ITEM_DESCRIPTION,
UNIT_PRICE,
QTY,
NEW_UNIT_PRICE,
LINE_AMOUNT,
NEW_LINE_AMOUNT,
PRICE_INCREASED_DATE,
PERCENTAGE_INCREASED,
ORIGINAL_CONTRACT_AMOUNT,
NEW_CONTRACT_AMOUNT,
PRICE_INCREASE_AMOUNT)
My .prog File is: Please fidn that i created symbolink file also for my .prog.
if [ -z $XADP_TOP ];then
echo "XADP_TOP environment variable is not set!"
exit 1
fi
cd $XADP_TOP/data/CPIU/in
DATE=`date +%y%m%d:%H%M`
i_program_name="$0"
i_ora_pwd="$1"
i_user_id="$2"
i_user_name="$3"
i_request_id="$4"
i_ftp_host_name="$5"
i_ftp_user_name="$6"
i_ftp_user_password="$7"
ftp_prog() {
# FTP Function to reuse the FTP Commands
if [ $# -ne 6 ];then
echo "Usage : ftp_prog <Hostname> <User name> <Password> <Remote Directory> <command> <filename>"
exit 2
fi
l_ftp_host_name="$1"
l_ftp_user_name="$2"
l_ftp_user_password="$3"
l_ftpdir="$4"
l_ftp_command="$5"
l_ftp_filename="$6"
ftp -v -n ${l_ftp_host_name} <<EOF
user ${l_ftp_user_name} ${l_ftp_user_password}
ascii
cd ${l_ftpdir}
${l_ftp_command} ${l_ftp_filename}
quit
EOF
#exit $?
# setting the ftp directory
#ftpdir="/`echo ${TWO_TASK:-$ORACLE_SID}|tr "[A-Z]" "[a-z]"`/CPIU"
##ftpdir="/FinTEST/quoting/PS/ar"
ftpdir="$XADP_TOP/data/CPIU/in"
# setting the in directory and out directory
indir="$XADP_TOP/data/CPIU/in"
outdir="$XADP_TOP/data/CPIU/out"
ftp_prog ${i_ftp_host_name} ${i_ftp_user_name} ${i_ftp_user_password} ${ftpdir} get XXOKS_Price_Increase.csv
echo $ftpdir
echo "Converting the data file into unix mode"
dos2unix XXOKS_Price_Increase.csv XXOKS_Price_Increase.csv
chmod 777 XXOKS_Price_Increase.csv
cd $XADP_TOP/bin
echo "Trying to excute sqlldr and entering into the into control file"
$ORACLE_HOME/bin/sqlldr userid=$i_ora_pwd control=XXOKS_PRICE_INCR_LOAD log=$XADP_TOP/log/XXOKS_PRICE_INCR_LOAD_${DATE}.log;
exit_status=$?
echo "Checking the status and giving permissions to the data file which in in dir"
if [ $exit_status -eq 0 ]; then
cd $XADP_TOP/data/CPIU/in
chmod 777 XXOKS_Price_Increase.csv
echo "try to move data file into out dir"
# Moving the file to out directory
mv XXOKS_Price_Increase.csv ${outdir}/XXOKS_Price_Increase.csv_${DATE}
#echo "ready to zip file in out dir step6"
# Zipping the file
#gzip -f ${outdir}/XXOKS_Price_Increase.csv_${DATE}
echo "deleting the file which is in dir"
# Deleting the file from in directory
/bin/rm -f ${indir}/XXOKS_Price_Increase.csv
# Deleting from the remote directory
ftp_prog ${i_ftp_host_name} ${i_ftp_user_name} ${i_ftp_user_password} ${ftpdir} delete XXOKS_Price_Increase.csv
echo "sqlloader finished successfully."
else
echo "Error in loader"
##echo "Loader error in Price Increase Detials File ${i_file}"
fi
exit $exit_status
And My Log file Comments are
SQL*Loader: Release 10.1.0.5.0 - Production on Thu Dec 3 01:32:08 2009
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Control File: XXOKS_PRICE_INCR_LOAD.ctl
Data File: /oesapp/applmgr/GIS11/apps/apps_st/appl/xadp/12.0.0/data/CPIU/in/XXOKS_Price_Increase.csv
Bad File: XXOKS_Price_Increase.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table XXOKS_CONTRACT_PRICE_INCR_DTLS, loaded from every logical record.
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
EXCLUSION_FLAG FIRST * , O(") CHARACTER
LEGACY_NUMBER NEXT * , O(") CHARACTER
CUSTOMER_NUMBER NEXT * , O(") CHARACTER
CUSTOMER_NAME NEXT * , O(") CHARACTER
REQUEST_ID NEXT * , O(") CHARACTER
CONTRACT_NUMBER NEXT * , O(") CHARACTER
CONTRACT_START_DATE NEXT * , O(") CHARACTER
CONTRACT_END NEXT * , O(") CHARACTER
REQUEST_LINE_ID NEXT * , O(") CHARACTER
LINE_START_DATE NEXT * , O(") CHARACTER
LINE_END_DATE NEXT * , O(") CHARACTER
ITEM_NUMBER NEXT * , O(") CHARACTER
ITEM_DESCRIPTION NEXT * , O(") CHARACTER
UNIT_PRICE NEXT * , O(") CHARACTER
QTY NEXT * , O(") CHARACTER
NEW_UNIT_PRICE NEXT * , O(") CHARACTER
LINE_AMOUNT NEXT * , O(") CHARACTER
NEW_LINE_AMOUNT NEXT * , O(") CHARACTER
PRICE_INCREASED_DATE NEXT * , O(") CHARACTER
PERCENTAGE_INCREASED NEXT * , O(") CHARACTER
ORIGINAL_CONTRACT_AMOUNT NEXT * , O(") CHARACTER
NEW_CONTRACT_AMOUNT NEXT * , O(") CHARACTER
PRICE_INCREASE_AMOUNT NEXT * , O(") CHARACTER
value used for ROWS parameter changed from 64 to 43
Table XXOKS_CONTRACT_PRICE_INCR_DTLS:
43 Rows successfully loaded.
0 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 255162 bytes(43 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 43
Total logical records rejected: 0
Total logical records discarded: 0
Run began on Thu Dec 03 01:32:08 2009
Run ended on Thu Dec 03 01:32:08 2009
Elapsed time was: 00:00:00.19
CPU time was: 00:00:00.04
Plz srini help me.
Thanks in advance
Rama.. -
SQL Loader - Set a field to a CONSTANT and the data is date
Hi.
I have a question on SQL Loader.
I want to load a field(field name is TIME_IN).
In the Oracle table, this field type is DATE.
I my csv file, I do not put any value for this field.
I intend to use CONSTANT variable to this field.
In my control file, following is the command for above field.
Hoever, when I run the SQLLoader, it will return me error message which is INVALID MONTH.
I hope somebody can help and advise me on this matter.
Thanks.
INTO TABLE eqreceival_temp
FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"'
EQ_NO,
SIZE_TYPE,
CUSTOMER,
CONSTRUCTION,
QUALITY,
CONDITION,
DATE_IN DATE "DD/MM/YYYY",
TIME_IN CONSTANT '01/01/1999',
PARK_LOC CONSTANT 'CP',
STOCK CONSTANT 'S',
DISCHARGE_PORT CONSTANT 'MYPEN',
CTYPE CONSTANT 'P',
ROAD CONSTANT 'Y',
GATE CONSTANT 'Y',
SHIPMENT CONSTANT 'Y'
)use to_data function when u r inserting....
or
give simply SYSDATE --- AND FIRST CHECK
--YOU MAY REMOVE CONSTANT KEY WORD. -
How to insert some records in one table and some records in another table
Interview question
how insert records in two tables by using trigger
CREATE or REPLACE TRIGGER Emp_Ins_Upd_Del_Trig
BEFORE delete or insert or update on EMP
FOR EACH ROW
BEGIN
if UPDATING then
UPDATE emp2
SET
empno = :new.empno,
ename = :new.ename
--, job = :new.job
--, mgr = :new.mgr
--, hiredate = :new.hiredate
, sal = :new.sal
--, comm = :new.comm
--, deptno = :new.deptno;
sdate = :new.sdate,
edate = :new.edate
end if;
if INSERTING then
INSERT INTO emp2
VALUES
( :new.empno
, :new.ename
--, :new.job
--, :new.mgr
--, :new.hiredate
, :new.sal
--, :new.comm
--, :new.deptno
new.sdate,
new.edate);
end if;
if DELETING then
DELETE FROM emp2
WHERE empno = emp2.empno;
end if;
END;
it is working fine but he wants to insert some specific litimit on one table and some specified limit of records in one ..
In this senerio can i insert records by use count of records...
please help me..Can you be more specific on the "Limit"
Conditional insert can be used in this case. -
Finding value in one table based on date = maximum date in another
I have an event table that shows event dates and attendees but a customer now wants to see the titles of attendees at the time of the event. I have an audit table that shows changes in title and the dates on which the change occurred however I am not sure how to do a SQL statement that would find the title whose date is less than the date of the event but greater than earlier title changes.
Example: in audit table I have:
name1, title1, 1/23/2011
name1, title2, 2/1/2012
name1, title3, 3/1/2013
name2, title1, 5/3/2012
name2, title2, 8/1/2013
In event table I have
event1, name1, 3/2/2012
event2, name1, 1/30/2011
event3, name1, 6/3/2013
event4, name1, 5/3/2012
event4, name2, 6/1/2012
event5, name2, 9/1/2013
Result I want is:
event1, name1, title2 (event date > 2/1/2012 (title2 date) but < 3/1/2013 (title3 date)
event2, name1, title1 (event date > 1/23/2011 but < other title dates)
event3, name1, title3 (event date > highest title date)
event4, name1, title2 same as first example above
event4, name2, title1
event5, name2, title2
Is it possible to get result with SQL alone?
PatAnalytic funstion solution:
with audit_tbl as (
select 'name1' name,'title1' title,to_date('1/23/2011','mm/dd/yyyy') dt from dual union all
select 'name1','title2',to_date('2/1/2012','mm/dd/yyyy') from dual union all
select 'name1','title3',to_date('3/1/2013','mm/dd/yyyy') from dual union all
select 'name2','title1',to_date('5/3/2012','mm/dd/yyyy') from dual union all
select 'name2','title2',to_date('8/1/2013','mm/dd/yyyy') from dual
event_tbl as (
select 'event1' event,'name1' name,to_date('3/2/2012','mm/dd/yyyy') dt from dual union all
select 'event2','name1',to_date('1/30/2011','mm/dd/yyyy') from dual union all
select 'event3','name1',to_date('6/3/2013','mm/dd/yyyy') from dual union all
select 'event4','name1',to_date('5/3/2012','mm/dd/yyyy') from dual union all
select 'event4','name2',to_date('6/1/2012','mm/dd/yyyy') from dual union all
select 'event5','name2',to_date('9/1/2013','mm/dd/yyyy') from dual
select e.event,
e.name,
a.title
from event_tbl e,
select name,
title,
dt from_dt,
lead(dt - 1,1,date '9999-12-31') over(partition by name order by dt) to_dt
from audit_tbl
) a
where a.name = e.name
and e.dt between a.from_dt and a.to_dt
order by e.event,
e.name,
e.dt
EVENT NAME TITLE
event1 name1 title2
event2 name1 title1
event3 name1 title3
event4 name1 title2
event4 name2 title1
event5 name2 title2
6 rows selected.
SQL>
SY. -
Auto Populate Field in One Table with Primary Key from another table.
Greetings all,
I have created two tables. One for Root Cause which will be the based description information of an analysis. Each Root cause can have many corrective actions.
My Table structure is as follows:
RCCA TABLE:
=====================================
Column Name Data Type Nullable
RCCAID NUMBER No
DESCRIPTION VARCHAR2(4000) Yes
SUMMARY VARCHAR2(4000) Yes
OWNER VARCHAR2(4000) Yes
DATEOFINCIDENT DATE Yes
STATUS VARCHAR2(4000) Yes
CORRECTIVE ACTION TABLE
=====================================
Column Name Data Type Nullable
CAID NUMBER No
RCCAID NUMBER No
CANUMBER NUMBER Yes
CACTION VARCHAR2(4000) Yes
DATEDUE DATE Yes
COMMENTS VARCHAR2(4000) Yes
So I have a form that creates the RCCA and then I have another form that I want to feed off of the first form. My thought was that when the RCCA was created, it would open a report of the RCCA and then in another region of the page I would add corrective action form. What I am looking to do is when I press the Create Corrective Action, it will automatically populate the RCCAID in the Corrective Action Table so that it is associated directly to the RCCA. I don't want to have to have someone know what the RCCAID is from teh RCA table because they are autogenerated.
There may be a better way to do this and since I am new to APEX and to Oracle Databases, I am just going with what my logic tells me. Any assistance or thoughts would be appreciated.
Assuming there would be some type of trigger?
I will have to be able to view each RCCA and CA in a report that customers will see.
Thanks in Advance
WallyHi Debasis,
Have a look on this
Quick note on IDENTITY column in SAP HANA
Regards,
Krishna Tangudu -
Update one table based on condition from another table using date ranges
Hello!
I have two tables:
DateRange (consists of ranges of dates):
StartDate FinishDate
Condition
2014-01-02
2014-01-03 true
2014-01-03
2014-01-13
false
2014-01-13
2014-01-14 true
Calendar (consists of three-year dates):
CalendarDate IsParental
2014-01-01
2014-01-02
2014-01-03
2014-01-04
2014-01-05
2014-01-06
2014-01-07
2014-01-08
2014-01-09
2014-01-10
I want to update table Calendar by setting IsParental=1
for those dates that are contained in table DateRange between
StartDate and FinishDate AND Condition IS TRUE.
The query without loop should look similar to this but it works wrong:
UPDATE
Calendar
SET IsParental = 1
WHERE
CalendarDate BETWEEN
(SELECT
StartDate
FROM DateRange
WHERE Calendar. CalendarDate = DateRange. StartDate
AND
(SELECT StartDate
FROM DateRange
WHERE Calendar. CalendarDate = DateRange. FinishDate
AND Condition
IS TRUE
Is it possible to do without loop? Thank you for help!
AnastasiaHi
Please post DDL+DML next time :-)
-- This is the DDL! create the database structure
create table DateRange(
StartDate DATE,
FinishDate DATE,
Condition BIT
GO
create table Calendar(
CalendarDate DATE,
IsParental BIT
GO
-- This is the DML (insert some sample data)
insert DateRange
values
('2014-01-02', '2014-01-03', 1),
('2014-01-03', '2014-01-13', 0),
('2014-01-13', '2014-01-14', 1)
GO
insert Calendar(CalendarDate)
values
('2014-01-01'),
('2014-01-02'),
('2014-01-03'),
('2014-01-04'),
('2014-01-05'),
('2014-01-06'),
('2014-01-07'),
('2014-01-08'),
('2014-01-09'),
('2014-01-10')
select * from DateRange
select * from Calendar
GO
-- This is the solution
select CalendarDate
from Calendar C
where EXISTS (
select C.CalendarDate
FROM DateRange D
where C.CalendarDate between D.StartDate and D.FinishDate and D.Condition = 1
UPDATE Calendar
SET IsParental = 1
from Calendar C
where EXISTS (
select C.CalendarDate
FROM DateRange D
where C.CalendarDate between D.StartDate and D.FinishDate and D.Condition = 1
[Personal Site] [Blog] [Facebook]
Maybe you are looking for
-
How large of a performance difference between two different graphic cards
Hey I am in the middle of upgrading my system and I am looking to spend my tax return on0 graphics cards. I am going to run an SLI with either the Nvidia GTX 660 or the GTX 680. Now I realize the 680 is hands down better than the 660 what I want to k
-
To Whom want to be helped using the webutil_102 Jacob_17 Form9i builder
Steps Running WEBUTIL Comes With 9i 1) File Distibution; After I unzipped the file webutil_102.ZIP; I distributed the file as follow In D:\Dev9i\forms90\java I have the following file: webutil.jar, jacob.jar, In C:\webutil\lib, I ha
-
Error in Local Message System: Error when opening an RFC connection Message
Hi Gurus, We are not able to create support messages in our ECC production system but it can created in Development and Quality. Support message created appears in solution manager system. ECC System displays error message as Error in Local Message S
-
My Lacie Thunderbolt drive is giving me the following message when I try to back up files. New Vault Could Not be Created The new vault could not be created because the file system of the destination volume is unsupported.
-
Post Author: vitalya CA Forum: Charts and Graphs Hello all, I use Crystal Reports XI. I have report with chart. Chart type is Bar. With Chart expert i defined colors for each type of data in the chart. When i preview my report chart colors looks fin