Date in shell script
How do I get a script to run from a particular time to a perticular time?
like say from 5pm on 12/19 to 4pm on 12/20?
I have the steps that need to be don e during that period?
Thanks!
Anand
How do I have a counter that does the following:
1. parse thru an access log ...that keeps getting updated reguarly.
2. Check for the occurence of a particular phrase every hr.Count the no of times that phrase occurs and save that value.
3. Do it every hr.
Something like...
while(1)
do
cat access|grep Login
count++
> access (to flush out previous content after getting the no of times the phrase occurs.)
sleep 3600 (sleep for an hr)
done
Thanks in advance.
Anand
Similar Messages
-
Pass in date from shell script to java program
Hi,
I need to pass in some parameter to my java program.
eg 'java SomeProgram yyyy mm dd hh mm ss'
How do i use the shell script to generate the valeus for the year, mth day etc?
The program is running once everyday.
I tried $argu0 = `date+%y`
java SomeProgram $argu0
but not working... thanks for your help!Then, why would you want to use 'date' to provide
'now' to a Java program when you could just use new
java.util.Date() which probably makes the same call
as the Unix 'date' command?That works until you need to use a date other than 'now.'
public static final String ISO_DATE = "yyyy-MM-dd HH:mm:ss";
public static void main(String[] args) {
SimpleDateFormat sdf = (SimpleDateFormat) SimpleDateFormat
.getDateTimeInstance();
sdf.applyPattern(ISO_DATE);
try {
Date d = sdf.parse(args[0]);
// rest of code here
} catch (ParseException e) {
// handle exception
}Users will have to enclose there input with quotes otherwise the JVM will consider their input as two parameters, or you can change the pattern to include a character between dd and HH. -
hi,
i hv use testin datatabase in testing database i hv test the script ,if u can pls help me
i hv bacup in yesterday means if today date is 16042007 i hv backup of 150402007
what is the script i hv use $date -d '%y%m%d' but no outputjust my aim is i want delete record in previous day (substract 1 day of original day)
means example
today is 070416 date
and i want delete record in 070415 previous day how can possible ? -
Executing shell script through PL/SQL
Hi,
I need some help regarding execution of shell script through Oracle PL/SQL.
I have a shell script present in /abc/xyz folder with name search.sh , Through a PL/SQL procedure I am creating a file to store the report data.
I want to execute /abc/xyz/search.sh from the PL/SQL procedure to delete all files created before 3 mins .
1. At first I took Java route and got following permissions granted for RECON user.
GRANT USER SYS java.io.FilePermission <<ALL FILES>> execute ENABLED 351
GRANT USER SYS java.lang.RuntimePermission readFileDescriptor * ENABLED 350
GRANT USER SYS java.lang.RuntimePermission writeFileDescriptor * ENABLED 349
2. Then I created a simple java class for execution of OS command as below
CREATE OR REPLACE AND COMPILE JAVA SOURCE NAMED "OSCommand" AS
import java.io.*;
import java.util.*;
public class OSCommand{
public static String Run(String Command){
try{
Process proc = Runtime.getRuntime().exec(Command);
int ext=proc.waitFor();
return ("0");
catch (Exception e){
System.out.println("Error running command: " + Command +
"\n" + e.getMessage());
return(e.getMessage());
3. And a wrapper function as below to use this class
create or replace
FUNCTION OSCommand_Run(Command IN STRING)
RETURN VARCHAR2 IS
LANGUAGE JAVA
NAME 'OSCommand.Run(java.lang.String) return int';
4. In my PL/SQL proceedure I am using following code to execute the command
v_Return := OSCommand_Run('/abc/xyz/search.sh');
to execute the shell script.
Proceedure executes without any error and generates a new csv file with report data , however shell script does not get executed and hence all csv files created earlier remain as it is in the folder.
Please help.Sven W. wrote:
What happens if you remove the catch exception block from your java command?
I asume you still might have a permission issue. But it could be hidden from you, because of the exception is catched and printed into nirvana.Executed the wrapper function OSCOMMAND_RUN as below
DECLARE
v_Return VARCHAR2(2000);
BEGIN
v_Return := OSCOMMAND_RUN('/recon/html/invoice/search.sh' );
DBMS_OUTPUT.PUT_LINE('v_Return = ' || v_Return);
END;
And following is the result
v_Return = 0
Process exited.
In case of exception it would had printed the exception.
One more thing I noticed, even though I have taken following permissions
GRANT RECON SYS java.io.FilePermission /abc/* execute ENABLED 347
GRANT RECON SYS java.io.FilePermission /abc/xyz/* execute ENABLED 351
GRANT RECON SYS java.io.FilePermission <<ALL FILES>> execute ENABLED 352
GRANT RECON SYS java.lang.RuntimePermission readFileDescriptor * ENABLED 350
GRANT RECON SYS java.lang.RuntimePermission writeFileDescriptor * ENABLED 349
When I create a new search.sh in /abc dir I get following error
v_Return = the Permission (java.io.FilePermission /abc/search.sh execute) has not been granted to RECON. The PL/SQL to grant this is dbms_java.grant_permission( 'RECON', 'SYS:java.io.FilePermission', '/abc/search.sh', 'execute' )
Edited by: 960702 on Sep 25, 2012 10:34 AM -
Please guide me with parameter passing at 3 levels. Here is the scenario.
a. A plsql-generateMaster.plsql- invokes stored procedure genMDetails(param1, query)
b. A shell script genM.sh invokes generateMaster.plsql
c. Need to pass date range as parameters (2 dates) to shell script.
d. The shell scripts accepts the date range parameters and passes them to -generateMaster.plsql
e. generteMaster.plsql uses the 2 date parameters to pass as part of query during the invocation of stored procedure genMDetails(param1, query)
In short - shell script->plsql->stored procedure
Platform is Sun Unix 8, oracle db 9i R2
Thanks.This script shows how to pass parameters to PL/SQL anonymous block.
sqlplus "mob/mob" << EOF
set serveroutput on
begin
dbms_output.put_line('Parameter one: ' || '$1');
dbms_output.put_line('Parameter two: ' || '$2');
-- Here you can invoke procedure proc1($1,$2);
end;
EOF
You can just invoke it like this:
./passParameters.sh 12-13-1981 03-04-2006
Best Regards
Krystian Zieja / mob -
Shell scripts to read data from a text file and to load it into a table
Hi All,
I have a text file consisting of rows and columns as follows,
GEF001 000093625 MKL002510 000001 000000 000000 000000 000000 000000 000001
GEF001 000093625 MKL003604 000001 000000 000000 000000 000000 000000 000001
GEF001 000093625 MKL005675 000001 000000 000000 000000 000000 000000 000001 My requirement is that, i should read the first 3 columns of this file using a shell script and then i have to insert the data into a table consisting of 3 rows in oracle .
the whole application is deployed in unix and that text file comes from mainframe. am working in the unix side of the application and i cant access the data directly from the mainframe. so am required to write a script which reads the data from text file which is placed in certain location and i have to load it to oracle database.
so I can't use SQL * loader.
Please help me something with this...
Thanks in advance.1. Create a dictionary object in Oracle and assign it to the folder where your file resides
2. Write a little procedure which opens the file in the newly created directory object using ULT_FILE and inside the FOR LOOP and do INSERTs to table you want
3. Create a shell script and call that procedure
You can use the post in my Blog for such issues
[Using Oracle UTL_FILE, UTL_SMTP packages and Linux Shell Scripting and Cron utility together|http://kamranagayev.wordpress.com/2009/02/23/using-oracle-utl_file-utl_smtp-packages-and-linux-shell-scripting-and-cron-utility-together-2/]
Kamran Agayev A. (10g OCP)
http://kamranagayev.wordpress.com -
Error reading data from Infocube using shell script.
Dear all ,
I am facing a problem while reading data from an infocube using a shell script.
The details are as follows.
One of the shell script reads the data from the infocube to extract files with the values.
The tables used for extraction by the shell script are :
from SAPR3."/BIC/F&PAR_CUBE.COPA" FCOPA,
SAPR3."/BIC/D&PAR_CUBE.COPAU" COPAU,
SAPR3."/BIC/D&PAR_CUBE.COPAP" COPAP,
SAPR3."/BIC/D&PAR_CUBE.COPA1" CCPROD,
SAPR3."/BIC/D&PAR_CUBE.COPA2" CCCUST,
SAPR3."/BIC/D&PAR_CUBE.COPA3" COPA3,
SAPR3."/BIC/D&PAR_CUBE.COPA4" COPA4,
SAPR3."/BIC/D&PAR_CUBE.COPA5" COPA5,
SAPR3."/BIC/MCCPROD" MCCPROD,
SAPR3."/BIC/SCCPROD" SCCPROD,
SAPR3."/BIC/MCCCUSTOM" MCCCUSTOM,
SAPR3."/BIC/SCCCUSTOM" SCCCUSTOM,
SAPR3."/BIC/SORGUNIT" SORGUNIT,
SAPR3."/BIC/SUNIMOYEAR" SUNIMOYEAR,
/* SAPR3."/BI0/SFISCPER" SFISCPER, */
SAPR3."/BI0/SREQUID" SREQUID,
SAPR3."/BI0/SCURRENCY" SCURRENCY,
SAPR3."/BIC/SSCENARIO" SSCENARIO,
SAPR3."/BIC/SSOURCE" SSOURCE
The problem is that the file generation by this script (after reading the data from teh infocube) is taking an unexpected time of 2 hours which needs to be maximum 10 mins only.
I used RSRV to get the info about these tables for the infocube:
Entry '00046174', SID = 37 in SID table is missing in master data table /BIC/MCUSLEVEL2
Entry '00081450', SID = 38 in SID table is missing in master data table /BIC/MCUSLEVEL2
and so on for SID = 39 and SID = 35 .
Checking of SID table /BIC/SCUSLEVEL2 produced errors
Checking of SID table /BIC/SCUSLEVEL3 produced errors
Can you please let me know if this can be a reason of delay in file generation (or reading of data from the infocube).
Also , Please let me know how to proceed with this issue.
Kindly let me know for more information, if required.
Thanks in advance for your help.
-ShalabhHi ,
In continuation with searching the solution to the problem , I could manage to note a difference in the partition of the Fact table of the infocube.
Using SE14 -> Storage Parameters, I could find the partition done for the fact table as :
PARTITION BY: RANGE
COLUMN_LIST: KEY_ABACOPA
and subsequently there are partitions with data in it.
I need to understand the details of these partitions .
Do they correspond to each requests in the infocube(which may not be possible as there are 13 requests in infocube and much more partitions).
Most importantly, since this partition is observed for this onfocube only and not for other infocubes, it is possible that it can be a reason for SLOW RETRIEVAL of data from this ionfocube( not sure since the partition is used to help in fast retreival of data from the infocubes).
Kindly help.
Thanks for your co-operation in advance.
-Shalabh -
Shell Script- To FTP the latest file with the date stamp
I have a Solaris 10 based system, where it generate some application files (multiple) in XML format and the file name comprises of the current date.
File Format is as follows;
CX-FIL-20070624000000-2-8452536d-000133.xml
Where 20070624 repesents the curent date of the file
I want to FTP these files to another server (Solaris 10 based Sun Machine) by comparing the file name with the current date.
Please let me know how I can do this by using a shell script.Assuming you want to ftp the files with today's datestamp, you could match the files you want like so:
CX-FIL-`/bin/date +"%Y%m%d"`*
Use that in your script to generate the file list to be transferred...
-Rob -
Sql Loader by using shell script, not able to insert data
Hi,
I am trying to dump the data by using shell script.(in shell script i am having sqlldr command)(its a host excutable method cocurrent program)
When i am loading the data, by placing my files(.ctl,.prog,.csv,symbolink file for .prog) in $Custom_top/bin, it is loading exactly. 17000 records inserted.
But if i am loading the data by placing my files in $custom_top/custom_folders. unable to insert total data. only 43 records inserting.
Please any one can help me.
Thanks in advance.
Rama.Srini, Thanks a lot for ur reply,
Oracle Apps version R12,
Microsoft windows XP profissional
Version 2002 service Pack 3
My Control file Script is:
load data
infile '$XADP_TOP/data/CPIU/in/XXOKS_Price_Increase.csv'
append
into table XXOKS_CONTRACT_PRICE_INCR_DTLS
fields terminated BY ',' optionally enclosed by '"'
TRAILING NULLCOLS
(EXCLUSION_FLAG,
LEGACY_NUMBER,
CUSTOMER_NUMBER,
CUSTOMER_NAME,
REQUEST_ID,
CONTRACT_NUMBER,
CONTRACT_START_DATE,
CONTRACT_END,
REQUEST_LINE_ID,
LINE_START_DATE,
LINE_END_DATE,
ITEM_NUMBER,
ITEM_DESCRIPTION,
UNIT_PRICE,
QTY,
NEW_UNIT_PRICE,
LINE_AMOUNT,
NEW_LINE_AMOUNT,
PRICE_INCREASED_DATE,
PERCENTAGE_INCREASED,
ORIGINAL_CONTRACT_AMOUNT,
NEW_CONTRACT_AMOUNT,
PRICE_INCREASE_AMOUNT)
My .prog File is: Please fidn that i created symbolink file also for my .prog.
if [ -z $XADP_TOP ];then
echo "XADP_TOP environment variable is not set!"
exit 1
fi
cd $XADP_TOP/data/CPIU/in
DATE=`date +%y%m%d:%H%M`
i_program_name="$0"
i_ora_pwd="$1"
i_user_id="$2"
i_user_name="$3"
i_request_id="$4"
i_ftp_host_name="$5"
i_ftp_user_name="$6"
i_ftp_user_password="$7"
ftp_prog() {
# FTP Function to reuse the FTP Commands
if [ $# -ne 6 ];then
echo "Usage : ftp_prog <Hostname> <User name> <Password> <Remote Directory> <command> <filename>"
exit 2
fi
l_ftp_host_name="$1"
l_ftp_user_name="$2"
l_ftp_user_password="$3"
l_ftpdir="$4"
l_ftp_command="$5"
l_ftp_filename="$6"
ftp -v -n ${l_ftp_host_name} <<EOF
user ${l_ftp_user_name} ${l_ftp_user_password}
ascii
cd ${l_ftpdir}
${l_ftp_command} ${l_ftp_filename}
quit
EOF
#exit $?
# setting the ftp directory
#ftpdir="/`echo ${TWO_TASK:-$ORACLE_SID}|tr "[A-Z]" "[a-z]"`/CPIU"
##ftpdir="/FinTEST/quoting/PS/ar"
ftpdir="$XADP_TOP/data/CPIU/in"
# setting the in directory and out directory
indir="$XADP_TOP/data/CPIU/in"
outdir="$XADP_TOP/data/CPIU/out"
ftp_prog ${i_ftp_host_name} ${i_ftp_user_name} ${i_ftp_user_password} ${ftpdir} get XXOKS_Price_Increase.csv
echo $ftpdir
echo "Converting the data file into unix mode"
dos2unix XXOKS_Price_Increase.csv XXOKS_Price_Increase.csv
chmod 777 XXOKS_Price_Increase.csv
cd $XADP_TOP/bin
echo "Trying to excute sqlldr and entering into the into control file"
$ORACLE_HOME/bin/sqlldr userid=$i_ora_pwd control=XXOKS_PRICE_INCR_LOAD log=$XADP_TOP/log/XXOKS_PRICE_INCR_LOAD_${DATE}.log;
exit_status=$?
echo "Checking the status and giving permissions to the data file which in in dir"
if [ $exit_status -eq 0 ]; then
cd $XADP_TOP/data/CPIU/in
chmod 777 XXOKS_Price_Increase.csv
echo "try to move data file into out dir"
# Moving the file to out directory
mv XXOKS_Price_Increase.csv ${outdir}/XXOKS_Price_Increase.csv_${DATE}
#echo "ready to zip file in out dir step6"
# Zipping the file
#gzip -f ${outdir}/XXOKS_Price_Increase.csv_${DATE}
echo "deleting the file which is in dir"
# Deleting the file from in directory
/bin/rm -f ${indir}/XXOKS_Price_Increase.csv
# Deleting from the remote directory
ftp_prog ${i_ftp_host_name} ${i_ftp_user_name} ${i_ftp_user_password} ${ftpdir} delete XXOKS_Price_Increase.csv
echo "sqlloader finished successfully."
else
echo "Error in loader"
##echo "Loader error in Price Increase Detials File ${i_file}"
fi
exit $exit_status
And My Log file Comments are
SQL*Loader: Release 10.1.0.5.0 - Production on Thu Dec 3 01:32:08 2009
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Control File: XXOKS_PRICE_INCR_LOAD.ctl
Data File: /oesapp/applmgr/GIS11/apps/apps_st/appl/xadp/12.0.0/data/CPIU/in/XXOKS_Price_Increase.csv
Bad File: XXOKS_Price_Increase.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table XXOKS_CONTRACT_PRICE_INCR_DTLS, loaded from every logical record.
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
EXCLUSION_FLAG FIRST * , O(") CHARACTER
LEGACY_NUMBER NEXT * , O(") CHARACTER
CUSTOMER_NUMBER NEXT * , O(") CHARACTER
CUSTOMER_NAME NEXT * , O(") CHARACTER
REQUEST_ID NEXT * , O(") CHARACTER
CONTRACT_NUMBER NEXT * , O(") CHARACTER
CONTRACT_START_DATE NEXT * , O(") CHARACTER
CONTRACT_END NEXT * , O(") CHARACTER
REQUEST_LINE_ID NEXT * , O(") CHARACTER
LINE_START_DATE NEXT * , O(") CHARACTER
LINE_END_DATE NEXT * , O(") CHARACTER
ITEM_NUMBER NEXT * , O(") CHARACTER
ITEM_DESCRIPTION NEXT * , O(") CHARACTER
UNIT_PRICE NEXT * , O(") CHARACTER
QTY NEXT * , O(") CHARACTER
NEW_UNIT_PRICE NEXT * , O(") CHARACTER
LINE_AMOUNT NEXT * , O(") CHARACTER
NEW_LINE_AMOUNT NEXT * , O(") CHARACTER
PRICE_INCREASED_DATE NEXT * , O(") CHARACTER
PERCENTAGE_INCREASED NEXT * , O(") CHARACTER
ORIGINAL_CONTRACT_AMOUNT NEXT * , O(") CHARACTER
NEW_CONTRACT_AMOUNT NEXT * , O(") CHARACTER
PRICE_INCREASE_AMOUNT NEXT * , O(") CHARACTER
value used for ROWS parameter changed from 64 to 43
Table XXOKS_CONTRACT_PRICE_INCR_DTLS:
43 Rows successfully loaded.
0 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 255162 bytes(43 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 43
Total logical records rejected: 0
Total logical records discarded: 0
Run began on Thu Dec 03 01:32:08 2009
Run ended on Thu Dec 03 01:32:08 2009
Elapsed time was: 00:00:00.19
CPU time was: 00:00:00.04
Plz srini help me.
Thanks in advance
Rama.. -
Data guard monitoring shell script
uname -a
Linux DG1 2.6.18-164.el5 #1 SMP Thu Sep 3 03:28:30 EDT 2009 x86_64 x86_64 x86_64 GNU/Linux
SQL> select * from v$version;
BANNER
Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
PL/SQL Release 11.2.0.1.0 - Production
CORE 11.2.0.1.0 Production
TNS for Linux: Version 11.2.0.1.0 - Production
NLSRTL Version 11.2.0.1.0 - Production
Hi Guys,
I am looking for a shell script that i can cron ,which monitors dataguard env (10g and 11g )and sent email alerts if DR go out of sync say by 10 or 15 logs
i found couple on the net but not working for some reason
http://emrebaransel.blogspot.com/2009/07/shell-script-to-check-dataguard-status.html
if you guys have some please shareYou are using an advanced version of Oracle and want to plug an obsolete script into it??
Why not just monitor the Data Guard with EM or Grid Control and setup emails in there? It is 100% more reliable than anything else. -
Shell Script Programming -- Loading data into table
Hello Gurus
I am using Oracle's sql*loader utility to load data into a table. Lately, I got an unlikely scenario where in I need to process the data file first before loading it into the table and where I need help from you guys.
Consider the following data line
"Emp", DOB, Gender, Subject
"1",01/01/1980,"M","Physics:01/05/2010"
"2",01/01/1981,"M","Chemistry:02/05/2010|Maths:02/06/2011"
"3",01/01/1982,"M","Maths:03/05/2010|Physics:06/07/2010|Chemistry:08/09/2011"
"4",01/01/1983,"M","Biology:09/09/2010|English:10/10/2010"Employee - 1 will get loaded as a single record in the table. But I need to put Subject value into two separate fields into table. i.e. Physics into one column and date - 01/05/2010 into separate column.
Here big problem starts
Employee - 2 Should get loaded as 2 records into the table. The first record should have Chemistry as subject and date as 02/05/2010 and the next record should have all other fields same except the subject should be Maths and date as 02/06/2011. The subjects are separated by a pipe "|" in the data file.
Similarly, Employee 3 should get loaded as 3 records. One as Maths, second as Physics and third as Chemistry along with their respective dates.
I hope I have made my problem clear to everyone.
I am looking to do something in shell scripting such that before finally running the sql*loader script, the above 4 employees have their records repeated as many times as their subject changes.
In summary 2 problems are described above.
1. To load subject and date into 2 separate fields in Oracle table at the time of load.
2. If their exists multiple subjects then a record is to be loaded that many times as there exists any changes in employee's subject.
Any help would be much appreciated.
Thanks.Here are some comments. Perl can be a little cryptic but once you get used to it, it can be pretty powerful.
#!/usr/bin/perl -w
my $line_count = 0;
open FILE, "test_file" or die $!;
# Read each line from the file.
while (my $line = <FILE>) {
# Print the header if it is the first line.
if ($line_count == 0) {
chomp($line);
print $line . ", Date\n";
++$line_count;
next;
# Get all the columns (as separated by ',' into an array)
my @columns = split(',', $line);
# Remove the newline from the fourth column.
chomp($columns[3]);
# Read the fields (separated by pipe) from the fourth column into an array.
my @subject_and_date = split('\|', $columns[3]);
# Loop for each subject and date.
foreach my $sub_and_date (@subject_and_date) {
# Print value of Emp, DOB, and Gender first.
print $columns[0] . "," . $columns[1] . "," . $columns[2] . ",";
# Remove all double quotes from the subject and date string.
$sub_and_date =~ s/"//g;
# Replace ':' with '","'
$sub_and_date =~ s/:/","/;
print '"' . $sub_and_date . '"' . "\n";
++$line_count;
close FILE; -
Shell script to monitor the data guard
Hi,
Can any body please provide the shell scripts to monitor the data guard in all scenarios and to get the mail when problem occurs in dataguard.
Thanks,
MahipalSorry Mahi. Looks like all of the scripts i've got are for logical standbys and not physical. Have a look at the link ualual posted - easy enough to knock up a script from one or more of those data dictionary views. Just had a look on metalink and there's what looks to be a good script in note 241438.1. Its a good starting point definately.
regards,
Mark -
Shell scripts to monitor data guard
Hi All,
Please help me to have the shell scripts for monitoring the data guard.
Thanks,
Mahihere is the shell script we use to monitor dataguard, it sends mail if there is a gap for more than 20 archive logs..
#set Oracle environment for Sql*Plus
#ORACLE_BASE=/oracle/app/oracle ; export ORACLE_BASE
ORACLE_HOME=/oracle/app/oracle/product/10.2.0 ; export ORACLE_HOME
ORACLE_SID=usagedb ; export ORACLE_SID
PATH=$PATH:/oracle/app/oracle/product/10.2.0/bin
#set working directory. script is located here..
cd /oracle/scripts
#Problem statemnt is constructed in message variable
MESSAGE=""
#hostname of the primary DB.. used in messages..
HOST_NAME=`/usr/bin/hostname`
#who will receive problem messages.. DBAs e-mail addresses seperated with space
DBA_GROUP='[email protected] '
#SQL statements to extract Data Guard info from DB
LOCAL_ARC_SQL='select archived_seq# from V$ARCHIVE_DEST_STATUS where dest_id=1; \n exit \n'
STBY_ARC_SQL='select archived_seq# from V$ARCHIVE_DEST_STATUS where dest_id=2; \n exit \n'
STBY_APPLY_SQL='select applied_seq# from V$ARCHIVE_DEST_STATUS where dest_id=2; \n exit \n'
#Get Data guard information to Unix shell variables...
LOCAL_ARC=`echo $LOCAL_ARC_SQL | sqlplus -S / as sysdba | tail -2|head -1`
STBY_ARC=`echo $STBY_ARC_SQL | sqlplus -S / as sysdba | tail -2|head -1`
STBY_APPLY=`echo $STBY_APPLY_SQL | sqlplus -S / as sysdba | tail -2|head -1`
#Allow 20 archive logs for transport and Apply latencies...
let "STBY_ARC_MARK=${STBY_ARC}+20"
let "STBY_APPLY_MARK= ${STBY_APPLY}+20"
if [ $LOCAL_ARC -gt $STBY_ARC_MARK ] ; then
MESSAGE=${MESSAGE}"$HOST_NAME Standby -log TRANSPORT- error! \n local_Arc_No=$LOCAL_ARC but stby_Arc_No=$STBY_ARC \n"
fi
if [ $STBY_ARC -gt $STBY_APPLY_MARK ] ; then
MESSAGE=${MESSAGE}"$HOST_NAME Standby -log APPLY- error! \n stby_Arc_No=$STBY_ARC but stby_Apply_no=$STBY_APPLY \n"
fi
if [ -n "$MESSAGE" ] ; then
MESSAGE=${MESSAGE}"\nWarning: dataguard error!!! \n .\n "
echo $MESSAGE | mailx -s "$HOST_NAME DataGuard error" $DBA_GROUP
fi -
Catch a value from a pl*sql function in a shell script
Hi all,
I have a shell script that simply calls the following pl*sql function.
echo "execute scott.my_pkg.test('FDLmaster');\n exit;" >./pippo.sql
sqlplus scott/tiger @/fidcap_ftp/FDL/SCRIPTS/pippo.sql
What I have to do to catch the value returned from the function test?
Thanks in advance
best regards
MarioSQL> create or replace function do_something return varchar2 is
2 begin
3 return ('Something');
4* end;
SQL> /
Function created.
SQL> select do_something from dual;
DO_SOMETHING
Something
SQL> save pippo
Created file pippo.sql
SQL> exit
Disconnected from Oracle Database 10g Enterprise Edition Release 10.1.0.3.0 - Production
With the Partitioning, OLAP and Data Mining options
[linuxas tmp test10]$ echo exit >>pippo.sql
[linuxas tmp test10]$ VAR=`sqlplus -s scott/tiger @pippo`
[linuxas tmp test10]$ echo $VAR | cut -f3 -d" "
Something
[linuxas tmp test10]$ -
Problem-Report generation using shell script
Hi
We have the Production database and the Reporting database (copy of Production database),
both on Oracle 9.2.0.5 and Solaris 5.8. There is a package inside the Oracle database, which extracts some data from inside the
database, and would generate a report. A shell script has been created in Solaris which would
send in the parameters and call the pakage to generate the report. The parameters it is sending is
the name of report to be generated, and the location where it is to be generated, both hard-coded into
the script. The script is scheduled to run through crontab.
The problem we are facing is that, if we run the script for Reporting database, it successfully
generates the report. But if we use that script for Production database, it gives the error
"Invalid directory Path". I have tried using various other directory paths, even '/tmp'
and '/', but it still gives the same error when executed for Production dataabse.
Could somebody provide any ideas what might be going wrong.
The reasons it is to be executed on Prod db and not the Reporting database are unavoidable.
It anyway runs in off business hours and takes about 10secs to execute.
Please do let me know if there is any other info that I missed to provide here.
Thanks in advance...I will be just guessing because you didn't provide contents of script and package.
The "Invalid directory path" as you said could be ORA-29280 due non existent directory.
Try execute (as sys or system) select * from dba_directories; (or select * from all_directories; as user which the script is login to) on both databases and compare the results. If there is missing your important directory then create it using create directory <dirname>; (from sqlplus and don't forget to grant rights for user).
This error could come from shell script. In that case you should find resolution yourself because you didn't provide script source.
Maybe you are looking for
-
Select query is not working in BDC Program
Hi, I am working in BDC for update valuation class for T-code mm01.Actually In this BDC i am using two recoding based on material type. i am using two internal table : I_DATA and ITAB Use I_DATA to hold excle data in which material No, plant , valuat
-
Premiere Elements shows white screen
At some point in generating a project, the SW suddenly shows a "white screen". Then the Windows 7 Screen gives you the option to wait for the program or exit. After 30 to 50sec. the screen is normal again, but then starts showing white again. When st
-
HT201317 How do you download the photos in your stream on to your Windows Desktop?
I have all these photos on my stream and was wondering how i can get them on to my desktop?
-
UDO Bug in PL31/32 (Can't delete or update data!)
Hi All Anyone else have problems with Userdefined Objects in SBO2005A PL31 and PL32... I have a Form that show data from the UDO... I'm able to: - Add UDO's - Remove UDO's - Update UDO header data - Add UDO Lines I'm not able to: - Delete lines (I de
-
Where are the saved jpegs located?
This may be a very newbie question, but how do i find the files created using the "Save as jpeg/png/pdf" options. And can you change the default folder? Flash Player 11 Firefox