Is the syntax correct to export tables using shell script
can you please tell me the syntax to plaxce the exp utility in shelle script.
Appreciate your help
Is the syntax correct for export.
It is throwing errors
vi test.sh
echo migrating data...
echo Exporting ADDRESS_COUNTRY_CODE ...
exp userid=scott/tiger@orcl file=address_country_code log=exp_address_country_code tables=ADDRESS_COUNTRY_CODE compress=n indexes=n constraints=n grants=n triggers=n statistics=none consistent=y query='where org_grp_i in ( 66 )' > /dev/null
echo Exporting ASSUMPTION_DETAIL_SUMMARY ...
exp userid=scott/tiger@orcl file=assumption_detail_summary log=exp_assumption_detail_summary tables=ASSUMPTION_DETAIL_SUMMARY compress=n indexes=n constraints=n grants=n triggers=n statistics=none consistent=y query='where org_grp_i in ( 66 )' > /dev/null
Just follow the notes and the example in the utilities guide
http://download-uk.oracle.com/docs/cd/B10501_01/server.920/a96652/ch01.htm#1005843
exp scott/tiger TABLES=emp QUERY=\"WHERE job=\'SALESMAN\' and sal \<1600\"
Similar Messages
-
Sqlldr flat file to external table using shell scripts
Hi,
Has anyone done this before? Please give me a hand.
Thanks!Thanks Justin.
When do I need to create the external table EMP_STAGING ?
These are my steps so far:
- shell script to crate the flat file (but I need to change the table name to EMP_STAGING)
- use a script to call sqlldr to load the flat file into the external table
- then the script will call the MERGE sql script to merge the data from the external table into the database table
Am I on the right track?
In which stage should I create and drop the external table?
Thanks! -
Generating Text file from table using Shell script
I am using KSH for generating and FTPing a text file from a table.
While generating Text file I am not getting my Column names in orderly manner.
q2="select COLUMN1||' '||COLUMN2||' '||COLUMN3 from table1;"
set pagesize 0
set head off
set trimspool on
set trimout on
set colsep ' '
set linesize 1500
set trimspool on
spool /ss/app11/oastss/reports/$file2
select 'COLUMN1'||' '||'COLUMN2'||' '||'COLUMN3' from dual;
$q2
spool off;
EOF
I am getting the result some what like below in text file
COLUMN1 COLUMN2 COLUMN3
MALLIK_ACCT 17-SEP-11 908030482
MALLIK_ACCT 17-SEP-11 908266967
MALLIK_ACCT 17-SEP-11 909570766
I want the format like below
COLUMN1........ COLUMN2 .... COLUMN3
MALLIK_ACCT ...17-SEP-11 .... 908030482
MALLIK_ACCT ...17-SEP-11 .... 908266967
MALLIK_ACCT ...17-SEP-11 .... 909570766
I put dots(.) for illustration purpose.
column data length may icrease some times . it shoudl automatically adjust column and data so that they are in alignment. thanks in advance.Mallik wrote:
Hi my question is to format the headers so that they will be in alignment with column data and readable.So you want to output a query as a fixed width format data file? How about this (rather than using scripts)...
As sys user:
CREATE OR REPLACE DIRECTORY TEST_DIR AS '\tmp\myfiles'
GRANT READ, WRITE ON DIRECTORY TEST_DIR TO myuser
/As myuser:
CREATE OR REPLACE PROCEDURE run_query(p_sql IN VARCHAR2
,p_dir IN VARCHAR2
,p_header_file IN VARCHAR2
,p_data_file IN VARCHAR2 := NULL) IS
v_finaltxt VARCHAR2(4000);
v_v_val VARCHAR2(4000);
v_n_val NUMBER;
v_d_val DATE;
v_ret NUMBER;
c NUMBER;
d NUMBER;
col_cnt INTEGER;
f BOOLEAN;
rec_tab DBMS_SQL.DESC_TAB;
col_num NUMBER;
v_fh UTL_FILE.FILE_TYPE;
v_samefile BOOLEAN := (NVL(p_data_file,p_header_file) = p_header_file);
BEGIN
c := DBMS_SQL.OPEN_CURSOR;
DBMS_SQL.PARSE(c, p_sql, DBMS_SQL.NATIVE);
d := DBMS_SQL.EXECUTE(c);
DBMS_SQL.DESCRIBE_COLUMNS(c, col_cnt, rec_tab);
FOR j in 1..col_cnt
LOOP
CASE rec_tab(j).col_type
WHEN 1 THEN DBMS_SQL.DEFINE_COLUMN(c,j,v_v_val,2000);
WHEN 2 THEN DBMS_SQL.DEFINE_COLUMN(c,j,v_n_val);
WHEN 12 THEN DBMS_SQL.DEFINE_COLUMN(c,j,v_d_val);
ELSE
DBMS_SQL.DEFINE_COLUMN(c,j,v_v_val,2000);
END CASE;
END LOOP;
-- This part outputs the HEADER
v_fh := UTL_FILE.FOPEN(upper(p_dir),p_header_file,'w',32767);
FOR j in 1..col_cnt
LOOP
CASE rec_tab(j).col_type
WHEN 1 THEN v_finaltxt := v_finaltxt||rpad(lower(rec_tab(j).col_name),rec_tab(j).col_max_len,' ');
WHEN 2 THEN v_finaltxt := v_finaltxt||rpad(lower(rec_tab(j).col_name),rec_tab(j).col_max_len,' ');
WHEN 12 THEN v_finaltxt := v_finaltxt||rpad(lower(rec_tab(j).col_name),greatest(19,length(rec_tab(j).col_name)),' ');
END CASE;
END LOOP;
UTL_FILE.PUT_LINE(v_fh, v_finaltxt);
IF NOT v_samefile THEN
UTL_FILE.FCLOSE(v_fh);
END IF;
-- This part outputs the DATA
IF NOT v_samefile THEN
v_fh := UTL_FILE.FOPEN(upper(p_dir),p_data_file,'w',32767);
END IF;
LOOP
v_ret := DBMS_SQL.FETCH_ROWS(c);
EXIT WHEN v_ret = 0;
v_finaltxt := NULL;
FOR j in 1..col_cnt
LOOP
CASE rec_tab(j).col_type
WHEN 1 THEN DBMS_SQL.COLUMN_VALUE(c,j,v_v_val);
v_finaltxt := v_finaltxt||rpad(nvl(v_v_val,' '),rec_tab(j).col_max_len,' ');
WHEN 2 THEN DBMS_SQL.COLUMN_VALUE(c,j,v_n_val);
v_finaltxt := v_finaltxt||rpad(nvl(to_char(v_n_val,'fm99999999999999999999999999999999999999'),' '),rec_tab(j).col_max_len,' ');
WHEN 12 THEN DBMS_SQL.COLUMN_VALUE(c,j,v_d_val);
v_finaltxt := v_finaltxt||rpad(nvl(to_char(v_d_val,'DD/MM/YYYY HH24:MI:SS'),' '),greatest(19,length(rec_tab(j).col_name)),' ');
END CASE;
END LOOP;
UTL_FILE.PUT_LINE(v_fh, v_finaltxt);
END LOOP;
UTL_FILE.FCLOSE(v_fh);
DBMS_SQL.CLOSE_CURSOR(c);
END;This allows for the header row and the data to be written to seperate files if required.
e.g.
SQL> exec run_query('select * from emp','TEST_DIR','output.txt');
PL/SQL procedure successfully completed.Output.csv file contains:
empno ename job mgr hiredate sal comm deptno
7369 SMITH CLERK 7902 17/12/1980 00:00:00800 20
7499 ALLEN SALESMAN 7698 20/02/1981 00:00:001600 300 30
7521 WARD SALESMAN 7698 22/02/1981 00:00:001250 500 30
7566 JONES MANAGER 7839 02/04/1981 00:00:002975 20
7654 MARTIN SALESMAN 7698 28/09/1981 00:00:001250 1400 30
7698 BLAKE MANAGER 7839 01/05/1981 00:00:002850 30
7782 CLARK MANAGER 7839 09/06/1981 00:00:002450 10
7788 SCOTT ANALYST 7566 19/04/1987 00:00:003000 20
7839 KING PRESIDENT 17/11/1981 00:00:005000 10
7844 TURNER SALESMAN 7698 08/09/1981 00:00:001500 0 30
7876 ADAMS CLERK 7788 23/05/1987 00:00:001100 20
7900 JAMES CLERK 7698 03/12/1981 00:00:00950 30
7902 FORD ANALYST 7566 03/12/1981 00:00:003000 20
7934 MILLER CLERK 7782 23/01/1982 00:00:001300 10
The procedure allows for the header and data to go to seperate files if required. Just specifying the "header" filename will put the header and data in the one file.
Adapt to output different datatypes and styles are required (this is currently coded for VARCHAR2, NUMBER and DATE) -
What is the best way to export for use on internet?
what is the best way to export for use on internet?
It depends. Is this for a personal web site or for a site like YouTube, Vimeo or Facebook?
For YouTube, Vimeo and Facebook, use Publish & Share/Computer/AVCHD using one of the YouTube presets. -
Get the daily incremental search crawl information using PowerShell script
Dear Friends ,
I want to get the daily incremental search crawl information using PowerShell script . I need this information into CSV or txt format . Can you please help me with this.
valmikiHi
I have got the below script which worked .
## SharePoint Reference
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Administration")
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.Office.Server.Search.Administration")
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.Office.Server.Search")
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.Office.Server")
function global:Get-CrawlHistory($url)
trap [Exception] {
write-error $("ERROR: " + $_.Exception.GetType().FullName);
write-error $("ERROR: " + $_.Exception.Message);
continue;
$s =
new-Object Microsoft.SharePoint.SPSite($url);
$c = [Microsoft.Office.Server.Search.Administration.SearchContext]::GetContext($s);
$h =
new-Object Microsoft.Office.Server.Search.Administration.CrawlHistory($c);
Write-OutPut $h.GetCrawlHistory();
$s.Dispose();
Get-CrawlHistory
-url http://your_site_url/
|
Export-CsvF:\temp\search.csv
valmiki -
How to get ORA errors in alertlog file using shell script.
Hi,
Can anyone tell me how to get all ORA errors between two particular times in an alertlog file using shell script.
ThanksHi,
You can define the alert log as an external table, and extract messages with SQL, very cool:
http://www.dba-oracle.com/t_oracle_alert_log_sql_external_tables.htm
If you want to write a shell script to scan the alert log, see here:
http://www.rampant-books.com/book_2007_1_shell_scripting.htm
#!/bin/ksh
# log monitoring script
# report all errors (and specific warnings) in the alert log
# which have occurred since the date
# and time in last_alerttime_$ORACLE_SID.txt
# parameters:
# 1) ORACLE_SID
# 2) optional alert exclusion file [default = alert_logmon.excl]
# exclude file format:
# error_number error_number
# error_number ...
# i.e. a string of numbers with the ORA- and any leading zeroes that appear
# e.g. (NB the examples are NOT normally excluded)
# ORA-07552 ORA-08006 ORA-12819
# ORA-01555 ORA-07553
BASEDIR=$(dirname $0)
if [ $# -lt 1 ]; then
echo "usage: $(basename) ORACLE_SID [exclude file]"
exit -1
fi
export ORACLE_SID=$1
if [ ! -z "$2" ]; then
EXCLFILE=$2
else
EXCLFILE=$BASEDIR/alert_logmon.excl
fi
LASTALERT=$BASEDIR/last_alerttime_$ORACLE_SID.txt
if [ ! -f $EXCLFILE ]; then
echo "alert exclusion ($EXCLFILE) file not found!"
exit -1
fi
# establish alert file location
export ORAENV_ASK=NO
export PATH=$PATH:/usr/local/bin
. oraenv
DPATH=`sqlplus -s "/ as sysdba" <<!EOF
set pages 0
set lines 160
set verify off
set feedback off
select replace(value,'?','$ORACLE_HOME')
from v\\\$parameter
where name = 'background_dump_dest';
!EOF
`
if [ ! -d "$DPATH" ]; then
echo "Script Error - bdump path found as $DPATH"
exit -1
fi
ALOG=${DPATH}/alert_${ORACLE_SID}.log
# now create awk file
cat > $BASEDIR/awkfile.awk<<!EOF
BEGIN {
# first get excluded error list
excldata="";
while (getline < "$EXCLFILE" > 0)
{ excldata=excldata " " \$0; }
print excldata
# get time of last error
if (getline < "$LASTALERT" < 1)
{ olddate = "00000000 00:00:00" }
else
{ olddate=\$0; }
errct = 0; errfound = 0;
{ if ( \$0 ~ /Sun/ || /Mon/ || /Tue/ || /Wed/ || /Thu/ || /Fri/ || /Sat/ )
{ if (dtconv(\$3, \$2, \$5, \$4) <= olddate)
{ # get next record from file
next; # get next record from file
# here we are now processing errors
OLDLINE=\$0; # store date, possibly of error, or else to be discarded
while (getline > 0)
{ if (\$0 ~ /Sun/ || /Mon/ || /Tue/ || /Wed/ || /Thu/ || /Fri/ || /Sat/ )
{ if (errfound > 0)
{ printf ("%s<BR>",OLDLINE); }
OLDLINE = \$0; # no error, clear and start again
errfound = 0;
# save the date for next run
olddate = dtconv(\$3, \$2, \$5, \$4);
continue;
OLDLINE = sprintf("%s<BR>%s",OLDLINE,\$0);
if ( \$0 ~ /ORA-/ || /[Ff]uzzy/ )
{ # extract the error
errloc=index(\$0,"ORA-")
if (errloc > 0)
{ oraerr=substr(\$0,errloc);
if (index(oraerr,":") < 1)
{ oraloc2=index(oraerr," ") }
else
{ oraloc2=index(oraerr,":") }
oraloc2=oraloc2-1;
oraerr=substr(oraerr,1,oraloc2);
if (index(excldata,oraerr) < 1)
{ errfound = errfound +1; }
else # treat fuzzy as errors
{ errfound = errfound +1; }
END {
if (errfound > 0)
{ printf ("%s<BR>",OLDLINE); }
print olddate > "$LASTALERT";
function dtconv (dd, mon, yyyy, tim, sortdate) {
mth=index("JanFebMarAprMayJunJulAugSepOctNovDec",mon);
if (mth < 1)
{ return "00000000 00:00:00" };
# now get month number - make to complete multiple of three and divide
mth=(mth+2)/3;
sortdate=sprintf("%04d%02d%02d %s",yyyy,mth,dd,tim);
return sortdate;
!EOF
ERRMESS=$(nawk -f $BASEDIR/awkfile.awk $ALOG)
ERRCT=$(echo $ERRMESS|awk 'BEGIN {RS="<BR>"} END {print NR}')
rm $LASTALERT
if [ $ERRCT -gt 1 ]; then
echo "$ERRCT Errors Found \n"
echo "$ERRMESS"|nawk 'BEGIN {FS="<BR>"}{for (i=1;NF>=i;i++) {print $i}}'
exit 2
fi -
Make .dmp file using shell script
i am new in linux.
i want to make .dmp file(export form a table) in a particular directory using shell script .
then form .dmp file i want to convert it in .zip file
Edited by: 855516 on Jul 18, 2011 5:26 PMThe following should give you a head start:
Create the following file i.e. scott_export.sh
#!/bin/bash
MAILADR="[email protected]"
EXPDIR="/home/oracle/datapump"
EXPDIR4SQL="'$EXPDIR'"
DUMPFILE="export_`date +%N`.dmp"
LOGFILE="export_`date +%N`.log"
ORACLE_SID=test; export ORACLE_SID
PATH=$PATH:/usr/local/bin; export PATH
ORAENV_ASK=NO
. /usr/local/bin/oraenv
mkdir -p $EXPIDR 2>/dev/null
sqlplus -s /nolog <<EOF
connect / as sysdba
set pages 0 feed off
create or replace directory datapump as $EXPDIR4SQL;
grant read, write on directory datapump to scott;
exit
EOF
expdp scott/tiger tables=EMP,DEPT \
transport_full_check=y \
directory=datapump dumpfile=$DUMPFILE logfile=$LOGFILE
gzip $EXPDIR/$DUMPFILE >> $EXPDIR/$LOGFILE
cat $EXPDIR/$LOGFILE | mailx -s "$LOGFILE" $MAILADR
# ENDGive execute privileges:
$ chmod 750 scott_export.sh
The above script should also work as a cron job. -
How to write CLOB parameter in a file or XML using shell script?
I executed a oracle stored procedure using shell script. How can i get the OUT parameter of the procedure(CLOB) and write it in a file or XML in UNIX environment using shell script?
Edit/Delete MessageSQL> var c clob
SQL>
SQL> begin
2 select
3 DBMS_XMLGEN.getXML(
4 'select rownum, object_type, object_name from user_objects where rownum <= 5'
5 ) into :c
6 from dual;
7 end;
8 /
PL/SQL procedure successfully completed.
SQL>
SQL> set long 999999
SQL> set heading off
SQL> set pages 0
SQL> set feedback off
SQL> set termout off
SQL> set trimspool on
// following in the script is not echo'ed to screen
set echo off
spool /tmp/x.xml
select :c from dual;
spool off
SQL>
SQL> --// file size
SQL> !ls -l /tmp/x.xml
-rw-rw-r-- 1 billy billy 583 2011-12-22 13:35 /tmp/x.xml
SQL> --// file content
SQL> !cat /tmp/x.xml
<?xml version="1.0"?>
<ROWSET>
<ROW>
<ROWNUM>1</ROWNUM>
<OBJECT_TYPE>TABLE</OBJECT_TYPE>
<OBJECT_NAME>BONUS</OBJECT_NAME>
</ROW>
<ROW>
<ROWNUM>2</ROWNUM>
<OBJECT_TYPE>PROCEDURE</OBJECT_TYPE>
<OBJECT_NAME>CLOSEREFCURSOR</OBJECT_NAME>
</ROW>
<ROW>
<ROWNUM>3</ROWNUM>
<OBJECT_TYPE>TABLE</OBJECT_TYPE>
<OBJECT_NAME>DEPT</OBJECT_NAME>
</ROW>
<ROW>
<ROWNUM>4</ROWNUM>
<OBJECT_TYPE>TABLE</OBJECT_TYPE>
<OBJECT_NAME>EMP</OBJECT_NAME>
</ROW>
<ROW>
<ROWNUM>5</ROWNUM>
<OBJECT_TYPE>TABLE</OBJECT_TYPE>
<OBJECT_NAME>EMPTAB</OBJECT_NAME>
</ROW>
</ROWSET>
SQL> -
Sql Loader by using shell script, not able to insert data
Hi,
I am trying to dump the data by using shell script.(in shell script i am having sqlldr command)(its a host excutable method cocurrent program)
When i am loading the data, by placing my files(.ctl,.prog,.csv,symbolink file for .prog) in $Custom_top/bin, it is loading exactly. 17000 records inserted.
But if i am loading the data by placing my files in $custom_top/custom_folders. unable to insert total data. only 43 records inserting.
Please any one can help me.
Thanks in advance.
Rama.Srini, Thanks a lot for ur reply,
Oracle Apps version R12,
Microsoft windows XP profissional
Version 2002 service Pack 3
My Control file Script is:
load data
infile '$XADP_TOP/data/CPIU/in/XXOKS_Price_Increase.csv'
append
into table XXOKS_CONTRACT_PRICE_INCR_DTLS
fields terminated BY ',' optionally enclosed by '"'
TRAILING NULLCOLS
(EXCLUSION_FLAG,
LEGACY_NUMBER,
CUSTOMER_NUMBER,
CUSTOMER_NAME,
REQUEST_ID,
CONTRACT_NUMBER,
CONTRACT_START_DATE,
CONTRACT_END,
REQUEST_LINE_ID,
LINE_START_DATE,
LINE_END_DATE,
ITEM_NUMBER,
ITEM_DESCRIPTION,
UNIT_PRICE,
QTY,
NEW_UNIT_PRICE,
LINE_AMOUNT,
NEW_LINE_AMOUNT,
PRICE_INCREASED_DATE,
PERCENTAGE_INCREASED,
ORIGINAL_CONTRACT_AMOUNT,
NEW_CONTRACT_AMOUNT,
PRICE_INCREASE_AMOUNT)
My .prog File is: Please fidn that i created symbolink file also for my .prog.
if [ -z $XADP_TOP ];then
echo "XADP_TOP environment variable is not set!"
exit 1
fi
cd $XADP_TOP/data/CPIU/in
DATE=`date +%y%m%d:%H%M`
i_program_name="$0"
i_ora_pwd="$1"
i_user_id="$2"
i_user_name="$3"
i_request_id="$4"
i_ftp_host_name="$5"
i_ftp_user_name="$6"
i_ftp_user_password="$7"
ftp_prog() {
# FTP Function to reuse the FTP Commands
if [ $# -ne 6 ];then
echo "Usage : ftp_prog <Hostname> <User name> <Password> <Remote Directory> <command> <filename>"
exit 2
fi
l_ftp_host_name="$1"
l_ftp_user_name="$2"
l_ftp_user_password="$3"
l_ftpdir="$4"
l_ftp_command="$5"
l_ftp_filename="$6"
ftp -v -n ${l_ftp_host_name} <<EOF
user ${l_ftp_user_name} ${l_ftp_user_password}
ascii
cd ${l_ftpdir}
${l_ftp_command} ${l_ftp_filename}
quit
EOF
#exit $?
# setting the ftp directory
#ftpdir="/`echo ${TWO_TASK:-$ORACLE_SID}|tr "[A-Z]" "[a-z]"`/CPIU"
##ftpdir="/FinTEST/quoting/PS/ar"
ftpdir="$XADP_TOP/data/CPIU/in"
# setting the in directory and out directory
indir="$XADP_TOP/data/CPIU/in"
outdir="$XADP_TOP/data/CPIU/out"
ftp_prog ${i_ftp_host_name} ${i_ftp_user_name} ${i_ftp_user_password} ${ftpdir} get XXOKS_Price_Increase.csv
echo $ftpdir
echo "Converting the data file into unix mode"
dos2unix XXOKS_Price_Increase.csv XXOKS_Price_Increase.csv
chmod 777 XXOKS_Price_Increase.csv
cd $XADP_TOP/bin
echo "Trying to excute sqlldr and entering into the into control file"
$ORACLE_HOME/bin/sqlldr userid=$i_ora_pwd control=XXOKS_PRICE_INCR_LOAD log=$XADP_TOP/log/XXOKS_PRICE_INCR_LOAD_${DATE}.log;
exit_status=$?
echo "Checking the status and giving permissions to the data file which in in dir"
if [ $exit_status -eq 0 ]; then
cd $XADP_TOP/data/CPIU/in
chmod 777 XXOKS_Price_Increase.csv
echo "try to move data file into out dir"
# Moving the file to out directory
mv XXOKS_Price_Increase.csv ${outdir}/XXOKS_Price_Increase.csv_${DATE}
#echo "ready to zip file in out dir step6"
# Zipping the file
#gzip -f ${outdir}/XXOKS_Price_Increase.csv_${DATE}
echo "deleting the file which is in dir"
# Deleting the file from in directory
/bin/rm -f ${indir}/XXOKS_Price_Increase.csv
# Deleting from the remote directory
ftp_prog ${i_ftp_host_name} ${i_ftp_user_name} ${i_ftp_user_password} ${ftpdir} delete XXOKS_Price_Increase.csv
echo "sqlloader finished successfully."
else
echo "Error in loader"
##echo "Loader error in Price Increase Detials File ${i_file}"
fi
exit $exit_status
And My Log file Comments are
SQL*Loader: Release 10.1.0.5.0 - Production on Thu Dec 3 01:32:08 2009
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Control File: XXOKS_PRICE_INCR_LOAD.ctl
Data File: /oesapp/applmgr/GIS11/apps/apps_st/appl/xadp/12.0.0/data/CPIU/in/XXOKS_Price_Increase.csv
Bad File: XXOKS_Price_Increase.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table XXOKS_CONTRACT_PRICE_INCR_DTLS, loaded from every logical record.
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
EXCLUSION_FLAG FIRST * , O(") CHARACTER
LEGACY_NUMBER NEXT * , O(") CHARACTER
CUSTOMER_NUMBER NEXT * , O(") CHARACTER
CUSTOMER_NAME NEXT * , O(") CHARACTER
REQUEST_ID NEXT * , O(") CHARACTER
CONTRACT_NUMBER NEXT * , O(") CHARACTER
CONTRACT_START_DATE NEXT * , O(") CHARACTER
CONTRACT_END NEXT * , O(") CHARACTER
REQUEST_LINE_ID NEXT * , O(") CHARACTER
LINE_START_DATE NEXT * , O(") CHARACTER
LINE_END_DATE NEXT * , O(") CHARACTER
ITEM_NUMBER NEXT * , O(") CHARACTER
ITEM_DESCRIPTION NEXT * , O(") CHARACTER
UNIT_PRICE NEXT * , O(") CHARACTER
QTY NEXT * , O(") CHARACTER
NEW_UNIT_PRICE NEXT * , O(") CHARACTER
LINE_AMOUNT NEXT * , O(") CHARACTER
NEW_LINE_AMOUNT NEXT * , O(") CHARACTER
PRICE_INCREASED_DATE NEXT * , O(") CHARACTER
PERCENTAGE_INCREASED NEXT * , O(") CHARACTER
ORIGINAL_CONTRACT_AMOUNT NEXT * , O(") CHARACTER
NEW_CONTRACT_AMOUNT NEXT * , O(") CHARACTER
PRICE_INCREASE_AMOUNT NEXT * , O(") CHARACTER
value used for ROWS parameter changed from 64 to 43
Table XXOKS_CONTRACT_PRICE_INCR_DTLS:
43 Rows successfully loaded.
0 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 255162 bytes(43 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 43
Total logical records rejected: 0
Total logical records discarded: 0
Run began on Thu Dec 03 01:32:08 2009
Run ended on Thu Dec 03 01:32:08 2009
Elapsed time was: 00:00:00.19
CPU time was: 00:00:00.04
Plz srini help me.
Thanks in advance
Rama.. -
How to Email Concurrent Program Output to Email using Shell Script
Hi All,
Have a Nice Day,
I have a tricky requirement and i was not able to achieve, let me explain my requirement
I have created a PLSQL Concurrent Program named "Approval Update". This will do update and it display the number of rows updated.
Now i need to take this concurrent program output and it needs to be send it to the person who submits this program as an email using shell scripts.
I have referred meta link note as well as some OTN posts but I was not able to achieve this.
Please help me to complete this As soon as possible, Thanks in advance for your help.
Let me know if you need more clarifications.
Regards,
CSKI don't have much idea in shell scripts all i want is, in my shell script i need to get the parent concurrent program output and that needs to be emailed to the intended person.
Please help to to get the shell script commands for this.I do not have any shell script to share, sorry! If you want the query to get the parent request_id so you can get the log/out file name/location from then please refer to:
REQUESTS.sql Script for Parent/Child Request IDs and Trace File IDs [ID 280295.1]
http://etrm.oracle.com/pls/et1211d9/etrm_pnav.show_object?c_name=FND_CONC_REQ_SUMMARY_V&c_owner=APPS&c_type=VIEW
http://etrm.oracle.com/pls/et1211d9/etrm_pnav.show_object?c_name=FND_CONCURRENT_REQUESTS&c_owner=APPLSYS&c_type=TABLE -- LOGFILE_NAME & OUTFILE_NAME
Thanks,
Hussein -
Problem-Report generation using shell script
Hi
We have the Production database and the Reporting database (copy of Production database),
both on Oracle 9.2.0.5 and Solaris 5.8. There is a package inside the Oracle database, which extracts some data from inside the
database, and would generate a report. A shell script has been created in Solaris which would
send in the parameters and call the pakage to generate the report. The parameters it is sending is
the name of report to be generated, and the location where it is to be generated, both hard-coded into
the script. The script is scheduled to run through crontab.
The problem we are facing is that, if we run the script for Reporting database, it successfully
generates the report. But if we use that script for Production database, it gives the error
"Invalid directory Path". I have tried using various other directory paths, even '/tmp'
and '/', but it still gives the same error when executed for Production dataabse.
Could somebody provide any ideas what might be going wrong.
The reasons it is to be executed on Prod db and not the Reporting database are unavoidable.
It anyway runs in off business hours and takes about 10secs to execute.
Please do let me know if there is any other info that I missed to provide here.
Thanks in advance...I will be just guessing because you didn't provide contents of script and package.
The "Invalid directory path" as you said could be ORA-29280 due non existent directory.
Try execute (as sys or system) select * from dba_directories; (or select * from all_directories; as user which the script is login to) on both databases and compare the results. If there is missing your important directory then create it using create directory <dirname>; (from sqlplus and don't forget to grant rights for user).
This error could come from shell script. In that case you should find resolution yourself because you didn't provide script source. -
Hi,
I want to study in details about how to use shell script in pl/sql codes. Can any one suggest some useful link on the same.
Nordiksmon wrote:
BluShadow wrote:
miriam_omaha wrote:
If you mean execute a shell script from a procedure, there are ways, not easy ways, to execute a shell script - try this thread on that topic:Not easy?
What's not easy about setting up a job using DBMS_SCHEDULER which can directly call any executable at the operating system level?that doesn't meet the requirement, it's about executing host commands from within plsql.Which you can do with DBMS_SCHEDULER by creating a job to execute straight away. DBMS_SCHEDULER can execute operating system commands...
http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14258/d_sched.htm#i1010013
>
'EXECUTABLE'
This specifies that the program is external to the database. External programs implies anything that can be executed from the operating system's command line. AnyData arguments are not supported with job or program type EXECUTABLE. -
Passing flexfield from report wrapper to seeded report using shell script
I am unable to pass flexfield values from report wrapper to seeded report using shell script
below is the shell script that i am using to transfer GL Account High and GL Account Low to the seeded program Receipt Journal Report (ARXRJR). the flex fields get passed through the parameter v_param_12 and v_param_13. but all i am getting in the seeded report log file is the reporting level,reporting context,set_of_books,period, gl_date_low/high .
#TEST Program for Passing GL Account High and GL Account Low
v_user_name=`echo $3`
v_login_pwd=`echo $FCP_LOGIN`
v_prog_name=`echo TESTARXRJR`
echo 'v_prog_name: '$v_prog_name
# testing for passing the arguments from CM ...
if [ -z "$1" ]
then
echo "Please pass the first argument...."
exit 1
fi
v_request_id=`echo $4`
echo 'v_request_id: '$v_request_id
v_param_1=`echo $5`
echo 'v_param_1: '$v_param_1
v_param_2=`echo $6`
echo 'v_param_2: '$v_param_2
v_param_3=`echo $7`
echo 'v_param_3: '$v_param_3
v_param_4=`echo $8`
echo 'v_param_4: '$v_param_4
v_param_5="$9"
echo 'v_param_5: '$v_param_5
shift
v_param_6=`echo $9`
echo 'v_param_6: '$v_param_6
shift
v_param_7=`echo $9'`
echo 'v_param_7: '$v_param_7
shift
v_param_8=`echo "$9"`
echo 'v_param_8: '$v_param_8
shift
v_param_9=`echo $9`
echo 'v_param_9: '$v_param_9
shift
v_param_10=`echo $9`
echo 'v_param_10: '$v_param_10
shift
v_param_11=`echo $9`
echo 'v_param_11: '$v_param_11
shift
v_param_12=`echo $9`
echo 'v_param_12: '$v_param_12
shift
v_param_13=`echo $9`
echo 'v_param_13: '$v_param_13
shift
v_param_14=`echo $9`
echo 'v_param_14: '$v_param_14
shift
v_param_15=`echo $9`
echo 'v_param_15: '$v_param_15
shift
v_param_16=`echo $9`
echo 'v_param_16: '$v_param_16
shift
v_param_17=`echo $9`
echo 'v_param_17: '$v_param_17
shift
v_param_18=`echo $9`
echo 'v_param_18: '$v_param_18
shift
v_param_19=`echo $9`
echo 'v_param_19: '$v_param_19
shift
v_param_20=`echo $9`
echo 'v_param_20: '$v_param_20
shift
v_param_21=`echo $9`
echo 'v_param_21: '$v_param_21
shift
v_param_22=`echo $9`
echo 'v_param_22: '$v_param_22
shift
v_param_23=`echo $9`
echo 'v_param_23: '$v_param_23
echo "Executiong SQL to obtain org id and responsibility name"
(sqlplus -s /nolog <<end_of_sql
conn $v_login_pwd
set heading off
set echo off
set feedback off
SELECT 'ORGID:'||b.profile_option_value || ':' ||e.responsibility_name
FROM fnd_profile_options a,
fnd_profile_option_values b,
fnd_concurrent_requests c,
fnd_responsibility d,
fnd_responsibility_tl e
WHERE c.request_id = $v_request_id
AND c.responsibility_id = b.level_value
AND b.profile_option_id = a.profile_option_id
AND a.application_id = b.application_id
AND c.responsibility_application_id = d.application_id
AND d.application_id = e.application_id
AND a.profile_option_name = 'ORG_ID'
AND c.responsibility_id = d.responsibility_id
AND d.responsibility_id = e.responsibility_id;
exit
end_of_sql
) > a.log
v_var=`grep -iv "ORG_ID" a.log|sed 's/Connected.//'`
echo 'v_var: ' $v_var
v_org=`echo $v_var| awk 'FS=":" {print $2 }'`
v_resp_name=`echo $v_var | awk 'FS=":" {print $3}'`
echo 'v_org :'$v_org
echo 'v_resp_name :'$v_resp_name
echo 'Sumitting Receipt Journal Report...'
user_request=`CONCSUB $v_login_pwd GL "$v_resp_name" $v_user_name WAIT=Y CONCURRENT AR ARXRJR \
\"$v_param_3\"\
\"$v_param_4\"\
\"$v_param_5\"\
\"$v_param_6\"\
\"$v_param_7\"\
\"$v_param_8\"\
\"$v_param_9\"\
\"$v_param_10\"\
\"$v_param_11\"\
\"$v_param_12\"\
\"$v_param_13\"\
\"$v_param_14\"\
\"$v_param_15\"\
\"$v_param_16\"\
\"$v_param_17\"\
\"$v_param_18\"\
\"$v_param_19\"\
\"$v_param_20\"\
\"$v_param_21\"\
\"$v_param_22\"\
\"$v_param_23\"`
user_request=`echo $user_request|awk '{print $3}'`
echo Request is $user_request
trx_rep_file="$user_request"
echo 'trx_rep_file: '$trx_rep_file
sleep 10
for i in 1 2 3 4 5
do
if test -f $APPLCSF/$APPLLOG/l$user_request.req
then
if test [`grep -i '"Concurrent process completed successfully"' $APPLCSF/$APPLLOG/l$user_request.req`]
then
if test -f $APPLCSF/$APPLOUT/o$trx_rep_file.out
then
echo +++++++++++++++++++++++++++++++++
echo 'output file found'
echo ++++++++++++++++++++++++++++++++++++++
else
echo +++++++++++++++++++++++++++++++++++++++++
echo failure in Report listing production
echo +++++++++++++++++++++++++++++++++++++++
fi
break
else
echo +++++++++++++++++++++++++++++++++++++++
echo Job not Over Yet
echo +++++++++++++++++++++++++++++++++++++++++++++
fi
echo +++++++++++++++++++++++++++++++++++++++
echo File not found yet
echo ++++++++++++++++++++++++++++++++++++++
fi
sleep 10
done
echo ----------------------------------------------------------
echo Transaction Listing process over
echo ---------------------------------------------------------
echo ----------------------------------------------------------
echo Executing SQL to derive Search Pattern
echo ----------------------------------------------------------
v_pattern=`sqlplus -s /nolog <<EOF1
conn $v_login_pwd
set heading off
set echo off
set feedback off
SELECT 'SEARCH^'||meaning||'^'||attribute1||'^'||attribute2||'^'||attribute3
FROM fnd_lookup_values
WHERE lookup_type = '$v_param_2'
AND lookup_code = '$v_prog_name';
exit
EOF1`
echo 'v_pattern: '$v_pattern
v_pattern_n1=`echo $v_pattern|sed 's/Connected.//'|awk 'FS="^" {print $2}'`
echo 'v_pattern_n1: '$v_pattern_n1
v_pattern_n2=`echo $v_pattern|sed 's/Connected.//'|awk 'FS="^" {print $3}'`
echo 'v_pattern_n2: '$v_pattern_n2
v_pattern_n3=`echo $v_pattern|sed 's/Connected.//'|awk 'FS="^" {print $4}'`
echo 'v_pattern_n3: '$v_pattern_n3
v_pattern_n4=`echo $v_pattern|sed 's/Connected.//'|awk 'FS="^" {print $4}'|sed 's/ //'`
echo 'v_pattern_n4: '$v_pattern_n4
echo 'v_param_12: '$v_param_12
echo 'v_param_13: '$v_param_13
echo -----------------------------------
echo Deriving final output
echo ----------------------------------
v_var=`cat $APPLCSF/$APPLOUT/o$trx_rep_file.out | grep -i "$v_pattern_n1"|\
sed "s/(/-/g"|sed "s/)/ /g"|\
sed "s/$v_pattern_n1/$v_pattern_n2/g"|\
sed 's/ /~/g'|\
sed 's/~~*/~/g'|\
sed 's/ *//g'|\
sed 's/,//g'`
echo 'v_var: '$v_var
v_var_2=`cat $APPLCSF/$APPLOUT/o$trx_rep_file.out|grep -i "$v_pattern_n3"|\
uniq`
v_out_1=`echo "$v_var"|grep -i "$v_pattern_n2"|awk 'FS="~" {print $3}'`
echo 'v_out_1: '$v_out_1
echo 'v_var_2: '$v_var_2
echo ------------------------------------------------------------
v_out_2=`echo "$v_var_2"|grep -i "$v_pattern_n3"|awk 'FS=":" {print $2}'`
echo 'v_out_2: '$v_out_2
echo ----------------------------------------------
echo Process comleted
echo ----------------------------------------------------
echo ================================================================================
#---------------------------------------------------------------------------------------------------The name of this OTN group is "Database - General"
Your question relates to an unnamed application that might be E-Business Suite, or PeopleSoft, or Siebel, who can say.
Please re-ask your question in the appropriate applications forum and be sure to clearly name the full product name and version. We are not mind readers and we can not look over your shoulder. -
Is there a Sun form to post useful shell scripts?
Look at Bigadmin's main page. There is an option there to let you share your scripts and other resources with the community.
-
How to submit request(Report+Template) using Shell Scripts?
Hi Friends,
How to submit request + Add layout using shell scripts..
If anybody has sample code..Can you please send me to [email protected]
Please help me..
Its Urgent.
Thanks and Regards,
A Swain
Message was edited by:
SwainAFollowing is a package where the request is submitted and the layout is added.
Also check iin the other way by adding the layout first n then giving the request.
CREATE OR REPLACE PACKAGE BODY try
AS
PROCEDURE try_proc
errbuf OUT VARCHAR2,
retcode OUT NUMBER
IS
l_mode BOOLEAN;
l_request_id NUMBER;
xml_layout BOOLEAN;
BEGIN
fnd_file.put_line (fnd_file.output, 'USER_ID :' || fnd_global.user_id);
fnd_file.put_line (fnd_file.output, 'REQU_ID :' || fnd_global.resp_id);
fnd_file.put_line (fnd_file.output,
'RESP_APPL_ID :' || fnd_global.resp_appl_id
l_mode := fnd_request.set_mode (db_trigger => TRUE);
IF l_mode IS NOT NULL
THEN
fnd_file.put_line (fnd_file.output,
'Concurrent MODE Option is Success'
END IF;
-- if wrong paramters error recd give all 100 arguments as null
l_request_id :=
fnd_request.submit_request (application => 'app_short_name',
program => 'shortname',
sub_request => FALSE
fnd_file.put_line (fnd_file.output,
'Request_Id 1 is :' || l_request_id
xml_layout :=
fnd_request.add_layout (template_appl_name => 'SQLAP',
template_code => 'CAPINEF01',
template_language => 'en',
template_territory => 'US',
output_format => 'PDF'
COMMIT;
EXCEPTION
WHEN OTHERS
THEN
fnd_file.put_line (fnd_file.LOG,
'Error in procedure package procedure :'
|| SUBSTR (SQLCODE, 1, 20)
|| ':'
|| SUBSTR (SQLERRM, 1, 250)
END try_proc;
END try;
Maybe you are looking for
-
Time Machine as a simple server?
Short question, is it possible to access time Time Capsual harddrive over the internet if you got an static IP? Would be great to access files, sort of as an simple server. Thanks.
-
Predefined filters in OLAP Universe
Hello I am trying to create a pre-defined filter in an OLAP Universe on top of SSAS 2008. We use BOXI 3.1 Here is my object Structure Date(Folder) -> Year(Folder) -> Year(Dimension object). However when i try to create a simple filter as below <FILTE
-
Adobe illustrator memory issues
Hi I have a customer who is using Adobe Creative suite CS2. She has been using it for a couple of years quite happily but recently we upgraded her to windows vista business. After a re-install of adobe creative suite illustrator keeps coming up with
-
Need informations about options in mavericks
This is simple. When i plug a usb key in my macbook pro i see it on the desktop and in the finder. But i want to add multiple files it's a little bit longer. Is there a way to do this : Right click go to share Have in this part of the menu : USB Than
-
After including the payer in condition table, I tried to insert into access sequence. But I get the following error "The field assignment has not yet been made". Could someone give some insights. Thanks Suddu