Error executing shell script using dbms_scheduler

I have a job running which executes a shell script,which fails with following error :
SQL> SELECT additional_info
2 FROM user_scheduler_job_run_details
3 WHERE log_date = (SELECT MAX (log_date)
4 FROM user_scheduler_job_run_details);
ADDITIONAL_INFO
ORA-27369: job of type EXECUTABLE failed with exit code: No such file or directory
STANDARD_ERROR="mkdir: Failed to make directory "/export/home/bwsolaris/abc"; Permission denied"
This are contents of my shell script
#!/bin/ksh
/bin/mkdir /export/home/bwsolaris/abc
Can anyone suggest me some way out of it?
Thanks in advance!!!

Does oracle still creates a user nobody"nobody" is a "standard" Unix/Linux lowly privileged user, which is used by DBMS_SCHEDULER by default.
You should change permissions on /export/home/bwsolaris, e.g.
$ chmod 777 /export/home/bwsolaris
or use a different directory, where everyone has access, for example /tmp.
Or take a look at Metalink Note:391820.1 - Scheduled Job Running Shell Script Fails With ORA-27369

Similar Messages

  • Execute unix shell script using DBMS_SCHEDULER

    Hi,
    I am trying run to shell script using DBMS_SCHEDULER.
    1) I check..nobody user exist on my HP-UX.
    2) I check externaljob.ora on (10.2.0.2.0) also..It has an entry..
    run_user = nobody
    run_group = nobody
    3) I created job successfully and enabled it.
    begin
    DBMS_SCHEDULER.CREATE_JOB
    job_name => 'test_unix_script',
    job_type => 'EXECUTABLE',
    job_action => '/tmp/test.ksh',
    start_date => '08-NOV-2006 04:45:16 PM',
    job_class => 'DEFAULT_JOB_CLASS',
    enabled => TRUE,
    auto_drop => FALSE,
    comments => 'test_unix_script.'
    END;
    EXEC DBMS_SCHEDULER.enable('test_unix_script');
    4) test.ksh script had -r-xr-xr-x permission.
    5) When I checking dba_scheduler_job_run_details view, ADDITIONAL_INFO column display following error messgae.
    ORA-27369: job of type EXECUTABLE failed with exit code: No such file or directory
    Did I miss anything?
    Any help will be appreciated!!
    Thanks..

    My /tmp/test.ksh trying to find database status.
    . ~oracle/.profile > /dev/null
    db_status=`eval sqlplus -s 'system/passwd@DEV' << EOF
    set pagesize 0 feedback off verify off heading off echo off
    select status from v\\$instance;
    exit
    EOF`
    echo $db_status > /tmp/db_status_out

  • Issue with calling Shell Script using DBMS_SCHEDULER

    Hi All,
    I am executing a shell script using DBMS_SCHEDULER from APEX web page. Execution part is working fine without any issues.
    In my shell script file (abc.sh) I have few oracle sql procedure calls which connects back to same database and that SQL call is not executing some reason, it not giving any errors.
    Inside my shell script code looks like this.
    sqlplus -silent $USER/$PASSCODE@$SCONNECT > /dev/null <<END
    set pagesize 0 feedback off verify off heading off serveroutput on
    echo off linesize 1000
    WHENEVER SQLERROR EXIT 9
    BEGIN     
    dbms_output.enable(1000000);
    do_enable_cons();
    dbms_output.disable;
    END;
    spool off;
    exit;
    END
    When I run this shell script file from backend it works fine now issues.
    Is there any restrictions in executing sql code using DBMS_SCHEDULER? Any ones help is much appreciated.
    -Regards

    james. wrote:
    Thanks you sb and Sybrand . It is problem with environment variables. After running .bash_profile in the beginning of the shell script, it is working fine.
    One issue is when I check the process it is showing two entries with two different process id's.
    The command I used
    ps -ef | grep <my script> is COPY & PASTE broken for you?
    any reason why you did not show us EXACTLY was produced by OS command above?
    >
    Is it something wrong with my code or is it normal? Is it really executing two times ?
    -Regards
    bcm@bcm-laptop:~$ sqlplus user1/user1
    SQL*Plus: Release 11.2.0.1.0 Production on Fri Jul 20 15:14:15 2012
    Copyright (c) 1982, 2009, Oracle.  All rights reserved.
    Connected to:
    Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    15:14:15 SQL> !ps -ef | grep sqlplus
    bcm      24577  1903  1 15:14 pts/0    00:00:00 sqlplus           
    bcm      24579 24577  0 15:14 pts/0    00:00:00 /bin/bash -c ps -ef | grep sqlplus
    bcm      24581 24579  0 15:14 pts/0    00:00:00 grep sqlplus
    15:14:23 SQL> how many different copies of "sqlplus" running on my laptop based upon actual output above?

  • Executing shell script using OSLinetoken fetchlet

    Hii,
    I do have a requirement. I need to use a shell script in the OSLineToken fetchlet. In response metric i will be checking whether the directory exists or not on the server. In order to check the existence of the directory, i have created a shell script. But how can i relate its result with the Response metric? The shell script is as follows:
    Shell Script:
    if test -d $1 ; then
    echo "DIR exist"
    else
    echo "false"
    fi
    The Response metric for the same will be:
    <QueryDescriptor FETCHLET_ID="OSLineToken">
         <Property NAME="command" SCOPE="GLOBAL">
              sh {dir_name where the shell script is uploaded}/{shell script file name} {dir_name_parameter} </Property>
         <Property NAME="startsWith" SCOPE="GLOBAL">em_result=</Property>
         <Property NAME="delimiter" SCOPE="GLOBAL">|</Property>
    </QueryDescriptor>
    Please suggest what is the use of em_result here?
    once the existence of the directory is checked, if it up then i need to call another shell script in order concatenate the contents of all the files with extension .log(this will be the parameter of shell script). Get the output from shell script and display it into custom management plug-in. As i am using cat *.log>>consolidatefile command to concatenate the data, i need to read consolidatefile file from the server and return this concatenated file data into plug-in. Again, how can i read the content of consolidatefile file in EMF? I will be creating another matric for this purpose say "read_content". the querydesciptor of the same will be as follows:
    <QueryDescriptor FETCHLET_ID="OSLineToken">
         <Property NAME="command" SCOPE="GLOBAL">
              sh {dir_name where the shell script is uploaded}/{shell script file name} {dir_name_parameter} {extension of the files to concatenated} </Property>
         <Property NAME="startsWith" SCOPE="GLOBAL">em_result=</Property>
         <Property NAME="delimiter" SCOPE="GLOBAL">|</Property>
    </QueryDescriptor>
    I am not sure which all properties to be used in this case..I have seen multiple sample files some of them uses perbin, scriptsdir but some of them does not..The related pdf also does not say anyhting about such kind of properties. Please suggest.
    I hope the explaination of the problem is not so cumbersome. Please let me know if you have any query to understand.
    Thanks,
    AS

    If you notice, localScriptsDir is a directory within scriptsDir. If you package your plug-in up and deploy it through the UI, any scripts you create will go into %scriptsDir%/emx/<target_type>. So localScriptsDir just specifies that directory for you. You don't need it but then in the command paramater you'll have something like:
    sh %scriptsDir%/emx/yourtargettype/yourscript...
    So whether you specify it in the command or another property (localScriptsDir) doesn't really matter.
    You can create your own properties in the QueryDescriptor. Just make sure you have the correct scope specified and it should be fine (options for scope are described in the Enterprise Manager DTD section of the Extensibility Guide).
    Metric collection isn't really meant for dynamic specification of input parameters. I can think of a few solutions:
    1) Create a target instance for each log directory. When you create the instance, the directory is specified. If you need to monitor a different directory, you can just create another instance. Upside is that it's flexible and scalable, and also, when you get an error you'll know exactly which directory it is based on which instance throws the error. Downside is that you have to have a separate instance for each directory.
    2) If the log directories are well known and finite (and won't change names), hardcode them into the target metadata. Have a different metric collect for each log directory, so you'll have as many metrics as log directories you want to monitor. Even if the names of the directories are different, you can use instance properties to map them, so if you know there will always be 5 log directories you want to monitor, you can have 5 instance properties to map the names into the metrics, although this won't work if you don't have the same number each time. Upside is that there is only a single target instance. Downside is that it's not as flexible.
    3) Use a job rather than a target type to find out this information. You could create a new job type which scans the logs for information and have the directory as an input parameter to the job. You could have this job on a repeating schedule to duplicate the effect you are trying to get out of creating a target type. The upside is that you can start the job whenever you want from the UI and specify exactly which directory whenever you run it. The downside is that the job system is centered on the OMS rather than the agent, so every time it runs it will have to contact the agent to do the work. In the case of the target type, the agent acts autonomously without contact from the OMS.
    There are probably other options, but these are the quick ones off the top of my head.
    Chris

  • Execve: Exec format error to execute shell script

    I made job to execute a shell script.
    exec dbms_scheduler.create_job (
    job_name => 'run_sh',
    job_type=>'EXECUTABLE',
    job_action=> '/fsoracle/app/oracle/inst2/if_cft/send_file_susin.sh',
    start_date=>sysdate + 1/5760,
    enabled => TRUE
    send_file_susin.sh
    #!/bin/ksh
    set -v
    . /fsutil/ndm/axway/profile
    /fsutil/ndm/axway/Synchrony/Transfer_CFT/bin/CFTUTIL << EOJ
    CONFIG TYPE=COM,MEDIACOM=TCPIP,FNAME=$CFTTCP
         SEND PART=ZADA,IDF=1363X1364,
              FNAME='/fsoracle/app/oracle/product/rdbms/log/KFG.DD.SHRCOM.HRD.A03',
              NFNAME=KFG.DD.SHRCOM.HRD.A03,
              FLRECL=25,NLRECL=25
         SWAITCAT SELECT='IDTU=="%_CAT_IDTU%"'     
    EOJ
    This script is well executed in terminal mode, but it threwed error, [ORA-27369: job of type EXECUTABLE failed with exit code: Unknown error STANDARD_ERROR="execve: Exec format error"] when I executed the shell script, send_file_susin.sh.
    I changed owners and chmods of some specific files, and t.sh was well executed.
    t.sh
    #!bin/ksh
    /usr/bin/mkdir /tmp/test
    I don't know what's wrong with the script, send_file_susin.sh.
    Does any know about this? Please, tell me the solution.
    Thanks in advance,
    Jinbae Kim.

    Hi,
    Posting this in case anyone else find it.
    The key error here is "execve: Exec format error" .
    The following things should be checked
    - that the shell script is a text file has UNIX line terminators
    - that the shell script is set to be executable by the user that the job runs as
    - that the shell script begins with a hashbang line - e.g. #!/bin/sh (I suspect this is the problem)
    For running external jobs on 10gR2 or below, refer to this post Guide to External Jobs on 10g with dbms_scheduler e.g. scripts,batch files
    For running external jobs on 11g and up, please use a credential.
    Hope this helps,
    Ravi.

  • Executing a shell script with DBMS_SCHEDULER

    Hi,
    when I execute a shell script with DBMS_SCHEDULER this doesn't works correctly
    BEGIN
    DBMS_SCHEDULER.create_job
    job_name => 'job_AR',
    job_type => 'EXECUTABLE',
    job_action => '/home/crm/crmdw/AR/start_execution.sh',
    enabled => TRUE,
    start_date => systimestamp,
    repeat_interval => 'FREQ=MINUTELY;INTERVAL=15',
    comments => 'Test Job AR'
    END;Inside the shell script there is a code who call a Hierarchy of process,
    if I executed it manually or with a cron, it works perfectly
    but when I execute it with the job that I've described before it's executes
    all process at same time and it doesn't work.
    What can I do to fix the issue,
    any Ideas?
    Thanks in advanced...

    #!/usr/bin/ksh
    #test_dbms_scheduler.ksh
    echo $1
    echo "I am in Unix"
    exit 0
    chmod 755 test_dbms_scheduler.ksh
    Create or replace procedure test_dbms_scheduler
    as
    v_text varchar2(255) := 'Parameter passed from Oracle to Unix';
    Begin
    dbms_output.put_line("I am in Procedure");
    dbms_scheduler.create_job
    (job_name=>'test_dbms_scheduler',
    job_action=>'/usr/bin/test_dbms_scheduler.ksh',
    number_of_arguments=>1,
    job_type=>'executable',
    start_date => SYSDATE,
    repeat_interval => 'FREQ=SECONDLY; INTERVAL=1',
    enabled=>false,
    auto_drop => TRUE,
    comments=> 'Run shell-script test_dbms_scheduler.ksh');
    dbms_scheduler.set_job_argument_value(job_name =>'test_dbms_scheduler', argument_position => 1, argument_value => v_text);
    dbms_scheduler.enable('test_dbms_scheduler');
    dbms_output.put_line("I am back in Procedure");
    Exception
    when others then
    dbms_output.put_line(sqlcode||sqlerrm);
    end;
    set serveroutput on
    exec test_dbms_scheduler;

  • Execute Shell Script through Demantra Workflow

    Hi All ,
    Can we execute shell script from Demantra Workflow ?
    If yes where should we place the shell script file and wat should be the commandline command in the step .
    We have the Demantra installed on a windows server and the workflow manager is on a linux server .
    the batch file on the windows server can be executed through the secure shell from the linux server .
    I am trying to achieve the same through thr demantra workflow
    Appreciate any input on the same .
    Thanks and regards
    Suzy

    Hi,
    Shell script is not supported till Demantra 7.2.0.2 WF.
    I have checked with Oracle team also and reply I got below for your reference:
    QUESTION
    *=========*
    As per your details, shall we conclude like this:
    *"only *.bat and *.exe files can be used in workflow. Demantra standard functionality doe*
    *s not support shell script in workflow"*
    ANSWER
    *=======*
    Hi ,
    The Demantra standard functionality 7.2.0.2 does support shell script in workflow.
    Thanks,
    Asya
    Please review the note#468071.1-Unable to Run EBS Workflows that Call EngineManager.exe in
    You will find this note is referring to the Enhancement Request Bug 6644455-- ANALYTICAL ENGINE
    NOT AVAILABLE ON UNIX/LINUX
    But the enhancement bug exist and I hope it is fixed in Demantra 7.3.
    Tks
    MJ

  • HFM 933, error executing vb script %0 (for rules file)

    i have a HFM rules file that is error-free, but sometimes, when i run a consolidation via SmartView, it is aborted and the error is "Error Executing VB Script %0".
    Does anyone know why this occurs?
    Thanks in advance.

    The error message you shared may not have anything to do with HFM rules. There are a number of things that can generate a VBScript error: HFM rules, HFM custom member lists, or permissions problems/configuration problems with IIS. Smart View uses IIS, so maybe this is a configuration issue there. If you log out and log back in again, perform the very same activity successfully, it could mean you have load balance problem, or some other network related error. If it's a permission problem on the server, it will affect all users in the same way. An HFM member list problem would reveal itself any time the member list is called: whether through HFM's forms, grids, smart view, or Reports.
    --Chris                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • OS Problem executing shell script

    Hi,
    I am executing a shell script using 'Runtime.getRuntime().exec()'. I need to view the iteration data on my shell as the execution progresses.
    I get the input streams from the process and buffer them in order to do this.
    I can see the progress successfully on SunOS, but not on HP. The shell script itself runs fine on HP.
    Any possible solution to this is greatly appreciated. I searched the forum and also some other sites like Java World, but haven't found anything useful yet.
    Thanks.

    Here's a detailed explanation of the problem:
    I have a shell script that makes a single call to a executable. The output is piped to the shell one line at a time(number of lines depend on the number of iterations I specify in the script). This shell script run by itself works on Unix SunOS and on HPUX 11 with the output piped to the shell as the execution progresses.
    I am trying to execute this shell script from my java code. For this I use the "runtime" environment, and get the output from the "process" using the "getInputStream" and "getErrorStream" methods. I buffer these streams using "BufferReader", and output them to the shell.
    My code works as desired on Sun OS, with the output from the process piped to the shell one line at a time as the execution progresses. But on HPUX 11, the output is piped to the shell all at one time - after the process runs to completion. I am not able to understand this behavior.
    Hope my description of the problem is clear. Thank you.

  • Run shell script using Host Command

    How do I run Unix Shell Script using Host command?
    Please help me......

    Are you running the shell script from the same box as the forms are deployed in? If not, then HOST call will try to execute on the machine where the forms are hosted(App server).

  • Scheduling an sql script using dbms_scheduler

    Hi Experts,
    I am having an oracle 10g database on windows platform. I have an sql script which has a normal set of sql statements (insertion and updation).
    I would like to schedule to run this sql script using dbms_scheduler but I've gone through certain sites and came to know that it's not possible to schedule an sql script using dbms_scheduler. Please let me know how I can schedule this script using dbms_scheduler.

    It is possible - in 10g and above you can use DBMS_SCHEDULER to call an external procedure, which in this case could be a call to your SQL file.
    Get's a bit more complicated with older versions, but still doable.
    But - unless there is a really good reason why you cannot do so, move this into a PL/SQL procedure as suggested.
    Carl

  • How to execute a script using ombplus

    hi
    i have created one scripts using umbplus now i want to execute that scripts using ombplus editor...
    can any one pls let me know how to execute that using ombplus editor..
    thanks
    keka.

    hi,
    we have created a batch file say a.bat and in that batch file we have given the ombplus path "C:\oracle\OWBHome\owb\bin\win32\OMBPlus.bat" and the path of the script file " C:\OWB\test.tcl". if you run the a.bat file its connecting to ombplus but the script is not executing.
    let me know how to achive this??
    thanks.

  • Upgraded ROX filer not executing shell scripts

    Hi all,
    I run ICEWM+ROX filer as my lite desktop but after a recent upgrade, ROX does not execute shell scripts on mouse click. It appears that filetype detection has changed so that all scripts are shown with the shellscript icon, whereas before those without a .sh extension where shown with the kde type "cogwheel".
    Any tips much appreciated.
    My thanks in advance!

    I've found a lot of things I don't like about the latest ROX upgrade... too much caching nowadays in my  opinion.

  • How to execute SQL Script using windows powershell(using invoke-sqlcmd or any if)

    OS : Windows server 2008
    SQL Server : SQL Server 2012
    Script: Test.sql (T-SQL)  example : "select name from sys.databases"
    Batch script: windows  MyBatchscript.bat ( here connects to sql server using sqlcmd  and output c:\Testput.txt) 
     (sqlcmd.exe -S DBserverName -U username -P p@ssword -i C:\test.sql -o "c:\Testoutput.txt)  ---it working without any issues.....
    This can execute if i double click MyBatchscript.bat file and can see the output in c:\testput.txt.
    Powershell: Similarly, How can i do in powershell 2.0 or higher versions?  can any one give full details with each step?
    I found some of them online, but nowhere seen clear details or examples and it not executing through cmd line (or batch script).
    example: invoke-sqlcmd -Servernameinstance Servername -inputfile "c:\test.sql" | out-File -filepath "c:\psOutput.txt"  --(call this file name MyTest.ps1)
    (The above script working if i run manually. I want to run automatic like double click (or schedule with 3rd party tool/scheduler ) in Batch file and see the output in C drive(c:\psOutput.txt))
    Can anyone Powershell experts give/suggest full details/steps for this. How to proceed? Is there any configurations required to run automatic?
    Thanks in advance.

    Testeted the following code and it's working.....thanks all.
    Execute sql script using invoke-sqlcmd with batch script and without batch script.
    Option1: using Import sqlps
    1.Save sql script as "C:\scripts\Test.sql"  script in side Test.sql: select name from sys.databases
    2.Save Batch script as "C:\scripts\MyTest.bat" Script inside Batch script:
    powershell.exe C:\scripts\mypowershell.ps1
    3.Save powershell script as "C:\scripts\mypowershell.ps1"
    import-module "sqlps" -DisableNameChecking
    invoke-sqlcmd -Servername ServerName -inputFile "C:\scripts\Test.sql" | out-File -filepath "C:\scripts\TestOutput.txt"
    4.Run the Batch script commandline or double click then can able to see the output "C:\scripts\TestOutput.txt" file.
    5.Connect to current scripts location  cd C:\scripts (enter)
    C:\scripts\dir (enter )
    C:\scripts\MyTest.bat (enter)
    Note: can able to see the output in "C:\scripts" location as file name "TestOutput.txt".
    Option2: Otherway, import sqlps and execution
    1.Save sql script as "C:\scripts\Test.sql"  script in side Test.sql: select name from sys.databases
    2.Save powershell script as "C:\scripts\mypowershell.ps1"
    # import-module "sqlps" -DisableNameChecking #...Here it not required.
    invoke-sqlcmd -Servername ServerName -inputFile "C:\scripts\Test.sql" | out-File -filepath "C:\scripts\TestOutput.txt"
    3.Connect to current scripts location
    cd C:\scripts (enter)
    C:\scripts\dir (enter )
    C:\scripts\powershell.exe sqlps C:\scripts\mypowershell.ps1 (enter)
    Note: can able to see the output in "C:\scripts" location as file name "TestOutput.txt".

  • Finding path to source executed shell script

    Hello,
    I have shell script A which executes shell script B. The scripts A and B reside in the same directory.
    Following requirements:
    Scripts A and B need to be source executed.
    The directory where A and B resides should be anywhere.
    The present working directory from where A is executed should be anywhere.
    The directory of scripts A and B may not be defined in $PATH.
    The current shell could be any Bourne or Korn shell, Linux or Unix.
    Problem:
    Let's assume scripts A and B are installed in "/custom".
    The users working directory is /home/oracle and he executes "source /custom/A".
    If script A executes "source ./B", the result would be "source /home/oracle/B", which will fail.
    Possible Solution:
    Since script A is being sourced, $0 will always be the name of the current shell and not the name and path to script A.
    I have not found a universal solution to locate the absolute path of a shell script that is being source executed.
    So I was wondering about defining a variable in script A. e.g. SCRIPT_HOME=./custom. When script A executes "source $SCRIPT_HOME/B" it will work, but then script A needs to be customized by the user.
    I could write a script C that edits script A to search and edit the SCRIPT_HOME variable. For this to work I will have to verify that script C is not source executed, and that script A is in the same directory as C based on $0. This is not a problem..
    The question however is whether I need to create script C, or could simply execute script A with -config argument to edit itself and exit. I would prefer the later case and from what I understand, Unix and Linux do not lock files by default and this should be possible.
    Any comments or suggestions please?
    Thanks!
    Edited by: Dude on Apr 5, 2011 3:59 AM

    The $_ solution is actually very simple:
    $ cat thatsme
    mypath=$(dirname $_)
    echo "$mypath/B"
    $ source /home/oracle/thatsme
    /home/oracle/BI do not need the absolute path of script A. The relative path from my current directory will do as well.
    But, it does not work with an alias:
    $ alias thatsme='source /home/oracle/thatsme'
    $ thatsme
    ./B

Maybe you are looking for

  • Can I delay my phone upgrade until like September for the iPhone 5?

    My family's contract is supposed to be renewed this June or July, and so we would get the phone upgrade prices on new phones. I've been thinking about getting the iPhone, but I've heard that the iPhone 5 should be released sometime this fall. And sin

  • VL10A Issue

    Hi The user is unable to create delivery from a sales order that is appearing in the Delivery Due List (VL10A). When I checked the error log in delivery creation, its says for item xxx After product selection, there is a remaining quantity of  12 NOS

  • Sync Music audio with Video Audio PE10

    Hi. I am new to Premiere elements 10 (Changed from Corel to Adobe, so i'm not new to video editing, but I'm not used to PE10). I am filming a lot of my daughters Circus performances, so very often I find the music number used at the performance and m

  • Strange iTunes behaviour which no one seems to get, but me! (Track naming)

    I have a very annoying issue which no one seems to get, but me! To give some background info: - Using latest version of iTunes (this problem has occured in older versions as well) - Using a 4G Nano (also happened when using a 2G Nano) - Windows 7 x64

  • Error -88

    error 88 ? when trying to acquire a scaned image on my computer I get error -88, then says" getting twain properties",  But does nothing more.                             hp6600officejetprinter      windows 7                  1st time  ive seen this?