Unix PLSQL Batch job toad

The below code is a shell script to run it as a batch job. The insert code works in a Toad but in unix it doesn't seem to recognize from the declare? what could be the issue?
function insertNewData
printf "Extracting previous month data records \n"
sqlplus -s $USERID_PASSWORD@$ORACLE_SID << THE_END >> $FILENAME
SET NEWPAGE 0
SET SPACE 0
SET LINESIZE 476
SET PAGESIZE 0
SET ECHO OFF
SET FEEDBACK OFF
SET HEADING OFF
SET TERMOUT OFF
whenever sqlerror exit sql.sqlcode
-- spool $FILENAME
     DECLARE
     MinCount NUMBER;
     BEGIN
     SELECT TO_NUMBER(T2.PROP_VAL) INTO MinCount FROM LMN.XYZ_CONFIG T2 WHERE T2.PROP_NAM='CountMin';
     INSERT INTO LMN.XYZ_XYZ_ T1
          (_ID, STATCD, NUM, CRTE_DT, ENTERED_DTTM, ENTERED_USER_ID,
               LAST_UPD_DTTM, LAST_UPD_USER_ID, VER_ID)
     SELECT LMN.XYZ_XYZ__SEQ.nextval, 'AP', V1._N, TRUNC(SYSDATE), TRUNC(SYSDATE), 'XYZ_BATCH', TRUNC(SYSDATE),
               'XYZ_BATCH', 0     
          FROM (SELECT DISTINCT V2._NUM N, SUM(V2.CNT) _COUNT
          FROM LMN.XYZ_PQR__VW V2
               WHERE V2._CNT >= MinCount
          GROUP BY V2._NUM ) V1
          WHERE NOT EXISTS ( SELECT * FROM LMN.XYZ_XYZ_ WHERE V1._N = LMN.XYZ_XYZ_._NUM);
     END;
-- spool off
THE_END
RC=$?
if [ $RC -ne 0 ]
then
printf "Error occurred while Inserting New Alerts Data into LMN.XYZ_XYZ_ Table!\n"
exit -1
fi
# MAIN PROCESSING SECTION
# Set environment
. /apps_01/XYZ/etc/XYZEnv
APP=`basename $0`
TODAY=`date '+%m%d%Y%H%M%S'`
# Set logfile directory and log files
LOG=${XYZ_LOG}/${APP}.log.${TODAY}
#LOG=/home/XYZ/log/${APP}.log.${TODAY}
# Direct all stdout and stderr to the log file
exec >>$LOG 2>&1
printf "\n${APP} processing started at `date` \n"
printf "${APP} data extract started at `date` \n"
#Create the output filename
FILENAME=$XYZ_LOG/XYZAcctDtl.out.${TODAY}
#FILENAME=/home/XYZ/log/XYZAcctDtl.out.${TODAY}
# Retrieve user i and passwoctABM.shrd
USERID_PASSWORD=`cat $XYZ_ETC/userid.dat`
# Do processing
insertNewPQRData
# Data extractr done! Start file transfer
printf "${APP} data purge/insert completed successfully at `date` \n"
RETURN_CD=$?
# Return successful/unsuccessful code
echo "Return Code::::$RETURN_CD"
exit $RETURN_CD

Oracle version is 9i. Actually there is no error messages. I also have tow other functions but is has only SQL statements . not a pl/sql bloc. These two functions work fine when the scriptis run but dont c any chnges as far as Insert is concerned.
Shell script doesn't throw error.
If i take out the Plsql block and run it on Toad it works fine ?
Any experience dealing with this before.
Tiger

Similar Messages

  • How to get all AD User accounts, associated with any application/MSA/Batch Job running in a Local or Remote machine using Script (PowerShell)

    Dear Scripting Guys,
    I am working in an AD migration project (Migration from old legacy AD domains to single AD domain) and in the transition phase. Our infrastructure contains lots
    of Users, Servers and Workstations. Authentication is being done through AD only. Many UNIX and LINUX based box are being authenticated through AD bridge to AD. 
    We have lot of applications in our environment. Many applications are configured to use Managed Service Accounts. Many Workstations and servers are running batch
    jobs with AD user credentials. Many applications are using AD user accounts to carry out their processes. 
    We need to find out all those AD Users, which are configured as MSA, Which are configured for batch jobs and which are being used for different applications on
    our network (Need to find out for every machine on network).
    These identified AD Users will be migrated to the new Domain with top priority. I get stuck with this requirement and your support will be deeply appreciated.
    I hope a well designed PS script can achieve this. 
    Thanks in advance...
    Thanks & Regards Bedanta S Mishra

    Hey Satyajit,
    Thank you for your valuable reply. It is really a great notion to enable account logon audit and collect those events for the analysis. But you know it is also a tedious job when thousand of Users come in to picture. You can imagine how complex it will be
    for this analysis, where more than 200000 users getting logged in through AD. It is the fact that when a batch / MS or an application uses a Domain Users credential with successful process, automatically a successful logon event will be triggered in associated
    DC. But there are also too many users which are not part of these accounts like MSA/Batch jobs or not linked to any application. In that case we have to get through unwanted events. 
    Recently jrv, provided me a beautiful script to find out all MSA from a machine or from a list of machines in an AD environment. (Covers MSA part.)
    $Report= 'Audit_Report.html'
    $Computers= Get-ADComputer -Filter 'Enabled -eq $True' | Select -Expand Name
    $head=@'
    <title>Non-Standard Service Accounts</title>
    <style>
    BODY{background-color :#FFFFF}
    TABLE{Border-width:thin;border-style: solid;border-color:Black;border-collapse: collapse;}
    TH{border-width: 1px;padding: 2px;border-style: solid;border-color: black;background-color: ThreeDShadow}
    TD{border-width: 1px;padding: 2px;border-style: solid;border-color: black;background-color: Transparent}
    </style>
    $sections=@()
    foreach($computer in $Computers){
    $sections+=Get-WmiObject -ComputerName $Computer -class Win32_Service -ErrorAction SilentlyContinue |
    Select-Object -Property StartName,Name,DisplayName |
    ConvertTo-Html -PreContent "<H2>Non-Standard Service Accounts on '$Computer'</H2>" -Fragment
    $body=$sections | out-string
    ConvertTo-Html -Body $body -Head $head | Out-File $report
    Invoke-Item $report
    A script can be designed to get all scheduled back ground batch jobs in a machine, from which the author / the Owner of that scheduled job can be extracted. like below one...
    Function Get-ScheduledTasks
    Param
    [Alias("Computer","ComputerName")]
    [Parameter(Position=1,ValuefromPipeline=$true,ValuefromPipelineByPropertyName=$true)]
    [string[]]$Name = $env:COMPUTERNAME
    [switch]$RootOnly = $false
    Begin
    $tasks = @()
    $schedule = New-Object -ComObject "Schedule.Service"
    Process
    Function Get-Tasks
    Param($path)
    $out = @()
    $schedule.GetFolder($path).GetTasks(0) | % {
    $xml = [xml]$_.xml
    $out += New-Object psobject -Property @{
    "ComputerName" = $Computer
    "Name" = $_.Name
    "Path" = $_.Path
    "LastRunTime" = $_.LastRunTime
    "NextRunTime" = $_.NextRunTime
    "Actions" = ($xml.Task.Actions.Exec | % { "$($_.Command) $($_.Arguments)" }) -join "`n"
    "Triggers" = $(If($xml.task.triggers){ForEach($task in ($xml.task.triggers | gm | Where{$_.membertype -eq "Property"})){$xml.task.triggers.$($task.name)}})
    "Enabled" = $xml.task.settings.enabled
    "Author" = $xml.task.principals.Principal.UserID
    "Description" = $xml.task.registrationInfo.Description
    "LastTaskResult" = $_.LastTaskResult
    "RunAs" = $xml.task.principals.principal.userid
    If(!$RootOnly)
    $schedule.GetFolder($path).GetFolders(0) | % {
    $out += get-Tasks($_.Path)
    $out
    ForEach($Computer in $Name)
    If(Test-Connection $computer -count 1 -quiet)
    $schedule.connect($Computer)
    $tasks += Get-Tasks "\"
    Else
    Write-Error "Cannot connect to $Computer. Please check it's network connectivity."
    Break
    $tasks
    End
    [System.Runtime.Interopservices.Marshal]::ReleaseComObject($schedule) | Out-Null
    Remove-Variable schedule
    Get-ScheduledTasks -RootOnly | Format-Table -Wrap -Autosize -Property RunAs,ComputerName,Actions
    So I think, can a PS script be designed to get the report of all running applications which use domain accounts for their authentication to carry out their process. So from that result we can filter out the AD accounts being used for those
    applications. After that these three individual modules can be compacted in to a single script to provide the desired output as per the requirement in a single report.
    Thanks & Regards Bedanta S Mishra

  • Can a long running batch job causing deadlock bring server performance down

    Hi
    I have a customer having a long running batch job (approx 6 hrs), recently we experienced performance issue where the job now taking &gt;12 hrs. The database server is crawling. Looking at the alert.log showing some deadlock,
    The batch job are in fact many parallel child batch job that running at the same time, that would have explain the deadlock.
    Thus, i just wondering any possibility that due to deadlock, can cause the whole server to be crawling, even connect to the database using toad is also getting slow or doing ls -lrt..
    Thanks
    Rgds
    Ung

    Kok Aik wrote:
    According to documentation, complex deadlock can make the job appeared hang & affect throughput, but it didn't mentioned how it will make the whole server to slow down. My initial thought would be the rolling back and reconstruct of CR copy that would have use up the cpu.
    I think your ideas on rolling back, CR construction etc. are good guesses. If you have deadlocks, then you have multiple processes working in the same place in the database at the same time, so there may be other "near-deadlocks" that cause all sorts of interference problems.
    Obviously you could have processes queueing for the same resource for some time without getting into a deadlock.
    You can have a long running update hit a row which was changed by another user after the update started - which woudl cause the long-running update to rollback and start again (Tom Kyte refers to this as 'write consistency' if you want to search his website for a discussion on the topic).
    Once concurrent processes start sliding out of their correct sequences because of a few delays, it's possible for reports that used to run when nothing else was going on suddenly finding themselves running while updates are going on - and doing lots more reads (physical I/O) of the undo tablespace to take blocks a long way back into the past.
    And so on...
    Anyway, according to the customer, the problem seems to be related to the lgpr_size as the problem disappeared after they revert it back to its orignial default value,0. I couldn't figure out what the lgpr_size is - can you explain.
    Thanks
    Jonathan Lewis
    http://jonathanlewis.wordpress.com
    http://www.jlcomp.demon.co.uk
    "Science is more than a body of knowledge; it is a way of thinking" Carl Sagan

  • How should i execute file transfer via batch job scheduling

    Hi guys,
        i want to call unix transfer script via batch job scheduling. Executing the system commands witin ABAP is as below:
             DATA: BEGIN OF ITAB occurs 0,
                         LINE(200),
                        END OF ITAB.
            UNIXCOMM = '/usr/sap/trans/data/*******'
            CALL 'SYSTEM' ID 'COMMAND' FIELD UNIXCOMM
                    ID 'TAB'     FIELD itab-sys.
    Now i plan to assign application server ('Exec Target') to it and let it implement in the background.
      Should i do it via batch job?
    Any info is appreciated very much.

    Instead of using
    CALL 'SYSTEM' u can use FM..
    SXPG_COMMAND_EXECUTE  .. in which u have to pass a command name which u can create or get from SM49...
    In CALL 'SYSTEM' .. u can not catch exceptions.. but in FM given above u can check
    SY-SUBRC, STATUS and EXITCODE  for successful  command execution..
    for successful command execution..
    sy-subrc = 0
    STATUS = 'O'  " Capital O
    exitcode = 0
    For batch scheduling, you can use this FM in a report  with one parameter for additional parameter which u need to pass to this FM  and create a JOB for that report and schedule it..
    I've used it and  find it useful even for batch scheduling..
    Reward if useful
    Regards
    Prax

  • Problem with PLSQL dbms job in apex

    Hi,
    I am completely new to apex, and am facing a issue which i feel is strange.
    There is a button on the apex page and a process is associated with that button. Some tables are being updated in the code written in that process and a db procedure is called which creates some files on the unix box(utl_file). The type of that process is PLSQL DBMS job.
    Now its been observed that the code written on that button is executed at odd times i.e no one made any action on the button.
    So my questions are,
    Is this PLSQL DBMS job same as that of the database dbms_job?? but the logs have no entry.
    Is there a case when the button might be pressed earlier but the job executed late because of resources not being available?
    Any logs created by apex so that i can track the job??
    Any idea if i can replace this plsql dbms job with any other process type??
    Secondly, there are 4 files being generated out of which 3 are generated with 644 permission and 1 file is generated with 600. Why does this happen?
    Apex version :3.1
    db version: 10.2.0.4
    Regards,
    Ankit

    Looks like it's still a database bug. In 10.1.0.4, I get the correct results in sqlplus from the query but I get incorrect results in sqlplus if I do it like this:set serveroutput on
    declare
        l_cursor    integer := DBMS_SQL.OPEN_CURSOR;
        l_desc_tbl  DBMS_SQL.DESC_TAB2;
        l_ignore    number;
        l_col_cnt   integer;
        l_col_val   varchar2(32767);
    BEGIN
        DBMS_SQL.PARSE(l_cursor,
    'select connect_by_isleaf LEAF,DESCRIPTION,PLACE from test_tab start with id = 1 connect by prior id = id_master',
            DBMS_SQL.NATIVE);
        l_ignore := DBMS_SQL.EXECUTE(l_cursor);   
        DBMS_SQL.DESCRIBE_COLUMNS2(l_cursor, l_col_cnt, l_desc_tbl );
        for i in 1 .. l_col_cnt loop              
            DBMS_SQL.DEFINE_COLUMN(l_cursor, i, l_col_val, 32767 );
        end loop;
        while (DBMS_SQL.FETCH_ROWS(l_cursor) > 0)
        loop
            for i in 1 .. l_col_cnt loop
                DBMS_SQL.COLUMN_VALUE(l_cursor, i, l_col_val);           
                -- print the column value
                dbms_output.put_line(l_col_val);
            end loop;
            dbms_output.put_line(chr(10));      
        END LOOP;
        if DBMS_SQL.IS_OPEN(l_cursor) then
            DBMS_SQL.CLOSE_CURSOR(l_cursor);
        end if;
    END;
    /Scott

  • Shedule a Batch job to run every one min

    Hello all,
    I'm new to data services . My requirement is to shedule a batch job to run every minute .
    Please help me on this.
    Thanks,
    David king J

    Hi David,
    Regarding your query below is the suggestion from my side:
    Option I: Schedule the Job from Data Services Management Console below is the steps.
    (a) Log in in SAP Business Objects Data Services Management console
    Link:<Server Name>:<Port Number>/DataServices
    (b) Go to Administrator Tab and click on Batch Job
    (c) Select Batch Job and in right hand side click on Add schedule
    Where you have Option to schedule as per your requirement
    Note : You can use Data Services scheduler Or BOE scheduler.
    Option II: Schedule the Job from UNIX environment below is the steps.
    (a) Cron Tab Access rights required
    for checking cron tab rights use below command
    crontab -l  ---> List of schedule Jobs
    crontab -e  ---> for editing or creating new schedule
    (b)Log in in SAP Business Objects Data Services Management console
    Link:<Server Name>:<Port Number>/DataServices
    (c) Go to Administrator Tab and click on Batch Job
    (d) Select Batch Job and in right hand side click on Export execution command
    (e) Update the details regarding JOb Server, Global Variable if used in the Job and click on export;Now check the "SHELL Schript" .SH file mentioned in Data Services management console link
    (f) Copy and Paste the Script in your desired location
    (g) Then create SHELL script, below is sample example
    Test:
    1 * * * * /usr/sap/BODS/BO_Script/ABC.SH
    For cronjobs details please refer below link
    Run crontab (cron jobs) Every 10 Minutes
    Note: As Michael suggested that option is also good, you can also implement the same.
    Hope this will help!!!!!
    Thanks,
    Daya

  • Add storage location to a batch job

    Hi all,
    How do i add a Storage location in a batch job?
    Please help!!
    Thanks,
    Avani.

    I hope you realize that you will have to reimport those corrected photos into iPhoto for it to recognize the new times.
    Do you Twango?
    TIP: For insurance against the iPhoto database corruption that many users have experienced I recommend making a backup copy of the Library6.iPhoto database file and keep it current. If problems crop up where iPhoto suddenly can't see any photos or thinks there are no photos in the library, replacing the working Library6.iPhoto file with the backup will often get the library back. By keeping it current I mean backup after each import and/or any serious editing or work on books, slideshows, calendars, cards, etc. That insures that if a problem pops up and you do need to replace the database file, you'll retain all those efforts. It doesn't take long to make the backup and it's good insurance.
    I've written an Automator workflow application (requires Tiger), iPhoto dB File Backup, that will copy the selected Library6.iPhoto file from your iPhoto Library folder to the Pictures folder, replacing any previous version of it. You can download it at Toad's Cellar. Be sure to read the Read Me pdf file.

  • Difference between batch job and Background Job

    Hi Forum,
    Can anyone distinguish between batch job and background job for me please ?
    SK

    Hi,
    Batch Job:
    A batch job is a process that runs in the background, often deferred and unattended, to process data in groups (batch) rather than by individual transactions (e.g. a monthly phone bill rather than a bill for each individual phone call). A batch job executes a sequence of programs and technical instructions that are stored in a command file. Progress and error messages are output to a log file allowing users to determine, at any time, if the batch job completed successfully or identify the cause of the problem. Because batch jobs run in the background they are less visible to the end user.
    In a business-computing context, batch job scheduling implies the automatic execution of background tasks (batch jobs) at pre-determined points in time (e.g. every day at 8pm, midday on Wednesday).
    3 types of batch job scheduling can be distinguished: native, basic and advanced batch job scheduling.
    Most operating systems and some business solutions software come equipped with native batch job scheduling tools that provide a limited service (e.g. Windows Scheduled Tasks, UNIX crontab, SAP CCMS) locally to each installation. However, business processes may span multiple platforms, applications, countries and companies. Their complexity may require much more functional power as provided by basic batch job scheduling including national and regional variations in the working calendar, sequence variations according to the day of the month, triggering of jobs by the successful completion of preceding jobs, elimination of gaps and reduced batch windows. Major benefits of basic batch job scheduling are enhanced productivity, operations reliability and cost-reduction. For e-business applications that require real-time processing, the distance between interactive individual processing and batch processing tends to decrease. Advanced batch job scheduling can handle these advanced requirements: event-driven scheduling for a real-time synchronization with interactive processing, just-in-time scheduling to run operations as soon as possible, cross-platform and cross application services for the entire IT landscape, real-time overall monitoring to track background operations for all applications on all servers.
    The standard benefits of batch job scheduling are drastically amplified when job schedulers can handle the end-to-end automation and monitoring requirements for all background operations.
    To schedule a back ground job follow the below steps:
    1. Use Transaction SM36.
    2. Assign a job name.
    3. Set the job’s priority, or “Job Class”:
    High --- Class A
    Medium --- Class B
    Low ---Class C
    4. Here you can specify,when the job is to start by choosing Start Condition. If you want the job to repeat, or be periodic, check the box at the bottom.
    Else click on the immediate and save this.
    5. Now,define the job’s steps by choosing Step.
    Here you need to give the ABAP program that has been used and the name of the variant thats being used.
    6. Save the fully defined job to submit it to the background processing system.(You need to click save button on the main screen i.e SM36)
    7. When you need to modify, reschedule, or otherwise manipulate a job after you've scheduled it the first time, you'll manage jobs from the Job Overview.
    8. Release the job so that it can run.
    The job, even those scheduled for immediate processing, can not run without first being released.So,do remember to release.
    Hope this helps you.
    Regards,
    Rakesh

  • PL/SQL batch jobs and error reports handling

    Has anyone ever had to write exception and statistics reports from a PL/SQL batch job and how have those been handled? I have 3 options below not but sure what is best (or is there another) way to do this.
    I have a series of batch jobs that are written in PL/SQL. A unix script will invoke a SQL*Plus script that calls a stored procedure. The stored procedure may call other procedures and functions. I'd like to collect all the errors and write them out a report that the user can review. I was thinking of just writing a message using DBMS_OUTPUT and spool that to a file in my script, but I have concerns that other message in the buffer may end up in the report. The other thought was to use UTL_FILE but then I would have to make sure the file handle got passed thru all the procedures and functions appropriate. The last thought was to write to a table ( a temporary PL/SQL table) , and then read that table and write to a file using UTL_FILE (don't have to pass file handles that way) - but I'm not sure if a called function or procedure can access that table.
    null

    <BLOCKQUOTE><font size="1" face="Verdana, Arial">quote:</font><HR>Originally posted by mirp:
    Has anyone ever had to write exception and statistics reports from a PL/SQL batch job and how have those been handled? I have 3 options below not but sure what is best (or is there another) way to do this.
    I have a series of batch jobs that are written in PL/SQL. A unix script will invoke a SQL*Plus script that calls a stored procedure. The stored procedure may call other procedures and functions. I'd like to collect all the errors and write them out a report that the user can review. I was thinking of just writing a message using DBMS_OUTPUT and spool that to a file in my script, but I have concerns that other message in the buffer may end up in the report. The other thought was to use UTL_FILE but then I would have to make sure the file handle got passed thru all the procedures and functions appropriate. The last thought was to write to a table ( a temporary PL/SQL table) , and then read that table and write to a file using UTL_FILE (don't have to pass file handles that way) - but I'm not sure if a called function or procedure can access that table.
    <HR></BLOCKQUOTE>
    Just a thought
    If You use utl_file, You can write all opeations with files in a separate package with public variables of file handles and so on. Not necessary to pass all variables to each procedure and function

  • Batch job

    Hi I am working on batch job .
    my program is printing invoice as well as downloading and I have run this program in batch,
    I am using the FM job_open , job_submit and job_close.
    but it is failing in job_submit with sy-subrc eq 1 .
    giving me error bad print parameter , I haven;t done this before.
    I think it should be option.
    CALL FUNCTION 'JOB_SUBMIT'
        EXPORTING
          authcknam = SY-UNAME  "tbtcjob-authcknam
          jobcount  = tbtcjob-jobcount
          jobname   = p_jobnam
          language  = sy-langu
           report    = c_reprot
          variant   = pvariant
        EXCEPTIONS
          OTHERS    = 01.
    could anybody pls guide me why it is so???
    Regards.
    Kusum.

    you can below code
    OPEN DATASET gv_file FOR INPUT IN TEXT MODE ENCODING DEFAULT
                                WITH SMART LINEFEED.
        IF sy-subrc EQ 0.
          WHILE sy-subrc IS INITIAL.
            READ DATASET gv_file INTO gwa_header_file.
            IF sy-subrc NE 0.
              EXIT.
            ELSE.
              APPEND gwa_header_file TO gt_header_file.
            ENDIF.
          ENDWHILE.
          CLOSE DATASET gv_file.
        ENDIF.
      ENDIF.

  • Display error message in batch job log

    Hello
    I have a batch job running and I have an error coming during some validation logic.
    The problem is I need to continue the batch job when this error message comes and it should not cancel the batch job as it is doing currently but display that error message in batch job log, there are more similar error messages coming in job log and job gets finished, but when my error message comes job gets cancelled.
    I cannot give it as info message as it will give wrong idea about message type.
    Is there any FM by which we can add message in job log?

    Sanjeev I have done that but problem is I do not want to give that as Information message but Error message only and continue processing.
    If you see in screenshot 3rd message is given by me as information and you can see error messages also 6th and 7th and job continued till it is finished
    Basically I want that 'I' to be displayed as 'E'.
    Display error message in batch job log 

  • Report to be sent to a list of recipients in an e:mail (part of batch job)

    Hi,
    I need to generate a report using ALV functionality.
    Currently my report requirement is to sent to a list of recipients in an e:mail (part of the batch job set-up) and the recipients just download the report in a spreadsheet format. 
    Could you please give me the suggestions the way which I need to follow and how I will be able to set this report as a part of batch job which will send the report details to the users in the form of Email.
    Points will be rewarded for the answers.
    Regards,
    Ravi Ganji

    Hi,
    IN SM36..You will see a button for "Spool list recipient" which is next to the target server button..
    press that button..
    Give the email address in the recipient field..
    GIve the steps and start condition and then release the job..
    THanks,
    Naren

  • I Need to Create a report for batch jobs Based on Subject Area.

    Hi SAP Guru's,
    I need to create a report , that it must show the status of batch jobs Completion Times based on Subject area(SD,MM,FI).
    Please help me in this issue ASAP.
    Thanks in Advance.
    Krishna.

    You may need to activate some additional business content if not already installed but there are a lot BI statistics you can report on. Have a look at this:
    http://help.sap.com/saphelp_nw70ehp1/helpdata/en/46/f9bd5b0d40537de10000000a1553f6/frameset.htm

  • Batch job for collecting Blocked Deliveries

    Hi,
    Can anyone help me, i need to collect all the orders that have been blocked for delivery and send the results to an email address.
    I created a batch job gave the program for SD Documents blocked for deliveries, but it doesnt seem to work, can anyone please give a step by step procedure.
    Thanks alot,
    Michelle.

    1. T.code SM36- Here you will creating the job
    2. Give the appropriate variants and the back ground user id
    3. once this is done, go to the Spool receipient and give the email id where you want to send the mail to. Run the job now. You should receive the mail now, provide the connections are maintained.
    Hope this will resolve the issue.
    Mani

  • Duplicate deliveries getting created by batch job against STO

    Hi Experts,
    I am facing one issue where duplicate deliveries are getting created by batch job against Intercopmany STO.
    Scenario is PO having one line item with ordered qty of 8000kg.
    Through batch job, two deliveries got created on same day for PO line item.
    One delivery got created for 8000kg and another delivery has got created for 7000Kg. So user has deleted second delivery of 7000kg.
    Next day again the delivery got created for 8000kg for the same PO line item through batch job.
    I am wondering how the duplicate deliveries are getting created by batch job for PO even though it has no open items.
    All deliveries got created through batch job only as cross checked the user name in delivery.
    Kindly help to fix the issue.

    Hi Amit
    I assume you are talking about outbound deliveries.  In this case it would be worth checking the customer master record for the receiving plant.  In the sales area data there is a shipping tab which contains several settings used to control delivery creation for customers.
    It is possible to control how the system behaves when you have a stock shortage and restrict the number of partial deliveries.  This might help you control this situation and might be the cause.
    Regards
    Robyn

Maybe you are looking for