Table lock in Batch jobs

There is a program which runs in batch mode every 30 mins. After successful execution, program updates last run time stamp in a Z table. For each sales org we set one batch job, like this there are several jobs running in same time & some jobs are not updating time stamp. My gut feeling is reason could be table lock. We canu2019t ask batch team to schedule  all jobs in sequential manner with in 30 mins duration because of huge number of jobs.
How we can overcome this problem.. Check for table lock using ENQUEUE_E_TABLES and wait for some time is feasible solution? Can some one guide me in this regard please?
Thanks in advance.

Hi Perez C,
you may try ENQUEUE and skip for reprocessing, you can also set WAIT = 'X' in the  ENQUEUE call. This will wait a pre-set time (default I think like 3 seconds) trying to lock the object. In most cases it helps. But if you can not lock successfully, you should write it to application log at least.
Regards,
Clemens

Similar Messages

  • Batch Job creation by adding entries to SAP tables

    Hi Experts,
    I understand there are at least 4 SAP tables involved with Batch Jobs.
    I need to create many variants and manually creating them using sm36 is tedious.
    If I add entries in the TBTC* tables, can the batch jobs be created without causing inconsistency ?
    The batch jobs are all similar, based on a zprogram that selects from sap tables data and updating to a ztable. There are  many variants I need to create and assign to the batch jobs. So, if I can update the entries in the tables to achieve the same, it would be much quicker and reduce human error and make checking the job setup faster and easier.
    Please advise.
    regards
    M Russo

    Hi,
    There are many function modules which helps you to create the variants.
    You need to create a simple report to automate your variant creation and batch submission.
    use function module RS_CREATE_VARIANT to create variant use the structure RSPARAMS to specify the value of your selection screens
    Go to the function group BTCH . Use JOB open close to submit the batch jobs with variants anor any ther fm in the BTCH function goup.
    Hope this helps rather than directly writing to tables.
    Thanks

  • Batch Jobs fail because User ID is either Locked or deleted from SAP System

    Business Users releases batch jobs under their user id.
    And when these User Ids are deleted or locked by system administrator, these batch jobs fail, due to either user being locked or deleted from the system.
    Is there any way that these batch jobs can be stopped from cancelling or any SAP standard report to check if there are any batch jobs running under specific user id.

    Ajay,
    What you can do is, if you want the jobs to be still running under the particular user's name (I know people crib about anything and everything), and not worry about the jobs failing when the user is locked out, you can still achieve this by creating a system (eg bkgrjobs) user and run the Steps in the jobs under that System User name. You can do this while defining the Step in SM37
    This way, the jobs will keep running under the Business User's name and will not fail if he/she is locked out. But make sure that the System User has the necessary authorizations or the job will fail. Sap_all should be fine, but it really again depends on your company.
    Kunal

  • Create Pivot Table in Background using a Batch Job

    I'm using 4.7 and I have a request to create a daily batch job that will run an ALV report and create a pivot table. How do I create the pivot table? I tried using the FM 'EXCEL_OLE_STANDARD_DAT' but I get a 'File do not exist' message. Tried looking at the documentaion of this fm but it doesn't have any. If anybody can explain to me how this function works, that would really be great. Or if you have a better solution, that would be even greater than great.
    Thanks in advance!

    Hello Liz,
    I think it is not possible because to generate an Excel file, SAP needs the Excel program.
    But when the program is executed in backgroud, SAP server does not have a way to access Excel program to generate the file.
    Hope I am wrong...
    Regards,
    Mauricio

  • CK24 Marking Batch Jobs causes SM12 Record locks

    When finance runs the CK24 Marking job at month end, it seems to leaving some record locks hanging in SM12. The batch job is sucessful and says it has processed 23,699 materials, but it always leaves locks on some 700~ materials.
    The materials are rohs, halbs, and ferts. I cannot figure out why only some cause this problem.
    I am able to cleanup the locks, by deleting in SM12, but I want to know how to prevent, because it stops all functions such as shipping on these materials.
    Thanks,
    Bev

    Dear,
    Please check SM37 is for job logs
    Have you "enqueue" in RZ20 ?
    In SM12 By double-clicking a lock entry, you can display detailed information, including the host name and number of the SAP System in which the lock was generated.
    check this sap help...
    http://help.sap.com/saphelp_nw04/helpdata/en/7b/f9813712f7434be10000009b38f8cf/frameset.htm
    Regards,
    R.Brahmankar

  • Question On Table Locks

    Hello,
    I have a question on locks.
    USER A runs a batch job of insert statements have 100000 records and does a commit after every 1000 records.
    USER B is also running a batch job on similar tables and is blocked due to locks held by USER A
    I have identified that USER A is blocking USER B and now i need USER B to continue with the batch job. My question is that i need to kill the USER A's session without making him lose all the data he already inserted. In short as SYS can i commit his inserted transactions on his behalf.
    I assume if I kill his session he will lose all the INSERTS he performed since he hasnt commited until that point.
    Please Help.

    is there any way i could save the INSERTED transcations of USER A.No. User A commits or User A is killed and rolls back.
    i'm also confused about this type of lock i seeIf you supply details of what the sessions are waiting on / locks held, then this will clarify.
    do u suggest i commit every 500 records No. In general, you should commit at the end of a transaction - all or nothing. Committing every X is nasty.
    a lot of refrential constariants with other tables and this could be causing the lockYou can get locking problems with unindexed foreign keys.
    If you could provide more details of what's going on in both sessions / what they're waiting on this should clarify.

  • Sessions from apex are blocking by batch job

    Hi All,
    In my 10.2.0.3 linux with apex 3.1.1
    We have a batch job using a package,that is doing DML operation in tables depends upon some condition.And this job will run daily 5 hours.
    The sessions from apex that is also using some procedures in the same package.Some days the sessions from apex are blocking by batch job.While blocking wait event from apex sessions are 'library cache pin'.Once the batch job completes then automatically removes the block.
    What may be the reason?
    Thanks in Advance,
    Sunil

    The query against v$lock is not relevant if the blocked sessions are waiting on "library cache pin" - so you seem to have two separate problems.
    If you have a session waiting on another session's transaction slot in mode 4 there are several possible causes - often related to indexes, but there are a couple of "internal" problems as well. If you see this locking issue again some of the simplest things to check for would be session 1 inserting (without commit) some rows in a table with a unique key, and session 2 then trying to insert a duplicate. Session 2 has to wait for session 1 to commit or rollback before deciding whether to return a "duplicate key" error, or to continue processing.
    The "library cache pin" waits suggest that the package had become invalid while the batch job is running it, and one of the Apex jobs is trying to recompile it. But if the batch job is currently running (hence pinning) the "executable", then the Apex job can't get the necessary exlusive pin until the batch job ends and releases its pin.
    Read the notes in the script $ORACLE_HOME/rdbms/admin/catblock.sql about creating views that let you see more of the information about library cache (KGL) locks and pins.
    Regards
    Jonathan Lewis
    http://jonathanlewis.wordpress.com
    http://www.jlcomp.demon.co.uk
    To post code, statspack/AWR report, execution plans or trace files, start and end the section with the tag {noformat}{noformat} (lowercase, curly brackets, no spaces) so that the text appears in fixed format.
    "Science is more than a body of knowledge; it is a way of thinking"
    Carl Sagan                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • How to get all AD User accounts, associated with any application/MSA/Batch Job running in a Local or Remote machine using Script (PowerShell)

    Dear Scripting Guys,
    I am working in an AD migration project (Migration from old legacy AD domains to single AD domain) and in the transition phase. Our infrastructure contains lots
    of Users, Servers and Workstations. Authentication is being done through AD only. Many UNIX and LINUX based box are being authenticated through AD bridge to AD. 
    We have lot of applications in our environment. Many applications are configured to use Managed Service Accounts. Many Workstations and servers are running batch
    jobs with AD user credentials. Many applications are using AD user accounts to carry out their processes. 
    We need to find out all those AD Users, which are configured as MSA, Which are configured for batch jobs and which are being used for different applications on
    our network (Need to find out for every machine on network).
    These identified AD Users will be migrated to the new Domain with top priority. I get stuck with this requirement and your support will be deeply appreciated.
    I hope a well designed PS script can achieve this. 
    Thanks in advance...
    Thanks & Regards Bedanta S Mishra

    Hey Satyajit,
    Thank you for your valuable reply. It is really a great notion to enable account logon audit and collect those events for the analysis. But you know it is also a tedious job when thousand of Users come in to picture. You can imagine how complex it will be
    for this analysis, where more than 200000 users getting logged in through AD. It is the fact that when a batch / MS or an application uses a Domain Users credential with successful process, automatically a successful logon event will be triggered in associated
    DC. But there are also too many users which are not part of these accounts like MSA/Batch jobs or not linked to any application. In that case we have to get through unwanted events. 
    Recently jrv, provided me a beautiful script to find out all MSA from a machine or from a list of machines in an AD environment. (Covers MSA part.)
    $Report= 'Audit_Report.html'
    $Computers= Get-ADComputer -Filter 'Enabled -eq $True' | Select -Expand Name
    $head=@'
    <title>Non-Standard Service Accounts</title>
    <style>
    BODY{background-color :#FFFFF}
    TABLE{Border-width:thin;border-style: solid;border-color:Black;border-collapse: collapse;}
    TH{border-width: 1px;padding: 2px;border-style: solid;border-color: black;background-color: ThreeDShadow}
    TD{border-width: 1px;padding: 2px;border-style: solid;border-color: black;background-color: Transparent}
    </style>
    $sections=@()
    foreach($computer in $Computers){
    $sections+=Get-WmiObject -ComputerName $Computer -class Win32_Service -ErrorAction SilentlyContinue |
    Select-Object -Property StartName,Name,DisplayName |
    ConvertTo-Html -PreContent "<H2>Non-Standard Service Accounts on '$Computer'</H2>" -Fragment
    $body=$sections | out-string
    ConvertTo-Html -Body $body -Head $head | Out-File $report
    Invoke-Item $report
    A script can be designed to get all scheduled back ground batch jobs in a machine, from which the author / the Owner of that scheduled job can be extracted. like below one...
    Function Get-ScheduledTasks
    Param
    [Alias("Computer","ComputerName")]
    [Parameter(Position=1,ValuefromPipeline=$true,ValuefromPipelineByPropertyName=$true)]
    [string[]]$Name = $env:COMPUTERNAME
    [switch]$RootOnly = $false
    Begin
    $tasks = @()
    $schedule = New-Object -ComObject "Schedule.Service"
    Process
    Function Get-Tasks
    Param($path)
    $out = @()
    $schedule.GetFolder($path).GetTasks(0) | % {
    $xml = [xml]$_.xml
    $out += New-Object psobject -Property @{
    "ComputerName" = $Computer
    "Name" = $_.Name
    "Path" = $_.Path
    "LastRunTime" = $_.LastRunTime
    "NextRunTime" = $_.NextRunTime
    "Actions" = ($xml.Task.Actions.Exec | % { "$($_.Command) $($_.Arguments)" }) -join "`n"
    "Triggers" = $(If($xml.task.triggers){ForEach($task in ($xml.task.triggers | gm | Where{$_.membertype -eq "Property"})){$xml.task.triggers.$($task.name)}})
    "Enabled" = $xml.task.settings.enabled
    "Author" = $xml.task.principals.Principal.UserID
    "Description" = $xml.task.registrationInfo.Description
    "LastTaskResult" = $_.LastTaskResult
    "RunAs" = $xml.task.principals.principal.userid
    If(!$RootOnly)
    $schedule.GetFolder($path).GetFolders(0) | % {
    $out += get-Tasks($_.Path)
    $out
    ForEach($Computer in $Name)
    If(Test-Connection $computer -count 1 -quiet)
    $schedule.connect($Computer)
    $tasks += Get-Tasks "\"
    Else
    Write-Error "Cannot connect to $Computer. Please check it's network connectivity."
    Break
    $tasks
    End
    [System.Runtime.Interopservices.Marshal]::ReleaseComObject($schedule) | Out-Null
    Remove-Variable schedule
    Get-ScheduledTasks -RootOnly | Format-Table -Wrap -Autosize -Property RunAs,ComputerName,Actions
    So I think, can a PS script be designed to get the report of all running applications which use domain accounts for their authentication to carry out their process. So from that result we can filter out the AD accounts being used for those
    applications. After that these three individual modules can be compacted in to a single script to provide the desired output as per the requirement in a single report.
    Thanks & Regards Bedanta S Mishra

  • Batch job creation for sending email if the invoice is aged

    Positive confirmation: Send system notification to the requestor if:
    a) An invoice is received and GR is not yet posted in the ECC system.
    b) An 'aged' invoice and GR is not yet posted in the ECC system.
    This requirement is for USA only. i.e., we have only one company code which is 8960.
    Identifiers of the invoice are
    Company Code = u20188960u2019
    Invoice Document Type = u2018R9u2019
    Use table EKBE for relationship between Invoice, Goods Receipt and Purchase Order.
    The field EKPO-BEDNR contains the Shopping Cart Number
    Implementation Strategy
    1. To send email if the invoice is aged.
    Create a program that will run as a job. This job will frequently check for invoices related to SRM Shopping Carts received from E2OPEN. If an invoice is aged and there is no Goods Receipt posted through confirmation from SRM, then an email will go out from ECC system to the requestor. Using the Shopping Cart number on PO, find the requestor information (email ID) from SRM through an RFC call. This job will run once in a day and should have a parameter in the program for the age of the Invoice.
    Both the emails should contain a link (To be taken from SRM system, to be supplied) to log into SRM system. The email will also contain the Shopping Cart number and the Description of the item, and a message to Confirm this, including age of the invoice if aged.
    Please let me know how to create the batch job.
    Regards,
    Venkat
    Edited by: VenkatG on Sep 1, 2009 3:14 PM

    Do not pass the COMMIT_WORK flag... Leave it blank. Any transaction for the data integrity, one should never use COMMIT WORK related statements. Normally SAP transactions will have the COMMIT WORK at the end of transaction which is enough for any thing that is added part of the customer exits...
    Hope this helps.
    Regards
    Anjaiah

  • How to find out batch job failure and taking action:

    Normally We will monitor the batch jobs  through transaction code sm37 for job monitoring. In SM37 we will give a batch job name date and time  as  input. In the first step we will check the batch job for the reason failure or check the spool request for the batch job for failures an help in analyzing the error
    I understand from the my experience is that the batch may fail due to below reasons.
    1.,Data issues :             ex: Invalid character in quantity (Meins) field  >>>> We will correct the corresponding document with correct value or we will manually run or request the team to rerun the batch job by excluding  the problematic documents from the batch job variant  so that it may process other documents.
    2.Configuration issues : Materials XXXX is not extended for Plant >>>> we will contact the material master team or business to correct the data or we will raise sub contract call with support team to correct he data. Once the data been corrected and will request the team to rerun the batch job.
    3.Performance issues : Volume of the data being processed  by the batch job ,network problems.>>>Normally these kind of issues we will encounter during the month end process as there will lot of accounting transactions or documents being posted business hence it may cause the batch job failure as there is enough memory to complete the program or select queries in the program will timeout because of volume of the records.
    4.Network issues. : Temporary connectivity issues in other partner systems :Outage in other partner systems like APO or other system like GTS  will cause the batch job failure as Batch job not in position to connect other system to get the inforamtion and proceed for further steps.Nornmally we will check RFC destination status by running a custom program  weather connectivity between system are in progress or not. then intimate other partner system  for the further actions, Once the partner system comes online then we will intimate the team to restart or manually submit batch job.
    Some times we will create a manual job by transaction code SM36.

    I'm not sure what the question is among all that but if you want to check on jobs that are viewable via SM37 and started via SM36. The tables are TBTCP -Background Job Step Overview and TBTCO - Job Status Overview Table.
    You can use the following FM to get job details:
    GET_JOB_RUNTIME_INFO - Reading Background Job Runtime Data

  • Launch batch job thru SOAP call : no execution, connection OK.

    Hello,
    I am experiencing some problems launching batch jobs thru Web Services. I have added batch job "EPN_Test_Webservices" to the Web Services and enabled the Job Attributes.
    When I try to launch the job thru a SOAP call, I get a reply but the job is not executed.
    This is the SOAP envelope:
    <SOAP-ENV:Envelope xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/" xmlns:SOAP-ENC="http://schemas.xmlsoap.org/soap/encoding/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
         <SOAP-ENV:Body>
              <m:EPN_Test_Webservices_Job xmlns:m="http://www.businessobjects.com/DataIntegrator/ServerX.xsd">
                   <job_parameters>
                        <job_system_profile>String</job_system_profile>
                        <sampling_rate>10</sampling_rate>
                        <auditing>true</auditing>
                        <recovery>true</recovery>
                        <job_server>JS_ict</job_server>
                        <trace>String</trace>
                   </job_parameters>
              </m:EPN_Test_Webservices_Job>
         </SOAP-ENV:Body>
    </SOAP-ENV:Envelope>
    This is the reply I get:
    <soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
         <soapenv:Body>
              <BatchJobResponse>
                   <pid>3888</pid>
                   <cid>13</cid>
              </BatchJobResponse>
         </soapenv:Body>
    </soapenv:Envelope>
    When I use the "ping" operation I get the following reply:
    <soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
         <soapenv:Body>
              <pingVersion>
                   <version>Business Objects Data Integrator Version 11.7.2.0</version>
              </pingVersion>
         </soapenv:Body>
    </soapenv:Envelope>
    ...which indicates the connection works. Also the "processed counter" increases, but the job does not get executed. There is nothing in the  job log, there is no result (the job is supposed to write a record in a database table). What can be wrong?

    If I run the following (on the machine I'm trying to set up the share)
    smbclient -L localhost -U%
    I got the following output
    Connection to localhost failed (Error NT_STATUS_CONNECTION_REFUSED)
    so I thought it might be something incorrect with the iptables side of things, however I haven't really touched that at all and it seems to look correct
    iptables -nvL
    Chain INPUT (policy ACCEPT 667 packets, 79977 bytes)
    pkts bytes target prot opt in out source destination
    Chain FORWARD (policy ACCEPT 0 packets, 0 bytes)
    pkts bytes target prot opt in out source destination
    Chain OUTPUT (policy ACCEPT 157 packets, 20724 bytes)
    pkts bytes target prot opt in out source destination
    So from my (very little) knowledge this appears correct (I think)... However it appears that something is blocking access somewhere.

  • How to send an updated list using batch job

    Hi All,
      The program displays data on the screen, if the data looks ok, then there is an option to update.
    When I run update, the program submits a batch job and the basic list gets updated, but my batch job is still sending the data on the screen. how can i send the updated list using batch job.
      Ex: output of the program
                    1         2
           there is an update button on the screen, when i press update button, my program submits in batch job, the above list becomes
                    1        2
                    3        4
    but when i check the spool, it shows the o/p as         1           2 ..it is not sending the updated list.
    Please suggest me how to send the updated data
    Thanks,
    Kumar

    Hi Krishna,
      I have added a button on the alv list. when i press update button, my program updates the list, then submits the batch job. I am attaching the sample test program i am trying with, please suggest me how can i get the updated list.
    *& Report  ZTESTSSSSS
    REPORT  ZTESTSSSSS.
    DATA: gt_fieldcat TYPE slis_fieldcat_alv,
          lt_fieldcat type slis_t_fieldcat_alv,
          gt_sort     TYPE slis_t_sortinfo_alv,
          g_repid     LIKE sy-repid,
          gt_layout   TYPE slis_layout_alv.
    start-of-selection.
      lt_return-type = 'S'.
      lt_return-message = 'test message'.
      append lt_return.
      CLEAR gt_fieldcat.
      gt_fieldcat-fieldname = 'TYPE'.
      gt_fieldcat-outputlen = '3'.
      gt_fieldcat-tabname   = 'LT_RETURN'.
      gt_fieldcat-seltext_l  =  'Type'.
      gt_fieldcat-seltext_m  =  'Type'.
      gt_fieldcat-seltext_s  =  'Type'.
      APPEND gt_fieldcat TO lt_fieldcat.
      CLEAR gt_fieldcat.
      gt_fieldcat-fieldname = 'MESSAGE'.
      gt_fieldcat-outputlen = '15'.
      gt_fieldcat-tabname   = 'LT_RETURN'.
      gt_fieldcat-seltext_l  =  'Message'.
      gt_fieldcat-seltext_m  =  'Message'.
      gt_fieldcat-seltext_s  =  'Message'.
      APPEND gt_fieldcat TO lt_fieldcat.
      CALL FUNCTION 'REUSE_ALV_GRID_DISPLAY'
        EXPORTING
          I_CALLBACK_PROGRAM       = sy-repid
          I_CALLBACK_PF_STATUS_SET = 'SET_PF_STATUS'
          I_CALLBACK_USER_COMMAND  = 'USER_COMMAND'
          IT_FIELDCAT              = lt_fieldcat
        TABLES
          T_OUTTAB                 = lt_return
        EXCEPTIONS
          PROGRAM_ERROR            = 1
          OTHERS                   = 2.
    *&      Form  set_pf_status
          text
         -->RT_EXTAB   text
    FORM set_pf_status USING rt_extab TYPE slis_t_extab.
      SET PF-STATUS 'STANDARD'.
    ENDFORM. "Set_pf_status
    *&      Form  user_command
          text
         -->R_UCOMM      text
         -->RS_SELFIELD  text
    FORM user_command USING r_ucomm     LIKE sy-ucomm
                            rs_selfield TYPE slis_selfield.
      DATA: li_count TYPE I.
      IF r_ucomm EQ 'UPD'.
    Adding another message
        lt_return-type = 'S'.
        lt_return-message = 'Another test message'.
        APPEND lt_return.
        rs_selfield-refresh = 'X'.
        rs_selfield-col_stable = 'X'.
        rs_selfield-row_stable = 'X'.
        l_upd = 'X'.
       LOOP AT lt_return.
         WRITE: / lt_return-type, lt_return-message.
       ENDLOOP.
        IF sy-batch IS INITIAL.
          l_upd = 'X'.
    Open the Job
          CALL FUNCTION 'JOB_OPEN'
            EXPORTING
              jobname          = w_name
            IMPORTING
              jobcount         = w_number
            EXCEPTIONS
              cant_create_job  = 1
              invalid_job_data = 2
              jobname_missing  = 3
              OTHERS           = 4.
          IF sy-subrc = 0.
            SUBMIT ('ZTESTSSSSS') VIA JOB w_name NUMBER w_number
                    AND RETURN
                    WITH p_recnnr = p_recnnr.
            CALL FUNCTION 'JOB_CLOSE'
              EXPORTING
                jobcount             = w_number
                jobname              = w_name
                strtimmed            = 'X'
              EXCEPTIONS
                cant_start_immediate = 1
                invalid_startdate    = 2
                jobname_missing      = 3
                job_close_failed     = 4
                job_nosteps          = 5
                job_notex            = 6
                lock_failed          = 7
                OTHERS               = 8.
          ENDIF.
        ENDIF.
      ENDIF.
    ENDFORM.  "User_command
    Thanks,
    Kumar

  • Pass lock to background job

    Hi,
    I'm looking for ideas.
    We have an inbound process for special messages transfered from XI system via asynchrounous call. Messages are identified by a GUID.
    The inbound process will save the messages in a database table and create background jobs for processing big messages while small messages are processed directly.
    To avoid any interference, we want to lock messages as long as they are processed. For this we have a lock object and the Enqueue/dequeue modules.
    Now my question is: How can I pass a lock to the background job (created by SUBMIT VIA JOB) and have it released there after processing is done. The Job will run in a new LUW.
    And then, if any error in background (or online) processing occurs, how to make sure all existing locks are released?
    Thanks for some good ideas on how to proceed.
    Regards,
    Clemens

    resolved - we won't do it. We'll create some kind of queue: A fully-buffered database table where we put the lock entries and remoce them when it's time to do so. This way we can integrate an activity monitor watching the objects being processed.
    Thanks for reading
    Clemens

  • Creating batch job with 5 classes

    Hi All,
    I have a requirement in which I have to do five different operations at different time instances.
    1.Generate a report at morning 11.00 AM.
    2.update a table with more than 6000 rows at night 12.30 AM
    3.Generate a second report querying from the database at 10.00 AM Everyday.
    4.Generate an automail at 11.30 Am Everyday
    All these are plain Java classes and not web components. How could I effectively design the batch job so that it doesnot take nmuch memory and design classes so that they must be reusable like DB connection,Getting a db field value frequently etc.
    Can any one help me on this.

    http://www.google.com/search?q=job+schedule+in+java&client=netscape-pp&rls=com.netscape:en-US

  • Program or function module to get active Batch jobs

    Hi Experts,
                     I need a program or a FM that would give me the list of active batch jobs and also the time they were running
    for.
    Plz help me out

    You Just follow the below code, It is exactly same as your requirement.
    TYPES:   BEGIN OF ty_itbl,
               jobname   TYPE   tbtcp-jobname,
               jobcount  TYPE   tbtcp-jobcount,
               stepcount TYPE   i,"tbtcp-stepcount,
               sdldate   TYPE   tbtcp-sdldate,
               sdltime   TYPE   tbtcp-sdltime,
               sdluname  TYPE   tbtcp-sdluname,
               status    TYPE   tbtco-status,
               END OF ty_itbl.
      DATA :   wt_itbl TYPE TABLE OF ty_itbl,
               wa_itbl TYPE ty_itbl.
    SELECT     a~jobname
                   a~jobcount
                   a~stepcount
                   a~sdldate
                   a~sdltime
                   a~sdluname
                   b~status
                   INTO CORRESPONDING FIELDS OF TABLE wt_itbl
                   FROM tbtcp AS a
                   INNER JOIN tbtco AS b
                   ON    b~jobname    EQ    a~jobname
                   AND   b~jobcount   EQ    a~jobcount
                   WHERE a~progname IN wp_prog
                   AND   a~sdldate  IN wp_date
                   AND  b~status eq <STATUS>. "<-Give the Active Status here

Maybe you are looking for

  • How to Set default value for taxonomywebtagging control with terms and nested terms

    Hi, I have created taxonomy control in custom aspx page and I am able to select terms but I am trying to setup default value to that control. Can anybody let me know how to set the default value for TaxonomyWebTagging control in custom.aspx page with

  • Problem with Clearbox Lightbox

    Hi Everyone, I'm using Clearbox Lightbox 2.0. My thumbnails keep "jumping around" when I move the curser over them. Can anyone advise me on how I can get these thumbnails to remain still? Thanks!

  • Macbook Pro 13" + Dell US2410

    Hello fellas, bought a dell 2410 2 weeks ago and find that when i edit stuffs on lightroom 3, it kind of lags as compared to when i am running LR3 on my macbook itself without using the external monitor. would like to ask if the graphic card on board

  • March Madness...???

    What's the story this year with March Madness on iTune...?

  • Where is: SampleShapefileToJGeomFeature.java

    We are starting an Oracle Spatial project and need to convert a few hundred ESRI shapefiles to a SDO_GEOMETRY datatype. Where can I find the SampleShapefileToJGeomFeature.java file and JAR files to help with the conversion? Thanks in advance...