Bachground Jobs from WF-Batch

Hi,
     In my workflow, I escaled Org path(Approver levels) using RH_STRUC_GET this function module,this step is back ground step assigned ,my doubt is if I want to see If i want to see the Approver now
I mean the workitem is in which Approvers inbox it sitted..
How  can i check this?
Thanks and regards
Raj

Hi Aditya,
   I checked in workflow log.in that view workflow agents not find any agents
can you plz help on this
Thanks
Raj
Edited by: RaviRaj K on Jun 23, 2009 6:51 AM
Edited by: RaviRaj K on Jun 23, 2009 6:52 AM
Edited by: RaviRaj K on Jun 23, 2009 6:52 AM

Similar Messages

  • Triggering Real time job from Batch jobs..

    Can we trigger real time jobs from the batch jobs. As soon as batch job completes we need to start real time job automatically. Is this possible with BODI XI?

    Greetings Post Originator,
    This post is older than 60 days and there are no entries in the past 30 days.  Based on the content discussed, it appears that you question has been answered. This message is being marked as answered and points are being assigned if available where possible. 
    Thank you for being an active participant in the SAP Forums,
    Rob Siegele
    Forum Moderator
    SAP Americas

  • Is there a way to access ABAP OO instances from a batch job report

    Hello,
    I am looking for a way to access ABAP OO instances from a batch job report. My circumstances are the following:
    I have got some ABAP OO coding that identifies other objects (class instances) that have to be processed (they have a DoIt method that does some calculation). As this processing is time consuming and performace relevant I have to parallelize this in batch jobs. This batch jobs however can only be "simple" ABAP reports according to SM36. The problem is now that I dont really know how to tell the batch job report what objects to process. Somehow I have to access theses instances from  that batch job report.
    Does anybody have an idea?
    Greetings
    Matthias

    Hi David,
    thanks a lot for your help. After a lot of searching on the net this seems to be the only way to cope with it. However I am not sure about the locking mechanisms and if its suitable for mass data processing. In the help page you suggested the following is stated which I do not fully understand::
    "The current lock logic does not enable you to set specific locks for the following requirements:
    ·        Many parallel read and write accesses
    ·        Frequent write accesses
    ·        Division into changeable and non-changeable areas
    Although the lock logic makes the first two points technically possible, they are not practical because most accesses would be rejected."
    Does this mean
    a) if I dont want to set "specific locks" for frequent write accessess I am fine
    or
    b) I am discouraged from use shared memory technics for frequent write accessess?
    In the latter case I will have a problem...
    What do you think?
    Greets
    Matthias

  • Printing to the wrong Printer from the Batch Job

    Hi,
    Morning every one!
    I am facing a problem with Print which is coming from Batch Job. The Batch job contains 10 steps. Previously all steps out put has configured to one printer. Recently I changed one step out put to the another printer with the request. But the print is going to the old printer still.
    When I am checking at the batch job -> Steps -> Specific step ->Print Specifications -> Output Device .. able to see New Printer.
    But batch job -> Steps-> Select Specific step -> Spool -> Doble click .. There spool is going to the old printer. I am facing this problem for only one step.
    The issue is dragging from past one week.. I tried couple of options..No luck
    I appreciate any inputs/Solution for this.
    Env:  ECC5.0/Solaris/Oracle
    Thanks,
    Sri.

    I got the solution finally from ABAP team. The printer is maintained in the Condition records of the Variant for the particular step. Even though I assigned the printer in Batch job..It is taking the printer from Condition record.
    Thanks,
    Sri.

  • Parallel batch jobs from ABAP

    Hi all -
    Any good suggestions regarding the simplest way to submit and track parallel batch jobs from ABAP?  I have used the function modules in function group BTCH (E.g. BP_JOB_CREATE, etc.) in the past with good results but are there any more current and simpler approaches (E.g. class based)?  Running WAS 7.0 / ECC 6.0.
    Thanks,
    Pat

    This may not be the right forum for such a query. Post the query on the Managing oracle Applications forums.
    I am not aware of the GL jobs, but if a concurrent program is not uncompatible with itself, there is no problem running multiple jobs at the same time.
    Thanks

  • How to display custom error message in Job log for batch processing

    Hi All,
    I am rexecuting one R/3 report in batch mode and i want to display all the custom error i have handled in job log when its executed from SM36,SM37. The custom error are like 'Delovery/Shipmet doe not exits' or others which we can display in online mode like message e100(ZFI) or any other way and accordingly we can handle the program control like come out of the program ro leave to transaction'Zxxx' or anything. But i want my program to be executed completely and accumulate all the error in job log of batch processing.
    Can anyone tell me how can i do so...
    Thanks,
    Amrita

    Hi,
    Thats what i have done from the begining. I have written message like this:
    Message i100(ZFI).
    I was hoping to see this message in the log. But i cant see. Can you help me pleae...

  • Print the job from the spool in abtch mode

    I want to be able to print a job from the spool in batch mode. In other words a job currently exists that creates a report that currently goes to a printer.
    The user wants the report to print on two different printers. This extra printer cannot be placed in the job, since SAP only allows one printer per output. What I would like is to find a SAP program that would accept a spool name and a printer destination. It necessary we can write a program that might execute a SAP function to do the same.
    SP01, the spool transaction, does not easily adapt itself to batch execution. I donu2019t want to create a bdc session using it.

    Hi,
    If the domain of these fields are of type SYTIME, then adding the fields like a numeric value should suffice.

  • Submit batch job daynamically using batch user id

    Hi,
    I need to submit the background job dynamically from ABAP program with BATCH user id.
    I have created new program. Users will execute program in foreground.
    Had issue with USER id. If I give my user id then batch job successfully creating. If I give u2018BAICHu2019 id then it is giving the error.
    Am I doing any thing wrong here?
    Code:
    Open Job
      CALL FUNCTION 'JOB_OPEN'
        EXPORTING
          jobname  = jobname
        IMPORTING
          jobcount = w_jobcount.
        SUBMIT zrufilep WITH p_file1 = sourfile
                        WITH p_file2 = destfile
                        VIA JOB jobname NUMBER w_jobcount
                        USER 'BATCH_FI'
                        AND RETURN.
    Schedule and close job.
          CALL FUNCTION 'JOB_CLOSE'
            EXPORTING
              jobcount  = w_jobcount
              jobname   = jobname
              sdlstrtdt = sy-datum
              sdlstrttm = sy-uzeit.
    Thanks

    Can any body tell me if that authorization is given to the user, can he submit any other job from front end?
    As you can see if he can, it will conflict security issues of business. So If he is not permitted to execute these kind of statements in any other program, would he be confined to using diff user in this program only?
    Or is there any other way-out to confine the user for getting this authorization for a single T-code or such?

  • How Can I Run a SQL Loader Job from Schedular

    How Can I Run a SQL Loader Job from Schedular , So that It Runs every Day.

    Depends on a couple of factors.
    If you are on a UNIX platform, you can create a shell script and schedule it with cron.
    If you are on a Windows platform, you can create a batch file and schedule it with the Windows scheduler.
    Or, if you are on Oracle 9i or 10g, you could use the external table feature instead of SQL*Loader. Then you could write a stored procedure to process the external table and schedule it using the Oracle scheduler (DBMS_JOB). This would probably be my preference.

  • Cannot delete old sessions from SM35 - Batch Input...

    Cannot delete old sessions from SM35 - Batch Input...                
    We have an issue, we are trying to delete old batch input sessions via SM35 and no
    matter how hard we try we cannot get rid of them. They are still in status "In Process" but they are status incorrect.
    Those sessions have long finished and some are more than 5 years old.
    We have tried running rsdbcreo and RSBDC_REORG but it only returns with
    Temse Error, running SP12 does not list any inconsistencies. Basically we want all jobs in SM35 that ran this year only available in the SM35 queue, we have sessions that have run from year 2004 to 2008 we wish to
    delete.
    Anybody experience anything similar or have any advice, sorry this is an environment that i have inherited and wish to cleanup.
    We have checked the following notes 706478,76422,706478,76422 and many more.
    Thanks
    Suleman

    When you try to delete, you will be asked whether you want to delete logs too.
    Here try with 'No'.
    hope this helps

  • How to trigger a job from the another job?

    Dear Pals,
    Im new to Business objects data integrator 11.7.2.3. 
    Requirement :
    I have two jobs JOB1 and JOB2. JOB2 to be executed only after completion of JOB1.Please do let me know how to meet this requirement.
    Thanks in advance.
    Regards,
    Diras

    export the execution command for both the jobs from the web admin batch job history page
    this will export the job launch command to a .bat or .sh file depending on OS to %LINK_DIR%/log directory
    you can call these jobs in another script in sequence
    or if you are looking for lanuching Job2 only if Job1 is successful, then in the Job1 you can add a script at the end and call the script (to which the execution command is exported) for Job2 by using the exec function

  • Do we have the option to copy all the jobs from one OEM admin to other ?

    Hi Experts,
    I have a number of jobs created under one administrator account in OEM 10g. And now my requirement is to re-create these jobs under some other adminstrator User account.
    So, is there any process in OEM , where we can copy all the jobs from one user account to the other user accounts.
    I guess create like option can be only Helpful if you are planning to copy a single OEM job. And also , administrator who will copying the jobs , should have a super admins assigned so that the job is visible for him.
    So, can someone Help me out in letting me if there is a process for doing this in a single shot ?
    Thank you very much for your Help.
    Thanks,
    NAV.

    Do you need the jobs to be OWNED by the new admin or just accessible and modified? You can edit the job and grant the user Full permissions in the Access tab. Otherwise, the only option is to Create Like as you mentioned. No batch method.

  • How to execute the job from script ??

    How to execute the job from script ?? i have 2 jobs  A AND B . I want to execute job B  from job A'S Script ?? how can i ??

    Hi Kishore,
    Please refer the below link for BODS Job execution using Script
    Executing a job by another job in BODS 4.1 using simple script
    http://scn.sap.com/community/data-services/blog/2013/12/04/executing-a-job-by-another-job-in-bods-41-using-simple-script
    Steps for executing BODS job from Unix Script with user defined global parameters
    http://scn.sap.com/community/data-services/blog/2013/09/02/steps-for-executing-bods-job-from-unix-script-with-user-defined-global-parameters
    Executing a job using batch file
    http://scn.sap.com/thread/3503338
    How to add a schedule for job2 with a condition after job 1 is finished
    http://scn.sap.com/message/14523514#14523514
    Scheduling BODS Jobs Sequentially and Conditionally
    http://scn.sap.com/docs/DOC-34648
    Thanks,
    Daya

  • Migrating BO11 jobs from one CMS to another

    I need to migrate about 50 jobs from a BO11 CMS on one server to another CMS on another server. Is there any way to do this in a batch? I assume I can't do a restore because I don't want to disturb the existing jobs on the target CMS.

    You can use the Import Wizard for this. Just select the reports you want to transfer and do not forget to select the option to include the instances in the transports. The jobs are defined are report instances with status "Recurring".
    Regards,
    Stratos

  • Can not see running jobs from, package

    I have a problem with processing parallely runing jobs:
    I am creating another immediately runned jobs in a main job. Those two parallel jobs (2 loads from different databases) have to be finished before I run next operation (working out loaded data). Problem is that, after starting those 2 parallel jobs, I can not see them by select from ALL_SCHEDULER_RUNNING_JOBS that is executed immediately after I create those jobs in a package. If I take a look into ALL_SCHEDULER_RUNNING_JOBS from anonymous statement I can see all my running jobs. Let's sum it up:
    1/ Start of main job
    2/ Running 2 immediately created jobs (load data)
    3/ Checking in loop if jobs created in step 2 are still running
    3.1/ Jobs are running (ALL_SCHEDULER_RUNNING_JOBS check) - sleep for a
    while - It never happens - can't see any running jobs from select executed in
    package but can see them in
    anonymous statement
    3.2/ Jobs finished - start processing loaded data
    Can somebody help me with this task?
    Thanks alot!
    Jakub

    Hi,
    There is no reason a job should be visible from an anonymous block but not from inside a job. There are two things that may be happening here.
    - jobs scheduled to run immediately my not start running as soon as they are created/enabled, you may need to wait a bit before they start running (they will appear in all_scheduler_jobs immediately but maybe not all_scheduler_running_jobs immediately)
    - you may be running into privilege issues. Is the user that executes the anonymous block the same as the user that the job is running as (the job's schema) ? If not maybe the job user does not have privileges to see the job (you can grant alter on the job to the user to ensure this).
    Can you see the jobs in the all_scheduler_jobs view from within the job with status RUNNING ? If you can see jobs in all_scheduler_jobs as RUNNING but not in all_scheduler_running_jobs then this is a bug of some sort.
    Thanks,
    Ravi.

Maybe you are looking for