Sequence periodoic Jobs

Hello All,
I have 3 jobs thats needs to run one after the other.
Job A, Job B and job C.
When Job A is complete Job B needs to trigger and then when job B is done Job C.
I have set the Jobs in SM36, First i have set Job A witha time and date and schedules to run everyday as a periodic job.
Now for Job B, I have clicked the tab after Job and i have given to atart after Job A and for JOb C after Job B.
However the problem is it all works fine the first day of the jon. ALl job ran and everthing looks good.
But from the second day onwards Job B and Job C are not running. However Job A is getting completed everyday.
Any suggestions on why this happens and what can be done to resolve this.
Thanks,
Ster

Hi,
I think if you schedule JOB B after JOB A then there is no need to make JOB B periodic as logically if JOBA is periodic then JOB B will become periodic.
For documentation you can go through this link.
[http://help.sap.com/saphelp_erp60_sp/helpdata/EN/c4/3a7f87505211d189550000e829fbbd/frameset.htm]
Regards,
Ankur Parab

Similar Messages

  • Drop/Create sequence using Oracle Job Scheduler

    IDE for Oracle SQL Development: TOAD 9.0
    Question: I am trying to do the following:
    1. Check if a certain sequence exists in the user_sequences table
    2. Drop the sequence if it exists
    3. Re-create the same sequence afterward
    All in a job that is scheduled to run daily at 12:00 AM.
    What I would like to know is if this is even possible in the first place with Oracle jobs. I tried the following:
    1. Create the actual "BEGIN...END" anonymous block in the job.
    2. Create a procedure that uses a dynamic SQL string using the same "BEGIN...END" block that drops and recreates the sequence using the EXECUTE IMMEDIATE commands
    But I have failed on all accounts. It always produces some sort of authorization error which leads me to believe that DDL statements cannot be executed using jobs, only DML statements.
    BTW, by oracle jobs, I mean the SYS.DBMS_JOBS.SUBMIT object, not the job scheduler.
    Please do not ask me why I need to drop and recreate the sequence. It's just a business requirement that my clients gave me. I just want to know if it can be done using jobs. If not, I would like to know if there are any work-arounds possible.
    Thank you.

    Please do not ask me why I need to drop and recreate the sequence. It's just a business requirement that my clients gave me. I just want to know if it can be done using jobs. If not, I would like to know if there are any work-arounds possible.Well, I won't ask you then, but can you ask your clients why on earth they would want that?
    Do they know that doing DDL 'on the fly' will invalidate the dependent objects?
    Best shot you can give at it is reset the sequence. And you could do it in a job, yes, as long as it's interval is during some maintenance window (no active users).
    Regarding resetting a sequence, you, (and your clients) should read this followup:
    http://asktom.oracle.com/pls/apex/f?p=100:11:0::::P11_QUESTION_ID:1119633817597
    (you can find lots more info on sequences and jobs by doing a search from the homepage http://asktom.oracle.com)
    Regarding the authorization errors: your DBA should be able to provide you the nessecary privileges.
    But in the end, this is something I'd rather not would like to see implemented on a production system...

  • How to avoid Sequence Gapping in ODI Interface

    I have an employee table and a job map table in oracle based on the employee type/group and job type/group I need to pick up the sequence, generate id
    and update employee id of employee table using ODI
    employee table job table
    name type group id type group sequencename seqmin seqmax
    pat 1 1 1 1 seq1 1 10
    dan 1 2 1 2 seq2 30 40
    john 1 3 1 3 seq3 20 100
    when I select the sequence using if condition or case statement or decode and call the sequence <%=snpRef.getObjectName( "L" , "My_SEQ" , "D" )%>.nextval
    the sequences are creating gaps for every call.as the sequence is incrementing internally for every wrong mapping. How should I get rid of these gaps.
    In oracle database we call functions in the case condition.These functions consists of the seq.nextval code and the unwanted incremental gapping is avoided.
    But in the case of ODI how can we get this.
    Thanks,
    Vikram

    I am facing this issue when I execute on the source or staging area.When I try toexecute on the target,the ODI
    doesn't allow me to execute and gives the following warning
    "A mapping executed on the target cannot reference source columns. Move this mapping on the source or the staging area.     Target Column Employeeid"
    In my case I am using IKM Oracle Incremental update
    the source datastore is employee table, job table and target is copy of employee table(as i need to update the employee id column with sequence numbers by picking the right sequence from job table,sequence name column)

  • Back Ground Job scedule (emergency)

    Hi all ,
    I have scheduled Back Ground Job in Production System at morning 11am, now the job is in running status,but i wonder that before the status come to completed the
    LAST RUN Column is filled with Date and time same at 11:am.
    Please find the below job log details i am pasting last few lines of job log
    Jun 5, 2008 2:15:22 PM com.virsa.cc.xsys.riskanalysis.AnalysisEngine performActPermAnalysis
    INFO:  Job ID:6 : Before GC => memory usage: free=481M, total=1009M
    Jun 5, 2008 2:15:22 PM com.virsa.cc.xsys.riskanalysis.AnalysisEngine performActPermAnalysis
    INFO:  Job ID:6 : Analysis starts: HTREMGMT2
    I would like to know wheather job is running or terminated.
    Wat is the sequence of JOB status in CC5.2.
    Kindly do the needfull.

    Hi Amol,
    Thank You very much for ur quick reponse,
    please find the view job log details
    (last few lines)
    Jun 5, 2008 3:07:12 PM com.virsa.cc.xsys.riskanalysis.AnalysisEngine performActPermAnalysis
    INFO:  Job ID:6 : 464 out of 818 (56%) done
    Jun 5, 2008 3:07:12 PM com.virsa.cc.xsys.bg.BgJob setStatus
    INFO: Job ID: 6 Status: Running
    Jun 5, 2008 3:07:12 PM com.virsa.cc.xsys.riskanalysis.AnalysisEngine performActPermAnalysis
    INFO:  Job ID:6 : Before GC => memory usage: free=347M, total=1009M
    Jun 5, 2008 3:07:12 PM com.virsa.cc.xsys.riskanalysis.AnalysisEngine performActPermAnalysis
    INFO:  Job ID:6 : Analysis starts: MACTUSER17
    Jun 5, 2008 3:07:21 PM com.virsa.cc.xsys.meng.ObjAuthMatcher <init>
    FINEST: ObjAuthMatcher constructed: 0ms, #singles=970, #ranges=0, #super=0
    Jun 5, 2008 3:07:22 PM com.virsa.cc.dataextractor.bo.DataExtractorSAP getObjPermissions
    FINEST: getObjPermissions: elapsed time=891ms
    can u tell me the job status sequence

  • How to stop process chain job in middle

    Hi,
    I have designed process chain ,to trigger following sequence of jobs
    Step 1 Switching from Transactional cube to Basic cube and Vice versa.
    Step 2 Loading data from Transactional cube to basic cube.
    Step 3 Buffering the Scorecards.
    My question is From Step1.
    where i am using one ABAP program as process type( ABAP program functinality is it will switch Transactional cube to Basic cube if there is any yellow request in Transactional cube,If there is no yellow request in the Transactional cube,it should stop from step 1 it self,  but I wrote a code to throw error to stop the job, in that case in sm 37 I am able to see cancelled job of this process I don't want to see cancelled job in SM 37 since this prcess chain  has been sheduled for every 1 hour, is there any way to stop this job to not to proceed successive process. please give me some input from ur side
    Thanks in advance.
    Best Regards
    Shiva

    Hi Shiva,
    This is what we have done here:
    1. Change the ABAP process type from "Process ends non-specifically" to "Process ends successful or incorrect" from Settings => Maintain Process Type.
    2. Copy SE37 function "RSPC_ABAP_FINISH" to a Z-function. Add state as the input parameter and pass that to l_s_log-state and i_state (In Function Call).
    3. Now in your program if you want to execute further step call this Z-function with status "F" else with "R" and give your ABAP Variant Name as the other input.
    Let me know if you need more help.
    Regards,
    RB

  • How to limit operator swapping the discrete job seqence in shopfloor

    Our planner prioritize and sequence the jobs released to the shopfloor and posted to dispatch board. The current issue is, operator can easily swap the sequence in the dispatch board and we are unable to achieve the priority & sequence set by planner.
    Can some one help me out with this issue, so that even operator swap in dispatch board the system should not allow them to perform WIP transaction and mandate them to follow the sequence.
    Hoping for your help.
    Thanks,
    Edited by: user604737 on Nov 10, 2010 9:56 AM

    This is a classic example of relying on computers when common sense and business policy/procedures should take over.
    There are many ways to achieve this
    1) Write a d/b trigger that stops the move transaction record being committed if prior transactions are not done.
    2) Personalize the move txn screen to show only appropriate jobs in the LOV for the job # field.
    But the sensible thing to do is instruct floor personnel that they should pick up jobs from a queue displayed on the floor board. And if they don't, then slap their wrists or more.
    Hope that the first 2 suggestions don't help,
    Sandeep Gandhi

  • Process chain in APO - batch jobs

    Dear Friends
    what for process chain in APO is used
    For instance in the sytem i noticed
    there are many process chain created
    Daily Master data
    Daily transfer of forcast
    there are so many process chain created
    what for it is used
    can you please give me a clear  idea ..
    Thanks & Regards
    Raj

    Hi
    In APO DP or SNP we have many jobs which have to following certain sequence. For example after loading the sales history only you can generate the CVCs. All these jobs has to start and finish in sequence. In the process chain you can defined or sequence all the your jobs according your need. You can also schedule these process chain as pre our requirement.
    So I will say process chain is better and more flexible way of scheduling and sequencing different jobs in APO. For example In you case you have daily process chain which might be loading the data in different Infoobjects  which are master data.
    Thanks
    Amol

  • A workflow to burn AVCHD camcorder video and 5.1 sound to BluRay disc.

    A workflow to burn AVCHD camcorder video and 5.1 sound to BluRay disc.
    After weeks searching through many posts by the Forum’s experts with Final Cut Pro and Compressor, I have a crude workflow that seems to work for me, and might be of interest to others.  This information I couldn’t find in the Help files of either FCP or Compressor, and would appreciate feedback on other options. Thanks to the Forum contributors who have helped me with suggestions.
    I have successfully burned a BluRay DVD of the video and audio outputs of my new Panasonic AVCHD camcorder, that is, 1920x1080 video and 5.1 surround sound,  using Final Cut Studio 2  (FCP 7.0.2, Compressor 3.5.4.) on an Intel iMac with  OS 10.6.3
    Using AUNSOFT-PAVTUBE or CLIPWRAP, I converted the camcorder’s MTS files to ProRes 422 .mov files containing 5.1 six channels of audio.
    In FCPro,  set the Sequence / Settings / Audio Outputs to 5.1 Monitoring:
         L+R Stereo,     Stereo
         Center             Dual Mono
         LFE                Dual Mono
         Ls +Rs            Stereo
    Uncheck the “Downmix…to Stereo”  in Warning box that pops up when this step is completed.
    Drag the ProRes .mov file into the time line of FCP 7:
    The next steps are important for assigning each of the six audio channels to the Dolby 5.1 configuration (L, R, C, Lfe, Ls, Rs).  This step was new to me and something I couldn’t find in the FCPro or Compressor Help file manual. If anyone can reference a page number, I would appreciate that info.
    In the FCP time line, unlink the video from the audio channels (Linked Selection) in the upper right corner of the FCP time line.
    Select each audio channel, then right click in the area of the blank column near the padlock. Assign A1, A2, A3 etc to each of the audio channels as they fit the Dolby configuration ( L+R, C, Lfe, Ls+Rs) This process is kind of clunky and it may take patience to accomplish.
    The best Forum ideas that I could find for setting up the six channel audio came from the following posts which I credit for their help:
    https://discussions.apple.com/message/9095726#9095726
    https://discussions.apple.com/message/12525373#12525373
    In FCPro, Mark In and Out points, Select In to Out
    Under File / Share option ………Select BluRay job……then Select either “Export” or “Send to Compressor”
    (Note: At this point, the  Option Export which enables FCPro to burn a BluRay DVD, seems to work well, and the resulting DVD video and six channels of audio seems as good as that produced by the more complicated option of “Send to Compressor”……..which is described below.)
    If the Send to Compressor option is selected, Compressor is automatically started by this selection. At this point, do not Quit FCPro, because Compressor needs to transfer file data (or something like that) to Compressor.
    In the job pane of Compressor, will exist two targets, H264 for BluRay and Dolby Digital.
    I deleted the Dolby Digital and replaced it with Dolby Digital Professional Auto.
    Clicking any where in the Sequence 1 job pane (not in H264 nor Dolby target rows) will reveal the A/V attributes:
    Under the Job Action tab, select the BluRay unit that will eventually do the burning.
    Clicking on the target H264 for BluRay located in the larger Job Pane reveals the settings I chose:
    Clicking on the target Dolby Digital Professional (Auto) reveals these settings I chose:
    I left settings on the Bitstream tab untouched.  However the Preprocessing tab, I set the Compression Preset to “None”.
    I found that turning on the BluRay burner with a preloaded BD/RE is best to do at this time, or even better, to energize the burner at the time that the “Send to Compressor” function is activated.
    When settings are complete, select “Submit” in the lower corner of the Compressor larger pane and processing will begin, and a status pane indicating time elapsed and time remaining will appear. These estimates are not very accurate.
    I have found that my 1 minute test video was initially estimated to require about two hours of processing time, but actually required only 1 hour which is still unusually long. Based on my experience, be prepared for an  “overnight” processing to occur for longer movie durations.
    This is the part of the overall process I need to understand better:  How to estimate the duration to encode and burn 1 minute of video / audio?  This 1 hour duration for 1 minute of video/audio was the same whether I SHARE-Exported to FCPro to burn DVD, or SHARE-Send to Compressor option.
    Wondering if the encoding of the six 5.1 audio channels caused the lengthy processing(?).  Perhaps settings that I made in Compressor affected time to process and burn.
    Finally, I hope this poorly written process will help someone who has been looking for the same information as I had been. I would appreciate feedback from those who have already done this; what OS are you using, what hardware are you using? What software package have you tried.  And lastly, Thanks to all who contribute to these Community Forums, who take the time to detail their processes. You all have helped me to get this far.
    BoBo

    Go to https://discussions.apple.com/thread/4719249
    BoBo

  • Materialized View On top of View Refesh Question

    I have senario...where we have a MVW on top of view....How can I refesh the MVW everytime the view refreshes...?
    THe MVW is very small (less than 150 rows)..and its straight data from view
    any suggestions?

    Hi dude,
    Please refere below code
    Materialized View Built on View Rewritten for FAST REFRESH
    SQL> DROP MATERIALIZED VIEW scott.emp_v_MV;
    SQL> CREATE MATERIALIZED VIEW scott.emp_v_MV
    NOLOGGING
    PARALLEL
    BUILD IMMEDIATE
    REFRESH FORCE ON DEMAND
    ENABLE QUERY REWRITE
    AS
    select * from emp_v
    SQL> truncate table mv_capabilities_table;
    SQL> exec dbms_mview.explain_mview('scott.emp_v_mv');
    SQL> set linesize 100
    SQL> SELECT capability_name,  possible, SUBSTR(msgtxt,1,60) AS msgtxt
               FROM mv_capabilities_table
               WHERE capability_name like '%FAST%';
    CAPABILITY_NAME                P MSGTXT
    REFRESH_FAST                   N
    REFRESH_FAST_AFTER_INSERT      N named view in FROM list not supported   for this type MV
    REFRESH_FAST_AFTER_INSERT      N named view in FROM list not supported for this type MV
    REFRESH_FAST_AFTER_INSERT      N view or subquery in from list
    REFRESH_FAST_AFTER_INSERT      N the detail table does not have a materialized view log
    REFRESH_FAST_AFTER_ONETAB_DML  N see the reason why REFRESH_FAST_AFTER_INSERT is disabled
    REFRESH_FAST_AFTER_ANY_DML     N see the reason why REFRESH_FAST_AFTER_ONETAB_DML is disabled
    REFRESH_FAST_PCT               N PCT is not possible on any of the detail tables in the mater
    SQL> DROP MATERIALIZED VIEW scott.emp_v_MV;
    SQL> CREATE MATERIALIZED VIEW scott.emp_v_MV
    NOLOGGING
    PARALLEL
    BUILD IMMEDIATE
    REFRESH FORCE ON DEMAND
    ENABLE QUERY REWRITE
    AS
    select * from emp;
    SQL> TRUNCATE TABLE mv_capabilities_table;
    SQL> EXEC dbms_mview.explain_mview('scott.emp_v_mv');
    SQL> SELECT capability_name,  possible, SUBSTR(msgtxt,1,60) AS msgtxt
               FROM mv_capabilities_table
               WHERE capability_name like '%FAST%';
    CAPABILITY_NAME                P MSGTXT
    REFRESH_FAST                   Y
    REFRESH_FAST_AFTER_INSERT      Y
    REFRESH_FAST_AFTER_ONETAB_DML  Y
    REFRESH_FAST_AFTER_ANY_DML     Y
    REFRESH_FAST_PCT               N PCT is not possible on any of the detail tables in the mater
    Materialized View Aggregation with Required Materialized View Logs:
    SQL> CREATE MATERIALIZED VIEW LOG ON scott.emp
    WITH SEQUENCE, ROWID (JOB, DEPTNO, SAL)
    INCLUDING NEW VALUES;
    SQL> CREATE MATERIALIZED VIEW LOG ON scott.dept
    WITH SEQUENCE, ROWID (DEPTNO)
    INCLUDING NEW VALUES;
    SQL> DROP MATERIALIZED VIEW scott.sal_dept_mv;
    SQL> CREATE MATERIALIZED VIEW scott.sal_dept_mv
               NOLOGGING
               PARALLEL
               BUILD IMMEDIATE
               REFRESH FORCE ON DEMAND
               ENABLE QUERY REWRITE
               AS
              SELECT e.job, e.deptno, sum(e.sal)
              FROM emp e,
                   dept d
              WHERE e.deptno=d.deptno
              GROUP BY e.job, e.deptno;

  • How to identify deltas

    Hi,
    create table t1(no number primary key, name varchar2(10), email varchar2(100));
    I don't have who columns(last_udpate_date, last_updated_by, creation_date, created_by) in my table t1
    I want to send records to my other team on a daily basis,
    assume on day 1 I have 10 records in my table t1 and today am sending all 10 records as it is a day1
    on day 2 I have 20 records in my table t1
    meaning below are the scenarios for 20 records
    a) 10 records newly added -> send only newly added 10
    b) 10 records added + 5 records updated -> send only newly added 10 and 5 updated
    c) 14 records added + 3 records updated + 4 records deleted -> send only 14 newly added + 3 updated + 4 records deleted

    >
    create table t1(no number primary key, name varchar2(10), email varchar2(100));
    I don't have who columns(last_udpate_date, last_updated_by, creation_date, created_by) in my table t1
    I want to send records to my other team on a daily basis,
    >
    Create a MATERIALIZED VIEW LOG on your table and let Oracle do all of the work of capturing which rows changed.
    That way you don't have to modify your table at all and the MV log will also capture DELETEs (you didn't mention how to tell the other site about rows that have been deleted).
    See CREATE MATERIALIZED VIEW LOG in the SQL Language doc
    http://docs.oracle.com/cd/B28359_01/server.111/b28286/statements_6003.htm
    >
    When DML changes are made to master table data, Oracle Database stores rows describing those changes in the materialized view log
    >
    Try this example in the scott schema
    drop materialized view log on emp_copy
    create table emp_copy as select * from emp
    alter table emp_copy add constraint pk_emp_copy primary key (empno)
    create materialized view log on emp_copy
    with sequence (ename, job, mgr, hiredate, sal, comm, deptno)
    including new values
    update emp_copy set job = 'TEST1' where empno = 7654Then take a look at the MV LOG (MLOG$_EMP_COPY) records and you will see all of the information you were wanting to capture.
    For your use case you just need the distinct ROWID or PRIMARY KEY values from the log. Use those to grab the rows from your table and then delete the log rows.

  • Design and save workflow diagrams using flex

    Hi,
    I am new to flex..I need to design a workflow diagrams using flex and save the information..Could you please help me on this..
    Actually what exactly i want is..
    In my project we have some jobs..Job is bunch of scripts..These jobs will run in different hosts(Server).In my project we have a provision to run these jobs in order in different hosts.But not for running the scripts in perticular sequence.
    For example::
    We have a,b,c,d,e,f jobs..
    1) these jobs run in order if we select one check box.This provision is there.Order means after a runs b should run.after b is done c should run and so on..
    2) there is another senario like.
       After a job runs b,c should run parallel and these 2 are done d shold run..after it is done e,f,g shold run paralell..some thing like this..This is a sequence of scripts running in different hosts..For this we nned to desing a UI like work flow diagram.
         In UI we show all jobs in data grid and user will drag the jobs and keep it an area and he connects all the jobs with some lines with arrow(to identify).There he should conncects the jobs in which sequence he wants to run the jobs.
      For this need to design a diagram and save the data in the UI.From UI we have to capture the sequence of jobs and store the sequence in db..
    Colud you plz provide me soltion for this..I am getting how to draw a flow diagram using flex and capture the sequence from UI and store the data in DB..
    Please help me out in this issue..
    Thanks in advance...

    I am trying to create an image file with flex and want to save it only in  a perticular directory ie., user should not be given any option to choose the location.Air application uses resolvePath where we can specify the path but donno how this can be achieved for a web-based application.
    Is there any workaround for this?
    Thanks.

  • Can HP Softpaqs be installed during deployment using offline media?

    I have some systems that use HP Softpaqs to install drivers/applications that need an installer to run and there is no apparent silent command that works on the softpaqs directly.  I tried /s /silent /quiet etc. and they all still pop up a wizard.
    So, we will need to use the HP SDM SSM to quietly install these application based drivers, but the process seems to run based on connecting to a UNC path.  However, we will be using offline USB deployments for these systems with no network. 
    Is there any way to get this done other than manually installing these Softpaqs or making a model specific image that contains all the drivers already installed? 

    In the majority of cases I find the softpaq exes are just self extracting to the swsetup folder.  YOu can sometimes get away with adding the app in as an application and using a VBWrapper.
    If they contain drivers only, add them in to deploy as normal, if they contain applications I'd control it through the task sequence.
    Crack open your task sequence and under the custom tasks folder create a folder for which ever model you're imaging.  Place a condition on the folder, such as a WMI query on the model so the tasks inside the folder will only run if the condition is
    met, then place your Softpaqs in as applications with a silent install command.  I blogged about this, it may help you.  See what you think:
    http://www.deploymentshare.com/post/singular-task-sequence-multiple-jobs
    If you create a "softpaqs" folder within applications and add them in as apps, you will be able to select that folder to be included in your offline media when you regenerate it so it wont look to any UNC path at all.
    Regards
    Jonnie
    MCP, MCTS, MCITP, MCSA.. Gunning for MCSE Cloud | Please visit www.deploymentshare.com | If I help you solve your issue please mark my reply as the answer.
    The problem is that some of the softpaqs will not install silently if you execute them directly and, if you use ssm to run them, I don't see any way of controlling the installation order.  I don't see what controls the installation order of ssm. 
    Some will install silently, but there are some really messy "driver" installs that install applications and make registry edits and run batch files that do other things, yet they do not document any silent commands to launch them other than using
    SSM.

  • Sending image sequence job to compressor in command line : how to set the framerate?

    Hello,
    In Compressor, when you import a image sequence, it's possible to set the framerate and add an audio file before choosing a preset and lauching the encoding task.
    I want to dlo the same thing with command line. I know how to send a compressor job with command line, but no how to add theses settings.
    In the compressor command line help, we've got this :
    Job Info: Used when submitting individual source files. Following parameters are repeated to enter multiple job targets in a batch
              -jobpath <url> -- url to source file.
                                               -- In case of Image Sequence, URL should be a file URL pointing to directory with image sequence.
                                               -- Additional parameters may be specified to set frameRate (e.g. frameRate=29.97) and audio file (e.g. audio=/usr/me/myaudiofile.mov).
    So there are also framerate and audio parameters in command line but i've no idea how to write the command line with theses parameters
    Here is an example of command line for Compressor (by Apple) :
    ./Compressor ‑clusterid tcp://127.0.0.1:51737 ‑batchname myBatch ‑jobpath /Volumes/Source/ShortClips/NTSC24p.mov ‑settingpath /Users/stomper10/Library/Application\ Support/Compressor/PhotoJPEG.setting ‑destinationpath /Users/machinename/Movies/myDestinationFilename.mov.
    Thank you for your help!

    You can see in the command sh running-config command
    show running-config : Displays the current access point operating configuration
    Use the guest-mode SSID configuration mode command to configure the radio interface (for the specified SSID) to support guest mode. Use the no form of the command to disable the guest mode.
    [no] guest-mode .
    Here is the guideline for usage
    The access point can have one guest-mode SSID or none at all. The guest-mode SSID is used in beacon frames and response frames to probe requests that specify the empty or wildcard SSID. If no guest-mode SSID exists, the beacon contains no SSID and probe requests with the wildcard SSID are ignored. Disabling the guest mode makes the networks slightly more secure. Enabling the guest mode helps clients that passively scan (do not transmit) associate with the access point. It also allows clients configured without a SSID to associate.
    Examples
    This example shows how to set the wireless LAN for the specified SSID into guest mode:
    AP(config-if-ssid)# guest-mode
    This example shows how to reset the guest-mode parameter to default values:
    AP(config-if-ssid)# no guest-mode

  • Error in sequence jobs run in backround

    Hi All,
    I am getting problem while submitting the bacground in sequence.
    i need to execute the jobs in loop.
    i have put the logic
    loop at (no. of jobs)
    CALL FUNCTION 'JOB_OPEN'
      EXPORTING
        jobname                  = l_f_jobname
       SDLSTRTDT               = l_r_tbtcjob-strtdate
       SDLSTRTTM               = l_r_tbtcjob-strttime
    IMPORTING
       JOBCOUNT               = l_f_jobcount
    SUBMIT (g_c_prog_arch_control)
             VIA JOB l_f_jobname
             NUMBER l_f_jobcount
             WITH s_vbeln in s_vbeln
             WITH p_jobnam = l_f_jobname
    l_r_tbtcjob-strtdate = sy-datum.
      l_r_tbtcjob-strttime = sy-uzeit +  ( 60 * p_wait ).
    CALL FUNCTION 'JOB_CLOSE'
        jobcount                          = l_f_jobcount
        jobname                           = l_f_jobname
      STRTIMMED                         = 'X'
    but by this the status of job is not changed to finished . my concern is how to pass every time startdate and time in perform as the firts execute the start date is sy-datum
    thanks

    Do not fill SDLSTRTDT and SDLSTRTTM in the JOB_OPEN call (see documentation), rather in the JOB_CLOSE call, and do not fill STRTIMMED in the JOB_CLOSE call, because you either want to start immediately or at a predefined date/time.
    Thomas

  • What are some best practices for Effective Sequences on the PS job record?

    Hello all,
    I am currently working on an implementation of PeopleSoft 9.0, and our team has come up against a debate about how to handle effective sequences on the job record. We want to fully grasp and figure out what is the best way to leverage this feature from a functional point of view. I consider it to be a process-related topic, and that we should establish rules for the sequence in which multiple actions are inserted into the job record with a same effective date. I think we then have to train our HR and Payroll staff on how to correctly sequence these transactions.
    My questions therefore are as follows:
    1. Do you agree with how I see it? If not, why, and what is a better way to look at it?
    2. Is there any way PeopleSoft can be leveraged to automate the sequencing of actions if we establish a rule base?
    3. Are there best practice examples or default behavior in PeopleSoft for how we ought to set up our rules about effective sequencing?
    All input is appreciated. Thanks!

    As you probably know by now, many PeopleSoft configuration/data (not transaction) tables are effective dated. This allows you to associate a dated transaction on one day with a specific configuration description, etc for that date and a different configuration description, etc on a different transaction with a different date. Effective dates are part of the key structure of effective dated configuration data. Because effective date is usually the last part of the key structure, it is not possible to maintain history for effective dated values when data for those configuration values changes multiple times in the same day. This is where effective sequences enter the scene. Effective sequences allow you to maintain history regarding changes in configuration data when there are multiple changes in a single day. You don't really choose how to handle effective sequencing. If you have multiple changes to a single setup/configuration record on a single day and that record has an effective sequence, then your only decision is whether or not to maintain that history by adding a new effective sequenced row or updating the existing row. Logic within the PeopleSoft delivered application will either use the last effective sequence for a given day, or the sequence that is stored on the transaction. The value used by the transaction depends on whether the transaction also stores the effective sequence. You don't have to make any implementation design decisions to make this happen. You also don't determine what values or how to sequence transactions. Sequencing is automatic. Each new row for a given effective date gets the next available sequence number. If there is only one row for an effective date, then that transaction will have a sequence number of 0 (zero).

Maybe you are looking for