Jobs running different than scheduled time

Hi All'
I have 10g(10.2.0.2) database in linux,It is in Austin.
Recently the jobs scheduled are running on different than scheduled time.
And time in EM and the date command in linux also showing different.
$ date
Wed Oct 31 03:59:01 CDT 2007
and the same time EM showing 2:59:42 AM
And some of the jobs are failing also.
I checked the following
SQL> select sessiontimezone from dual;
SESSIONTIMEZONE
-05:00
SQL> select dbtimezone from dual;
DBTIME
+00:00
can anybody please give the solution
Thanks in advance,
saf

There are a number of threads in this forum which deal with this type of issue:
Re: changing SO hour - problem to run a scheduled job
did it start on Oct 28th ? It may be a DST issue. Lookup in metalink 402742.1.

Similar Messages

  • Cancelling PeopleSoft job if running longer than max time

    We have some PeopleSoft jobs that we want Tidal to automatically cancel if they are still running after 2 hours, then automatically set the job to Completed Normally so the next job in the jobstream will start running.  We tried using a Job Event triggered by "Job running longer than its maximum time", with Selected Action = "Set To Completed Normally".  The problem is that it sets the job to completed normally on Tidal, but the job continues to run on PeopleSoft.  In other words, it's not cancelling the job.
    Next, we changed the Selected Action to "Cancel/Abort".  This worked in terms of cancelling the job on PeopleSoft, but we can't figure out how to automatically set the job to Completed Normally on Tidal after the job is cancelled.  There is an Event Trigger for "Operator cancelled the job", but apparently the event isn't triggered if Tidal cancels the job.  We also tried using Event Trigger = "Job completed", but that didn't work either since apparently Tidal doesn't consider a job in Aborted status as completed.
    Anyone have any other ideas on how to automatically cancel a PeopleSoft job and then automatically set the Tidal job to Completed Normally?  Thanks.

    Hello Richard,
    I would use the sacmd (windows) or the tesmcmd(Unix) to interact with the given job so to set it to whatever status you want ;
    - You have to first let the event Cancel/Abort it if running longer than expected by using the built in function (Cancel)
    - Then using the same event you need to trigger another job action that has the above tesmcmd command to set the job to Normally completed; for this you may need to create a new job that calls the tesmcmd/sacmd and then use jobset command.
    now you need to create a job action that triggers that job which will allow you to override the command parameters especially the RunID of your failed job.. hope i was able to get it right unless if you already figured that out.
    -rami

  • I entered a new calendar event at I cloud on my laptop.  The enent showed up on my I Phone  4S but, the start and finish times on the entry in I cloud was different than the times shown on the I phone calendar.  How do I sync the time in my I Phone with

    I entered a new calender event using I cloud on my laptop. The enent transfered to my i Phone 5 S but, the start and finish times shown st i cloud on the laptop are different than the times shown on the i phone.  How do I sync the times on the i Phone with the times shown on the laptop i cloud calender?

    Turn off time zone support on the phone.

  • DBMS_JOB Not running for the scheduled time, but run with .run fucntion.

    Hi all,
    I am executing a job every 15 minutes which have to delete the rows created in every 30 minutes.
    ========================================
    Jobs Submitted as -
    begin
    sys.dbms_job.isubmit(job => 202,what => 'del_test_info_p;',next_date =>sysdate,interval => 'sysdate + 15/1440');
    commit;
    end;
    ==================================
    Procedure that is running in Job -
    CREATE OR REPLACE procedure del_test_info_p is
    cnt number;
    begin
    select count(*) into cnt from test where ((created_at+30/1440)<=systimestamp);
    update jobcount set cnt=cnt+1 ;
    delete from test where ((created_at+30/1440)<systimestamp);
    update jobcount set cnt=cnt+1 ;
    commit;
    dbms_output.put_line (cnt ||' ROWS DELETED');
    end del_test_info_p;
    =====================================
    PROBLEM - Job is running in every 15 minutes as per user_jobs but rows are not deleteing from the test table whose created time is more then 30 minutes from systimestamp. Even the views are not having logs of any job run.
    Checked the queries and run the job Manually using dbms_job.run.. gives correct output and deletes the row.
    Please suggest where is the problem and how I can correct it.
    With Regards
    Amit Nanote

    HI All,
    I have found solution for this. Dont use Systimestamp in the DML's if using in a job (scheduled).
    There is a statement in procedure del_test_info_p
    delete from test where ((created_at+30/1440)<systimestamp);
    Here usage of systimestamp restricting that procedure to execute.
    Create procedure as -
    CREATE OR REPLACE procedure del_test_info_p is
    tstamp timestamp;
    begin
    select systimestamp into tstamp from dual;
    delete from test where ((created_at+30/1440)<tstamp);
    commit;
    end del_test_info_p;
    Thank You.

  • Background job running for a long time - trace is not capturing.

    hi guys,
    My background job runs for more thatn 4 hours.  some times. (normally it completes withing one hour)
    i tried to trace using st12 ....and executed this program. but not got complete trace report of 4 hours.
    what is the best way to trace long running programs like this.
    is there any alternative for this...???
    regards
    girish

    Giri,
    There is no need to trace a program for the full 4 hours. There is usually just one piece of rogue code causing the problem, and it can be identified quite quickly.
    If I were you I would just watch the job run via SM50, to see if it lingers on any database tables. Also check if the memory is filling up - this is the sign of a memory leak.
    You can also try stopping the program (via SM50 debugging) at random points, to see what piece of code it is stuck in.
    The issue should reveal itself fairly quickly.
    Tell us what you find!
    cheers
    Paul

  • Job starts in un-scheduled time

    Hi All,
    I have created a job which runs fine for some days. But some times in runs in different time than what I scheduled. Here is the job script. It always suppose to run every day 5.00 PM.
    Please advice on the same.
    DECLARE
    X NUMBER;
    BEGIN
    SYS.DBMS_JOB.SUBMIT
    ( job => X
    ,what => 'mypackage.myproc(''/u000/app/oracle/myfol'', ''NL''||TO_CHAR(SYSDATE-1, ''YYMMDD'')||''.TXT'', TO_CHAR(SYSDATE, ''YYYYMMDD''));'
    ,next_date => to_date('10/04/2009 17:00:00','dd/mm/yyyy hh24:mi:ss')
    ,interval => 'TRUNC(SYSDATE + 1) +17/24'
    ,no_parse => TRUE
    SYS.DBMS_OUTPUT.PUT_LINE('Job Number is: ' || to_char(x));
    END;
    commit;

    But some times in runs in different time than what I scheduled.What time are you seeing it run? Are we talking about major changes (i.e. it's running at 3 AM)? Or problems where it is running a few minutes after 5 PM?
    Do you have other jobs? What is JOB_QUEUE_PROCESSES set to? Any chance there are more jobs that want to run at 5 than JOB_QUEUE_PROCESSES allows to run simultaneously at 5 so your job is getting queued?
    Is the clock on the server correct? Is the clock drifting over time?
    Justin

  • Back up iPhoto 08' for the first time (different than last time I did it)

    The last time I backed up my photo library was before I got iPhoto 08'. I know I should have done it a long time ago and backed things up at least once a week but I kinda got busy. Since my last backup I have also upgraded to Leopard, which yes I did back up my hard drive when I did that, but I also had the advantage of having an IT person personally helping me.
    Today when I went to back up my library things were different. Rather than the old "iPhoto Library" folder that use to be in the pictures section, I just see a single file named "iPhoto Library" and when I click it, it launches iPhoto.
    So my question is how do I back up my library now, and what ever happened to the "Original" and "Modified" folders? What I would really like to do is save my current library on an external with everything intact. Events, Albums, the order I added them in, etc. I want to clear iPhoto out completely and then re-import my current library.
    I should probably mention that on the external hard drive right now is a back up of my old library from before my upgrade to Leopard and it still has the old setup of the iphoto folder and then inside that is the "Original" and "Modified" folders. How should I handle this, just delete it?

    With iPhoto 7 (iLife 08) the old iPhoto Library Folder is now a Package File. This is simply a folder that looks like a file in the Finder. The change was made to the format of the iPhoto library because many users were inadvertently corrupting their library by browsing through it with other software or making changes in it themselves.
    Want to see inside? Go to your Pictures Folder and find the iPhoto Library there. Right (or Control-) Click on the icon and select 'Show Package Contents'. A finder window will open with the Library exposed.
    Look familiar? Standard Warning: Don't change anything in the iPhoto Library Folder via the Finder or any other application. iPhoto depends on the structure as well as the contents of this folder. Moving things, renaming things or otherwise making changes will prevent iPhoto from working and could even cause you to damage or lose your photos.
    What I would really like to do is save my current library on an external with everything intact. Events, Albums, the order I added them in
    Simply copy the iPhoto Library from the Pictures Folder to the External. That's it.
    I want to clear iPhoto out completely and then re-import my current library.
    I'm afraid that doesn't make a lot of sense to me. If you back up a library then trash it, then restore the selfsame library, how does that constitute a “clear out”?
    I would delete nothing until I had the Maintenance project finished.
    As an Fyi
    There are many, many ways to access your files in iPhoto:
    *For Users of 10.5 and later*
    You can use any Open / Attach / Browse dialogue. On the left there's a Media heading, your pics can be accessed there. Command-Click for selecting multiple pics.
    Uploaded with plasq's Skitch!
    (Note the above illustration is not a Finder Window. It's the dialogue you get when you go File -> Open)
    You can access the Library from the New Message Window in Mail:
    Uploaded with plasq's Skitch!
    *For users of 10.4 and later* ...
    Many internet sites such as Flickr and SmugMug have plug-ins for accessing the iPhoto Library. If the site you want to use doesn’t then some, one or any of these will also work:
    To upload to a site that does not have an iPhoto Export Plug-in the recommended way is to Select the Pic in the iPhoto Window and go File -> Export and export the pic to the desktop, then upload from there. After the upload you can trash the pic on the desktop. It's only a copy and your original is safe in iPhoto.
    This is also true for emailing with Web-based services. However, if you're using Gmail you can use iPhoto2GMail
    If you use Apple's Mail, Entourage, AOL or Eudora you can email from within iPhoto.
    If you use a Cocoa-based Browser such as Safari, you can drag the pics from the iPhoto Window to the Attach window in the browser.
    *If you want to access the files with iPhoto not running*:
    For users of 10.6 and later:
    You can download a free Services component from MacOSXAutomation which will give you access to the iPhoto Library from your Services Menu. Using the Services Preference Pane you can even create a keyboard shortcut for it.
    For Users of 10.4 and later:
    Create a Media Browser using Automator (takes about 10 seconds) or use this free utility Karelia iMedia Browser
    Other options include:
    1. *Drag and Drop*: Drag a photo from the iPhoto Window to the desktop, there iPhoto will make a full-sized copy of the pic.
    2. *File -> Export*: Select the files in the iPhoto Window and go File -> Export. The dialogue will give you various options, including altering the format, naming the files and changing the size. Again, producing a copy.
    3. *Show File*: Right- (or Control-) Click on a pic and in the resulting dialogue choose 'Show File'. A Finder window will pop open with the file already selected.
    You can set Photoshop (or any image editor) as an external editor in iPhoto. (Preferences -> General -> Edit Photo: Choose from the Drop Down Menu.) This way, when you double click a pic to edit in iPhoto it will open automatically in Photoshop or your Image Editor, and when you save it it's sent back to iPhoto automatically. This is the only way that edits made in another application will be displayed in iPhoto.
    Regards
    TD

  • Running more than one time machine back up drive

    I have a back up drive assigned as my offices time machine back up drive.
    I need to set up an off site back up drive too. I know that time machine will only allow a connection to one back up destination at once.
    What problems could i encounter if i set up a off site drive as another time machine drive, disconnect on site drive and connect the off site drive to back up.
    Thanks

    That will work just fine, other than the hassle of having to tell Time Machine every time you swap.
    Give the drives at least slightly different names (that's for you; Time Machine knows they're different).
    Each set of backups will be complete and independent. When you do a backup to either one, it will contain all the changes you've made since the last backup +*to that drive.+*
    The first backup after a swap may take a bit longer than usual, as there may be more changes to "catch up" with, but otherwise there should be no troubles.

  • Job running longer than usual.

    Hi Sappers!!
    We have a job that runs everyday at 1:00 AM for about 2-3 hours. This job, which started at around 1:00 AM today is still running. Its been over 14 hours now.
    Job Description:
    Extract of open A/R items from table BSID with corresponding data from customer master records, billing document data, and cash application data. Sent to the GetPaid application database daily.
    Would appreciate any pointers on this issue!
    Thanks for your time!

    VR,
    Since you haven't mentioned a t.code, I'm assuming it's a custom report / program.
    There could be far too many thing that could have gone wrong with a custom program to even warrant a mention in this forum. However, suffice to say that a suitably smart ABAP-er can try adn debug the batch job. You can do this by going to t.code SM50, select the PID for your job, and go to "Program/Mode --> Program --> Debugging".
    You can then analyze what is stopping the job / causing it to run so slow.
    Trust this helps.
    Remember to assign points if found useful.
    Regards
    Gulshan

  • LabWindows/CVI timer runs faster than system time in multithreaded application on Win 2000

    The same application runs well on Win NT

    This is a rarely seen and unconfirmed problem that you are seeing with the Timer function. I have posted some information as well as another Developer Exchange user posting a workaround at the following link that you should take a look at:
    http://exchange.ni.com/servlet/ProcessRequest?RHIVEID=101&RPAGEID=135&HExpertOnly=&HFORCEKWTID=11348:5&HOID=506500000008000000AE1F0000

  • Notification time zone different than calendar time zone

    On the iPad...
    Notification of a calendar event is displayed in Eastern time zone while the calendar event is scheduled in Central time zone.
    Notifications and Calendar are both set to automatic.
    How does one get Notifications to display Calendar events as Central time zone?
    The MacBook Pro displays both Notifications and Calendar events as Central time zone.

    "srvctl setenv database -d fawms -t TZ=CST6CDT" but it didn't work. The DB. is taken the timezone CST5CDT."
    As long as you set it the timezone that way - should be good -
    Just remember though if you logged in as SYS - you will not see the timezone change from your command at the crs level

  • Identify Jobs that are not running on their Scheduled date time

    I have 29 scheduled jobs that run at different intervals of time. Some run once a day. few others run on hourly basis while others run on Sundays.
    I was working on a query that would let me know if a particular job did not run on its scheduled date and time.
    SELECT * from all_scheduler_jobs WHERE state <>'DISABLED'; will give me a list of all jobs that I have to monitor and that are not in the disabled state. But how can I verify that the jobs are running at their scheduled date time?
    Any help please? I need to create a view of all such jobs and then plan to send an alert so that appropriate action can be taken and it is assured that all important jobs run as per schedule.
    Thanks.

    Hi,
    I can see 2 approaches.
    - for jobs that have run but ran very late you should query dba_scheduler_job_run_details and filter by the difference between req_start_date and actual_start_date
    - for jobs that should have run but shouldn't, query for DBA_SCHEDULER_JOBS jobs that are SCHEDULED where next_run_date is in the past
    Hope this helps,
    Ravi.

  • Making FCSvr run more than on job at a time?

    Hello everyone,
    How do I make FCsvr run more than one job at a time? I'm watching my Job Progress window and only one clip is being worked on at a time (creating thumb, poster, etc.) Also, when I look at the QMaster Admin it shows only one core working (even though I have all eight cores set for compressing).
    Thanks for any help you can give me
    Micheal

    It's actualy very long to explain in this post but I recommend you read all te Qmaster manual to understand the way it works.
    Just to help you in the process
    1) What you see in the Qmaster admin is a cluster wich can have several instances of compressor within it to run several process at the same time
    2) If you have in fact more than 1 instance of compressor within the cluster (lets say 4) you have 2 options depending of the compressor setting you are using. If the Allow Job Segmenting is ON in the setting all 4 instances are going to be use to transcode the clip and you can see the segments (batch monitor) in wich the file is partition to transcode them separately (in this case 8 segments of video and 1 audio). If the option is OFF only 1 instance is going to run to transcode 1 file, but in this case you can have up to 4 files transcoding at the same time.
    3) Make sure you have the qmaster cluster selected in the compressor option in the preferences panel of the FCS admin window.
    4) The number of cores used to trancode files is manage by Qmaster but I can tell you that if you have an 8 core MACPRO with 4 instances running, around a 90% of the CPU is being used for the compressor when you have 4 jobs running at the same time or 1 with job segmenting ON. When job segmenting is off and there is only 1 job running I saw over 50% of the system cpu running on that process.
    5) Probably the most important point: I really hope for your own good that you have Qmaster and FCS running on 2 different machines. As you can see Qmaster is using serious CPU during trancoding.
    Hope this help

  • Can an EDQ job preserve rather than overwrite its previous outputs

    We have a job running on a scheduled basis that outputs data quality issues to a spreadsheet.
    We would like to keep a history of specific error details (e.g. which records had which data quality issues) over time (i.e. more details than the simple measures that can be published to the Dashboard for seeing trends over time).
    I believe we could do this kind of thing using an export to an external database table (or even a file in the EDQ server's landing area) using the 'Append to current data' option however is that a recommended way to go or is there any alternative e.g. is it possible for the job to write to
    (a) a spreadsheet/file with a different name each time it runs (e.g. suffixed with the run timestamp) or
    (b) an EDQ staged data area without deleting what was previously in the table (we could include a run timestamp column to distinguish the runs).
    Thanks, Nik

    Thanks for the update Mike - and for reminding me of that option (I hadn't been using run labels to now due to the fact that these are not compatible with Result Book exports to spreadsheets).
    I guess in that latter case we would have to use the command line utility runopsjob in conjunction with a scheduled script (run at the operating system level) that changes the -runlabel argument passed to it.
    Thanks, Nik

  • SQL Agent job running DTS package running in SQL 2005 is unable to run if job onwer is not logged onto server

    I am currently working on a SQL Server upgrade project for a a client where I am converting old dts packages to SSIS. However for a few of the packages no conversion is allowed to be done. For these few packages I have to use dts legacy components in SQL
    2005 on a windows 2003 server to run them.
    These packages use CAPICOM via an Active X script in the package to envelope connection string data for security within the package. Consequently I have to register the capicom.dll for the job owner (which will execute the job via proxy) and install private
    and public key files via internet explorer. I also do this when I am logged in with my account so i can test the package.
    I have created a SQL Server Agent Job which is used to execute the package. We have a schedule account which is local admin on the server and sysadmin within SQL Server. This account is used to create a credential and then a proxy for the CmdExec subsystem
    is then created based on this credential. A CmdExec job step is then added to the job. The directory of  cmd file which calls the dts package is then entered in the command window.
    Finally a recurring schedule is added to execute the job every 5 minutes. 
    If i am logged in to the server using the scheduled account the schedule runs successfully. I am also able to run the command file manually by double clicking on it. The DTS package is run successfully. However once the schedule is set up and I log off the
    machine and log onto my development machine with my normal account and fire up  SQL Server. I connect to that instance with the schedule and see that the job is failing with and Active X error in the package. From experience with this package this Active
    X  error occurs when the user executing the package has not registered the capicom.dll. This has already been done for the scheduler account because the job runs when the scheduler account is logged in on the server. 
    It almost seems as though the job will only run if the Scheduler account is logged on. If i log directly on to the server with own user account I am able to manually execute the package via the cmd file which indicates that the capciom.dll is registered
    under my account. Yet if I try an run the job in SQL Server when I am logged in under my account (using the scheduler account proxy) then the job fails.
    Does anybody have any idea why this may be happening? Any ideas would be much appreciated

    Run the job SSIS step under a proxy account that is derived from the domain account, non-expiring password and has been set to have all the necessary rights.
    How to is here: http://www.mssqltips.com/sqlservertip/2163/running-a-ssis-package-from-sql-server-agent-using-a-proxy-account/
    Arthur
    MyBlog
    Twitter

Maybe you are looking for

  • Which imac would work best for Graphic Design?

    I am a Graphic Designer, currently using Creative Suite 2 (soon to be upgraded to CS3) and I need a new Mac for home. A Powermac takes up too much space, so I'm thinking about an imac with a 24" monitor and I have a few questions for the experts:) Wh

  • Problem in assigning value to an array element

    hi all in the following prog i am not able to assign the value to the array element i am not getting why it is giving me error //my program is as follows public class ArrayTest      static int [] intArray = new int[5];      static int [] intArray1 =

  • Duplicate photos in import

    Hello, When importing photos each photo is duplicated.  I checked preferences in iPhoto but don't see where this is an option to uncheck.  It's wierd and annoying.  Is this a normal function of iPhoto?  Any way to stop this? Thank you.

  • Problem getting music from Ipod to Itunes

    I have a 2nd Gen Nano and recently had to reinstall my operating system (Windows XP), which required me to back up my music. I had no external HD at the time and not enough discs so I sync'd my Ipod and stored all 812 songs to my 4GB Ipod. Now that I

  • Importing via the sharing function in iphoto

    I want to import from a iphoto 6 on MacBook Pro to my new 24"imac. I have sharing set up and can see the library from the macbook on the new imac. I drag the library up to my library and it imports into my imac however with one BIG problem. It only i