HCM Time and Attendance - data requirements

Can anyone help with what data is required from an external time and attendance collection application.
E.G. First Name, Last Name, Emploee Code, Location, Clock in, Clock out and the big one is an 'hours calculated' data required?
Regards
Ray

you would some javascript to refresh
but this works in standard without doing any extra coding
or yoiu can check this
The flag event.target.dirty should be set to true at an appropriate
position (probably form:ready event) for adobe reader versions < 9.
Given below is a sample (javascript):
if ( xfa.host.version < 9) { //handles 8.x.x and previous
    event.target.dirty = true;
ie
During the initialization of a PDF form, Since 'dirtyState' is a
property of Reader, and when we open a PDF form, reader would set that
property as "false" automatically. That's why the dirty state could not
be changed to true in some event (like, initialize or Form:readey).
Inorder to set the dirty state to ture, we need to set it after the
initialization of the PDF form. Otherwise, the dirty state would be set
as false during the initialization by default.
The solution would be that they could try this script :
ContainerFoundation_JS.setGlobalValue(event.target, "saveDirtyState",
true);
SaveDirtyState is a property used by ZCI code to judge if a PDF form is
dirty or not.

Similar Messages

  • Does Cloud HCM time and attendance allow tracking by state?

    Does Cloud HCM time and attendance allow tracking by state to enable proper payroll withholding for state taxes? 
    Tracking time working each state you work in and having the wages properly withheld in payroll.  The time system has to be able to capture and process tie in different states and pass it along. Payroll system then has to process multi-state taxes and withhold properly to ensure compliance. 
    Does Oracle Fusion Time and Labor can or can’t support this? 

    I ended up creating a second object tracker (102 in the below example) that tracks the first object (2 in the below example) such that it is only up when the other is down.  This way only one of the 2 object can ever be up at one time and you can trigger the eem applets based on this.  Have been using this for several years in a fielded system no issues.
    ip sla 2
    icmp-echo 10.10.10.10 source-interface GigabitEthernet0/0.174
    timeout 1000
    threshold 30000
    frequency 1
    ip sla schedule 2 life forever start-time now
    track 2 ip sla 2 reachability
    delay down 3 up 3
    track 102 list boolean and
    object 2 not
    event manager applet BYPASS
    event track 102 state up
    action 010 cli command "enable"
    action 020 cli command "config t"
    action 100 cli command "int Gi0/0.174"
    action 101 cli command "encapsulation dot1q 175"
    event manager applet UNBYPASS
    event track 2 state up
    action 010 cli command "enable"
    action 020 cli command "config t"
    action 100 cli command "int Gi0/0.174"
    action 101 cli command "encapsulation dot1q 174"

  • Time and/or Date incorrect after moving/copying/importing

    This has been going on for years and has messed up hundreds, maybe more, of the time/date on my photos in Aperture.
    If I select my entire photo library (typically preset to 'Date - Ascending') in Aperture, I can find hundreds of photos out of order. When I look at them in 'Info', the time and sometimes date has changed by several hours. I'm not sure if this is an Aperture thing or Mac thing? Over the years I have moved and/or copied photos from smaller HDDs to larger HDDs to eventually a NAS unit and/or from iPhoto to Aperture.
    I have made calls to Tech Support, but they have always made me feel as if I made some mistake on my end or I didn't set the time correctly to begin with (now this could be correct with the P&S I share with my girlfriend, but very rarely with my DSLR). But I also explained that it was random (3 to 5 photos in a set).
    So today as I was trying to update some of my photos, I came across some photos (and videos) I took last year with my iPhone (my iPhone always has the correct time and date or 99.9%). I originally uploaded these to an old iPhoto library I still have (all the dates and times are correct). These same photos and videos (55 over a 3 day period) were moved to my Aperture library some time ago and 3 of the 55 have the incorrect time by -7 hours (ie. 3:30 pm is now 8:30 am).
    Why does this happen?? I have not set any time changes to these photos. If I did, they would all have the wrong time and not just 3 of them would be off.
    Is anyone having this issue and if so, was there a way to get the corrcet time/date from the photo hidden deep down in the EXIF? This is driving me crazy.
    Here are some screen shots of the same photo in iPhoto and Aperture.
    iPhoto 5:28:09 PM - Correct
    Aperture 10:28:09 AM - Incorrect -7 hrs
    TIA if anyone can help. I just don't have the time to go back and try and change all these photos.
    Narvon

    DF
    Thanks for the reply, I will try to post the screenshots again (they still show for me, odd?).
    Anyway, this is not a zoning problem. This is a moving or copying problem.
    The photos I tried to show from the screen shot I took last year at a concert in Coachella (same time zone as my residence in Redondo Beach). The shots were taken using my iPhone which has GPS and constant updating of Time. All the photos and videos show up correctly in Places (GPS) and all the correct times show in iPhoto.
    The issue occurs when the photos are moved to Aperture. Randomly the time has changed to -7 hours on a few photos. This same randomness has occured to hundreds of my photos that I have imported into Aperture over the years. As I said before, Tech Support and everyone else has placed the blame on me within my camera settings or improper importing.
    Now that I use my iPhone more often, I have proof that the settings are correct and that Aperture or Finder or ?? within Apple has changed the Time randomly. Also, if I had 'Accidently' opened the 'Time Zone' brick when importing (as suggested by Tech Support), all the photos would have the incorrect time and not just a few random ones.
    iPhoto original from iPhone - Time 5:28:09 correct
    Aperture photo from iPhoto - Time 10:28:09 incorrect -7 Hrs
    Thanks again,
    Narvon

  • SAP time and attendance

    We are evaluating to implement time and attendance module of SAP for
    5K+  users. Please help me to find technical or road map documentation. I need the information for proper project planning and costing,
    Thanks , Al Mamun

    Hi,
    Do you need configuration document of time evaluation or do you need any help on how to get the time evaluation results?
    Regards,
    Vasu,

  • Select Records between Begin Date/Time and End Date/Time

    Hi, I need to select records from table GLPCA where the CPUDT and CPUTM are between a START DATE/TIME and END DATE/TIME. 
    I have the below logic from an SAP Solution, but it doesn't seem to be working right in my opinion.  It is picking up records earlier than the date ranges.  Can anyone tell me how I might be able to accomplish this?  I'm hoping this is an easy one for the ABAPPERs... 
    Thanks,
    START DATE 20091022
    START TIME 125736
    END DATE 20091022
    END TIME 135044
    CPUDT 20091022
    CPUTM 100257
          SELECT * FROM GLPCA
             WHERE ( CPUDT >= STARTDATE AND ( CPUTM >= STARTTIME OR ( CPUDT <= ENDDATE AND CPUTM <= ENDTIME ) ) ).

    Thank you all!  I ended up using the following:
    SELECT * FROM GLPCA
              WHERE RYEAR IN L_R_RYEAR
                AND ( ( CPUDT = STARTDATE AND CPUTM >= STARTTIME ) OR CPUDT > STARTDATE )
                AND ( ( CPUDT = ENDDATE   AND CPUTM <= ENDTIME )   OR CPUDT < ENDDATE ).
    This child was born from the following thread that was found:
    update date and time of client record

  • Time and GPS data windows

    Hello all
    I was presented this sample 1080 frame and was asked if I can do this if time and GPS metadata were part of the video recording. In other words, how do you burn in this data in the video itself. I have P Pro CS 6 @ home and am a complete novice to video editing. Final product may or may not need this, more curious as to how its done - special camera system or in post processing?
    Thanks!

    If whoever shot the video recorded time of day timecode synced from a GPS receiver that was close to the camera (in your case that looks like it would mean mounted in the helicopter), you can use the time code data to match your GPS metadata to the video. I use an automated process to incorporate the the metadata into the video, but the process is all done with custom software that the CEO of our company wrote. The other people I've seen incorporating GPS metadata into video are also using customized, proprietary processes. If you're adding the data in post, you will have to translate the data into Premiere titles or After Effects text keyframes or something else that you can overlay on the video.
    You can also use hardware during your shoot to overlay the GPS data onto the video. I don't use any products that do this, and I don't know how easy they are to find for purchase, but there definitely are people in the geospatial sector who feed video through a GPS-aware piece of video processing hardware that overlays time and position data in real time. In this case you would feed video (most likely through HD-SDI) from your camera to the GPS overlay hardware and then to a recorder.
    There certainly aren't any GPS features built into Premiere Pro. If you want to add titles manually, you'll have to match the time code with your GPS data and make a new title each time the GPS info is supposed to update. GPS data is usually some form of table or spreadsheet with time, coordinates, and other positional data, so you can find the correct time in the video and make a title that includes the corresponding coordinates for that time. If you're doing this for more than a few seconds of video, it will be an unreasonably long and tedious process, but it is possible.

  • UTC Date Time and Normal Date Time

    Hi All,
    1. How UTC date time and Normal date time differs in siebel.
    2. If legacy data needed to be loaded into siebel, in siebel few fields are date time and UTC date time fields. what would happen if we load both normal date time and UTC date time without considering them techinically?
    3. UTC date time holds any specific format in physical database? If we want to load legacy data to UTC date time format what is the query to manipulate it?
    Thankyou
    Sean

    Sean,
    Please check document below, I believe it has most of the answers to the questions you have:
    http://download.oracle.com/docs/cd/E14004_01/books/GlobDep/GlobDepUTC.html
    Hope it helps,
    Wilson

  • Difference between a starting date and time and ending date and time

    Hi All,
    I need to bring out the difference between a starting data and time and ending date and time. Difference should be in time I mean it should be in hours or minus or seconds.
    I am sure there must be a Function module for this, Has anyone of you been in search of this kind of FM, Kindly suggest me. It is urgent.
    Thanks
    Mahen

    Hi,
    Check this out.
    data : date1 type dats ,   " System data type for Date (sy-datum)
           date2 type dats,
           time1 type tims,      " System data type for time (sy-timlo)
           time2 type tims,
           days  type i,
           scd   type i,
           t_mt  type i.
    days = date1 - date2. " Diference in days.
    Scd  = time1 - time2. " diference in seconds.
    t_mt =  ( days * 24 * 60 ) + ( scd / 60 ).
    total diference in minute
    <b>Reward Points & Mark Helpful Answers</b>

  • Selection based on previous run date , time and current date a

    HI all,
    I have the selection as follows.
    Previous run date :  .12/14/2005..                     previous run time-14:30:56
    current run date : 12/17/2009.                      current run time: 07:05:31
    Here all are parameters. and  current run date is system date  and   current run time is system time.
    i want to fetch all the deliveries from the above selection including previous run date and time.
    Any inputs..

    Hi
    Append current run data and previous run data into a data range parameter,Current time and previous time into time range parameter and fetch the data based on the range parameters.
    Regards
    Srilaxmi

  • Daylight Savings time, and how dates are stored internally and displayed

    This is probably a question that appears here annually, but I couldn't really find clear answers, so I'll try asking this in my own words:
    I'm in the Eastern timezone, and this Sunday, we'll be turning our clocks back an hour at 2:00 AM. That means that accordign to us humans, the time 1:30 AM will occur twice on Sunday.
    I've got an Oracle application that runs every 5 minutes around the clock, and it selects records from a certain table whose updated timestamp (TIMESTAMP(6)) is greater than SYSDATE - 5/1440, meaning any record that was updated in the last 5 minutes. Will we have a problem with some records being processed twice on Sunday morning? I'm theorizing that everything will be OK, that internally, Oracle stores DATE fields using something like an epoch which then gets interpreted when we display them. An epoch value will continue to grow each second no matter what “time” it is according to the U.S. Congress.
    A simpler way to look at the question might be as follows:
    If you store SYSDATE in a DATE column in row “X” at 1:30 AM before the time change, and you store sysdate in row “Y” exactly one hour later, will Oracle say that X’s timestamp is 60 minutes less than Y’s timestamp? All fields that are related to my particular situation are either DATE or TIMESTAMP(6).
    We use 11g.

    >
    That settles that! Thank you! My theory was wrong! I appreciate the help.
    >
    You may think it settles that but, sorry to burst your bubble, that doesn't really settle much of anything.
    One thing that was settled is the answer to this question
    >
    But are they talking about what you can EXTRACT and DISPLAY from the field or what is actually STORED internally?
    >
    which is, as Mark stated, they are talking about what is stored internally.
    The other thing that was settled is that you will pull the same, or mostly the same, data twice during that one hour. I say 'mostly the same' because of the major flaw your extraction method has to begin with.
    >
    If you store SYSDATE in a DATE column in row “X” at 1:30 AM before the time change, and you store sysdate in row “Y” exactly one hour later, will Oracle say that X’s timestamp is 60 minutes less than Y’s timestamp?
    >
    No - they will have the same time since 'one hour later' would have been 2:30 AM but the clock was turned back an hour so is again 1:30 AM. So the second time your job runs for 5 minutes at 1:30 AM it will pull both the original 1:30 AM data AND the data inserted an hour later.
    And Oracle will say that data stored in row "Z" exactly 45 minutes later than "X" at 1:30 AM will have a date of 1:15 AM and will appear to have been stored earlier.
    Your method of extracting data is seriously flawed to begin with so the daylight savings time issue is the least of your problems. The reason is related to the answer to this question you asked
    >
    do people avoid using DATE and TIMESTAMP datatypes because they are too simple?
    >
    That method isn't reliable - that is why people avoid using a date/timestamp value for pulling data. And the more often you pull data the worse the problems will be.
    >
    I've got an Oracle application that runs every 5 minutes around the clock, and it selects records from a certain table whose updated timestamp (TIMESTAMP(6)) is greater than SYSDATE - 5/1440, meaning any record that was updated in the last 5 minutes
    >
    No - it doesn't do that at all, at least not reliably. And THAT is the why your method is seriously flawed.
    The reason is that the value that you use for that DATE or TIMESTAMP column (e.g. SYSDATE) is assigned BEFORE the transaction is committed. But your code that extracts the data is only pulling data for values that HAVE BEEN committed.
    1. A transaction begins at 11:59 AM and performs an INSERT of one (or any number) of records. The value of SYSDATE used is 11:59 AM.
    2. The transaction is COMMITTED at 12:03 AM.
    3. Your job, which runs every five minutes pulls data for the period 11:55:00 AM to 11:59:59 AM. This job will NOT see the records inserted in step #1 because they had not been committed when your job query began execution - read consistency
    4. Your job next pulls data for the period 12:00:00 AM to 12:04:59 AM. This job will also NOT see the records inserted in step #1 because the SYSDATE value used was 11:59 AM which is BEFORE this jobs time range.
    You have one or ANY NUMBER of records that ARE NEVER PULLED!
    That is why people don't (or shouldn't) use DATE/TIMESTAMP values for pulling data. If you only pull data once per day (e.g. after midnight to get 'yesterdays' data) then the only data you will miss is for data where the transaction began before midnight but the commit happened after midnight. Some environments have no, or very little, activity at that time of night and so may never have a 'missing data' problem.
    Creating your tables with ROW DEPENDENCIES will store an SCN at the row level (at a cost of about 6 bytes per row) and you can use the commit SCN to pull data.
    Just another caveat though - either of those approaches will still NEVER detect rows that have been deleted. So if you need those you need yet a different approach such as using a materialized view log that captures ALL changes.

  • How do I get the difference of time and/or date using JavaScript?

    Hi All,
    I have four user text entry values (Date and Time) which require a difference to be performed on them.  The results would be put into a fifth text box named "Burn_Time". The entries are
    Start_Date, Start_Time, Stop_Date, and Stop_Time.
    How do I calculate the elapsed time between the entries?  For ease of calculations I surmised putting the values in 24 hour format would be helpful (not sure if this is true or not).
    Here is an example:
    Start_Date = 2012-04-10
    Start_Time = 10:00
    Stop_Date = 2012-04-12
    Stop_Time = 10:00
    Burn_Time = 48 (hours).
    Any help would be greatly appreciated!

    Here's the first in a series of three tutorials on the subject: http://acrobatusers.com/tutorials/date_time_part1

  • HCM Process and Forms - Clarification required reg BADI QISR1.

    Hi all,
               I want to have some extra validations done on some of the fields of my HCM Process Adobe form. For this, I created a new implementation for the BADI QISR1 with filter value as my Custom Scenario and implemented the method INT_SERVICE_REQUEST_CHECK. Validation is working now, but, it's not updating any data to the back end after I submit. When I checked, I found that SAP has written the whole logic to update the data to the Infotypes in another Implementation of the same BADI,which was like the Default implementation. But, that implementation is not getting triggerred now, as I have a new one now.
    Do I have to copy the whole update logic to this Implementation now? Or is there any other setting for this?
    Kindly help me.
    Regards!
    Mahesh

    Not sure I follow you here, but here's some basics.
    For any backend updates to infotypes, you should use the standard services SAP_PA, SAP_PT and SAP_PD (if on EhP4).
    In some case, this won't work. So you have two choices:
    (1) write and advanced generic service to handle this (difficult and no examples from SAP)
    (2) write custom code triggered in workflow to do this (such as a custom function module) to handle your updates (past EhP2, you have to use the same classes as the decoupled framework uses....not the good ol' "infotype operations" standard FM).
    Now, as for validations, you can handle those separately. In fact, talk about timely! (haha) I just posted a blog on this just now. =)
    /people/christopher.solomon/blog/2009/06/22/hcm-processes-forms-required-fields-arent-what-they-use-to-be

  • Query Execution/Elapsed Time and Oracle Data Blocks

    Hi,
    I have created 3 tables with one column only. As an example Table 1 below:
    SQL> create table T1 ( x char(2000));
    So 3 tables are created in this way i.e. T1,T2 and T3.
    T1 = in the default database tablespace of 8k (11g v11.1.0.6.0 - Production) (O.S=Windows).
    T2 = I created in a Tablespace with Blocksize 16k.
    T3 = I created in a Tablespace with Blocksize 4k. In the same Instance.
    Each table has approx. 500 rows (So, table sizes are same in all the cases to test Query execution time ). As these 3 tables are created under different data block sizes so the ALLOCATED no. of data blocks are different in all cases.
    T1  =   8k  = 256 Blocks =  00:00:04.76 (query execution time/elapsed time)
    T2  = 16k=121 Blocks =  00:00:04.64
    T3 =   4k =  490 Blocks =  00:00:04.91
    Table Access is FULL i.e. I have used select * from table_name; in all 3 cases. No Index nothing.
    My Question is why query execution time is nearly the same in all 3 cases because Oracle has to read all the data blocks in each case to fetch the records and there is a much difference in the allocated no. of blocks ???
    In 4k block size example, Oracle has to read just 121 blocks and it's taking nearly the same time as it's taking to read 490 blocks???
    This is just 1 example of different data blocks. I have around 40 tables in each block size tablespace and the result are nearly the same. It's very strange for me because there is a much difference in the no. of allocated blocks but execution time is almost the same, only difference in milliseconds.
    I'll highly appreciate the expert opinions.
    Bundle of thanks in advance.
    Best Regards,

    Hi Chris,
    No I'm not using separate databases, it's 8k database with non-standard blocksizes of 16k and 4k.
    Actually I wanted to test the Elapsed time of these 3 tables, so for that I tried to create the same size
    tables.
    And how I equalize these is like I have created one column table with char(2000).
    555 MB is the figure I wanted to use for these 3 tables ( no special figure, just to make it bigger than the
    RAM used for my db at the db startup to be sure of not retrieving the records from cache).
    so row size with overhead is 2006 * 290,000 rows = 581740000(bytes) / 1024 = 568105KB / 1024 = 555MB.
    Through this math calculation I thought It will be the total table size. So I Created the same no. of rows in 3 blocksizes.
    If it's wrong then what a mes because I was calculating tables sizes in the same way from the last few months.
    Can you please explain a little how you found out the tables sizes in different block sizes.Though I understood how you
    calculated size in MB from these 3 block sizes
    T8K =97177 BLOCKS=759MB *( 97177*8 = 777416KB / 1024 = 759MB )*
    T16K=41639 BLOCKS=650MB
    BT4K=293656 BLOCKS=1147MB
    For me it's new to calculate the size of a table. Can you please tell me then how many rows I can create in each of
    these 3 tables to make them equal in MB to test for elapsed time.
    Then I'll again run my test and put the results here. Because If I've wrongly calculated table sizes then there is no need to talk about elapsed time. First I must equalize the table sizes properly.
    SQL> select sum(bytes)/1024/1024 "Size in MB" from dba_segments> 2 where segment_name = 'T16K';
    Size in MB
    655
    Is above SQL is correct to calculate the size or is it the correct alternative way to your method of calculating the size??
    I created the same table again with everything same and the result is :
    SQL> select num_rows,blocks from user_tables where table_name = 'T16K';NUM_ROWS BLOCKS
    290000 41703
    64 more blocks are allocated this time so may be that's y it's showing total size of 655 instead of 650.
    Thanks alot for your help.
    Best Regards,
    KAm
    Edited by: kam555 on Nov 20, 2009 5:57 PM

  • Sales order with Compl.deliv., Lead time and Shipping date problems

    Hello, SAP gurus!
    We have some concerns with sales orders that won't start processing, and I'm wondering if there's anyone here who have had experience with this problem.
    If not, hopefully you understand this scenario.
    Scenario:
    * A salesorder with three lines, of which two lines are not in stock (itemcategory YBAB), one line in stock (itemcat TAN)
    * Requested delivery date is asap, ie 19/03-2014
    * Marked with Complete delivery
    * In the Shipping tab, it seems all dates are copied from lead time of the YBAB-lines, 11/06-2014.
    * We just received the two YBAB-lines on PO, but the date still shows 11/06-2014, almost a month from now.
    Is it possible to make SAP automatically change the shipping date to the date we actually receive the items and are ready to process the order?
    What we have now is a salesorder made 2 months ago, ready to be packed and shipped today, but SAP seems to hold on to the "expected" date a month from now. It's frustrating!
    Hope to hear from you

    Hello,
    You will need to trigger a av. check again to factor the impact of receipt of YBAB items.
    You could evaluate backorder processing to automate this as a daily run to work on the sales order relevant for re-ATP based on new availability situation.
       Thanks,
    Sudhir

  • Help with HCM PA and PD data transfer

    Hi,
    We managed to transfer PA data but our PD data (in particular, OU, Job, Position) are not transferred. Hence, the respective fields (Org Unit, Job and Position) in IT1 are reflected as "99999999".
    May I know how to transfer PD objects in the context above ?  When specifying the PD Selection, I specified Plan Version = 01, Object Type = O, Object ID = root Org. Unit, Evaluation Path = OSP and Status Vector = 1. 
    However, still the PD Objects are still not transferred and IT1 values for Org Unit, Job and Position still "99999999". Please help. Thank you.
    Regards
    Kir Chern

    In Transfer Selection Criteria Activity, Under PD Selection  > PD delete
    and Target Options   > Target Options.
    What target plan have you selected?
    If the plan variant is the same as the integration plan variant
    in the target system, the relevant PA tables of infotype 0001 of fields
    organizational unit, position and job
    (if contained in the evaluation path) are updated.
    Kindly get back in case of issues.
    Best regards,
    Krishna.

Maybe you are looking for

  • GR with out PO

    Hi , please tell me which transaction i can use to create a GR with out po. My requirements are : -Need to track qty, -Need to enter the price -excise details if applicable -Invoice verification incase invoice value is not entered in GR. this require

  • Error when I preview my form in HTML using Adobe Livecycle Es4

    Need help!!!! This is the error while I preview my form using HTML. Even though the from structure is present I am not able to preview my form. Please reply since it is a project deliverable and its an urgent requirement. Thanks in advance.

  • MacBook video graphics problem - NOT artifacts this time, but rather boxes

    http://www.johnwaller.org/bb/topic.php?post=167#post167 I document the problem there ... in response to someone asking me about the artifacts problem. I have gone through three MacBooks trying resolve misc. issues, and on my most recent MacBook, I ne

  • Transform XML and display xsl:message

    Hi, I have a simple method to transform XML. My XSLT has <xsl:message> to help debugging while transforming. How do I access the messages to System.out.printLn? public static Document transformXML(Document xmlDoc, Document xslDoc) throws XMLHelperExc

  • Archive a playlist with media files and the song settings

    I want to remove a body of songs from out of my library (Holiday and Christmas -- surprise, huh?) and yet have it ready to reimport next November, complete with song settings -- for example, I go through and amend the play volumes rather than using S