Best Practice for converting Temps to Permanent Employees

I'm interested in how other companies handle the conversion of Temporaory Employees to Regular Full-Time Employees. Currently, my company has a entirely separate "Temp Hire", which we use to get them in the system.  When they convert we perform a Termination action and then a Rehire Action to put them in the Employee position.  This "Rehire" action is causing problems for other systems and we could like to use a straight "Hire" action - but we can't because they already have a "Temp Hire".
Is this generally how it is handled?  Has anyone seen other ways to handle this?

Hi Ktenes,
I believe you have employee groups and subgroups set up to distinguish between temps and permanent employees. If that is the case, Iu2019d suggest you to consider running the org. reassignment action with a reason u201Cpermanent job offeru201D and changing the position along with employee group, subgroup, payroll area etc. You might want to include certain infotypes (through infogroup) that are only valid for permanent employees and delimit the ones that had been maintained exclusively for temps. This action would not change the employment status.
Other option is to get away from the termination-rehire process because in reality, employees are not getting terminated. They are being absorbed as/converted to permanent employees. So, running a termination-rehire process is not justified in this case. Also, it would make difficult to identify this case from a normal termination-rehire. As an alternative, you can have a new action/reason that would not change the employment status but only other attributes like employee group, subgroup, payroll area et al as mentioned in the above paragraph
Both the options are also easy to report off of. Let us see what other experts have to suggest.
Donnie

Similar Messages

  • Best Practices for converting SAP HR data (4.7 to ECC)

    Hello Experts ...
    We are going from 4.6 to ECC ... no upgrade ..it will be a new implementation ...
    I am looking for best practices to convert SAP HR data from one sap instance(4.6) to another(ECC) ...
    I am not sure if direct input or LSMW or any other method/tool is the best way ...
    Will really appreciate and award point if I can get good advise or documentation ...
    Let me know if my question is not clear enough.
    Thanks,

    Hi
    You can check SAP Marketplace and there follow the download link. You will require a SAP Marketplace login to download the Best Practices. Fortunately Best practices for ECC 5.00 and 6.00 are available there but they are country specific versions. I know HCM for US is available there.
    Reward points, if helpful.
    Regards
    Waz

  • Need to know the best practice for converting string to double

    I have a string and want to convert to double if it is a valid number, else want to keep as it is. There can be couple of ways doing it and I want to know which one is best if I have lots of strings, specially from performance point of view.
    1) Use Double.parseDouble(myString) and catching Number format exception to detect it is not a number. One of my colleague said it does not give good performance because of exception catching,
    2) Use of org.apache.commons.lang.math.NumberUtils.isNumber() and if it is true then only parse it - so don't rely on exception.
    I did some performance testing - putting it in a loop and trying out for 2 scenarios - one loop for proper numeric value string and another for non-numeric. What I found out was if strings are not proper then parseDouble() is taking long time (because of exception catching) and in that case using NumberUtils.isNumber() makes sense.
    Would like to hear expert views on this.
    Thanks
    Manisha

    If you need it as a double you must convert it to a double and catch the exception. This means that testing it first is a waste of time in the case when the test succeeds - did your colleague think of that?
    Catching the exception is possibly slower than the test. Whether this is significant depends on the relative timings of the test and catching the exception, and also on the expected error rate. If this is below about 40% I suggest your colleague is talking through his esteemed hat.
    And of course the best test by far is the conversion itself. Using any other test runs the risk of its rules being different from those applied by the conversion.
    In any case you are obliged to write the code that catches the exception. You're not obliged to write the pre-test code.
    My personal rule for efficiency is to minimize lines of code until hard evidence to the contrary proves that further improvement is required.

  • Best Practice for Conversion Workflow

    Hello,
    I'm converting video files from our "home grown" virtual media reserve to iTunes U. Some of the files are in RM format, some are already compressed .mov's (not H.264) and some I have the original DV files for.
    Anyone out there have a best practice for converting these file types for posting to iTunes U? I have Final Cut Studio (Compressor), QT Pro and Squeeze available to me.
    Any experience you have with this would be helpful.
    Thanks,
    Jeana

    For converting old files to a podcast compatible video and based on the machine you have, consider elgato turbo.264. It is a fairly priced "co-processor" for video conversion. It is comprised of an application and a small USB device with a encoder chip in it. In my experience, it is the fastest way to create podcast video files. The amount of time that you will save will pay for the device quickly (about $100). Plus it does batch conversion of any video that your system currently plays through QuickTime. it has all the necessary presets and you can create your own. It has a few minor limitations such as not supporting (at this moment) enhanced podcasting features such as chapter markers and closed-captions but since you have old files for conversion, that won't matter.
    For creating new content, the workflow varies a lot. Since you mention MP3s, I guess you are also interested in audio files. I would stick with GarageBand, especially if you are a beginner plus it supports enhanced podcasts.
    In any case the most important goal is to have the simplest and fastest way to go from recording to publication. The less editing the better. To attain that, the best methods will require the largest investments. For example, for video production the best way is to produce the content live so when you finish recording it is only a matter of encoding and publishing. that will require the use of a video switcher that can ingest at least one video camera and a computer output to properly capture presentation material. That's the minimum. there are several devices that can do this for you. Some are disguised PCs and some connect to a PC for tapeless recording. You can check the Tricaster, which I like but wish it was a Mac and not a Windows Xp PC. Other routes may include video mixers from manufacturers like Edirol, Pansonic or Sony connected to a VTR or directly to a Mac for direct-ti-disc capture. I f you look at some of the content available in iTunes U, you will see what I explain here. This workflow requires preparation and sufficient live support but you will have your material ready for delivery almost immediately after the recorded event. No editing required. Finally, the most intensive workflow is to record everything separately and edit it later, which is extremely time consuming.

  • What are the Best Practices for Optimizing Images in InDesign Files

    Is there a best practice for using images InDesign to optimize the document before converting to a PDF? Specifically, what I'm asking is, will the PDF file compress better if the images are cropped prior to placing them in Indesign? I'd like to know the answer for both creating PDF files for printing using images that are 300dpi and for creating PDF files for online delivery using images that are 72dpi. I have an employee that insists images need to be cropped to actual dimensions before placing in the InDesign document. I've never done it that way and believe that her recommended process is way too time consuming and leaves you with no leeway to tweak your page design since the images are tightly cropped.

    As for absolute cropping, I agree with your stance. Until the layout is fixed, preserving your ability to easily manipulate photo size and positioning is key.
    Some clever image management methods have been described in the discussion forums, and one that appealed most to me was the use of duplicate linked image folders. Having a high-res (CMYK) folder and a low-res (RGB) folder to switch between for different output enables you to use both to your advantage. Use the low-res images for layout, for internal proofing, and for EPUB/online PDF/HTML output. Then it's simply a quick switch to the high-res image folder for print purposes. You can easily prepare the alternate collection of images with a Photoshop batch convert script or with the Photoshop Image Processor. Save your presets!

  • Best practice for iTunes' music folder

    i keep my music on an external drive, but want itunes to be able to play the songs.
    currently, the itunes music folder is set to its default location. i changed the preference to prevent iTunes from copying music to this location. i added music to iTunes using File | 'Add folder to Library' menu.
    my friend, who also has his music on an external drive, set his itunes music folder to the Music folder on his external drive.
    what are the differences between these two approaches? what are the issues?
    is there a best practice for using iTunes w/ music stored on an external drive?
    thanks for your time.
    craig

    Thanks Paul for helping
    I am getting the symbol and can locate the song but it is very time consuming and I can't do whole albums .
    I tried dragging the entire music folder into iTunes . Is this it , iTunes Music Library.xml ? These are all the files and folders I found
    iTunes 3 Music Library Data file
    iTunes 4 Music Library Data File
    iTunes 4 Music Library (Old) Data File
    iTunes Music folder
    iTunes Music Library.xml document
    Temp File Document
    I unchecked the "Copy files to iTunes Music folder "
    before I dragged the xml. doc into the iTunes symbol in the dock .
    This seems to have made matters worse . Now I can't find the file at all except through the finder .
    Remember this is 10.3.9 with v4.7
    Powerbook   Mac OS X (10.4.6)   Panther eMac

  • Best Practice for FlexConnect Wireless roaming in MediaNet environment?

    Hello!
    Current Cisco best practice recommendations for enterprise MediaNet design, specify that VLANs be local to a switch / switch stack (i.e., to limit the scope of spanning-tree). 
    In the wireless world, this causes problems if you want users while roaming to keep real-time applications up and running.  Every time they connect to a new AP on a different VLAN, then they will need to get a new IP address, which interrupts real-time apps. 
    So...best practice for LAN users causes real problems for wireless users.
    I thought I'd post here in case there's a best practice for implementing wireless roaming in a routed environment that we might have missed so far!
    We have a failover pair of FlexConnect 7510s, btw, configured for local switching for Internal users, and central switching with an anchor controller on the DMZ for Guest users.
    Thanks,
    Deb

    Thanks for your replies, Stephen and JSnyder.
    The situation here is that the original design engineer is no longer here, and the original design was not MediaNet-friendly, in that it had a very few /20 subnets bridged over entire large sites. 
    These several large sites (with a few hundred wireless users per site), are connected to an HQ location (where the 7510s in failover mode are installed) via 1G ethernet hand-offs (MPLS at the WAN provider).  The 7510s are new, and are replacing older contollers at the HQ location. 
    The internal employee wireless users use resources both local to their site, as well as centralized resources.  There are at least as many Guest wireless users per site as there are internal employee users, and the service to them consists of Internet traffic only.  (When moved to the 7510s, their traffic will continue to be centrally switched and carried to an anchor controller in the DMZ.) 
    (1) So, going local mode seems impractical due to the sheer number of users whose traffic bound for their local site would be traversing the WAN twice.  Too much bandwidth would be used.  So, that implies the need to use Flex / HREAP mode instead.
    (2) However, re-designing each site's IP environment for MediaNet would suggest to go routed to the closet.  However, this breaks seamless roaming for users....
    So, this conundrum is why I thought I'd post here, and see if there was some other cool / nifty solution I wasn't yet aware of. 
    The only other (possibly friendly to both needs) solution I'd thought of was to GRE tunnel a subnet from each closet to the collapsed Core / Disti switch at each site.  Unfortunately, GRE tunnels are not supported in the rev of IOS on the present equipment, and so it isn't possible to try this idea.
    Another "blue sky" idea I had (not for this customer, but possibly elsewhere in the future), is to use LAN switches such as 3850s that have WLC functionality built-in.  I haven't yet worked with the WLC s/w available on those, but I was thinking it looks like they could be put into a mobility group, and L3 user roaming between them might then work.  Do you happen to know if this might be a workable solution to the overall big-picture problem? 
    Thanks again for taking the time and trouble to reply!
    Deb

  • ISE Best Practice for Purging Endpoints

    Maybe I haven't looked long enough or deep enough through the documents and guides, but I am wondering if there is a best practice for purging endpoints in general. For my guest endpoints, I have it set to purge those endpoints every 3 days. When i look at how many endpoints I have profiled at the current time, its a very large number of devices. I'm sure there is a large number of these that are no longer connecting to our network and probably won't in the future.
    If there isn't a current best practice, would it sound logical to purge every 180 to 190 days? We are a public school district and we have 180 instructional days. Employees and students alike are able to bring their own devices. I figure with 190 day purge, it would cover the time that employees and students are in session.
    Thoughts, opinions?
    Thank you for your time.
    Kevin

    A lot of vendors will suggest also to have one SSID if possible, but the rule of thumb is 3-4 max.  The main issue is the differences required for specific WLAN's, which isn't just for Data and Voice, but you also have to look at mDNS, multicast, 802.11r, DTIM's, MFP, etc.  You can combine all devices to use one, but all the features/setting will be the same, which isn't ideal all the time.  There are attributes which you can set from ISE to push out to the WLC(s), but its the other unique values that you need to research and understand.

  • What is the best practice for inserting (unique) rows into a table containing key columns constraint where source may contain duplicate (already existing) rows?

    My final data table contains a two key columns unique key constraint.  I insert data into this table from a daily capture table (which also contains the two columns that make up the key in the final data table but are not constrained
    (not unique) in the daily capture table).  I don't want to insert rows from daily capture which already exists in final data table (based on the two key columns).  Currently, what I do is to select * into a #temp table from the join
    of daily capture and final data tables on these two key columns.  Then I delete the rows in the daily capture table which match the #temp table.  Then I insert the remaining rows from daily capture into the final data table. 
    Would it be possible to simplify this process by using an Instead Of trigger in the final table and just insert directly from the daily capture table?  How would this look?
    What is the best practice for inserting unique (new) rows and ignoring duplicate rows (rows that already exist in both the daily capture and final data tables) in my particular operation?
    Rich P

    Please follow basic Netiquette and post the DDL we need to answer this. Follow industry and ANSI/ISO standards in your data. You should follow ISO-11179 rules for naming data elements. You should follow ISO-8601 rules for displaying temporal data. We need
    to know the data types, keys and constraints on the table. Avoid dialect in favor of ANSI/ISO Standard SQL. And you need to read and download the PDF for: 
    https://www.simple-talk.com/books/sql-books/119-sql-code-smells/
    >> My final data table contains a two key columns unique key constraint. [unh? one two-column key or two one column keys? Sure wish you posted DDL] I insert data into this table from a daily capture table (which also contains the two columns that make
    up the key in the final data table but are not constrained (not unique) in the daily capture table). <<
    Then the "capture table" is not a table at all! Remember the fist day of your RDBMS class? A table has to have a key.  You need to fix this error. What ETL tool do you use? 
    >> I don't want to insert rows from daily capture which already exists in final data table (based on the two key columns). <<
    MERGE statement; Google it. And do not use temp tables. 
    --CELKO-- Books in Celko Series for Morgan-Kaufmann Publishing: Analytics and OLAP in SQL / Data and Databases: Concepts in Practice Data / Measurements and Standards in SQL SQL for Smarties / SQL Programming Style / SQL Puzzles and Answers / Thinking
    in Sets / Trees and Hierarchies in SQL

  • Best practice for importing non-"Premiere-ready" video files

    Hello!
    I work with internal clients that provide me with a variety of differnet video types (could be almost ANYTHYING, WMV, MP4, FLV).  I of course ask for AVIs when possible, but unfortunately, I have no control over the type of file I'm given.
    And, naturally, Premiere (just upgraded to CS5) has a hard time dealing with these files.  Unpredictable, ranging from working fine to not working at all, and everything in between.  Naturally, it's become a huge issue for turnaround time.
    Is there a best practice for preparing files for editing in Premiere?
    I've tried almost everything I can think of:  converting the file(s) to .AVIs using a variety of programs/methods.  Most recently, I tried creating a Watch Folder in Adobe Media Encoder and setting it for AVI with the proper aspect ratio.  It makes sense to me that that should work:using an Adobe product to render the file into something Premiere can work with.
    However, when I imported the resulting AVI into Premiere, it gave me the Red Line of Un-renderness (that is the technical term, right?), and had the same sync issue I experienced when I brought it in as a WMV.
    Given our environment, I'm completely fine with adding render time to the front-end of projects, but it has to work.  I want files that Premiere likes.
    THANK YOU in advance for any advice you can give!
    -- Dave

    I use an older conversion program (my PrPro has a much older internal AME, unlike yours), DigitalMedia Converter 2.7. It is shareware, and has been replaced by Deskshare with newer versions, but my old one works fine. I have not tried the newer versions yet. One thing that I like about this converter is that it ONLY uses System CODEC's, and does not install its own, like a few others. This DOES mean that if I get footage with an oddball CODEC, I need to go get it, and install it on the System.
    I can batch process AV files of most types/CODEC's, and convert to DV-AVI Type II w/ 48KHz 16-bit PCM/WAV Audio and at 29.97 FPS (I am in NTSC land). So far, 99% of the resultant converted files have been perfect, whether from DivX, WMV, MPEG-2, or almost any other format/CODEC. If there is any OOS, my experience has been that it will be static, so I just have to adjust the sync offset by a few frames, and that takes care of things.
    In a few instances, the PAR flag has been missed (Standard 4:3 vs Widescreen 16:9), but Interpret Footage has solved those few issues.
    Only oddity that I have observed (mostly with DivX, or WMV's) is that occasionally, PrPro cannot get the file's Duration correct. I found that if I Import those problem files into PrElements, and then just do an Export, to the same exact specs., that resulting file (seems to be 100% identical, but something has to be different - maybe in the header info?) Imports perfectly into PrPro. This happens rarely, and I have the workaround, though it is one more step for those. I have yet to figure out why one very similar file will convert with the Duration info perfect, and then a companion file will not. Nor have I figured out exactly what is different, after running through PrE. Every theory that I have developed has been shot down by my experiences. A mystery still.
    AME works well for most, as a converter, though there are just CODEC's, that Adobe programs do not like, such as DivX and Xvid. I doubt that any Adobe program will handle those suckers easily, if at all.
    Good luck,
    Hunt

  • Best practice for partitioning 300 GB disk

    Hi,
    I would like to seek for advise on how I should partition a 300 GB disk on Solaris 8.x, what would be the optimal size for each of the partition.
    The system will be used internally for running web/application servers and database servers.
    Thanks in advance for your help.

    There is no "best practice" regardles of what others might say. I depends entirely on how you plan on using and maintaining the system. I have run into too many situations where fine-grained file system sizing bit the admins in the backside. For example, I've run into some that assumed that /var is only going to be for logging and printing, so they made it nice and small. What they didn't realize is that patch and package information is also stored in /var. So, when they attempted to install the R&S cluster, they couldn't because they couldn't put the patch info into /var.
    I've also run into other problems where a temp/export system that was mounted on a root-level directory. They made the assumption that "Oh, well, it's root. It can be tiny since /usr and /opt have their own partitions." The file system didn't mount properly, so any scratch files in that directory that were created went to the root file system and filled it up.
    You can never have a file system that's too big, but you can always have a file system that's too small.
    I will recommend the following, however:
    * /var is the most volatile directory and should be on its own several GB partition to account for patches, packages, and logs.
    * You should have another partition as big as your system RAM and assign that parition as a system/core dump for system crashes.
    * /usr or whatever file system it's on must be big enough to assume that it will be loaded with FOSS/Sunfreeware tools, even if at this point you have no plans on installing them. I try to make mine 5-6 GB or more.
    * If this is a single-disk system, do not use any kind of parallel access structure, like what Oracle prefers, as it will most likely degrade system performance. Single disks can only make single I/O calls, obviously.
    Again, there is no "best practice" for this. It's all based on what you plan on doing with it, what applications you plan on using, and how you plan on using it. There is nothing that anyone here can tell you that will be 100% applicable to your situation.

  • Best practice for exporting from iMovie '08 to iDVD

    I am looking to find out what is the best practice for exporting from iMovie '08 to iDVD. I have read the other postings that give the basic howto (export to Media Browser then select the video in iDVD). However, my question is a little more technical. I have 1080i HD projects. I am interested in burning them to DVD in the best possible quality. What setting should I be using when I publish to Media Browser?
    I am wondering about quality loss due to more than one conversion/compression. I suspect that when I export to the Media Browser then this is occurring. If I am not mistaken iMovie is using something like H.264 for this. Then, when I run iDVD I suspect it will it do another conversion/compression, I think to get to MPEG2. Not only could this result in a loss of quality but also it will take extra time. I am interested to know what others think about this.
    Finally, I am looking to create DVDs for a lot of video. I am wondering if there are any USB or firewire hardware devices out there that could speed up the compression. I use the Elgato Turbo.264 when I want to encode to H.264 but I wonder if there is something similar for DVD creation.
    Thanks in advance.

    the standards for videoDVD are 720x480, and usually mpeg2 encoded..
    so, your HiDef project HAS to be 'downsampled' somehow..
    I would Export with Qucktime/apple intermediate => which is the 'format' your project is allready, and you avoid any useless 'inbetween encoding'..
    iDVD will 'swallow' this huge export file - don't mind: iDVD cares for length, not size.
    iDVD will then convert into DVD-standards..
    you can 'raise' quality, by using projects <60min - this sets iDVD automatically to highest technical possible bitrate
    hint: judge pic quality on a DVDplayer + TV.. not on your computer (DVDs are meant for TVdelivery)

  • Best practice for dealing with Recordsets

    Hi all,
    I'm wondering what is best practice for dealing with data retrieved via JDBC as Recordsets without involving third part products such as Hibernate etc. I've been told to NOT use RecordSets throughout in my applications since they are taking up resources and are expensive. I'm wondering which collection type is best to convert RecordSets into. The apps I'm building are webbased using JSPs as presentation layer, beans and servlets.
    Many thanks
    Erik

    There is no requirement that DAO's have a direct mapping to Database Tables. One of the advantages of the DAO pattern is that the business layer isn't directly aware of the persistence layer. If the joined data is used in the business code as if it were an unnormalized table, then you might want to provide a DAO for the joined data. If the joined data provides a subsiduray object within some particular object, you might add the access method to the DAO for the outer object.
    eg:
    In a user permissioning system where:
    1 user has many userRoles
    1 role has many userRoles
    1 role has many rolePermissions
    1 permission has many rolePermissions
    ie. there is a many to many relationship between users and roles, and between roles and permissions.
    The administrator needs to be able to add and delete permissions for roles and roles for users, so the crud for the rolePermissions table is probably most useful in the RoleDAO, and the crud for the userRoles table in the UserDAO. DOA's also can call each other.
    During operation the system needs to be able to get all permissions for a user at login, so the UserDAO should provide a readPermissions method that does a rather complex join across the user, userRole, rolePermission and permission tables..
    Note that f the system I just described were done with LDAP, a Hierarchical database or an Object database, the userRoles and rolePermissions tables wouldn't even exist, these are RDBMS artifacts since relational databases don't understand many to many relationships. This is good reason to avoid providing DAO's that give access to those tables.

  • Noticing a lot of database index fragmentation yet no Health Analyzer alerts...? Best practice for database maintenance in 2013?

    Could someone point me to a document for best practices for database maintenance with SharePoint 2013? I have read the 2010 document, but I'm hoping their is an updated one that I'm just missing.
    My problem is that our DBA recently noticed that many of our SharePoint databases have high index fragmentation.  I have the Health Analyzer rules enabled for index fragmentation and they run daily, but I've never received an alert despite the majority
    of our databases having greater than 40% fragmentation and some are even above 95%.  
    Obviously it has our attention now and we want to get this addressed.  My understanding (which I now fear is at best incomplete, more likely just plain wrong) was that a maintenance plan wasn't needed for index fragmentation in 2010/2013 like it was
    in 2007. 
    Thanks,
    Troy

    It depends. Here are the rules for that job:
    Sampled mode
    Page count >24 and avg fragmentation in percent >5
    Or
    Page count >8 avg page space used in percent < fill_factor * 0.9 (Fill Factor in SharePoint 2013 varies from 80 to 100 depending on the index, it is important not to adjust index fill factors)
    I have seen cases where the indexes are not automatically managed by the rule and require a manual defragmentation with a Full Scan, instead of Sampled. Once the Full Scan defrag completed, the timer job started handling the index fragmentation automatically.
    Trevor Seward
    Follow or contact me at...
    &nbsp&nbsp
    This post is my own opinion and does not necessarily reflect the opinion or view of Microsoft, its employees, or other MVPs.

  • SAP HCM Implementation: Best Practice for configuring

    Hi,
    This is my first independent project of HCM implementation. I have just started the system configuration. Done with setting up the PA, PSA, EG and ESG. Assigned to CC.  At this stage, I have a very basic question which is, what is the best practice for the next steps of configuration. What do I go to next, step up the OM in the SAP EasyAccess Menu? How should I go from here?
    Would really appreciate some explanatory assistance.
    Thanks in advance.
    Papri
    Edited by: papri_rc on Jul 8, 2011 6:40 AM

    Its all depends on business requirement
    at starting as i advised you review your BBP , make sure you configure everything
    for your reference iam giving the following data for OM and PA config..
    as part of OM
    1.     depict client org structure using simple maintenance , with this you can create large structures in less time(while doing org structure be careful and refer BBP)
    2.     maintain integration switches
    3.     maintain plan version
    4.     maintain number ranges
    Configuration for PA
    HR Enterprise /PersonnelStructure     
    u2022     Personnel Areas     
    u2022     Personnel Sub Areas     
    u2022     Employee Group     
    u2022     Employee Sub Group     
    u2022     Assignment of Personnel Area to Company Code     
    u2022     Assignment of Employee Sub Group to Employee Group     
    Basic Settings
    1.     Maintain Number Range Intervals for Personnel Numbers     
    2.     Determine defaults for number ranges     
    Personal Data     
    1.     Create Forms of Address     
    2.     Create Marital Status     
    Family     
    1.     Defined Possible Family Members     
    Addresses
    1.     Create Address Type     
    Communication     
    1.     Create Communication Types     
    Contractual and Corporate Agreements
    1.     Define Contract Types     
    2.     Determine periods of notice     
    Employee Qualifications
    1.     Create education establishment types     
    2.     Define Education Training     
    3.     Create educational Certificates     
    4.     Create branches of study     
    5.     Determine permissible certificates for education type     
    Infotype Menus     
    1.     User Group Dependency on Menus and Info groups     
    2.     Infotype Menu     
    3.     Determine choice of Infotype menus     
    4.     Infotype Menus     
    Actions     
    1.     User Group Dependency on Menus and Infogroups     
    2.     Info Group     
    3.     Personnel Action Types     
    4.     Create reasons for personnel actions     
    5.     Change Action Menu     
    *Developments(ABAPconsultant will do)     *Field Enhancements     (any field enhancements in infotypes)
    Customer Infotypes     -Develop any customer infotypes if required for the business from 9000 series
    Edited by: Piscian . on Jul 8, 2011 9:08 AM

Maybe you are looking for

  • I want to pick your brains (again)...

    Bill Gehrke and myself have a website for the Adobe community, especially for the Premiere Pro crowd and it is widely used, even on Adobe TV like Todd Kopriva did. With the advent of CS6 we need to make major progress with that website as well. The c

  • New 2014 X1 Carbon owner very unhappy.

    I received my new X1 model (i7 Haswell Touchscreen) a couple days ago and here are my thoughts - BAD keyboard. I just don't know what Lenovo was thinking! The feeling of the buttons is ok, but I'm constantly having to look down at it. The Home, End,

  • Videos will not load in Firefox only. Have reinstalled, restored, etc.

    Videos (from YouTube, Facebook, websites, etc.) will not load in Firefox. I've tried reinstalling Firefox, restoring all settings to default, disabling all addons, extensions, etc. When I try to load a video, a dark rectangle appears where the video

  • My iPad keeps restarting itself.

    I turn my iPad 3 on, it shows me the apple logo and then starts to restart itself and does that over and over again. It never ends.

  • Where/How do I delete my Backup of My Iphone 5.

    I am very limited space on My Iphone 5. And I want to back it up. So i can save all my Pics/Music/Info   on my comp / Or Icloud either. I want to back up my Iphone 5 so I can make space for new Pics/Songs/Contacts/ Apps.. I just dont know how to dele