HR Benefits Best Practice

Is there an HR Benefits Best Practice document floating around out there?

Old question, but I have the same problem at this moment.
We also are looking into 2 solutions:
1. use the substitution costcenters
2. Create a custom function in our payroll scheme and modify the costcenter during payroll-run. Normally the costcenter will eb saved in the clusters and will be read from the clusters in the fico-run.
Anybody some good advice for this?

Similar Messages

  • Best Practice in using Business Packages

    Hi All,
    Are there any Best Practices in the use of Business Package content?   Do you assign the Roles delivered by the Business Package and do you make changes to the original iViews?
    or
    Do you copy the content delivered in the Business Package to a new folder and work with there?
    These questions are purely at the configuration level and not at the Java coding level.   For instance if I want to turn of the iView Tray, or change a parameter such as height, or even remove an iView from a page or Role.
    I would like to know the various approaches the SDN community uses and the different challenges and benefits that result in each approach.
    Look forward to hearing from you all
    Paul

    Hi Paul,
    I also build my own roles. The only time I might use the standard roles is for demo purposes early in a project.  You will find that in some cases the business packages like MSS don't always even include standard roles, so you have no choice but to build.
    I never change any of the standard iViews/Pages/Worksets - ever.
    The most contentious issue seems to be whether to do a full or delta link copy of the standard objects.  I tend to initially do a full copy of the objects into a custom folder set in the PCD and modify those. Then I only use delta links from Page to iViews where I need the option of setting different properties for the same iView if it appears in multiple pages.  Delta links can be a bit flakey at times, so I tend to only use them where I have to.  I suspect that I may get to a point where I don't use them at all.
    Just my 2 cents worth....
    Regards,
    John

  • Best practice for TM on AEBS with multiple macs

    Like many others, I just plugged a WD 1TB drive (mac ready) into the AEBS and started TM.
    But in reading here and elsewhere I'm realizing that there might be a better way.
    I'd like suggestions for best practices on how to setup the external drive.
    The environment is...
    ...G4 Mac mini, 10.4 PPC - this is the system I'm moving from, it has all iPhotos, iTunes, and it being left untouched until I get all the TM/backup setup and tested. But it will got to 10.5 eventually.
    ...Intel iMac, 10.5 soon to be 10.6
    ...Intel Mac mini, 10.5, soon to be 10.6
    ...AEBS with (mac ready) WD-1TB usb attached drive.
    What I'd like to do...
    ...use the one WD-1TB drive for all three backups, AND keep a copy of system and iLife DVD's to recover from.
    From what I'm reading, I should have a separate partition for each mac's TM to backup to.
    The first question is partitioning... disk utility see's my iMac's internal HD&DVD, but doesn't see the WD-1TB on the AEBS. (when TM is activity it will appear in disk utility, but when TM ends, it drops off the disk utility list).
    I guess I have to connect it via USB to the iMac for the partitioning, right?
    I've also read the benefits of keeping a copy of the install DVD's on the external drive... but this raises more questions.
    How do I get an image of the install DVD onto the 1TB drive?
    How do I do that? (install?, ISO image?, straight copy?)
    And what about the 2nd disk (for iLife?) - same partition, a different one, ISO image, straight copy?
    Can I actually boot from the external WD 1TB while it it connected to the AEBS, or do I have to temporarily plug it in via USB?
    And if I have to boot the O/S from USB, once I load it and it wants to restore from the TM, do I leave it USB or move it to the AEBS? (I've heard the way the backups are created differ local vs network)>
    I know its a lot of question but here are the two objectives...
    1. Use TM in typical fashion, to recover the occasion deleted file.
    2. The ability to perform a bare-metal point-in-time recovery (not always to the very last backup, but sometimes to a day or two before.)

    dmcnish wrote:
    From what I'm reading, I should have a separate partition for each mac's TM to backup to.
    Hi, and welcome to the forums.
    You can, but you really only need a separate partition for the Mac that's backing-up directly. It won't have a Sparse Bundle, but a Backups.backupdb folder, and if you ever have or want to delete all of them (new Mac, certain hardware repairs, etc.) you can just erase the partition.
    The first question is partitioning... disk utility see's my iMac's internal HD&DVD, but doesn't see the WD-1TB on the AEBS. (when TM is activity it will appear in disk utility, but when TM ends, it drops off the disk utility list).
    I guess I have to connect it via USB to the iMac for the partitioning, right?
    Right.
    I've also read the benefits of keeping a copy of the install DVD's on the external drive... but this raises more questions.
    Can I actually boot from the external WD 1TB while it it connected to the AEBS, or do I have to temporarily plug it in via USB?
    I don't think so. I've never tried it, but even if it works, it will be very slow. So connect via F/W or USB (the PPC Mac probably can't boot from USB, but the Intels can).
    And if I have to boot the O/S from USB, once I load it and it wants to restore from the TM, do I leave it USB or move it to the AEBS? (I've heard the way the backups are created differ local vs network)
    That's actually two different questions. To do a full system restore, you don't load OSX at all, but you do need the Leopard Install disc, because it has the installer. See item #14 of the Frequently Asked Questions *User Tip* at the top of this forum.
    If for some reason you do install OSX, then you can either "transfer" (as part of the installation) or "Migrate" (after restarting, via the Migration Assistant app in your Applications/Utilities folder) from your TM backups. See the *Erase, Install, & Migrate* section of the Glenn Carter - Restoring Your Entire System / Time Machine *User Tip* at the top of this forum.
    In either case, If the backups were done wirelessly, you must transfer/migrate wirelessly (although you can speed it up by connecting via Ethernet).

  • Upscale / Upsize / Resize - best practice in Lightroom

    Hi, I'm using LR 2 and CS4.
    Before I had Lightroom I would open a file in Bridge and in ACR I would choose the biggest size that it would interpolate to before doing an image re-size in CS2 using Bicubic interpolation to the size that I wanted.
    Today I've gone to do an image size increase but since I did the last one I have purchased OnOne Perfect Resize 7.0.
    As I have been doing re-sizing before I got the Perfect Resize I didn't think about it too much.
    Whilst the re-size ran it struck me that I may not be doing this the best way.
    Follow this logic if you will.
    Before:
    ACR > select biggest size > image re-size bicubic interpolation.
    Then with LR2
    Ctrl+E to open in PS (not using ACR to make it the biggest it can be) > image re-size bicubic interpolation.
    Now with LR2 and OnOne Perfect Resize
    Ctrl+E to open in PS > Perfect Resize.
    I feel like I might be "missing" the step of using the RAW engine to make the file as big as possible before I use OnOne.
    When I Ctrl+E I get the native image size (for the 5D MkII is 4368x2912 px or 14.56x9.707 inches).
    I am making a canvas 24x20"
    If instead I open in LR as Smart Object in PS and then double click the smart icon I can click the link at the bottom and choose size 6144 by 4096 but when I go back to the main document it is the same size... but maybe if I saved that and then opened the saved TIFF and ran OnOne I would end up with a "better" resized resulting document.
    I hope that makes sense!?!?!?!
    Anyway I was wondering with the combo of software I am using what "best practice" for large scale re-sizing is. I remember that stepwise re-sizing fell out of favour a while ago but I'm wondering what is now the considered best way to do it if you have access to the software that was derived from Genuine Fractals.

    I am indeed. LR3 is a nice to have. What I use does the job I need but I can see the benefits of LR3 - just no cash for it right now.

  • SAP PI conceptual best practice for synchronous scenarios

    Hi,
    <br /><br />Apologies for the length of this post but I'm sure this is an area most of you have thought about in your journey with SAP PI.
    <br /><br />We have recently upgraded our SAP PI system from 7.0 to 7.1 and I'd like to document  best practice guidelines for our internal development team to follow.
    I'd be grateful for any feedback related to my thoughts below which may help to consolidate my knowledge to date.
    <br /><br />Prior to the upgrade we have implemented a number of synchronous and asynchronous scenarios using SAP PI as the hub at runtime using the Integration Directory configuration.
    No interfaces to date are exposes directly from our backend systems using transaction SOAMANAGER.
    <br /><br />Our asynchronous scenarios operate through the SAP PI hub at runtime which builds in resilience and harnesses the benefits of the queue-based approach.
    <br /><br />My queries relate to the implementation of synchronous scenarios where there is no mapping or routing requirement.  Perhaps it's best that I outline my experience/thoughts on the 3 options and summarise my queries/concerns that people may be able to advise upon afterwards.
    <br /><br />1) Use SAP PI Integration Directory.  I appreciate going through SAP PI at runtime is not necessary and adds latency to the process but the monitoring capability in transaction SXMB_MONI provide full access for audit purposes and we have implemented alerting running hourly so all process errors are raised and we handle accordingly.  In our SAP PI Production system we have a full record of sync messages recorded while these don't show in the backend system as we don't have propogation turned on.  When we first looked at this, the reduction in speed seemed to be outweighed by the quality of the monitoring/alerting given none of the processes are particularly intensive and don't require instant responses.  We have some inbound interfaces called by two sender systems so we have the overhead of maintaing the Integration Repository/Directory design/configuration twice for these systems but the nice thing is SXMB_MONI shows which system sent the message.  Extra work but seemingly for improved visibility of the process.  I'm not suggesting this is the correct long term approach but states where we are currently.
    <br /><br />2) Use the Advanced Adapter Engine.  I've heard mixed reviews about this functionaslity, there areh obvious improvements in speed by avoiding the ABAP stack on the SAP PI server at runtime, but some people have complained about the lack of SXMB_MONI support.  I don't know if this is still the case as we're at SAP PI 7.1 EHP1 but I plan to test and evaluate once Basis have set up the pre-requisite RFC etc. 
    <br /><br />3) Use the backend system's SOAP runtime and SOAMANAGER.  Using this option I can still model inbound interfaces in SAP PI but expose these using transaction SOAMANAGER in the backend ABAP system.  [I would have tested out the direct P2P connection option but our backend systems are still at Netweaver 7.0 and this option is not supported until 7.1 so that's out for now.]  The clear benefits of exposing the service directly from the backend system is obviously performance which in some of our planned processes would be desirable.  My understanding is that the logging/tracing options in SOAMANAGER have to be switched on while you investigate so there is no automatic recording of interface detail for retrospective review. 
    <br /><br />Queries:
    <br /><br />I have the feeling that there is no clear cut answer to which of the options you select from above but the decision should be based upon the requirements.
    <br /><br />I'm curious to understand SAPs intention with these options  -
    <br /><br />- For synchronous scenarios is it assumed that the client should always handle errors therefore the lack of monitoring should be less of a concern and option 3 desirable when no mapping/routing is required? 
    <br /><br />- Not only does option 3 offer the best performance, but the generated WSDL is ready once built for any further system to implement thereby offering the maximum benefit of SOA, therefore should we always use option 3 whenever possible?
    <br /><br />- Is it intended that the AAE runtime should be used when available but only for asynchronous scenarios or those requiring SAP PI functionality like mapping/routing otherwise customers should use option 3?  I accept there are some areas of functionality not yet supported with the AAE so that would be another factor.
    <br /><br />Thanks for any advice, it is much appreciated.
    <br /><br />Alan
    Edited by: Alan Cecchini on Aug 19, 2010 11:48 AM
    Edited by: Alan Cecchini on Aug 19, 2010 11:50 AM
    Edited by: Alan Cecchini on Aug 20, 2010 12:11 PM

    Hi Aaron,
    I was hoping for a better more concrete answer to my questions.
    I've had discussion with a number of experienced SAP developers and read many articles.
    There is no definitive paper that sets out the best approach here but I have gleaned the following key points:
    - Make interfaces asynchronous whenever possible to reduce system dependencies and improve the user experience (e.g. by eliminating wait times when they are not essential, such as by sending them an email with confirmation details rather than waiting for the server to respond)
    - It is the responsibility of the client to handle errors in synchronous scenarios hence monitoring lost through P-P services compared to the details information in transaction SXMB_MONI for PI services is not such a big issue.  You can always turn on monitoring in SOAMANAGER to trace errors if need be.
    - Choice of integration technique varies considerably by release level (for PI and Netweaver) so system landscape will be a significant factor.  For example, we have some systems on Netweaver 7.0 and other on 7.1.  As you need 7.1 for direction connection PI services we'd rather wait until all systems are at the higher level than have mixed usage in our landscape - it is already complex enough.
    - We've not tried the AAE option in a Production scenarios yet but this is only really important for high volume interfaces, something that is not a concern at the moment.  Obviously cumulative performance may be an issue in time so we plan to start looking at AAE soon.
    Hope these comments may be useful.
    Alan

  • Best practices for loading swf's

    Greetings,
    Using CS5 AS2
    I'm creating a website in flash (all the files will be in one directory/folder on SharePoint) and want to make sure that what seems to be working fine is best practice.
    I have an index.swf with many buttons which will take the user to landing pages/content/other swfs. On these different buttons I have the script...
    on (release) {loadMovieNum("name.swf", 0);}                I could also do just {loadMovie("name.swf", 0);} ??
    The movie transitions nicely to name.swf and on this page I have a button that returns the user to the index.swf...
    on (release) {loadMovieNum("index.swf", 0);}   Things move back to index.swf nicely and user can chose to go to another landing page.
    It looks like I'm on the right track, bc nothing is going awry? but want to check. Am I following best practices for moving from one swf to another within a website?
    Thanks for help or confirmation!!

    loading into _level0 (and you should use loadMovieNum, not loadMovie, when loading into a level) undermines some of the benefits of a flash application:  all the assets must load with each display change and the user sees a flash instead of appearing to transition seamlessly from one display to the next.

  • Best practice for the use of reserved words

    Hi,
    What is the best practice to observe for using reserved words as column names.
    For example if I insisted on using the word comment for a column name by doing the following:
    CREATE TABLE ...
    "COMMENT" VARCHAR2(4000),
    What impact down the track could I expect and what problems should I be aware of when doing something like this?
    Thank You
    Ben

    Hi, Ben,
    Benton wrote:
    Hi,
    What is the best practice to observe for using reserved words as column names.Sybrand is right (as usual): the best practice is not to use them
    For example if I insisted on using the word comment for a column name by doing the following:
    CREATE TABLE ...
    "COMMENT" VARCHAR2(4000),
    What impact down the track could I expect and what problems should I be aware of when doing something like this?Using reserved words as identifiers is asking for trouble. You can expect to get what you ask for.
    Whatever benefits you may get from naming the column COMMENT rather than, say, CMNT or EMP_COMMENT (if the table is called EMP) will be insignificant compared to the extra debugging you will certainly need.

  • CF10 Production Best Practices

    Is there a document or additional information on the best way to configure multiple instances of CF10 in a production environment? Do most folks install CF10 as a ear/war J2EE deployment under JBoss or Tomcat with Apache as the webserver?

    There’s no such document that I know of, no.
    And here’s a perfect example where “best practices” is such a loaded phrase.
    You wonder if “install CF10 as a ear/war J2EE deployment under JBoss or Tomcat with Apache as the webserver”. I’d say the answer to that is “absolutely not”. Most folks do NOT deploy CF as a JEE ear/war. It’s an option, yes. And if you are running A JEE server already, then it does make great sense to deploy CF as an ear/war on said container.
    But would it be a recommended practice for someone installing CF10 without interest in JEE deployment? I’d say not likely, unless they already have familiarity with JEE deployment.
    Now, could one argue “but there are benefits to deploying CF on a JEE container”? Sure, they could. But would it be a “best practice”? Only in the minds of a small minority I think (those who appreciate the beenfits of native JEE deployment and containers). Of course, CF already deploys on a JEE container (Tomcat in CF10, JRun in CF 6-9), but the Standard and Enterprise Server forms of deployment hide all that detail, which is best for most. With those, we just have a ColdFusion directory and are generally none-the-wiser that it runs on JRun or Tomcat.
    That leads then to the crux of your first sentence: you mention multiple instances. That does change things quite a bit.
    First, a couple point of clarification before proceeding: in CF 7-9, such “multiple instance” deployment was for most folks enabled using the Enterprise Multiserver form of deployment, and created a Jrun4 directory where instances were installed (as distinguished from the Enterprise Server form I just mentioned above, which hid the JRun guts).
    In CF10, though, there is no longer a “multiserver” install option. It’s just that CF10 Enterprise (or Trial or Developer editions) does let you create new instances, using the same Instance Manager in the CF admin that existed for CF Enterprise Multiserver from 7-9. CF10 still only lets you create with the Enterprise (or trial or developer) edition, not Standard.
    (There is a change in CF10 about multiple instances, though: note that in CF10, you never see a Tomcat directory, even if you want “multiple instances”. When you create them, they are created right under the CF10 directory, as siblings to the cfusion directory (and while that cfusion directory previously existed only in the CF 7-9 multiserver form of deployment, it does not exist even in CF10 Standard, as the only instance it can use.)
    So all that is a lot of info, not any “best practices”, but you asked if there was any “additional info”, and I thought that helpful for you to have as you contemplate your options. (And of course, CF10 Enterprise does still let you deploy as a JEE ear/war if you want.)
    But no, doing it would not be a best practices. If someone asked for “the best way to configure multiple instances of CF10 in a production environment”, I’d tell them to just proceed as they would have in CF 7-9, using the same CF Admin Instance Manager capability to create them (and optionally cluster them).
    All that said, everything about CF10 does now run on Tomcat instead of JRun, and some things are improved under the covers, like clustering (and related things, like session replication), because those are now Tomcat-based features (which are actively updated and used by the Tomcat community), rather than JRun-based (which were pretty old and hardly used by anyone since JRun was EOL-ed several years ago).
    I’ll note that I offer a talk with a lot more detail contrasting CF10 on Tomcat to CF9 and earlier on JRun. That may interest you, snormo, so check out the presentations page at carehart.org.
    Hope all that’s helpful.
    /charlie
    PS You conclude with a mention of Apache as the web server. And sure, if one is on a *nix depoyiment or just favors Apache, it’s a fine option. But someone running CF 10 on Windows should not be discouraged from running on IIS. It’s come a long way and is now very secure, flexible, and capable, whether used for one or multiple instances of CF. 

  • Data access best practice

    Oracle web site has an article talking about the 9iAS best practice. Predefining column type in the select statement is one of topics. The detail is following.
    3.5.5 Defining Column Types
    Defining column types provides the following benefits:
    (1) Saves a roundtrip to the database server.
    (2) Defines the datatype for every column of the expected result set.
    (3) For VARCHAR, VARCHAR2, CHAR and CHAR2, specifies their maximum length.
    The following example illustrates the use of this feature. It assumes you have
    imported the oracle.jdbc.* and java.sql.* interfaces and classes.
    //ds is a DataSource object
    Connection conn = ds.getConnection();
    PreparedStatement pstmt = conn.prepareStatement("select empno, ename, hiredate from emp");
    //Avoid a roundtrip to the database and describe the columns
    ((OraclePreparedStatement)pstmt).defineColumnType(1,Types.INTEGER);
    //Column #2 is a VARCHAR, we need to specify its max length
    ((OraclePreparedStatement)pstmt).defineColumnType(2,Types.VARCHAR,12);
    ((OraclePreparedStatement)pstmt).defineColumnType(3,Types.DATE);
    ResultSet rset = pstmt.executeQuery();
    while (rset.next())
    System.out.println(rset.getInt(1)+","+rset.getString(2)+","+rset.getDate(3));
    pstmt.close();
    Since I'm new to 9iAS, I'm not sure whether it's true that 9iAS really does an extra roundtrip to database just for the data type of the columns and then another roundtrip to get the data. Anyone can confirm it? Besides the above example uses the Oracle proprietary information.
    Is there any way to trace the db activities on the application server side without using enterprise monitor tool? Weblogic can dump all db activities to a log file so that they can be reviewed.
    thanks!

    Dear Srini,
    Data level Security is not at all issue for me. Have already implement it and so far not a single bug in testing is caught.
    It's about object level security and that too for 6 different types of user demanding different reports i.e. columns and detailed drill downs are different.
    Again these 6 types of users can be read only users or power users (who can do ad hoc analysis) may be BICONSUMER and BIAUTHOR.
    so need help regarding that...as we have to take decision soon.
    thanks,
    Yogen

  • Best practices to generate archivelog

    Hi,
    What are the best practices to generate archivelog?
    Regards,
    RJiv.

    Are you using DataGuard to create & manage the physical standby? If so, what DataGuard protection mode are you using?
    Realistically, there is no ideal frequency at which to generate new archived logs. The faster you generate them, the greater the performance implications, but the less data is potentially at risk. You need to weight the performance costs and recoverability benefits for your particular application to determine what sort of a target to set. That probably requires doing some benchmarking to quantify the performance costs for your particular application.
    If the goal is minimal data loss, however, running DataGuard in a mode that transfers redo in real time rather than copying archived log files would be preferable and render the discussion on how frequently to generate archived log files moot.
    Justin

  • Failover cluster File Server role best practices

    We recently implemented a Hyper-V Server Core 2012 R2 cluster with the sole purpose to run our server environment.  I started with our file servers and decided to create multiple file servers and put them in a cluster for high
    availability.  So now I have a cluster of VMs, which I have now learned is called a guest cluster, and I added the File Server role to this cluster.  It then struck me that I could have just as easily created the File Server role under my Hyper-V
    Server cluster and removed this extra virtual layer.  
    I'm reaching out to this community to see if there are any best practices on using the File Server role.  Are there any benefits to having a guest cluster provide file shares? Or am I making things overly complicated for no reason?
    Just to be clear, I'm just trying to make a simple Windows file server with folder shares that have security enabled on them for users to access internally. I'm using Hyper-V Core server 2012 R2 on my physical servers and right now I have Windows
    Server Standard 2012 R2 on the VMs in the guest cluster.
    Thanks for any information you can provide.

    Hi,
    Generally with Hyper-V VMs available, we will install all roles into virtual machines as that will be easy for management purpose.
    In your situation the host system is a server core, so it seems that manage file shares with a GUI is much better.
    I cannot find an article specifically regarding "best practices of setting up failover cluster". Here are 2 articles regarding build guest cluster (you have already done) and steps to create a file server cluster. 
    Hyper-V Guest Clustering Step-by-Step Guide
    http://blogs.technet.com/b/mghazai/archive/2009/12/12/hyper-v-guest-clustering-step-by-step-guide.aspx
    Failover Cluster Step-by-Step Guide: Configuring a Two-Node File Server Failover Cluster
    https://technet.microsoft.com/en-us/library/cc731844(v=ws.10).aspx
    Please remember to mark the replies as answers if they help and un-mark them if they provide no help. If you have feedback for TechNet Support, contact [email protected]

  • BI Best Practice for Chemical Industry

    Hello,
    I would like to know if anyone is aware of SAP BI  Best Practice for Chemicals.And if so can anyone please post a link aswell.
    Thanks

    Hi Naser,
    Below information will helps you in detail explanation regarding Chemical industry....
    SAP Best Practices packages support best business practices that quickly turn your SAP ERP application into a valuable tool used by the entire business. You can evaluate and implement specific business processes quickly u2013 without extensive Customization of your SAP software. As a result, you realize the benefits with less Effort and at a lower cost than ever before. This helps you improve operational efficiency while providing the flexibility you need to be successful in highly demanding markets. SAP Best Practices packages can benefit companies of all sizes, including global enterprises creating a corporate template for their subsidiaries.
    Extending beyond the boundaries of conventional corporate divisions and functions, the SAP Best Practices for Chemicals package is based on SAP ERP; the SAP Environment, Health & Safety (SAP EH&S) application; and the SAP Recipe Management application. The business processes supported by SAP Best Practices for Chemicals encompass a wide range of activities typically found in a chemical industry
    Practice:
    u2022 Sales and marketing
    u2013 Sales order processing
    u2013 Presales and contracts
    u2013 Sales and distribution (including returns, returnables, and rebates, with quality management)
    u2013 Inter- and intracompany processes
    u2013 Cross-company sales
    u2013 Third-party processing
    u2013 Samples processing
    u2013 Foreign trade
    u2013 Active-ingredient processing
    u2013 Totes handling
    u2013 Tank-trailer processing
    u2013 Vendor-managed inventory
    u2013 Consignment processing
    u2013 Outbound logistics
    u2022 Supply chain planning and execution Supply and demand planning
    u2022 Manufacturing planning and execution
    u2013 Manufacturing execution (including quality management)
    u2013 Subcontracting
    u2013 Blending
    u2013 Repackaging
    u2013 Relabeling
    u2013 Samples processing
    u2022 Quality management and compliance
    u2013 EH&S dangerous goods management
    u2013 EH&S product safety
    u2013 EH&S business compliance services
    u2013 EH&S industrial hygiene and safety
    u2013 EH&S waste management
    u2022 Research and development Transformation of general recipes
    u2022 Supplier collaboration
    u2013 Procurement of materials and services (Including quality management)
    u2013 Storage tank management
    u2013 E-commerce (Chemical Industry Data Exchange)
    u2022 Enterprise management and support
    u2013 Plant maintenance
    u2013 Investment management
    u2013 Integration of the SAP NetWeaver Portal component
    u2022 Profitability analysis
    More Details
    This section details the most common business scenarios u2013 those that benefit most from the application of best practices.
    Sales and Marketing
    SAP Best Practices for Chemicals supports the following sales and marketingu2013related business processes:
    Sales order processing u2013 In this scenario, SAP Best Practices for Chemicals supports order entry, delivery, and billing. Chemical industry functions include the following:
    u2022 Triggering an available-to-promise (ATP) inventory check on bulk orders after sales order entry and automatically creating a filling order (Note: an ATP check is triggered for packaged material.)
    u2022 Selecting batches according to customer requirements:
    u2022 Processing internal sales activities that involve different organizational units
    Third-party and additional internal processing u2013 In this area, the SAP Best Practices for Chemicals package provides an additional batch production step that can be applied to products previously produced by either continuous or batch processing. The following example is based on further internal processing of plastic granules:
    u2022 Purchase order creation, staging, execution, and completion
    u2022 In-process and post process control
    u2022 Batch assignment from bulk to finished materials
    u2022 Repackaging of bulk material
    SAP Best Practices for Chemicals features several tools that help you take advantage of chemical industry best practices. For example, it provides a fully documented and reusable prototype that you can turn into a productive solution quickly. It also provides a variety of tools, descriptions of business scenarios, and proven configuration of SAP software based on more than 35 years of working with the
    Chemical industry.
    SAP Functions in Detail u2013 SAP Best Practices for Chemicals
    The package can also be used to support external toll processing such as that required for additional treatment or repackaging.
    Tank-trailer processing u2013 In this scenario, SAP Best Practices for Chemicals helps handle the selling of bulk material, liquid or granular. It covers the process that automatically adjusts the differences between the original order quantities and the actual quantities filled in the truck. To determine the quantity actually filled, the tank trailer is weighed before and after loading. The delta weight u2013 or quantity filled u2013 is transmitted to the SAP software via an order confirmation. When the delivery for the sales order is created, the software automatically adjusts the order quantity with the confirmed filling quantity.The customer is invoiced for the precise quantity filled and delivered.
    Supply Chain Planning and Execution
    SAP Best Practices for Chemicals supports supply chain planning as well as supply chain execution processes:
    Supply and demand planning u2013 Via the SAP Best Practices for Chemicals package, SAP enables complete support for commercial and supply-chain processes in the chemical industry, including support for integrated sales and operations planning, planning strategies for bulk material, and a variety of filling processes with corresponding packaging units. The package maps the entire supply chain u2013 from sales planning to material requirements planning to transportation procurement.
    Supplier Collaboration
    In the procurement arena, best practices are most important in the following
    Scenario:
    Procurement of materials and services:
    In this scenario, SAP Best Practices for Chemicals describes a range of purchasing processes, including the following:
    u2022 Selection of delivery schedules by vendor
    u2022 Interplant stock transfer orders
    u2022 Quality inspections for raw materials, including sampling requests triggered
    by goods receipt
    Manufacturing Scenarios
    SAP Best Practices for Chemicals supports the following sales and
    Manufacturingu2013related business processes:
    Continuous production u2013 In a continuous production scenario, SAP Best Practices for Chemicals typifies the practice used by basic or commodity chemical producers. For example, in the continuous production of plastic granules, production order processing is based on run-schedule headers. This best-practice package also describes batch and quality management in continuous production. Other processes it supports include handling of byproducts,co-products, and the blending process.
    Batch production u2013 For batch production,
    SAP Best Practices for Chemicals typifies the best practice used by specialty
    chemical producers. The following example demonstrates batch production
    of paint, which includes the following business processes:
    u2022 Process order creation, execution, and completion
    u2022 In-process and post process control
    u2022 Paperless manufacturing using XMLbased Process integration sheets
    u2022 Alerts and events
    u2022 Batch derivation from bulk to finished materials
    Enterprise Management and Support
    SAP Best Practices for Chemicals also supports a range of scenarios in this
    area:
    Plant maintenance u2013 SAP Best Practices for Chemicals allows for management
    of your technical systems. Once the assets are set up in the system, it focuses on preventive and emergency maintenance. Tools and information support the setup of a production plant with assets and buildings.Revenue and cost controlling u2013 The package supports the functions that help you meet product-costing requirements in the industry. It describes how cost centers can be defined, attached
    to activity types, and then linked to logistics. It also supports costing and settlement of production orders for batch and continuous production. And it includes information and tools that help you analyze sales and actual costs in a margin contribution report.
    The SAP Best Practices for Chemicals package supports numerous integrated
    business processes typical of the chemical industry, including the following:
    u2022 Quality management u2013 Supports integration of quality management concepts across the entire supplychain (procurement, production, and sales), including batch recall and complaint handling
    u2022 Batch management u2013 Helps generate batches based on deliveries from vendors or because of company production or filling, with information and tools for total management of batch production and associated processes including batch  derivation, batch information cockpit, and a batchwhere- used list
    u2022 Warehouse management u2013 Enables you to identify locations where materials
    or batch lots are stored, recording details such as bin location and other storage information on dangerous goods to help capture all information needed to show compliance with legal requirements
    Regards
    Sudheer

  • Best practice in HCM

    Hi,
    1) Can you please advise,what all processes comes under SAP Best Practices for HCM?
    2) What all infotypes comes under Payroll implementations?
    Thanks!
    Manish

    Hi,
    You can read every thread in this forum.
    1.Personnel Administration - it deals with the HR Master Data
    2.Organization Management - It deals with the personnel planning all about Business Units,Cost centers etc this will give the picture how your organization look virtually.
    Keep these modules as base must know modules
    3 Time Management - This manages employee times ,leaves,quotas,shifts,working schedules etc working patterns etc
    CATS ,Cross Application Time sheets
    Shift Planning etc are part of Time Management but these are individually separate knowledge of these is not necessary when you start your career going forward it would be helpful to grow.
    4.Payroll- This manages employee payment part and payment including the off cycle payments
    These 4 can be considered to be the basic modules of HCM
    5 Recruitment -Deals with process of recruitment from manpower planning to on-boarding
    Apart from these we have
    6.Benefits -which deals with the fringe benefits (applicable to USA ) and some other countries-I t deals with the insurance ,medical etc
    7,ESS/MSS This is self service module employee self service and manager self service these are Portal based modules
    We have new generation modules
    Such(8) E- Recruitment this deals with the E Recruitment this is portal based modules
    Learning Solution
    Succession Planning
    Personnel Development
    Talent Visualization by Nakisa
    Enterprise compensation Management (this is again a separate module)
    These all part of Talent Management Suite and portal based modules knowing these would give you an edge.
    This module is very upcoming and hot
    To become an effective HCM consultant you need have good business process knowledge combined with the any of 4 to modules (indepth)
    Outside HCM if you knowledge of HR ABAP and MS Excel and MS Word of great advantage.
    To start your choose Personnel Management,Org Management combined with Time Management and Payroll
    Or ESS/MSS with Talent Management
    Search in transaction PA30- Hr Master data and press F4 , you will find entire details 0000 Actions 0001 Organizational Assignment 0002 Personal Data 0003 Payroll Status 0004 Challenge 0005 Leave Entitlement 0006 Addresses 0007 Planned Working Time 0008 Basic Pay 0009 Bank Details 0011 External Transfers 0014 Recurring Payments/Deductions 0015 Additional Payments 0016 Contract Elements 0017 Travel Privileges 0019 Monitoring of Tasks 0021 Family Member/Dependents 0022 Education 0023 Other/Previous Employers 0024 Qualifications 0025 Appraisals 0027 Cost Distribution 0028 Internal Medical Service 0030 Powers of Attorney 0031 Reference Personnel Numbers 0032 Internal Data 0033 Statistics 0034 Corporate Function 0035 Company Instructions 0037 Insurance 0040 Objects on Loan 0041 Date Specifications 0045 Loans 0048 Residence Status 0050 Time Recording Info 0054 Works Councils 0057 Membership Fees 0077 Additional Personal Data 0078 Loan Payments 0080 Maternity Protection/Parental Leave 0081 Military Service 0083 Leave Entitlement Compensation 0105 Communication 0121 RefPerNo Priority 0123 Germany only 0124 Disruptive Factor D 0128 Notifications 0130 Test Procedures 0139 EE's Applicant No. 0165 Deduction Limits 0167 Health Plans 0168 Insurance Plans 0169 Savings Plans 0171 General Benefits Information 0185 Personal IDs 0219 External Organizations 0236 Credit Plans 0262 Retroactive accounting 0267 Add. Off-Cycle Payments w/Acc.***. 0267 Additional Off-Cycle Payments 0283 Archived Objects 0290 Documents and Certificates (RU) 0292 Add. Social Insurance Data (RU) 0293 Other and Previous Employers (RU) 0294 Employment Book (RU) 0295 Garnishment Orders (RU) 0296 Garnishment Documents (RU) 0297 Working Conditions (RU) 0298 Personnel Orders (RU) 0299 Tax Privileges (RU) 0302 Additional Actions 0315 Time Sheet Defaults 0330 Non-Monetary Remuneration 0334 Suppl. it0016 (PT) 0376 Benefits Medical Information 0377 Miscellaneous Plans 0378 Adjustment Reasons 0379 Stock Purchase Plans 0380 Compensation Adjustment 0381 Compensation Eligibility 0382 Award 0383 Compensation Component 0384 Compensation Package 0395 External Organizational Assignment 0396 Expatriation 0402 Payroll Results 0403 Payroll Results 2 0415 Export Status 0416 Time Quota Compensation 0429 Position in PS 0439 Data Transfer Information 0458 Monthly Cumulations 0459 Quarterly Cumulations 0460 Annual Cumulations 0468 Travel Profile (not specified) 0469 Travel Profile (not specified) 0470 Travel Profile 0471 Flight Preference 0472 Hotel Preference 0473 Rental Car Preference 0474 Train Preference 0475 Customer Program 0476 Garnishments: Order 0477 Garnishments: Debt 0478 Garnishments: Adjustment 0483 CAAF data clearing (IT) 0484 Taxation (Enhancement) 0485 Stage 0491 Payroll Outsourcing 0503 Pensioner Definition 0504 Pension Advantage 0529 Additional Personal Data for (CN) 0552 Time Specification/Employ. Period 0553 Calculation of Service 0559 Commuting allowance Info JP 0560 Overseas pay JP 0565 Retirement Plan Valuation Results 0567 Data Container 0569 Additional Pension Payments 0573 Absence for Australia PS 0576 Seniority for Promotion 0579 External Wage Components 0580 Previous Employment Tax Details 0581 Housing(HRA / CLA / COA) 0582 Exemptions 0583 Car & Conveyance 0584 Income From Other Sources 0585 Section 80 Deductions 0586 Section 80 C Deductions 0587 Provident Fund Contribution 0588 Other Statutory Deductions 0589 Individual Reimbursements 0590 Long term reimbursements 0591 Nominations 0592 Public Sector - Foreign Service 0593 Rehabilitants 0597 Part Time Work During ParentalLeave 0601 Absence History 0602 Retirement Plan Cumulations 0611 Garnishments: Management Data 0612 Garnishments: Interest 0614 HESA Master Data 0615 HE Contract Data 0616 HESA Submitted Data 0617 Clinical Details 0618 Academic Qualification 0624 HE Professional Qualifications 0648 Bar Point Information 0650 BA Statements 0651 SI Carrier Certificates 0652 Certificates of Training 0653 Certificates to Local Authorities 0655 ESS Settings Remuneration Statement 0659 INAIL Management 0666 Planning of Pers. Costs 0671 COBRA Flexible Spending Accounts 0672 FMLA Event 0696 Absence Pools 0702 Documents 0703 Documents on Dependants 0704 Information on Dependants 0705 Information on Checklists 0706 Compensation Package Offer 0707 Activation Information 0708 Details on Global Commuting 0709 Person ID 0710 Details on Global Assignment 0712 Main Personnel Assignment 0713 Termination 0715 Status of Global Assignment 0722 Payroll for Global Employees 0723 Payroll for GE: Retro. Accounting 0724 Financing Status 0725 Taxes SA 0742 HDB Concession 0745 HDB Messages in Public Sector 0746 De Only 0747 DE Only 0748 Command and Delegation 0758 Compensation Program 0759 Compensation Process 0760 Compensation Eligibility Override 0761 LTI Granting 0762 LTI Exercising 0763 LTI Participant Data 0783 Job Index 0784 Inquiry Family Court 0785 Pension Equalization Payment 0787 Germany Only 0788 Germany Only 0789 Germany Only 0790 Germany Only 0792 Organizational Additional Data 0794 Pensioner Message A 0795 Certification and Licensing 0796 Duty Assignments 0800 Material Assignment 0802 Sanctions / Offense 0803 Seniority Ranked List 0804 Personal Features 0805 Honors 0806 Course Data 0813 Historical Additional Fees A 0815 Multiple Checks in One Cycle 0845 Work Relationships 0846 Reimbursements 0851 Shukko Cost Charging 0852 Shukko Cost Charging Adjustment 0853 Shukko External Org. Assignment 0860 Sanctions / Offense 0861 Award/Decorations 0863 Verdict 0865 Mobility 0873 Additional Amount - Garnishment FR 0875 Events - My Simplification 0881 Expense Information 0882 Insurability Basic Data 0883 Entitlement Periods 0884 Insurability Calculation 0887 Garnishments (ES) 0900 Sales Data 0901 Purchasing Data 0904 Override Garnishable Amount D 0908 Info. about Annual Income Check 0942 Capital Payment 0976 Municipal Tax per Person 0978 Pension Contribution A 0979 Pension A 2001 Absences 2002 Activity Allocation (Attendances) 2002 Attendances 2002 Cost Assignment (Attendances) 2002 External Services (Attendances) 2002 Order Confs.(Att) 2003 Substitutions 2003 Substitutions: Indiv. Working Times 2004 Availability 2005 Overtime 2006 Absence Quotas 2007 Attendance Quotas 2010 Cost Allocation (EE Rem. Info) 2010 Cost Assignment (EE Rem. Info) 2010 Employee Remuneration Info 2011 Completed Time Events 2011 Time Events 2011 Time Events (CO) 2011 Time Events (PM) 2011 Time Events (PP) 2012 Time Transfer Specifications 2013 Quota Corrections 2050 Annual Calendar 2051 Monthly Calendar 2052 Absence Recording 2052 List Entry for Attendances/Absences 2052 Weekly Calendar w/Cost Assignment 2052 Weekly Entry w/Activity Allocation 3003 Materials Management 3202 Addresses 3215 SWF Staff Details 3216 SWF Contract Details 3217 SWF Qualifications 3893 Time Account Status 3894 Factoring Information BPO 
    Thanks and Regards,
    Revathi.

  • SAP Best Practice for Chemicals in an oil refinery

    Does anyone have experience in implementing the SAP Best Practice for Chemicals in an oil refinery that will also use IS-OIL?
    What would be the pros and cons?
    What is the implementation price of considering a best practice solution for a large complex organization?
    Oded Dagan
    Oil Refineries LTD.
    Israel

    Hi Oded,
    I don't know about any Best Practice Chemicals implementation in an refinery so far.
    But you can use Best Practice Chemicals with SAP IS-Oil for the non IS-Oil functionality as well.
    Best Practice Chemicals gives you benefits within standard business processes, but for the
    IS-Oil business processes you have to consider the traditon implementation methods.
    Best Practice Chemicals gives you a broad business process set, usually used in Chemical
    corporation. If you can cover 50% of your needed business processes out of Best Practice Chemicals you save approx 50% implementation time. It is not only implemenentation you save a lot in documentation and training material as well. Most of our Best Practice Chemicals implemenations used
    60-80% out of Best Practice Chemicals. At a large corporation the percentage of standard ERP processes is normally smaller, because of other additional needed SAP solutions e.g. APO, SRM, CRM etc.
    regards Rainer

  • Require official Oracle Best Practices about PSU patches

    A customer complained about the following
    Your company statements are not clear...
    On your web page - http://www.oracle.com/security/critical-patch-update.html
    The following is stated!
    Critical Patch Update
    Fixes for security vulnerabilities are released in quarterly Critical Patch Updates (CPU), on dates announced a year in advance and published on the Oracle Technology Network. The patches address significant security vulnerabilities and include other fixes that are prerequisites for the security fixes included in the CPU.
    The major products patched are Oracle Database Server, Oracle Application Server, Oracle Enterprise Manager, Oracle Collaboration Suite, Oracle E-Business Suite, PeopleSoft Enterprise Tools, PeopleSoft CRM, JD Edwards EnterpriseOne, JD Edwards OneWorld XE, Oracle WebLogic Suite, Oracle Communications and Primavera Product Suite.
    Oracle recommends that CPUs be the primary means of applying security fixes to all affected products as they are released more frequently than patch sets and new product releases.
    BENEFITS
    * Maximum Security—Vulnerabilities are addressed through the CPU in order of severity. This process ensures that the most critical security holes are patched first, resulting in a better security posture for the organization.
    * Lower Administration Costs—Patch updates are cumulative for many Oracle products. This ensures that the application of the latest CPU resolves all previously addressed vulnerabilities.
    * Simplified Patch Management—A fixed CPU schedule takes the guesswork out of patch management. The schedule is also designed to avoid typical "blackout dates" during which customers cannot typically alter their production environments.
    PROGRAM FEATURES
    * Cumulative versus one-off patches—The Oracle Database Server, Oracle Application Server, Oracle Enterprise Manager, Oracle Collaboration Suite, Oracle Communications Suite and Oracle WebLogic Suite patches are cumulative; each Critical Patch Update contains the security fixes from all previous Critical Patch Updates. In practical terms, the latest Critical Patch Update is the only one that needs to be applied if you are solely using these products, as it contains all required fixes. Fixes for other products, including Oracle E-Business Suite, PeopleSoft Enterprise Tools, PeopleSoft CRM, JD Edwards EnterpriseOne, and JD Edwards OneWorld XE are released as one-off patches, so it is necessary to refer to previous Critical Patch Update advisories to find all patches that may need to be applied.
    * Prioritizing security fixes—Oracle fixes significant security vulnerabilities in severity order, regardless of who found the issue—whether the issue was found by a customer, a third party security researcher or by Oracle.
    * Sequence of security fixes—Security vulnerabilities are first fixed in the current code line. This is the code being developed for a future major release of the product. The fixes are scheduled for inclusion in a future Critical Patch Update. However, fixes may be backported for inclusion in future patch sets or product releases that are released before their inclusion in a future Critical Patch Update.
    * Communication policy for security fixes—Each Critical Patch Update includes an advisory. This advisory lists the products affected by the Critical Patch Update and contains a risk matrix for each affected product.
    * Security alerts—Security alerts provide a notification designed to address a single bug or a small number of bugs. Security Alerts have been replaced by scheduled CPUs since January 2005. Unique or dangerous threats can still generate Security Alert email notifications through MetaLink and the Oracle Technology Network.
    Nowhere in that statement is the Patch Set Update even mentioned. If Oracle intends to recommend to all customers that Patch Set Updates are the recommended means of Patching for Security and Functionality then it should be stated so here!
    Please clarify!
    Where can I find the current information so that I can use to Official Oracle statement as a reference for my Enterprise Practices and Standards document? The individual patch package references you are giving me do not state Oracle recommended Best Practice, they only speak to the specific patch package they describe. These do not help me in making an Enterprise statement of Practices and Standards.
    I need to close the process out to capture a window of availability for Practices and Standards approval.
    Do we have any Best Practice document about PSU patches available for customers?

    cnawrati wrote:
    A customer complained about the following
    Your company statements are not clear...
    On your web page - http://www.oracle.com/security/critical-patch-update.html
    Who is the "your" to which you are referring?
    <snip>
    Nowhere in that statement is the Patch Set Update even mentioned. If Oracle intends to recommend to all customers that Patch Set Updates are the recommended means of Patching for Security and Functionality then it should be stated so here!Um. OK
    Please clarify!
    Of whom are you asking for a clarification?
    Where can I find the current information so that I can use to Official Oracle statement as a reference for my Enterprise Practices and Standards document? The individual patch package references you Who is the "you" to which you refer?
    are giving me do not state Oracle recommended Best Practice, they only speak to the specific patch package they describe. These do not help me in making an Enterprise statement of Practices and Standards.
    I need to close the process out to capture a window of availability for Practices and Standards approval.
    Be our guest.
    Do we What do you mean "we", Kemosabi?
    have any Best Practice document about PSU patches available for customers?This is a very confusing posting, but overall it looks like you are under the impression that this forum is some kind of channel for communicating back to Oracle Crop anything that happens to be on your mind about their corporate web site and/or policies and practices. Please be advised that this forum is simply a platform provided BY Oracle Corp as a peer operated user support group. No one here is responsible for anything on any Oracle web site. No one here is responsible for any content anywhere in the oracle.com domain, outside of their own personal posting on this forum. In other words, you can complain all you want about Oracle's policy, practice, and support, but "there's no one here but us chickens."

Maybe you are looking for

  • Report Designer - Chart and data

    Hi, I have a query section. In one of the cell, I have a chart. I have the data in other columns on the side of the chart. When I execute the report designer, I get the chart and the then the data below on the side. When I create a pdf, the data move

  • IPad no longer recognizing CF card through USB port

    I'm a photographer, Using a USB cable to a CF card reader, I can import and view my images via the iPad. This not only is easily done, Lexar's president has a video on Youtube demonstrating this. I've used this feature regularly for several events. H

  • [Microsoft][ODBC driver for Oracle]Syntax error or access violation

    Hi, When I am trying to connect to Oracle 8.1.6 database using Microsoft ODBC driver for Oracle. The connection is established. But while creating CallableStatement, I am getting error "[Microsoft][ODBC driver for Oracle]Syntax error or access violat

  • Simple (first attempt) mxi

    ok guys decided to look into extension writing have hacked away at the config dir and files within to produce what im after. a simple button which launches the browser and directs to a url. This bit is fine and have it working and it packages up fine

  • Export Specific Timelines (Main, Nested, Frame counts)

    It seems that every Flash file exports the main timeline in its entirety. Is there a way to choose to export jpeg sequences, mov's, and swf's of: - Specific ranges of frames - Specific timelines within a symbol only? - Specific ranges of frames withi