Best Pratice links for HM/HR

Hello Gurus...
I am trying to get some best practices for HR/HCM for BI implementation, but the links that i got in SDN threads dont work.
Can any one please guide me to the correct URL?
Also, is there any diff between HR and HCM?
Is there any change in HR between R/3 4.6 and ECC. I have documents for 4.6
Thank you,
Kris

http://help.sap.com/bp_bw370/html/index.htm
then   Business intelligence -> Preconfigured Scenarios  ... here u can find best practices for HCM
HCM is new terminology in NW BI for HR in BW..
There must be some different 4.6 and ECC, but most of the ECC content is available as 3.x
http://help.sap.com/saphelp_nw70/helpdata/en/2a/77eb3cad744026e10000000a11405a/frameset.htm

Similar Messages

  • Best JRE link for a consumer JNLP product

    I have a JNLP deployed, consumer, product close to being done. Supported platforms will be Windows & Snow Leopard. For Windows, I need a link to direct customers to get a runtime. I can make NO assumption that they even know what Java is. I want the experience to as simple as Acrobat or Flash runtime.
    The standard spot I use for myself, http://java.sun.com/javase/downloads/index.jsp , has all kinds of stuff that they will have no idea about. Giving them choices they do not understand just as prerequisite to installing your product seems like a really bad idea.
    The standard page points to https://cds.sun.com/is-bin/INTERSHOP.enfinity/WFS/CDS-CDS_Developer-Site/en_US/-/USD/ViewProductDetail-Start?ProductRef=jre-6u18-oth-JPR@CDS-CDS_Developer
    This has the release baked into the GET url. Will it suddenly stop working when u19 is out? Is this page meant to be gone to directly? I have heard that there is or going to be a Java store, but I am using a lease / monthly subscription business model, and already have my own credit card back-end interface. Where do users of that store get their JRE?

    I have similar questions. I have a game that is a Java applet. On the main webpage (no applet on this page) there is a "play" button. Clicking the play button loads a new tab/window with the applet on it. I am using Nimbus, so the user should have 1.6.0_10 or later plugin.
    I am trying to make things as easy for users as possible, and so far, the solutions I have found have subtle problems.
    If I use the normal deployJava.js script from Sun, when a user clicks on my "play" link, instead of going to the applet page it takes them to the Sun page to download the plugin. This seems very confusing to me. If you know nothing about Java, and click on a "Play" link, and see a totally unrelated page asking you to download something, wouldn't that be really confusing? I know if my dad did this, there is no way he would click the download link because it seems very suspicious.
    The previous post recommends using deployJava.setInstallerType("kernel"); to make it easier, but a little googling makes it sound like this is no less confusing. I tried, and instead of a webpage, it pops up a window saying something like "you have chosen to download..." or something similar--also seems confusing.
    I followed the example from http://java.sun.com/docs/books/tutorial/deployment/deploymentInDepth/ensuringJRE.html to make it pop up a message, but when using IE it says it passes the check and loads it with the outdated JRE plugin. It works in Firefox, but the titlebar, etc. are ugly/messy (with the whole URL displaying, etc.).
    Suggestions?

  • Best pratices for GATP by SAP

    Hi all,
    I am not able to download best pratices for GATP by SAP from http://help.sap.com/bp_scmv250/BBLibrary/HTML/ATP_EN_DE.htm. Seems the documents are removed. Can some one who already downloaded share the same with me?
    Also can you provide working links for best pratices for SNP and PP/DS?
    Thankyou,
    Ram

    Hello Ram
    Please check this wiki page - it has good content and some useful links
    APO-GATP General Information - Supply Chain Management (SCM) - SCN Wiki
    and
    Find out more on RDS solution for GATP at : http://service.sap.com/rds-gatp
    if you search http://service.sap.com/bestpractices you will find a documents about best practice in GATP.  The help.sap.com for GATP is a good resource too to start with as well.
    Also you can read below blog written by me
    Global Available To Promise (GATP) Overview
    Hope this will help
    Thank you
    Satish Waghmare

  • ADF Faces & BC: Best pratices for project layout

    Season greetings my fellow JDevelopers!
    Our software group has been working with ADF for around 5 years and through the years we have accumulated a good amount of knowledge working with JDeveloper and ADF. Much of our current application structure has been resurrected in the early days of JDeveloper 10 where there were more samples codes floating around then there were "best pratice" documentation. I understand this is a subjective topic and varies site to site, but I believe there is a set of common practices our group has started to identify as critical to streamlining a development process(reusable decorated ui components, modular common biz logic, team development with svn, continuous integration/build, etc..). One of our development goals is to minimize dependency between each engineer as everyone is responsible for both client and middle layer implementation without losing coding consistency. After speaking with a couple of the aces at the last openworld, I understand much of our anticipated architectural requirements are met with JDeveloper 11(with the introduction of templates, declarative components, bounded task flows, etc..) but due to time constraints on upcoming deliverables we are still about an year away before moving on with that new release. The following is a little bit about our group/application.
    JDeveloper version: 10.1.3.4
    Number of developers: 7
    Developer responsibilties: Build both faces & bc code
    We have two applications currently in our production environments.
    1.A flavor of Steve Muench's dynamic jdbc credentials login module
    2.Core ADF Faces & BC application
    In our Core ADF Faces application, we have the following structure:
    OurApplication
         -OurApplicationLib (Common framework files)
         -OurApplicationModel (BC project)
              -src/org/ourapp/module1
              -src/org/ourapp/module2
         -OurApplicationView (Faces project)
              public_html/ourapp/module1
              public_html/ourapp/module2
              src/org/ourapp/backing/module1
              src/org/ourapp/backing/module2
              src/org/ourapp/pageDefs/
    Total Number of Application Modules: 15 (Including one RootApplicationModule which references module specific AMs)
    Total Number View Objects: 171
    Total Number of Entities: 58
    Total Number of BC Files: 1734
    Total Number of JSPs: 246
    Total Number of pageDefs: 236
    Total Number of navigation cases in faces-config.xml: 127
    Total Number of application files: 4183
    Total application size: 180megs
    Are there any other ways to divide up this application? Ie: module specific projects with seperate faces-config files/databindings? If so, how can these files be "hooked" together? A couple of the aces has recommended that we should separate all the entity files into its own project which make sense. Also, we are looking into the maven builds which should remove those pesky model.jpr files that constantly gets “touched”. I would to love hear how other groups are organizing their application and anything else they would like to share as an ADF best pratice.
    Cheers,
    Wes

    After discussions over the summer/autumn by members of the ADF Methodology Group I have published an ADF Coding Standards wiki page that people may find useful:
    [http://wiki.oracle.com/page/ADF+Coding+Standards]
    It's aimed at ADF 11g and is intended to be a living document - if you have comments or suggestions please post them to the ADF Methodology google group ( [http://groups.google.com/group/adf-methodology?hl=en] ).

  • Any Oracle best practice/standards for inter-DataCente links for Oracle RAC

    Hello Oracle Experts,
    Am working for a customer to set up Oracle RAC architecture hosting SAP/Non-SAP applications per SLA levels(MC/BC/Standard) specs. Currently my network team needs calculation to arrive at whether we will go for a (1), (2) or (3) 10Gig links for inter DC (Data-Center) for Oracle RAC.. below is additional background:
    •     Porting all client SAP/Non-SAP Oracle databases to new 2 data-centers.
    •     There will be 10 blades (4x BL680s and 6x BL460s) in each DC (can scale-up/out later on).
    •     Clusters architecture to support Extended/Stretched RAC cluster feature
    •     Clusters 2-node each(1-datacenter1, 1-datacenter2) and nodes distributed across 2 x c7000 such that no cluster has more than one node in an enclosure.
    •     Each node will have - 4 NIC ports ( 2 x public and 2 x private) , 2 dual-port HBA
    •     Oracle ASM/ACFS (ASM Cluster File System), Voting Disk, OCR and Database files
    •     the versions are Oracle 11g RAC, Oracle 10g RAC and Oracle 9i (for DataGuard/Standby) on RHEL 6 on Proliant Blades (x86) + BladeMatrix
    My network colleagues considering using DWDM across the 2 DCs(given the lesser cost?). Am still looking around if there are any Oracle/industry-best practices around this and having a calculation to support that..
    Many Thanks in advance..
    Regards,
    Abhijit

    Hi ,
    There are no specific set of steps / practices for batch loading contents to ucm . It would be very much dependent on how many contents does the user have to load to UCM and how well the server is configured in terms of performance .
    You can get more details from the following documentation link : http://docs.oracle.com/cd/E21043_01/doc.1111/e10792/c02_settings009.htm
    Thanks,
    Srinath

  • Best pratices for ODI interfaces

    I was wondering how everyone is handling the errors that occur when running an interface with ODI.
    Our secinaro:
    We have customer data that we want to load each night via ODI. The data is in a flat file and a new file is provided each night.
    We have come across an issue where a numeric field had data that was non numeric in it ... so ODI created a bad file ... with the bad record in it .... and an error file with the error message. We then had some defined constraints that forced records into the E$ table.
    My question is how does everyone handle looking for these errors. We would like them to just bereported to one place ( an oracle table ) so when the process runs we can just look at the one table and then act on the issues.... as shown above ODI puts errors in two different places... DB ones in a flat file and user defined in the E$ tables.....
    I was wondering if anyone has come across this issue and might be able to tell me what was done to handle the errors that occurr .. or what the best pratices might be for handling these errors?
    Thanks for any assistance.
    Edited by: 832187 on Sep 29, 2011 1:18 PM

    If you have got few fields affected by conversion problem you could try insert an ODI constraint or you could modify LKM to load bad file if present.

  • Best pratices for the customizing about the performance

    Hello,
    I would like to know the list of the best pratices for the customizing BPC NW 7.5 about the performance.
    Best regards
    Bastien

    Hi,
    There are few how to guides on SDN which will give you a basic idea on script logic. Apart from this, you can refer to the help guide on help. sap.com.
    The templates might also effect the performance. The number of EVDRE functions, the number of expansion dimensions, the number of members on which expansion takes place will effect the performance. A complex formatting in the template will also effect.
    Hope this helps.

  • Link for Best practice

    Can anybody tell me the link for the Best Practice in steel industry
    Regards

    http://help.sap.com/bp_bblibrary/600/BBlibrary_start.htm
    Here choose Industry & Country...
    But I don't think SAP provided Best practice package for Steel..
    Edited by: Manohar Raju on Apr 29, 2008 12:28 PM

  • Best pratices for RMAN backup management for many databases

    Dear all,
    We have many 10g databases (>40) hosted on multiple Windows servers which are backup up using RMAN.
    A year ago, all backup's were implemented through Windows Scheduled Tasks using some batch files.
    We have been busy (re)implementing / migrating such backup in Grid Control.
    I personally prefer to maintain the backup management in Grid Control, but a colleague wants now to go back to the batch files.
    What i am looking for here, are advices in the management of RMAN backup for multiple databases: do you guys use Grid Control or any third-party backup management tool or even got your home-made solution?
    One of the discussion topic is the work involved in case that the central backup location changes.
    Well... any real-life advices on best practices / strategies for RMAN backup management for many databases will be appreciated!
    Thanks,
    Thierry

    Hi Thierry,
    Thierry H. wrote:
    Thanks for your reaction.
    So, i understand that Grid Control is for you not used to manage the backups, and as a consequence, you also have no 'direct' overview of the job schedules.
    One of my concern is also to avoid that too many backups are started at the same time to avoid network / storage overload. Such overview is availble in Grid Control's Jobs screen.
    And, based on your strategy, do you recreate a 'one-time' Oracle scheduled job for every backup, or do your scripts create an Oracle job with multiple schedule?
    You're very welcome!
    Well, Grid Control is not an option for us, since each customer is in a separate infrastructure, and with their own licensing. I have no real way (in difference to your situation) to have a centralized point of control, but that on the other hand mean that I don't have to consider network/storage congestion, like you have to.
    The script is run from a "permanent" job within the dba-scheduler, created like this:
    dbms_scheduler.create_job(
            job_name        => 'BACKUP',
            job_type        => 'EXECUTABLE',
            job_action      => '/home/oracle/scripts/rman_backup.sh',
            start_date      => trunc(sysdate)+1+7/48,
            repeat_interval => 'trunc(sysdate)+1+7/48',
            enabled         => true,
            auto_drop       => false,
            comments        => 'execute backup script at 03:30');and then the "master-script", determines which level to use, based on weekday from the OS. The actual job schedule (start date, run interval etc) is set together with the customer IT/IS dept, to avoid congestion on the backup resources.
    I have no overview of the backup status, run times etc, but have made monitoring scripts that will alert me if/when a backup either fails, or runs for too long. This, in addition with scheduled disaster/recovery tests makes me sleep rather well at night.. ;-)
    I realize that there (might be) better ways of doing backup scheduling in your environment, since my requirements are so completely different than yours, but I guess that we all face the same challenges in unifying the environments as much as possible, to minimize the amount of actual work we have to do. :-)
    Good luck!
    //Johan

  • Vmware Data recovery Best pratices

    Hi,
    I am looking for Vmware Data recovery Best pratices, I find everywhere on the internet the following link :  http://viops.vmware.com/home/docs/DOC-1551
    But this is not a valid link and I can't find it anywhere...
    Thanks

    Hi,
    I am looking for Vmware Data recovery Best pratices, I find everywhere on the internet the following link :  http://viops.vmware.com/home/docs/DOC-1551
    But this is not a valid link and I can't find it anywhere...
    Thanks

  • Remove link for uploaded files in application

    Hi all,
    I've been following the tutorial on how to upload and download files in application at http://download-uk.oracle.com/docs/cd/B32472_01/doc/appdev.300/b32469/up_dn_files.htm. I have now created my own table which stores the uploaded files and can download them successfully. I would now like to add the ability to delete these files within the application and was wondering what the best way of doing this was?
    I would like to add another column to the displayed report with a 'Remove file' link for each listed file, which when clicked would delete the file from the table. I've already tried using a similar method to the one used to upload the files (instead of uploading the file, the called procedure deletes it from the table using the specified ID), but this causes the application to display a blank page when the link is clicked, which I dont want (but does delete the file).
    I would appreciate any suggestions.
    Regards,
    Dave

    Dave,
    See the login page:
    http://htmldb.oracle.com/pls/otn/f?p=31517:101
    There are details on how to access the application builder, where you can see how the
    setup of that download, delete, upload application has been done.
    Denes Kubicek
    http://deneskubicek.blogspot.com/
    http://htmldb.oracle.com/pls/otn/f?p=31517:1
    -------------------------------------------------------------------

  • How to create link for pdf file in discoverer worksheet

    Hi All,
    I would like to create a link for pdf file into discoverer report (worksheet).
    So that i can open linked pdf file using worksheet link.
    Can any one tell me about this?
    Thanks
    Ravi

    Hi,
    I think the best way to do this is to use a database directory so that you can access your pdf file as a bfile. Then create a mod_plsql procedure that sets the mime type and downloads the pdf file. You then create a calculation in the EUL then returns the URL to your mod_plsql procedure. You can then include this calculation as a hyperlink in your worksheet.
    It is not as difficult as it sounds. Please let me know if you need anymore details.
    Rod West

  • HT4664 What is the best graphics card for FCPX?

    In the nonstop anti-FCPX propaganda is an article of interest — posted 7/9/12 — comparing the benchmarks of FCPX and PP6.
    http://www.streamingmedia.com/Producer/Articles/ReadArticle.aspx?ArticleID=83582 &PageNum=1
    The system used was a 2 x 2.93 GHz Quad-Core Mac Pro from early 2009 running MacOS X version 10.7.4 with 12 GB of RAM and an NVIDIA Quadro FX 4800 graphics card with 1.5 GB of onboard RAM.
    In most cases PP6 outperformed FCPX with this configuration. However, in the comments Ben Balser pointed out that FCP X's A/VFoundation engine wasn't ideal on the NVIDA card:
    "Quadro is actually not the best card for FCP X's A/VFoundation engine, but great for CS6's Mercury engine, so the test is amazingly flawed right there. Try both on a 5780 card and watch things drastically change. I've done that test myself. Exporting to Compressor uses a MUCH more sophisticated encoding engine meant for higher level, professional transcoding, not simple outputs, which are faster using Export Media…"
    Apple lists this card on its support page: http://support.apple.com/kb/HT4664
    So,
    What is the best graphics card for FCPX?

    Ben,
    Thanks for chiming in on that article. It would be good to have a benchmark comparison with the two systems each with a preferred card.
    I'm hoping to see some other comparisons on this thread. Also, some links to other articles about best practices and configurations.

  • Problem providing download link for BLOB data in apex report

    Hi,
    I am trying to create a report on a table with BLOB data so that the report displays a "download" link for the BLOB data. I am trying to give users the ability to download files from a website.
    I have been following the instructions in OBE "Defining and Viewing BLOB Data in Oracle Application Express 3.1".
    In the report, for the BLOB column, I have specified values for the following fields in Report Attributes->Column Formatting->Number/Date Format->BLOB Download Format Mask:
    BLOB Table
    BLOB Column
    Primary Key Column 1
    MIME Type Column
    Filename Column
    BLOB Last Updated Column
    However, when I run the report, I get the following error, even thought there is data loaded into the table:
    report error:
    ORA-01403: no data found
    When I remove the BLOB Download Format Mask information for the BLOB column, then the data displays in the report, but without the "download" link for the BLOB data (which is expected):
    E.g.
    Title Author Created BLOB_CONTENT
    Siebel & RAC Best Practices     Erik.Peterson     17-SEP-08     [datatype]
    Siebel Grid Case Studies     Mark.MacDonald     17-SEP-08     [datatype]
    Siebel CRM Applications protected by Oracle Clusterware     Kelly.Tan     17-SEP-08 [datatype]
    Oracle Grid: The Optimal Platform for Siebel Solutions Mark.MacDonald 17-SEP-08     [datatype]
    Any ideas why I'm seeing the ORA_01403 when I specify the BLOB Download Format Mask?
    Edited by: user716599 on Sep 18, 2008 12:31 AM

    Good morning,
    Here is how I have solved this problem.
    1. The select statement in the sql for the report should not include the BLOB column. I decided to select only 2 columns, the column that has the key and the column with the filename.
    2. On the first column ( the primary key ) I put in the format statement that was simply DOWNLOAD:TABLENAME:BLOB_COLUMN:PRIMARY_KEY
    This works. I believe that the Oracle error I was getting was because I was trying to apply this format statement to the actual BLOB column.
    So, it appears that you can use any of the columns in the report to hold the DOWNLOAD format statement since in the format statement, you are defining the BLOB table, BLOB column and the primary key into that column.
    Hope this helps,
    Don.

  • Best practice guide for Batch Load utility in Oracle UCM.

    Hi,
    Is there any best practice guide for Oracle UCM Batch Loader utility.
    We are looking for information regarding batch size in terms of number and size of contents. Also is there any loading time standards considering the contents are uploaded in filesystem where filestore provider is configured?
    Thanks,
    Krishnendu

    Hi ,
    There are no specific set of steps / practices for batch loading contents to ucm . It would be very much dependent on how many contents does the user have to load to UCM and how well the server is configured in terms of performance .
    You can get more details from the following documentation link : http://docs.oracle.com/cd/E21043_01/doc.1111/e10792/c02_settings009.htm
    Thanks,
    Srinath

Maybe you are looking for

  • Regarding the exchange rate used in shopping cart

    Hi, Iam facing one problem regarding the currency conversion while crerating the shopping cart. The basic currency in my organizational profile is CHF. But I have a requirement to create a shopping cart in NZD(New Zealand dollars). After entering all

  • Time Machine problem: backup disk ran out of space unexpectedly

    Hi all, I'm having trouble with Time Machine.  Each time it tries to do a backup, it errors out. Here is the most recent log from TM Buddy: Starting manual backup Backing up to: /Volumes/Time Machine Backup/Backups.backupdb Using file event preflight

  • Version 2.0 - where is the epub?

    In ADE 1.7 whenever I downloaded an ebook with name like "URLink.ascm", I would rename the file something like: BookName_Author.ascm and then proceed to open it with double click which would open it with ADE. No matter where BookName_Author.ascm is o

  • Use of transactions

    Hi, What is the recommended way to use transactions in the following system: One transaction-enabled-environment, with several DBs. A thread can handle one or more DBs, one DB is not handled by more than one thread. Currently we have one transaction

  • AppServer8 destroyed bu JMS attempt

    trying to get JMS going on Sun Java System Application Server Platform Edition 8.0.0_01 (build b08-fcs) There are many demos, NONE seem to work 1) http://java.sun.com/j2ee/1.4/docs/tutorial/doc/JMS5.html#wp80279 is close but NOT identical to the real