Best Practices for converting SAP HR data (4.7 to ECC)

Hello Experts ...
We are going from 4.6 to ECC ... no upgrade ..it will be a new implementation ...
I am looking for best practices to convert SAP HR data from one sap instance(4.6) to another(ECC) ...
I am not sure if direct input or LSMW or any other method/tool is the best way ...
Will really appreciate and award point if I can get good advise or documentation ...
Let me know if my question is not clear enough.
Thanks,

Hi
You can check SAP Marketplace and there follow the download link. You will require a SAP Marketplace login to download the Best Practices. Fortunately Best practices for ECC 5.00 and 6.00 are available there but they are country specific versions. I know HCM for US is available there.
Reward points, if helpful.
Regards
Waz

Similar Messages

  • Best practice for Plan and actual data

    Hello, what is the best practice for Plan and actual data?  should they both be in the same app or different?
    Thanks.

    Hi Zack,
    It will be easier for you to maintain the data in a single application. Every application needs to have the category dimension, mandatorily. So, you can use this dimension to maintain the actual and plan data.
    Hope this helps.

  • In the Begining it's Flat Files - Best Practice for Getting Flat File Data

    I probably should have posed this question here before I delved into writing Java to get data for reports, but better late than never.
    Our ERP is written in COBOL. We have a third party ODBC which allows us to access data using a version of SQL. I have several Java sources compiled in my database that access the data and return something relevant. The Java sources are written in a procedural style rather than taking advantage of object oriented programming with attributes and methods.
    Now that I am becoming more comfortable with the Java language, I would greatly appreciate any feedback as to best practices for incorporating Java into my database.
    My guess is that it would be helpful to model the ERP "tables" with Java classes that would have attributes, which correspond to the fields, and methods to return the attributes in an appropriate way. Does that sound reasonable? If so, is there a way to automate the task of modeling the tables? If not reasonable, what would you recommend?
    Thanks,
    Gregory

    Brother wrote:
    I probably should have posed this question here before I delved into writing Java to get data for reports, but better late than never.
    Our ERP is written in COBOL. We have a third party ODBC which allows us to access data using a version of SQL. I have several Java sources compiled in my database that access the data and return something relevant. The Java sources are written in a procedural style rather than taking advantage of object oriented programming with attributes and methods.
    OO is a choice not a mandate. Using Java in a procedural way is certainly not ideal but given that it is existing code I would look more into whether is well written procedural code rather than looking at the lack of OO.
    Now that I am becoming more comfortable with the Java language, I would greatly appreciate any feedback as to best practices for incorporating Java into my database.
    My guess is that it would be helpful to model the ERP "tables" with Java classes that would have attributes, which correspond to the fields, and methods to return the attributes in an appropriate way. Does that sound reasonable? If so, is there a way to automate the task of modeling the tables? If not reasonable, what would you recommend?Normally you create a data model driven by business need. You then implement using whatever means seem expedient in terms of other business constraints to closely model that data model.
    It is often the case that there is a strong correlation between data models and tables but certainly in my experience it is rare when there are not other needs driven by the data model (such as how foreign keys and link tables are implemented and used.)

  • Best practice for including additional DLLs/data files with plug-in

    Hi,
    Let's say I'm writing a plug-in which calls code in additional DLLs, and I want to ship these DLLs as part of the plug-in.  I'd like to know what is considered "best practice" in terms of whether this is ok  (assuming of course that the un-installer is set up to remove them correctly), and if so, where is the best place to put the DLLs.
    Is it considered ok at all to ship additional DLLs, or should I try and statically link everything?
    If it's ok to ship additional DLLs, should I install them in the same folder as the plug-in DLL (e.g. the .8BF or whatever), in a subfolder of the plug-in folder or somewhere else?
    (I have the same question about shipping additional files too, such as data or resource files.)
    Thanks
                             -Matthew

    Brother wrote:
    I probably should have posed this question here before I delved into writing Java to get data for reports, but better late than never.
    Our ERP is written in COBOL. We have a third party ODBC which allows us to access data using a version of SQL. I have several Java sources compiled in my database that access the data and return something relevant. The Java sources are written in a procedural style rather than taking advantage of object oriented programming with attributes and methods.
    OO is a choice not a mandate. Using Java in a procedural way is certainly not ideal but given that it is existing code I would look more into whether is well written procedural code rather than looking at the lack of OO.
    Now that I am becoming more comfortable with the Java language, I would greatly appreciate any feedback as to best practices for incorporating Java into my database.
    My guess is that it would be helpful to model the ERP "tables" with Java classes that would have attributes, which correspond to the fields, and methods to return the attributes in an appropriate way. Does that sound reasonable? If so, is there a way to automate the task of modeling the tables? If not reasonable, what would you recommend?Normally you create a data model driven by business need. You then implement using whatever means seem expedient in terms of other business constraints to closely model that data model.
    It is often the case that there is a strong correlation between data models and tables but certainly in my experience it is rare when there are not other needs driven by the data model (such as how foreign keys and link tables are implemented and used.)

  • Best Practices for Accessing the Configuration data Modelled as XML File in

    Hi,
    I refer the couple of blof posts/Forum threads on How to model and access the Configuration data as XML inside OSB.
    One of the easiest and way is to
    Re: OSB: What is best practice for reading configuration information
    Another could be
    Uploading XML data as .xq file (Creating .xq file copy paste all the Configuration as XML )
    I need expert answers for following.
    1] I have .xsd file which is representing the Configuration data. Structure of XSD is
    <FrameworkConfig>
    <Config type="common" key="someKey">proprtyvalue</Config>
    <FrameworkConfig>
    2] As my project will move from one env to another the property-value will change according to the Environment...
    For Dev:
    <FrameworkConfig>
    <Config type="common" key="someKey">proprtyvalue_Dev</Config>
    <FrameworkConfig>
    For Stage :
    <FrameworkConfig>
    <Config type="common" key="someKey">proprtyvalue_Stage</Config>
    <FrameworkConfig>
    3] Let say I create the following Folder structure to store the Configuration file specific for dev/stage/prod instance
    OSB Project Folder
    |
    |---Dev
    |
    |--Dev_Config_file.xml
    |
    |---Stage
    |
    |--Stahe_Config_file.xml
    |
    |---Prod
    |
    |-Prod_Config_file.xml
    4] I need a way to load these property file as xml element/variable inside OSb message flow.?? I can't use XPath function fn:doc("URL") coz I don't know exact path of XMl on deployed server.
    5] Also I need to lookup/model the value which will specify the current server type(Dev/Stage/prod) on which OSB MF is running. Let say any construct which will act as a Global configuration and can be acccessible inside the OSb message flow. If I get the vaalue for the Global variable as Dev means I will load the xml config file under the Dev Directory @runtime containing key value pair for Dev environment.
    6] This Re: OSB: What is best practice for reading configuration information
    suggest the designing of the web application which will serve the xml file over the http protocol and getting the contents into variable (which in turn can be used in OSB message flow). Can we address this problem without creating the extra Project and adding the Dependencies? I read configuration file approach too..but the sample configuration file doesn't show entry of .xml file as resources
    Hope I am clear...I really appreciate your comments and suggestion..
    Sushil
    Edited by: Sushil Deshpande on Jan 24, 2011 10:56 AM

    If you can enforce some sort of naming convention for the transport endpoint for this proxy service across the environments, where the environment name is part of the endpoint you may able to retrieve it from $inbound in the message pipeline.
    eg. http://osb_host/service/prod/service1 ==> Prod and http://osb_host/service/prod/service2 ==> stage , then i think $inbound/ctx:transport/ctx:uri can give you /service/prod/service1 or /service/stage/service1 and applying appropriate xpath functions you will be able to extract the environment name.
    Chk this link for details on $inbound/ctx:transport : http://download.oracle.com/docs/cd/E13159_01/osb/docs10gr3/userguide/context.html#wp1080822

  • Graphical display of best practice for implementing SAP NetWeaver

    Hi,
    in a presentation I need to show best practice to implement SAP NetWeaver is evaluating required KPIs, characteristics and so on and subsequently verifying the Business Content functionality and objects against these requirements.
    Can anybody provide a presentation or PDF document in which this process is displayed graphically? I am looking for a significant picture / diagram ...
    Any document or link to public SAP material which demonstrates the described process would be highly appreciated.
    [email protected]
    Best regards,
    Björn

    Hi,
    https://www.sdn.sap.com/irj/sdn/developerareas/bi?rid=/webcontent/uuid/e78a5148-0701-0010-7da9-a6c721c6112e [original link is broken]
    Regards,
    San!

  • Best practice for phone number column data type

    Hi,
    I hope I have posted this in the right forum, if not just notify me and i will remove it to the correct area.
    What would be considered best practice for storing a phone number.
    Number of Varchar2.
    Ben

    Well I was thinking that Number would have the following disadvantages,
    1. Users entering phone numbers into a form may be tempted to pre format with spaces, thereby throwing up an error.
    2. Mobile phone numbers may have leading zeros
    3. Calculations are not carried out on phone numbers.
    I was leaning towards a varchar2 type.
    Ben

  • Best Practice for Significant Amounts of Data

    This is basically a best-practice/concept question and it spans both Xcelsius & Excel functions:
    I am working on a dashboard for the US Military to report on some basic financial transactions that happen on bases around the globe.  These transactions fall into four categories, so my aggregation is as follows:
    Year,Month,Country,Base,Category (data is Transaction Count and Total Amount)
    This is a rather high level of aggregation, and it takes about 20 million transactions and aggregates them into about 6000 rows of data for a two year period.
    I would like to allow the users to select a Category and a country and see a chart which summarizes transactions for that country ( X-axis for Month, Y-axis Transaction Count or Amount ).  I would like each series on this chart to represent a Base.
    My problem is that 6000 rows still appears to be too many rows for an Xcelsius dashboard to handle.  I have followed the Concatenated Key approach and used SUMIF to populate a matrix with the data for use in the Chart.  This matrix would have Bases for row headings (only those within the selected country) and the Column Headings would be Month.  The data would be COUNT. (I also need the same matrix with Dollar Amounts as the data). 
    In Excel this matrix works fine and seems to be very fast.  The problem is with Xcelsius.  I have imported the Spreadsheet, but have NOT even created the chart yet and Xcelsius is CHOKING (and crashing).  I changed Max Rows to 7000 to accommodate the data.  I placed a simple combo box and a grid on the Canvas u2013 BUT NO CHART yet u2013 and the dashboard takes forever to generate and is REALLY slow to react to a simple change in the Combo Box.
    So, I guess this brings up a few questions:
    1)     Am I doing something wrong and did I miss something that would prevent this problem?
    2)     If this is standard Xcelsius behavior, what are the Best Practices to solve the problem?
    a.     Do I have to create 50 different Data Ranges in order to improve performance (i.e. Each Country-Category would have a separate range)?
    b.     Would it even work if it had that many data ranges in it?
    c.     Do you aggregate it as a crosstab (Months as Column headings) and insert that crosstabbed data into Excel.
    d.     Other ideas  that Iu2019m missing?
    FYI:  These dashboards will be exported to PDF and distributed.  They will not be connected to a server or data source.
    Any thoughts or guidance would be appreciated.
    Thanks,
    David

    Hi David,
    I would leave your query
    "Am I doing something wrong and did I miss something that would prevent this problem?"
    to the experts/ gurus out here on this forum.
    From my end, you can follow
    TOP 10 EXCEL TIPS FOR SUCCESS
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/204c3259-edb2-2b10-4a84-a754c9e1aea8
    Please follow the Xcelsius Best Practices at
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/a084a11c-6564-2b10-79ac-cc1eb3f017ac
    In order to reduce the size of xlf and swf files follow
    http://myxcelsius.com/2009/03/18/reduce-the-size-of-your-xlf-and-swf-files/
    Hope this helps to certain extent.
    Regards
    Nikhil

  • Best practice for having separate clone data for development purposes?

    Hi
    I am on a hosted Apex environment
    I have a workspace containing two instances/ copies of the application: DEV and PROD
    I would like to be able to develop functionality and data in/ with the DEV instance and then insert it into DEV.
    I gather that I can insert pages from DEV to PROD via Create -> New page as copy -> Page in another application
    But I don't know how I can mimic this process with database objects, eg. if I want to create a new table or manipulate the data in an existing table in a DEV environment before implementing in a PROD environment.
    Ideally this would be done in such a way that minimises changing table names etc when elevating pages from DEV to PROD.
    Would it be possible to create a clone schema that could contain the same tables (with the same names) as PROD?
    Any tips, best practices appreciated :)
    Thanks

    Hi,
    ideally you should have a little more separation between your dev and prod environments. At the minimum you should have separate workspaces each addressing separate schemas. Apex can be a little difficult if you want to move individual Apex application objects, such as pages, between applications (a much requested improvement), but this can be overcome by exporting and importing the whole application. You should also have some form of version control/backup of export files.
    As far as database objects go, tables etc, if you have tns access to your hosted environment, then you can use SQL Developer to develop, maintain and synchronize between your development and production schemas and objects in the different environments should have identical names. If you don't have that access, then you can use the Apex SQL Workshop features, but these are a little more cumbersome than a tool like SQL Developer. Once again, scripts for creating and upgrading your database schemas should be kept under some sort of version control.
    All of this is supposing your hosting solution allows more than one workspace and schema, if not you may have to incur the cost of a second environment. One other option would be to do your development locally in an instance of Oracle XE, ensuring you don't have any version conflicts between the different database object features and the Apex version.
    I hope this helps.
    Regards
    Andre

  • Best practices for administering Oracle Big Data Appliance

    -        Best practices as part of administration of Oracle Big Data Infrastructure
    -        How do we lock down max space usage per project
    Eg: Project team A can have a max limit of 10 TB space allocated
    -        Restricting roles, access ( Read, Write), place holder for common shared artifacts
    -        Template/procedure for code migration across dev,qa and prod environments etc

    Your data is bigger than I run, but what I have done in the past is to restrict their accounts to a separate datafile and limit its size to the max that I want for them to use: create objects restricted to accommodate the location.

  • Procedure/Best Practices - For providing SAP BASIS SUPPORT

    Hi All,
    I need information on SAP BEST PRACTICES on providing BASIS SUPPORT for SAP4.7.
    If you can give me any link or document regarding this that would be of great help to me.
    Expecting the replies...
    Thanks
    Cheenu

    Hi Cheenu,
    I'm not quite sure what do you mean by "providing SAP BASIS SUPPORT".
    Run SAP roadmaps describe the recommended methodology for End-to-end operations of SAP solution, including technical operations (basis). You can check them at service.sap.com/roadmaps (go to "Run SAP Roadmap"). If you need a list of tasks you should perform in a scope of solution operations you'll find it there as well.
    SAP solution management is based on ITIL methodology. You can find more about ITIL at www.best-management-practice.com/IT-Service-Management-ITIL/.
    Hope this helps.
    KR,
    Grigor

  • Best Practice for Storing Program Config Data on Vista?

    Hi Everyone,
    I'm looking for recommendations as to where (and how) to best store program configuration data for a LV executable running under Vista.  I need to store a number of things like window location, values of controls, etc.  Under XP I just stored it right in the VIs own execution path.  But under Vista, certain directories (such as C:\Program Files) are now restricted without administrator rights, so if my program is running from there, I dont think it'll be able to write its config file.
    Also right now I'm just using the Write to Spreadsheet File block to store my variables.  Does this sound alright or are these better suggestions?
    Thanks!
    Solved!
    Go to Solution.

    I fopund some stuff on microsoft page. Here the link and a short past from taht document:
    http://www.microsoft.com/downloads/details.aspx?FamilyID=BA73B169-A648-49AF-BC5E-A2EEBB74C16B&displa...
    Application settings that need to be
    changed at run time should be stored in one of the following
    locations:
     CSIDL_APPDATA
     CSIDL_LOCAL_APPDATA
     CSIDL_COMMON_APPDATA
    Documents saved by the user should be
    stored in the CSIDL_MYDOCUMENTS folder.
    Can't tell you more as I have no Vista around to look for the CSILD stuff.
    Felix
    www.aescusoft.de
    My latest community nugget on producer/consumer design
    My current blog: A journey through uml

  • Best practice for large form input data.

    I'm developing an HIE system with FLEX. I'm looking for a strategy, and management layout for pretty close to 20-30 TextInput/Combobox/Grid controls.
    What is the best way to present this in FLEX without a lot of cluter, and a given panel being way too large and unweildy.
    The options that I have come up with so far.
    1) Use lots of Tabs combined with lots of accordions, and split panes.
    2) Use popup windows
    3) Use panels that appear, capture the data, then, make the panel go away.
    I'm shying away from the popup windows, as, that strategy ALWAYS result in performance issues.
    Any help is greatly appreciated.
    Thanks.

    In general the Flex navigator containers are the way to go. ViewStack is probably the most versatile, though TabNavigator and Accordion are good. It all depends on your assumed workflow.
    If this post answers your question or helps, please mark it as such.

  • Need to know the best practice for converting string to double

    I have a string and want to convert to double if it is a valid number, else want to keep as it is. There can be couple of ways doing it and I want to know which one is best if I have lots of strings, specially from performance point of view.
    1) Use Double.parseDouble(myString) and catching Number format exception to detect it is not a number. One of my colleague said it does not give good performance because of exception catching,
    2) Use of org.apache.commons.lang.math.NumberUtils.isNumber() and if it is true then only parse it - so don't rely on exception.
    I did some performance testing - putting it in a loop and trying out for 2 scenarios - one loop for proper numeric value string and another for non-numeric. What I found out was if strings are not proper then parseDouble() is taking long time (because of exception catching) and in that case using NumberUtils.isNumber() makes sense.
    Would like to hear expert views on this.
    Thanks
    Manisha

    If you need it as a double you must convert it to a double and catch the exception. This means that testing it first is a waste of time in the case when the test succeeds - did your colleague think of that?
    Catching the exception is possibly slower than the test. Whether this is significant depends on the relative timings of the test and catching the exception, and also on the expected error rate. If this is below about 40% I suggest your colleague is talking through his esteemed hat.
    And of course the best test by far is the conversion itself. Using any other test runs the risk of its rules being different from those applied by the conversion.
    In any case you are obliged to write the code that catches the exception. You're not obliged to write the pre-test code.
    My personal rule for efficiency is to minimize lines of code until hard evidence to the contrary proves that further improvement is required.

  • Best Practice for ViewObjects when inserting data through pl/sql procedure

    My applications is oracle form based enterprise level application and we are now developing new module in ADF 11g but there is restriction that all data insertion, updation, and deletion will be through oracle pl/sql procedures. Now my question is that adf pages should be binded with ViewObjects based on Entity Object or with Viewobjects not based on Entity / sql query. Currently I have developed pages with programmatic ViewObjects which are neither based on Entity Objects nor on sql query. In those view objects, i create transient attributes and then used it to create adf pages. Then on save, i extract the data from ViewObject's current row and pass it to procedure. This is working fine but just wondering whether this approach is ok or there is better alternative for that. Ideally i want to create ViewObjects based on EntityObject but don't finding any way to synchronize entityObjects with data inserted through procedures.

    Hi,
    I create a EO for the Database-View and override the doDML()-Method. For insert/update and delete I call the pl/sql-functions.
    See "38.5 Basing an Entity Object on a PL/SQL Package API" in Oracle® Fusion Middleware Fusion Developer's Guide for Oracle Application Development
    Framework.

Maybe you are looking for

  • Content of file is different on some machines?!

    Hi, I'm the networkadministrator of a company that uses a couple of macs and macbooks with CS2 & CS3 in a simple network. The data is shared on a network drive and contains all the files we work with. If user A change something in file X and saves it

  • Imac won't get past boot screen

    I have never had a problem with my mac, but recently I was surfing the internet, and the screen seemed to freeze.  I powered it off by holding the power button until it turned off, then tried to turn it back on.  the apple screen came up, then the ro

  • R.I.P Jules Bianchi

    Such a promising young talent, so sad.

  • Upgrade OS 10.4 to 10.5 or higher

    Currently I'm using on my MacBook OS 10.4.?. My new printer software requires a higher version above 10.5 . Should I buy OS Leopard or Snow Leopard to upgrade or can I immediately upgrade to OS Lion?

  • The NJ Lottery website doesn't show the winning numbers when I click on the game

    I made a few changes to stop pop ups but I don't remember exactly what I did. Ever since then when I open the NJ Lottery website the information in the center panel isn't there and when I click on a game the winning numbers and the prize amount doesn