Using Best Practices personalization tool

hello,
We wish to usr the Best Practices personalization tool for customer specific data sor baseline package.
I do not understand from the documantation if its possible to use it after  installing the Baseline or its have to be done simutanioslly (meenung the personalized data have to be ready in the files before stsrting the baseline installaion)?!
Thank you
Michal

Hi,
Please Ref:
http://help.sap.com/bp_bl603/BL_IN/html/index.htm
Your personalized files to be done before implementaion as you will be using the file during the installation process.
The xml file and the txt files you creat from the personalization tool is used to upload the scenario to the system, or else it will upload the default.
Also ref note 1226570 ( here i am refering to IN ) you can check the same for other county version also.
Thanks & Regards,
Balaji.S

Similar Messages

  • How to use best practices installation tool?

    Hello!
    can anyone share me some useful links/doc which can guide me how to use best practices installation tool (/SMB/BBI) ?
    any responses will be awarded,
    Regards,
    Samson

    hi,
    will you please share the same ?
    thanks in advance

  • I found warning after ran Best practice analyser Tools in exchange 2010

    Hello ,
    when ran Best practice Analyse tool i found some warining :
    1-DNS 'Host' Record Appears to Be Missing
    2-Active Server Pages is
    not installed
    3-Application log size
    4-Self –sign certificate found:
    is strongly recommended that you install an authority-signed or trusted certificate
    The SSL certificate for 'https://exchange.mydomain.com/Autodiscover/Autodiscover.xml' is self-signed. It does not provide any of the security guarantees provided
    by authority-signed or trusted certificates.(i have ssl certificate form geo cert Turst )  all users you can access mails form owa and they  can connect
    mailbox using outlook anywhere but with SSL warning.
    5-Single Global catalog in topology:
    There is only one global catalog server in the Directory Service Access (DSAccess) topology on server CADEXCHANGE. This configuration should be avoided for fault-tolerance
    reasons
    already checked the below links but i am not understood good  :
    http://technet.microsoft.com/en-us/library/6ec1c7f7-f878-43ae-bc52-6fea410742ae.aspx
    http://technet.microsoft.com/en-us/library/4fa708a1-a118-4953-8956-3c50399499f8.aspx
    http://technet.microsoft.com/en-us/library/8867bba7-7f81-42f9-96b6-2feb7e0cea4e.aspx
    please advise me to avoid this issue
    thanks

    i have 2 server both server global catalog
    my question why warning appear only one global catalog
    please explain this.
    when test Autodiscover the test successful but when expand menu
    i am found some error :
    Attempting to test potential Autodiscover URL https://Mydomain.com:443/Autodiscover/Autodiscover.xml
    Testing the SSL certificate to make sure it's valid.
    Validating the certificate name.
    Certificate name validation failed 

  • Use Best Practice to Import Master Data

    Hi,
    I am SAP beginner, i glad someone can guide me on my issue. How can I using best practice to import 1000++ material master data into SAP system ??? I already prepared the data in Excel spreadsheet. Can any one guide me on step?
    Thanks.

    Hi,
    LSMW is a very good tool for master data upload. The tool is very rich in features is also complex. Being a beginner you should check with a consultant to know how you can use LSMW to upload your 1000 + records. The tool too is quite intuitive. After entering the LSMW transaction you create the project, subproject and the object you are going to work on uploading. When u enter the next screen you see several radio buttons. Typically every upload required all the features behind those radio buttons and in the same sequence. It is not possible to give the details to be performed in each of these radio buttons in this forum. Please take a consultant's help in your vicinity.
    thanx
    Bala

  • Implementing a "login" using best practices

    I have a little bit of time now for my project where I'd like to refactor it a bit and take the opportunity to learn about the best practices to use with JSP/Servlets but I'm having some trouble thinking about what goes where, and how to organize things.
    Here's my current login functionality. I have not seperated my "business logic" from my "presentation logic" as you can see in this simple starting example.
    index.html:
    <html>
    <body>
    <form action="login.jsp" method="post">
        <h1>Please Login</h1>
        User Name:    <input type="text" name="login"><br>
        Password: <input type="password" name="password"><br>
        <input type=submit value="Login">
    </form>
    </body>
    </html>login.jsp:
    <jsp:useBean id="db" type="database.DatabaseContainer" scope="session"/>
    <%
    if(session.getAttribute("authorized")==null || session.getAttribute("authorized").equals("no") || request.getParameter("login")!=null)
         String login = request.getParameter("login");
         String password = request.getParameter("password");
         if (login!=null && db.checkLogin(login,password))
             // Valid login
              session.setAttribute("authorized", "yes");
             session.setAttribute("user",login);
         else
              // Invalid login
                 session.setAttribute("authorized", "no");
                 %><jsp:forward page="index.html"/><%
    else if(session.getAttribute("authorized").equals("no"))
        //System.out.println("Refresh");
        %><jsp:forward page="index.html"/><%   
    else System.out.println("Other");
    %>
    <html>
    <body>
    <h1>Welcome <%= " "+session.getAttribute("user") %></h1>
    //links to other jsps are here
    </body>
    </html>What should I be doing instead? Should I make the form action be a Servlet rather than a jsp? I don't want to be writing html in my servlets though. Do I do the authentication in a servlet that I make the form action and then make the servlet forward to some standard html page?

    Ok, so I'm starting things off simply be just converting what I have to use better practices. For now I just want to get the basic flow of how I transition from page to servlet to page.
    Here's my index.html page:
    <html>
    <body>
    <form action="login" method="post">
        <h1>Please Login</h1>
        Phone Number:    <input type="text" name="login"><br>
        Password: <input type="password" name="password"><br>
        <input type=submit value="Login">
    </form>
    </body>
    </html>I have a mapping that says login goes to LoginServlet, which is here:
    import java.io.IOException;
    import javax.servlet.ServletException;
    import javax.servlet.http.HttpServlet;
    import javax.servlet.http.HttpServletRequest;
    import javax.servlet.http.HttpServletResponse;
    import javax.servlet.http.HttpSession;
    import db.DatabaseContainer;
    public class LoginServlet extends HttpServlet
        public void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException
            HttpSession session = request.getSession();
            if(session.getAttribute("authorized")==null || session.getAttribute("authorized").equals("no") || request.getParameter("login")!=null)
                String login = request.getParameter("login");
                String password = request.getParameter("password");
                DatabaseContainer db = (DatabaseContainer)(request.getSession().getAttribute("db"));
                if (login!=null && db.checkLogin(login,password))
                    // Valid login
                    session.setAttribute("authorized", "yes");
                    session.setAttribute("user",login);
                    //forward to home page
                else
                    // Invalid login
                    session.setAttribute("authorized", "no");
                    //forward back to login page
            else if(session.getAttribute("authorized").equals("no"))
                //forward back to login page
        public void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException
            doGet(request, response);
    }If I'm not logged in, I want to simply forward back to the login page for now. If I am logged in, I want to forward to my home page. If my home page is a simple html page though, then what's to stop a person from just typing in the home page and getting to it? I would think it would need to be a jsp page but then the jsp page would have to have code in it to see if the user was logged in, and then I'd be back to where I was before.
    Edited by: JFactor2004 on Oct 21, 2009 7:38 PM
    Edited by: JFactor2004 on Oct 21, 2009 7:38 PM

  • Job (C) use best practices

    Experts,
    This question is in regard to best practices/common ways that various companies ensure the proper use of the Job(C) object in the HCM systems.  For example, if there are certain jobs in the system that should only be assigned to a position if that position is in certain areas of the business (i.e. belongs to specific organizational areas), how is this type of restriction maintained?  Is it simply through business processes? Is there a way/relationship that can be assigned? Are there typical customizations and/or processes that are followed?
    I'm looking to begin organizing jobs into job families, and I'm currently trying to determine and maintain the underlying organization of our companies jobs in order to ensure this process is functional.
    Any insight, thoughts, or advise would be greatly appreciated.
    Best regards,
    Joe

    Hi Joe,
    You can embed the business area info into job description and this would be a part of best practice.
    What I mean is that:
    e.g. In your company you have 4 managers:
    HR Manager
    IT Manager
    Procurement Manager
    Production Manager
    Then, as part of the Best practice of SAP, you will have 4 positions (1 position per person)
    My advice is you should also have 4 jobs that describe the positions.
    Then, in order to group all managers, you may have one job family "Managers" and assign all four jobs to that family.
    By this way you can report all the managers as well as area-specific managers (e.g. HR manager)
    As far as I know, there is no standard relationship that holds business area info.
    For further info check table T778V via SM31.
    Regards,
    Dilek

  • VC table use, best practices?

    Hi,
    I'm updating a table in the back end with an RFC. I would like to send only the rows I've modified or added on the VC client to the RFC and not the whole table. Is this possible?

    Hey Joel,
    Make a condition(say check box) on changing the table row (user need to select what are all the rows he is modifying).
    while sending the values to the RFC, give the guard condition as valid only when check box is selected.
    Regards,
    Pradeep

  • Skinning; Best Practices & Tools? (Newbie)

    I'm something less than a newbie so I may not be asking this correctly... I am NOT a Java programmer.
    This may be slightly off-topic for this forum, but I think this is the best group for this question.
    I am looking for best practices and tools recommendations for skinning a Java app or game. I need to learn all I can about the steps, tools and potential pitfalls.
    I've got great graphic designers who can make the GUI for me. Now I just need to know my steps for applying it and what I need to watch out for.
    Thanks for any help; links, comments, titles, etc.

    The Skin Look and Feel seems to be designed for desktop applications, and uses KDE and Gnome skins from X. This would be an awkward if not impossible way of skinning a game IMO. I'm interested in any other solutions out there.

  • Best practice for migrating data tables- please comment.

    I have 5 new tables seeded with data that need to be promoted from a development to a production environment.
    Instead of the DBAs just using a tool to migrate the data they are insistent that I save and provide scripts for every single commit, in proper order, necessary to both build the table and insert the data from ground zero.
    I am very unaccustomed to this kind of environment and it seems much riskier for me to try and rebuild the objects from scratch when I already have a perfect, tested, ready model.
    They also require extensive documentation where every step is recorded in a document and use that for the deployment.
    I believe their rationale is they don't want to rely on backups but instead want to rely on a document that specifies each step to recreate.
    Please comment on your view of this practice. Thanks!

    >
    Please comment on your view of this practice. Thanks!
    >
    Sounds like the DBAs are using best practices to get the job done. Congratulations to them!
    >
    I have 5 new tables seeded with data that need to be promoted from a development to a production environment.
    Instead of the DBAs just using a tool to migrate the data they are insistent that I save and provide scripts for every single commit, in proper order, necessary to both build the table and insert the data from ground zero.
    >
    The process you describe is what I would expect, and require, in any well-run environment.
    >
    I am very unaccustomed to this kind of environment and it seems much riskier for me to try and rebuild the objects from scratch when I already have a perfect, tested, ready model.
    >
    Nobody cares if if is riskier for you. The production environment is sacred. Any and all risk to it must be reduced to a minimum at all cost. In my opinion a DBA should NEVER move ANYTHING from a development environment directly to a production environment. NEVER.
    Development environments are sandboxes. They are often not backed up. You or anyone else could easily modify tables or data with no controls in place. Anything done in a DEV environment is assumed to be incomplete, unsecure, disposable and unvetted.
    If you are doing development and don't have scripts to rebuild your objects from scratch then you are doing it wrong. You should ALWAYS have your own backup copies of DDL in case anything happens (and it does) to the development environment. By 'have your own' I mean there should be copies in a version control system or central repository where your teammates can get their hands on them if you are not available.
    As for data - I agree with what others have said. Further - ALL data in a dev environment is assumed to be dev data and not production data. In all environments I have worked in ALL production data must be validated and approved by the business. That means every piece of data in lookup tables, fact tables, dimension tables, etc. Only computed data, such as might be in a data warehouse system generated by an ETL process might be exempt; but the process that creates that data is not exempt - that process and ultimately the data - must be signed off on by the business.
    And the business generally has no access to, or control of, a development environment. That means using a TEST or QA environment for the business users to test and validate.
    >
    They also require extensive documentation where every step is recorded in a document and use that for the deployment.
    I believe their rationale is they don't want to rely on backups but instead want to rely on a document that specifies each step to recreate.
    >
    Absolutely! That's how professional deployments are performed. Deployment documents are prepared and submitted for sign off by each of the affected groups. Those groups can include security, dba, business user, IT and even legal. The deployment documents always include recovery steps so that is something goes wrong or the deployment can't procede there is a documented procedure of how to restore the system to a valid working state.
    The deployments themselves that I participate in have representatives from the each of those groups in the room or on a conference call as each step of the deployment is performed. Your 5 tables may be used by stored procedures, views or other code that has to be deployed as part of the same process. Each step of the deployment has to be performed in the correct order. If something goes wrong the responsible party is responsible for assisting in the retry or recovery of their component.
    It is absolutely vital to have a known, secure, repeatable process for deployments. There are no shortcuts. I agree, for a simple 5 new table and small amount of data scenario it may seem like overkill.
    But, despite what you say it simply cannot be that easy for one simple reason. Adding 5 tables with data to a production system has no business impact or utility at all unless there is some code, process or application somewhere that accesses those tables and data. Your post didn't mention the part about what changes are being made to actually USE what you are adding.

  • EFashion sample Universes and best practices?

    Hi experts,
    Do you all think that the eFashion sample Universe was developed based on the best practices of Universe design? Below is one of my questions/problems:
    Universe is designed to hide technical details and answer all valid business questions (queries/reports). For non-sense questions, it will show 'incompatible' etc. In the eFashion sample, I tried to compose a query to answer "for a period of time, e.g. from 2008.5 to 2008.9, in each week for each product (article), it's MSRP (sales price) and sold_price and margin and quantity_sold and promotion_flag". I grabed the Product.SKUnumber, week from Time period, Unit Price MSRP from Product, Sold at (unit price) from Product, Promotions.promotion, Margin and Quantity sold from Measures into the Query Panel. It gives me 'incompatible' error message when I try to run it. I think the whole sample (from database data model to Universe schema structure/joins...) is flawed. In the Product_promotion_facts table, it seems that if a promotion lasts for more than one week, the weekid will be the starting week and duration will indicate how long it lasts. In this design, to answer "what promotions run in what weeks" will not be easy because you need to join the Product_promotion_facts with Time dimention using "time.weekid between p_prom.weekid and p_prom.weekid+duration" assuming weekid is in sequence, instead of simple "time.weekid=p_prom.weekid".  The weekid joins between Shop_fact and product_promotion_facts and Calendar_year_lookup are very confusing because one is about "the week the sales happened" and the other "the week the promotion started". No tools can smart enough to resolve this ambitious automatically. Then the shortcut join between shop_facts and product_promotion_facts. it's based on the articleid alone. obviously the two have to be joined on both article and time (using between/and, not the simple weekid=weekid in this design), otherwise the join doesn't make sense (a sale of one article on one day joins to all the promotions to this article of all time?).
    What do you think?
    thanks.
    Edward

    You seem to have the idea that finding out whether a project uses "best practices" is the same as finding out whether a car is blue. Or perhaps you think there is a standards board somewhere which reviews projects for the use of "best practices".
    Well, it isn't like that. The most cynical viewpoint is that "best practices" is simply an advertising slogan used by IT consultants to make them appear competent to their prospective clients. But basically it's a value judgement. For example using Hibernate may be a good thing to do in many projects, but there are projects where it would not be a good thing to do. So you can't just say that using Hibernate is a "best practice".
    However it's always a good idea to keep your source code in a repository (CVS, Subversion, git, etc.) so I think most people would call that a "best practice". And you could talk about software development techniques, but "best practice" for a team of three is very different from "best practice" for a team of 250.
    So you aren't going to get a one-paragraph description of what features you should stick in your project to qualify as "best practices". And you aren't going to get a checklist off the web whereby you can rate yourself for "best practices" either. Or if you do, you'll find that the "best practice" involves buying something from the people who provided the checklist.

  • SAP SCM and SAP APO: Best practices, tips and recommendations

    Hi,
    I have been gathering useful information about SAP SCM and SAP APO (e.g., advanced supply chain planning, master data and transaction data for advanced planning, demand planning, cross-plant planning, production planning and detailed scheduling, deployment, global available-to-promise (global ATP), CIF (core interface), SAP APO DP planning tools (macros, statistical forecasting, lifecycle planning, data realignment, data upload into the planning area, mass processing u2013 background jobs, process chains, aggregation and disaggregation), and PP/DS heuristics for production planning).
    I am especially interested about best practices, tips and recommendations for using and developing SAP SCM and SAP APO. For example, [CIF Tips and Tricks Version 3.1.1|https://service.sap.com/form/sapnet?_FRAME=CONTAINER&_OBJECT=011000358700006480652001E] and [CIF Tips and Tricks Version 4.0|https://service.sap.com/form/sapnet?_FRAME=CONTAINER&_OBJECT=011000358700000596412005E] contain pretty useful knowledge about CIF.
    If you know any useful best practices, tips and recommendations for using and developing SAP SCM and SAP APO, I would appreciate if you could share those assets with me.
    Thanks in advance of your help.
    Regards,
    Jarmo Tuominen

    Hi Jarmo,
    Apart from what DB has suggested. you should give a good reading on the following.
    -Consulting Notes (use the application component filters in search notes)
    -Collective Notes (similar to the one above)
    -Release Notes
    -Release Restrictions
    -If $$ permit subscribe to www.scmexpertonline.com. Good perspective on concepts around SAP SCM.
    -There are a couple of blogs (e.g. www.apolemia.com) .. but all lack breadth.. some topics in depth.
    -"Articles" section on this site (not all are classified well.. see in ECCops, mfg, SCM, Logistics etc)
    -Serivce.sap.com- check the solution details overview in knowledge exchange tab. There are product presentations and collaterals for every release. Good breadth but no depth.
    -Building Blocks - available for all application areas. This is limited to vanilla configuration of just making a process work and nothing more than that.
    -Get the book "Sales and Operations Planning with SAP APO" by SAP Press. Its got plenty of  easy to follow stuff, good perspective and lots of screen shots to make life easier.
    -help.sap.com the last thing that most refer after all "handy" options (incl. this forum) are exhausted. Nevertheless, this is the superset of all "secondary" documents. But the maze of hyperlinks that start at APO might lead you to something like xml schema.
    Key Tip: Appreciate that SAP SCM is largely driven by connected execution systems (SAP ECC/ERP). So the best place to start with should be a good overview of ERP OPS solution overview, at least at the significant level of depth.). Check this document at sdn wiki "ERP ops architecture overview".
    I have some good collection of documents though many i havent read myself. If you need them let me know.
    Regards,
    Loknath

  • Best practice in implementation of SEM-CPM

    Is someone have  the experiance of implementing SEM-CPM using best practice. and if so, does it reduces implementation time?

    We should be able to adopt the best pratices when the software finally gets integrated into netweaver.
    Ravi Thothadri

  • What is best practice for dealing with Engineering Spare Parts?

    Hello All,
    I am after some advice regarding the process for handling engineering spare parts in PM. (We run ECC 5)
    Our current process is as follows:
    All materials are set up as HIBE's
    Each material is batch managed
    The Batch field is used for the Bin location
    We are now looking to role out PM to a site that has in excess of 50,000 spare parts and want to make sure we use best practice for handling the spare parts. We are now considering using a basic WM setup to handle the movement of parts.
    Please can you provide me with some feedback on what you feel the best practice is for dealing with these parts?
    We are looking to set up a solution that will us to generate pick lists etc and implment a scanning solution to move parts in and out of stores.
    Regards
    Chris

    Hi,
    I hope all the 50000 spare parts are maintained as stock items.
    1. Based on the usage of those spare parts, try to define safety stock & define MRP as "Reorder Point Planning". By this, you can avoid petty cash purchase.
    2. By keeping the spare parts (atleast critical components) in stock, Planned Maintenance as well as unplanned maintenance will not get delayed.
    3. By doing GI based on reservation, qty can be tracked against the order & equipment.
    As this question is MM & WM related, they can give better clarity on this.
    Regards,
    Maheswaran.

  • Best Practices v3.31 - SAP InfoSet Query connection

    Hi,
    I have a problem with adapting a Crystal Report from Best Practices for Business Intelligence v3.31 to SAP system. The report "Cost Analysis Planned vs. Actual Order Costs.rpt" is using SAP InfoSet Query "CO_OM_OP_20_Q1". This InfoSet Query is working fine in SAP. My SAP-User has access to the following user groups:
    - /SREP/IS_UG
    - /KYK/IS_UG
    - ZBPBI131_USR
    The Controlling Area in SAP is '1000'. Crystal Reports generates this error message:
    - Failed to retrieve data from the database.
    - Database Connector Error: "Controlling area does not exist"
    - Database Connector Error: 'RFC_CLOSED'
    But InfoSet Query "CO_OM_CA_20_Q1" has no problem with Controlling Area '1000' in Crystal Reports and is working fine in SAP!
    Can somebody help?
    Thanks in advance.
    Peter

    Hello Peter,
    I'm using Best Practices for BI v1.31 and this one also has this report you are talking about.
    I face the same issue when trying to adapt it to my ERP, but if I run it in SAP GUI, it goes smoothly without any issue.
    Please advise,
    Thanks in advance,
    Carlos Henrique Matos da Silva - SAP BusinessObjects BI - Porto Alegre/Brazil.

  • SAP Best Practice for Chemicals in an oil refinery

    Does anyone have experience in implementing the SAP Best Practice for Chemicals in an oil refinery that will also use IS-OIL?
    What would be the pros and cons?
    What is the implementation price of considering a best practice solution for a large complex organization?
    Oded Dagan
    Oil Refineries LTD.
    Israel

    Hi Oded,
    I don't know about any Best Practice Chemicals implementation in an refinery so far.
    But you can use Best Practice Chemicals with SAP IS-Oil for the non IS-Oil functionality as well.
    Best Practice Chemicals gives you benefits within standard business processes, but for the
    IS-Oil business processes you have to consider the traditon implementation methods.
    Best Practice Chemicals gives you a broad business process set, usually used in Chemical
    corporation. If you can cover 50% of your needed business processes out of Best Practice Chemicals you save approx 50% implementation time. It is not only implemenentation you save a lot in documentation and training material as well. Most of our Best Practice Chemicals implemenations used
    60-80% out of Best Practice Chemicals. At a large corporation the percentage of standard ERP processes is normally smaller, because of other additional needed SAP solutions e.g. APO, SRM, CRM etc.
    regards Rainer

Maybe you are looking for

  • My iPhone 5 has not been working properly please help!!

    I have a 16GB iPhone 5 with the latest update (iOS 7.1.2) and lately (about a week or so) it's not been working right. It first started where the screen would freeze and then I'd have to click the power button and return to lock screen I get it to wo

  • Acrobat X Crashes Reduce Size PDF

    Acrobat 10.1.1 MacOS 10.6.8 When I try to reduce the PDF file size of my book, Acrobat always crashes-regardless of which compatible version I choose. I have the crash log report, but don't see a way of including an attachment on this forum.

  • Can I uninstall Maverick.  Too slow

    I have a new iMac which had Mountain Lion installed.  Since updating to Maverick, its so slow.   Can I uninstall Maverick back to Mountain Lion, and how please would I do this - step by step - appreciate your help. 

  • My country is not listed, how to buy the license ?

    My country is not listed, how to buy the license ? I have LightRoom 4 trial version and would like to but the license. I didnt want to buy from a store that is not supporting my area ( middle east ) Thanks.

  • Login timeout at boot

    When I restart my Mac Mini Lion Server, it only completes boot after I authenticate as an administrator. How do I configure the system to boot without local login? If I restart it remotely, or if it restarts when I'm not around to type in my username