BI BPC server strategy and the best practice

Hi  Everyone
I would like to know couple of questions:
1) Is any white paper or documentation on pros and cons of having BPC NW installed as a add-on to BW system wherein Planning and Reporting are taking place on the same BW system versus BPC as a separate system wherein is used primarily for Planning and Consolidation system only
2) is there a Best Practice document and performance considerations for BPC development from SAP.
appreciated for any answers.
Regards
AK

Hi AK,
both scenarios works well but for first scenario having BPC on top of exisiting BW reporting, you need to take special attention on sizing. As BPC requires additional capacity, you need to take care of it.
Also if you have SEM component on your BW system, you need to check this Sap note: 1326576 u2013 SAP NW system with SAP ERP software components.
And before you install BPC, it is recommended to have a quick test process once you upgraded to NW EhP1 (it is a prerequisites for BPC), to check the existing BW reporting process.
regards,
Sreedhar

Similar Messages

  • Monthly Windows Server maintenance and patching, best practices?

    WSUS, or System Center Essentials or like that from microsoft itself, its quite good and you can delay the deployment or manage it as your needs.

    Greetings All,
    I would like to reduce our reliance on a MSP for performing server maintenance and patching and would like to know some best practices for doing updates on production servers.
    Are there programs or services out there that track Microsoft updates for Windows Server 2008+, and lists possible issues that they may cause? How do you go about monthly updates for your critical servers?
    Thanks
    This topic first appeared in the Spiceworks Community

  • What is the best practice and Microsoft best recommended procedure of placing "FSMO Roles on Primary Domain Controller (PDC) and Additional Domain Controller (ADC)"??

    Hi,
    I have Windows Server 2008 Enterprise  and have
    2 Domain Controllers in my Company:
    Primary Domain Controller (PDC)
    Additional Domain Controller (ADC)
    My (PDC) was down due to Hardware failure, but somehow I got a chance to get it up and transferred
    (5) FSMO Roles from (PDC) to (ADC).
    Now my (PDC) is rectified and UP with same configurations and settings.  (I did not install new OS or Domain Controller in existing PDC Server).
    Finally I want it to move back the (FSMO Roles) from
    (ADC) to (PDC) to get UP and operational my (PDC) as Primary. 
    (Before Disaster my PDC had 5 FSMO Roles).
    Here I want to know the best practice and Microsoft best recommended procedure for the placement of “FSMO Roles both on (PDC) and (ADC)” ?
    In case if Primary (DC) fails then automatically other Additional (DC) should take care without any problem in live environment.
    Example like (FSMO Roles Distribution between both Servers) should be……. ???
    Primary Domain Controller (PDC) Should contains:????
    Schema Master
    Domain Naming Master
    Additional Domain Controller (ADC) Should contains:????
    RID
    PDC Emulator
    Infrastructure Master
    Please let me know the best practice and Microsoft best recommended procedure for the placement of “FSMO Roles.
    I will be waiting for your valuable comments.
    Regards,
    Muhammad Daud

    Here I want to know the best practice
    and Microsoft best recommended procedure for the placement of “FSMO Roles both on (PDC) and (ADC)” ?
    There is a good article I would like to share with you:http://oreilly.com/pub/a/windows/2004/06/15/fsmo.html
    For me, I do not really see a need to have FSMO roles on multiple servers in your case. I would recommend making it simple and have a single DC holding all the FSMO roles.
    In case if
    Primary (DC) fails then automatically other Additional (DC) should take care without any problem in live environment.
    No. This is not true. Each FSMO role is unique and if a DC fails, FSMO roles will not be automatically transferred.
    There is two approaches that can be followed when an FSMO roles holder is down:
    If the DC can be recovered quickly then I would recommend taking no action
    If the DC will be down for a long time or cannot be recovered then I would recommend that you size FSMO roles and do a metadata cleanup
    Attention! For (2) the old FSMO holder should never be up and online again if the FSMO roles were sized. Otherwise, your AD may be facing huge impacts and side effects.
    This posting is provided "AS IS" with no warranties or guarantees , and confers no rights.
    Get Active Directory User Last Logon
    Create an Active Directory test domain similar to the production one
    Management of test accounts in an Active Directory production domain - Part I
    Management of test accounts in an Active Directory production domain - Part II
    Management of test accounts in an Active Directory production domain - Part III
    Reset Active Directory user password

  • What is the best Practice to improve MDIS performance in setting up file aggregation and chunk size

    Hello Experts,
    in our project we have planned to do some parameter change to improve the MDIS performance and want to know the best practice in setting up file aggregation and chunk size when we importing large numbers of small files(one file contains one record and each file size would be 2 to 3KB) through automatic import process,
    below is the current setting in production:-
    Chunk Size=2000
    No. Of Chunks Processed In Parallel=40
    file aggregation-5
    Records Per Minute processed-37
    and we made the below setting in Development system:-
    Chunk Size=70000
    No. Of Chunks Processed In Parallel=40
    file aggregation-25
    Records Per Minute processed-111
    after making the above changes import process improved but we want to get expert opinion making these changes in production because there is huge number different between what is there in prod and what change we made in Dev.
    thanks in advance,
    Regards
    Ajay

    Hi Ajay,
    The SAP default values are as below
    Chunk Size=50000
    No of Chunks processed in parallel = 5
    File aggregation: Depends  largely on the data , if you have one or 2 records being sent at a time then it is better to cluster them together and send it at one shot , instead of sending the one record at a time.
    Records per minute Processed - Same as above
    Regards,
    Vag Vignesh Shenoy

  • What is the best practice concerning View Objects and List of values

    Hi,
    Let's take these two tables :
    Market_Descriptions
    Id
    Name
    Desc
    Language_Id
    Languages
    Id
    Code
    Desc
    Now, if I am generating these business components with the help of the wizard, I will be having two entities and two views.
    If I am creating a list of value on the field Language_Id in Market_Descriptions table, showing the language code, everything works fine as long as this is drop as a Form but not as a table or read-only form.
    What is the best practice for me to include/replace the language_id with the language code, since it is more user-friendly, on a read-only form or a table?
    Thanks.

    Hi,
    Always have LOV on the field which we want show it on UI. Suppose based on below scenario, If you want to display Langaugae Description on UI, have LOV on Desc but not on Language_Id. Even though we can set display attribute in LOV configuration, but it will work only in forms but not in tables. In tables it will show language id instead of description.
    Have LOV on Desc and make one more join with location Id in LOV configuration, so as soon as we select desc in LOV, location_id will be populated.
    here is the join condition inside LOV
    Languages.Desc = ViewAccessor.Desc
    Languages.Id - ViewAccessor.Desc
    Hope this will help you.
    Dileep.

  • What is the best practice to create IDM user and target accts via recon

    usecase:
    LDAP<--->idm---->AD.
    User exists in LDAP. IDM and AD are empty. Need to create IDM user and AD acct from LDAP data.
    I can recon against LDAP and create the IDM user. But I cannot create AD acct in the same recon process. What is the best practice to do the above.

    i think you have to have a "Per-account Workflow" set which creates the user in AD.

  • What is the best practice for creating master pages and styles with translated text?

    I format translated text all the time for my company. I want to create a set of master pages and styles for each language and then import those styles into future translated documents. That way, the formatting can be done quickly and easily.
    What are the best practices for doing this? As a company this has been tried in the past, but without success. I'd like to know what other people are doing in this regard.
    Thank you!

    I create a master template that is usually void of content, with the exception I define as many of the paragraph styles I believe can/will be used with examples of their use in the body of the document--a style guide for that client. When beginning a new document for that client, I import those styles from the paragraph styles panel.
    Exception to this is when in a rush I begin documentation first, then begin new work. Then in the new work, I still pull in those defined paragraph and or object styles via their panels into the new work.
    There are times I need new styles. If they have broader applicability than a one-off instance or publication, then I open the style template for that client and import that style(s) from the publication containing the new style(s) and create example paragraphs and usage instructions.
    Take care, Mike

  • What is the Best Practice for Server Pool...

    hi,
    wht is the Best Practice for having server pool i.e
    1) having a single large serverpool consisting of "n" number of guest vm
    2) having a multiple small serverpool consisting of less of number of guest vm
    please suggest.....

    Raja Kondar wrote:
    wht is the Best Practice for having server pool i.e
    1) having a single large serverpool consisting of "n" number of guest vm
    2) having a multiple small serverpool consisting of less of number of guest vm I prefer option 1, as this gives me the greatest amount of resources available. I don't have to worry about resources in smaller pools. It also means there are more resources across the pool for HA purposes. Not sure if this is Official Best Practice, but it is a simpler configuration.
    Keep in mind that a server pool should probably have up to 20 servers in it: OCFS2 starts to strain after that.

  • What's the 'best practice' way to get email and fax number from vendor?

    Hello *,
    could anybody let me know what the 'best-practice' is to get the fax number and smtp address from the vendor master? Is there a preferred function module I should use?
    Thanks a lot,
    Torsten

    Hi ,
    try that:
    TYPE-POOLS: szadr.
    DATA adr_kompl TYPE szadr_addr1_complete.
    DATA adr1 TYPE szadr_addr1_line.
    DATA adtel TYPE szadr_adtel_line.
    DATA admail TYPE szadr_adsmtp_line.
    DATA adfax TYPE szadr_adfax_line.
    CALL FUNCTION 'ADDR_GET_COMPLETE'
           EXPORTING
                addrnumber              = lfa1-adrnr
           IMPORTING
                addr1_complete          = adr_kompl
           EXCEPTIONS
                parameter_error         = 1
                address_not_exist       = 2
                internal_error          = 3
                wrong_access_to_archive = 4
                OTHERS                  = 5.
    * Mail
      LOOP AT adr_kompl-adsmtp_tab INTO admail.
        MOVE admail-adsmtp-smtp_addr TO atab-mail.
      ENDLOOP.
    * fax
      LOOP AT adr_kompl-adfax_tab INTO adfax.
        MOVE adfax-adfax-fax_number TO atab-fax_number.
      ENDLOOP.
    regards Andreas

  • Appl.  Hierarchy and Package Hierarchy - what is the Best practice ?

    Hi!
    We plan to create a Package Hierarchy in our system, but then come to think of the Application Hierachy is built the way we think of building the Package Hierarchy.  We planned to build a Main package called fex. HR and below a structure package called HR-PA and below that ordinary packages within that area. But should we rather create our own application hierarchy for this?
    Or should we assign the packages to Sap standard application Hierarchy in the HR area ?
    What is the Best practice in this area ? Have any tried to create their own hierarchies ?
    Regards, Tine

    Hi!
    You can create your own Application Hierarchy as well .
    From ABAP Workbench select Overview->Applic. hierarchy->SAP or Customer
    This is what SAP says about this :
    <i>The application component hierarchy is a method of splitting up the SAP System from a logical or <b>business</b> point of view. Packages are a method of modularizing the system from a <b>technical</b> point of view. This technical modularization can, but need not, match the logical division of the system. Assignments should be made however between these two views.</i>
    Regards, Tine

  • What are the best practices for Database management and performance tuning?

    Hello,
    I want to ensure that I am using the best practices for managing and maintaining our Database.
    Is there any documentation out there that outlines how to maintain and ensure top performance out of our database?
    Thank you!
    John Sefton

    I appreciate the responses, however this is not the information I am looking for.
    I am specificaly looking for best practices invloving the managment and performance tuning.
    Example: are their tools that I can install that will monitor the size and response time of the database and alert me if there is degradation in performance?
    Are there specific periodic activities I should be doing to garuntee that my database will continue to function that way it is supposed to?
    Or is this a fire and forget solution that does not need this attention?

  • Database Log File becomes very big, What's the best practice to handle it?

    The log of my production Database is getting very big, and the harddisk is almost full, I am pretty new to SAP, but familiar with SQL Server, if anybody can give me advice on what's the best practice to handle this issue.
    Should I Shrink the Database?
    I know increase hard disk is need for long term .
    Thanks in advance.

    Hi Finke,
    Usually the log file fills up and grow huge, due to not having regular transaction log backups. If you database is in FULL recovery mode, every transaction is logged in Transaction file, and it gets cleared when you take a log backup. If it is a production system and if you don't have regular transaction log backups, the problem is just sitting there to explode, when you need a point in time restore. Please check you backup/restore strategy.
    Follow these steps to get transactional file back in normal shape:
    1.) Take a transactional backup.
    2.) shrink log file. ( DBCC shrinkfile('logfilename',10240)
          The above command will shrink the file to 10 GB.(recommended size for high transactional systems)
    >
    Finke Xie wrote:
    > Should I Shrink the Database? .
    "NEVER SHRINK DATA FILES", shrink only log file
    3.) Schedule log backups every 15 minutes.
    Thanks
    Mush

  • What are the best practices to migrate VPN users for Inter forest mgration?

    What are the best practices to migrate VPN users for Inter forest mgration?

    It depends on a various factors. There is no "generic" solution or best practice recommendation. Which migration tool are you planning to use?
    Quest (QMM) has a VPN migration solution/tool.
    ADMT - you can develop your own service based solution if required. I believe it was mentioned in my blog post.
    Santhosh Sivarajan | Houston, TX | www.sivarajan.com
    ITIL,MCITP,MCTS,MCSE (W2K3/W2K/NT4),MCSA(W2K3/W2K/MSG),Network+,CCNA
    Windows Server 2012 Book - Migrating from 2008 to Windows Server 2012
    Blogs: Blogs
    Twitter: Twitter
    LinkedIn: LinkedIn
    Facebook: Facebook
    Microsoft Virtual Academy:
    Microsoft Virtual Academy
    This posting is provided AS IS with no warranties, and confers no rights.

  • What is the best practice to display info of completed task in process flow

    Hi all,
    I'm starting to study BPM modeling with CE7.1 EHP1. Thanks to the tutorial and example on SDN site and I can easily build my own process in NWDS and deploy to server, start it, finish it.
    I like the new runtime which can show a BPMN diagram to the processors. However, I can't find a way to let the follow up processor to review the task result completed in previous step. I'm more familiar with Guided Procedure, and know there is "Display Callable Object" which can used to show some info of a completed task when the processor/owner/admin/overseer click on a completed task.  Where is the feature in BPM ? What is the best practice to show such task information in BPM environment.
    For example, A multiple level approval process, the higher level approver need to know the comment written by the previous approver. Can he read this information from process flow ?
    I think it is very important feature for a BPM platform. In Guided Procedure, such requirement can be done with Display Callable Object + View Permission, and you just need some coding for the UI. If BPM is superior to GP, I think there must be a way to achieve this, I just do not know how ?
    Can anyone shed me some light on it ?

    Oliver,
    Thanks for your quick reply.
    Yes, Notes and Attachment CAN BE USED for the purpose. But I'm still looking for a more elegant solution.
    With the solution of using Notes/Attachment, the processor need to give input at two places : the task UI and Note/Attach , with similar or same data. It is really annoying.
    Is there any SAP BPM real-world deployment ? None of customer has the requirement ?

  • Informatica and Essbase Best Practice questions

    We now have the Informatica adapter for Essbase installed and working. We have been able to get Informatica to upload data successfully. Now I have a few questions that I have not been able to find answers to in any documentation or forums for Informatica or Essbase. I have submitted these same questions to the Informatica Support but thought I would also post the questions here to see if many folks are using Informatica against Essbase.
    We are using:
    Informatica 8.6.1 (Linux)
    Essbase 11.1.1.3 (Windows 2003)
    1) I can see in Informtica that when we load data to Essbase (Target) it gives me the option to run a calc script AFTER it loads the data. However, if I need to run a Calc script BEFORE the load to Essbase (Target) what is the best practice? The work around I have found was to add the same session twice and for the 1st instance select the option to 'ONLY RUN THE CALC SCRIPT' on the mapping tab. The problem with this is the log shows that it will still run the query against the Source tables. This will impact run times and double to querying against the Source database. What is the Best Practice and proper way to build the workflow to Run a Calc Script BEFORE the load?
    2)Since you do not see the list of Calc Scripts for Essbase in Informatica (you have to manually type the Calc name), If I want to run the 'Default' calc for Essbase what is the syntax to run the 'Default' Calc Script? Tried 'Default' but didn't seem to work.
    3)I have other tasks in Essbase I want to do before actually having Informatica load the data. I would like to run the MAXL commands via a Command task. What is the Best Practice for doing this and the syntax to run MAXL commands in a Command Task in Informatica? I previously had Shell scripts built on the Informatica server that would be kicked off within Informatica, but we are trying to move away from shell scripts and instead have the scripting codes IN the workflows/sessions to make it easier to review the code and follow the logic, rather than having to find the scripts and open each of them.
    Any assistance you have with the two products working together I would GREATLY appreciate it!
    Robert

    AS i know addUser(User user){ ... } is much more useful for several reasons:
    1.Its object oriented
    2.its easy to write , because if Object has many parameters its very painful to write method with comma seperated parameters

Maybe you are looking for