SLD Landscape Best Practice recomendation

I am seeking advise on setting up SLD in my ever growing landscape.
I currently have a Master SLD on the same system as my NWDI system and a local SLD on my Solution manager 7.0 system that is updated from the master.
The Master SLD is shared by our ECC6 Duel Stack landscape and my BI 70 Duel stack portal landscape.
I have upcoming projects of implementing a PI 7.1 landscape, implementing CTS+ and a Solution Manager Enterprise upgrade all of which will be heavily dependent on SLD.  
I have seen documentation that PI would like it's own local SLD.
My question is what would be the prefered SLD landscape and how to I get there.  Any recomendations or best practices would be most appreciated.
Bill Stouffer
Basis Administrator.

Hi,
SLD that we have implemented in our landscape is like bleow:
1) All PI  and Portal system has local SLD.
2) For all non- production system, we have one SLD and production system we have seperate SLD.
3) It means we are following 3 tier SLD landscape as resommended by SAP.
4) Main SLd lies on solman, for production  we have seperate and non-production we have seperate. non production and production sld sent data to main sld that is on solman.
4) All systems except PI and portal send data to production and non-production sld. PI and portal systems first send data to local which in turns send to production and non-production sld.
5) So by this way you whole environment is secure as production sld is different.
So, i will recommend to have 3 tier SLD approach. One of the important thing is don't use cenrtal user to send data across SLD as one user lock will in turns be the fallback of whole environment. So, always make each system specific user for data transfer so that user lock of one system will not impact other.
If you need any other information please let me know.
Thanks
Sunny

Similar Messages

  • SLD Best Practices

    Hi
    From where can I get the best practices related to SLD?
    My XI system Landscape contains following systems.
    Sandbox
    Development
    Quality
    Production
    I am considering following options:
    Individual SLD for each system.
    One SLD for Sandbox, Second for Development and Quality and Third for Production.
    One for Sandbox, Development and Quality and Second for Production.
    I have some advantages and disadvantages in my mind regarding each approach however I would request forum to please share their experiences.
    Further to this there is Solution Manager also in the system which has its own SLD. Should that be utilized or should that be kept aloof from the XI system.

    Hi,
    My opinion, one SLD for each environment is good.
    Just go thru following links -
    SLD
    Central SLD vs SLD for each XI instance
    SLD best practices
    SAP NW 2004(s) SOLMAN/SLD/SLM best practice
    SLD best practices
    Check how to guides from service.sap.com.
    Hope this helps..
    Regards,
    Moorthy

  • Best Practices in SLD

    Best Practices needs to be followed in SLD.
    Currently our SLD's are maintained by the Basis and our Development team has only the Read Access. Whether it is NOrmal?
    Thanks

    Though its very irritating for a XI developer but from a best practice perspective its better if everyone in the development does not have access to modify the SLD. 
    As SLD configuration is generally a onetime activity and only changes are made when you need to add or delete systems in your system landscape which should be done by the basis team.
    Cheer's

  • Best Practice of using ERM (Role Expert) in Landscape

    Hello,
    Can anyone tell me what is the best practice (choice) of using ERM in the SAP landscape?
    1. Creating a role in DEV system using ERM and using SAP standard transport process to transport role to QAS and PRD systems.
    OR
    2. Creating a role in all systems in ladscape (DEV, QAS and PRD).
    Please share if you have any best practice implementation scenarios.
    Appreciate for the help.
    Thanks
    Harry.

    Harry,
       The best practice is to follow Option 1. You should never directly create a role in Prod system. This is what SAP recommends as well.
    Alpesh

  • WCEM Best Practice deployment in a multi CRM Landscape

    Hi SCN
    Im looking for advice in relation to best practice deployment of WCEM. Specifically in a multi CRM landscape scenario.
    Do best practices exist?
    JR

    Look into using NWDI as your source code control (DTR) and transport/migration from dev through to production.  This also will handle the deployment to your dev system (check-in/activate).
    For unit testing and debugging you should be running a local version (NWDW).  This way once the code is ready to be shared with the team, you check it in (makes it visible to other team members) and activate it (deploys it to development server).
    We are currently using a separate server for WD applications rather than running them on the portal server.  However, this does not allow for the WD app to run in the new WD iView.  So it depends on what the WD app needs to do an have access to.  Of course there is always the Federated Portal Network as an option, but that is a whole other topic.
    For JCo connections, WD uses a connection name and this connection can be set up to point to different locations depending on which server it is on.  So on the development server the JCo connection can point to the dev back-end and in prod point to the prod back-end.  The JCo connections are not migrated, but setup in each system.
    I hope this helps.  There is a lot of documentation available for NWDI to get you started.  See:  http://help.sap.com/saphelp_erp2005/helpdata/en/01/9c4940d1ba6913e10000000a1550b0/frameset.htm
    -Cindy

  • Best practice SAP landscape

    Hello all,
    I would like to know if there is some kind of best practice regarding SAP landscape in a big company.
    For example is it recommended to have in the landscape a SAP Quality Assurance System open for customizing (transaction SCC4) so that quick customizing tests are performed at any moment, instead of customizing in Development system and then transports in QaS. (this can be very frustrating because for solving and testing an issue it's possible that numerous customizing tasks and reset of customzing is neccessary) ?
    How SAP compliant would this solution be?
    Thank you very much for your help!
    Daniel Nicula

    Hmmm, I do not know exactly if the question can be posed here in GRC related threads.
    But it seemed to me that it is somehow connected.
    Anyway, I agree with you that final customizing should be done in DEV and then transported in QAS.
    What i am not sure is if it is against SAP recommendations to have a QAS opened for customizing and try all the solutions for an issue. And in the end when you are sure of what you want to do and to obtain, then you do the customizing also in DEV and follow the normal transport route.
    Which can be the risks in case you have a QAS opened for customizing?
    Thank you.

  • Question about Best Practices - Redwood Landscape/Object Naming Conventions

    Having reviewed documentation and posts, I find that there is not that much information available in regards to best practices for the Redwood Scheduler in a SAP environment. We are running the free version.
    1) The job scheduling for SAP reference book (SAP Press) recommends multiple Redwood installations and using export/import to move jobs and other redwood objects from say DEV->QAS->PROD. Presentations from the help.sap.com Web Site show the Redwood Scheduler linked to Solution Manager and handling job submissions for DEV-QAS-PROD. Point and Shoot (just be careful where you aim!) functionality is described as an advantage for the product. There is a SAP note (#895253) on making Redwood highly available. I am open to comments inputs and suggestions on this issue based on SAP client experiences.
    2) Related to 1), I have not seen much documentation on Redwood object naming conventions. I am interested in hearing how SAP clients have dealt with Redwood object naming (i.e. applications, job streams, scripts, events, locks). To date, I have seen in a presentation where customer objects are named starting with Z_. I like to include the object type in the name (e.g. EVT - Event, CHN - Job Chain, SCR - Script, LCK - Lock) keeping in mind the character length limitation of 30 characters. I also have an associated issue with Event naming given that we have 4 environments (DEV, QA, Staging, PROD). Assuming that we are not about to have one installation per environment, then we need to include the environment in the event name. The downside here is that we lose transportability for the job stream. We need to modify the job chain to wait for a different event name when running in a different environment. Comments?

    Hi Paul,
    As suggested in book u2018job scheduling for SAP from SAPu2019 press it is better to have multiple instances of Cronacle version (at least 2 u2013 one for development & quality and other separate one for production. This will have no confusion).
    Regarding transporting / replicating of the object definitions - it is really easy to import and export the objects like Events, Job Chain, Script, Locks etc. Also it is very easy and less time consuming to create a fresh in each system. Only complicated job chains creation can be time consuming.
    In normal cases the testing for background jobs mostly happens only in SAP quality instance and then the final scheduling in production. So it is very much possible to just export the verified script / job chain form Cronacle quality instance and import the same in Cronacle production instance (use of Cronacle shell is really recommended for fast processing)
    Regarding OSS note 895253 u2013 yes it is highly recommended to keep your central repository, processing server and licencing information on highly available clustered environment. This is very much required as Redwood Cronacle acts as central job scheduler in your SAP landscape (with OEM version).
    As you have confirmed, you are using OEM and hence you have only one process server.
    Regarding the conventions for names, it is recommended to create a centrally accessible naming convention document and then follow it. For example in my company we are using the naming convention for the jobs as Z_AAU_MM_ZCHGSTA2_AU01_LSV where A is for APAC region, AU is for Australia (country), MM is for Materials management and then ZCHGSTA2_AU01_LSV is the free text as provided by batch job requester.
    For other Redwood Cronacle specific objects also you can derive naming conventions based on SAP instances like if you want all the related scripts / job chains to be stored in one application, its name can be APPL_<logical name of the instance>.
    So in a nutshell, it is highly recommend
    Also the integration of SAP solution manager with redwood is to receive monitoring and alerting data and to pass the Redwood Cronacle information to SAP SOL MAN to create single point of control. You can find information on the purpose of XAL and XMW interfaces in Cronacle help (F1). 
    Hope this answers your queries. Please write if you need some more information / help in this regard.
    Best regards,
    Vithal

  • Best Practice to transport MDM Workflows trough a 3-tier Landscape

    Hello Experts,
    I wanted to ask you for some Best practices to transport MDM Workflows from D-to Q/P?
    Well besides the Archive/Unarchive method because we want to keep the Data of P in the repository
    and do not want to import the whole.
    Any ideas? Points are for sure!!
    Best regards
    Carsten

    Hi Carsten,
    Currently with the MDM 5.5 version the only transport mechanism that will allow you to export your Workflows from the development server to the Production or Quality is the Archieve/Unarchieve method.
    but this methods archives the data as well along with the workflow.There is no such method to exort the workflow alone.
    But with the onset of the MDM 7.1 there are new transport mechanism included and hopefully they address this issue.
    A work around that i can think of at this moment:
    - If you have your development server as your Master rep and the Production server as your slave
    - Then you can call the workflow from the D to the P.If you do not have any data presntly in the dev.
    -  and then Normalize the Productions server to make it an individual master rep with its own data.
    I am not sure how well this solution will help but just a suggestion....
    Hope It Helped
    Thanks & Regards
    Simona Pinto

  • Best Practice of copying Production to Development

      Hi everyone.  My management would like some documentation  on the Best Practice of copying Production systems to Development. 
    1. In the past, we've setup the clients and connections/interfaces and shrink the database by deleting unneeded production data. Now that there is note 130906, saving the ABAP versions may be easier. 
    2. Every two or three years we copy our BIW Production to Development.  This is also labor intensive to set up the development system again with source systems.
    3. We've copied our PI 70 system to a sandbox, and again this is labor intensive to reset the PI object names, since the system names are imbedded in the object names.
    4. I've done java systems to, so I'm not to concerned about these.
    5. There is talk of copying the following Production systems to Development systems - SLD, PI 7.1, Solution Manager, SAP R/3, BIW, MII...some of these make sense, others we question if it is worth it.
      As a note, we copy our SAP R/3 Production instance to our QA environment about twice a year and to another QA system monthly.  We also copy our CRM Production to our QA environment twice a year and BIW Production to QA environment once a year.  We can leverage what we know about copying to PRD to QA, but we want to give them alternatives about copying PRD to DEV.
    We'd like to use SAP's TDMS to move data from PRD to DEV.  But unfortunately, that project has not gotten management  attention as being a high priority (yet).
    Rather than take my  word for it, is there any SAP provided documentation that would give my management a set of alternatives on their desire to copy our Production systems to Development.
    Thanks...Scott B.

    From a E2E200 perspective SAP recommends the following landscape:
    Maint = Dev -> QA -> Pre-Prod -> Prod
    N+1 =  Dev2 -> QA2 -> Pre-Prod -> Prod
    Maint is all object versions to remain ideally the same versions, and only transports to fix a short dumping Prod system.  N+1 is where you have implementation/upgrade projects worked on and allow you import the entire project of transports.
    Another consideration, when you restore a Prod system to a Dev or new Dev system; you are destroying all of your version info which means your developers can't view or revert changes for the objects.

  • PI best practice and integration design ...

    Im currently on a site that has multiple PI instance's for each region and the question of inter-region integration has been raised my initial impression is that each PI will be in charge of integration of communications for its reginal landscape and inter-region communications will be conducted through PI - PI interface . I havent come across any best practice in this regard and have never been involved with a multiple PI landscape ...
    Any thoughts ? or links to best practice for this kind of landscape ?...
    to Summaries
    I think this is the best way to set it up, although numerous other combinations are possible, this seems to be the best way to avoid any signifcant system coupling. When talking about ECC - ECC inter-region communications
    AUS ECC -
    > AUS PI -
    > USA PI -
    > USA ECC

    abhishek salvi wrote:
    I need to get data from my local ECC to USA ECC, do i send the data to their PI/my PI/directly to their ECC, all will work, all are
    valid
    If LocalECC --> onePI --> USA ECC is valid, then you dont have to go for other PI in between...why to increase the processing time....and it seems to be a good option to bet on.
    The issue is
    1. Which PI system should any given peice of data be routed through and how do you manage the subsequent spider web of interfaces resulting from PI AUS talking to ECC US, ECC AU, BI US, BI AU and the reverse for the PI USA system.
    2. Increased processing time Integration Engine - Integration Engine should be minimal and it will mean a consistent set of interfaces for support and debug, not to mention the simplification of SLD contents in each PI system.
    I tend to think of like network routing, the PI system is the default gateway for any data not bound for a local systems you send and let PI figure out what the next step is.
    abhishek salvi wrote:
    But then what about this statement (is it a restriction or business requirement)
    Presently the directive is that each PI will manage communications with its own landscape only respectively
    When talking multiple landscapes dev / test / qa / prod, each landscape has its own PI system generally, this is an extention of the same idea except that both systems are productive, from a interface and customisation point of view given the geographical remotness of each system local interface development for local systems and support makes sense, whilst not limited to this kind of interaction typically interfaces for a given business function for a given location (location specific logic) would be developed in concert with the interface and as such has no real place on the remote system (PI).
    To answer your question there is no rule, it just makes sense.

  • Best Practice for setting systems up in SMSY

    Good afternoon - I want to cleanup our SMSY information and I am looking for some best practice advice on this. We started with an ERP 6.0 dual-stack system. So I created a logical component Z_ECC under "SAP ERP" --> "SAP ECC Server" and I assigned all of my various instances (Dev, QA, Train, Prod) to this logical component. We then applied Enhancement Package 4 to these systems. I see under logical components there is an entry for "SAP ERP ENHANCE PACKAGE". Now that we are on EhP4, should I create a different logical component for my ERP 6.0 EhP4 systems? I see in logical components under "SAP ERP ENHANCE PACKAGE" there are entries for the different products that can be updated to EhP4, such as "ABAP Technology for ERP EHP4", "Central Applications", ... "Utilities/Waste&Recycl./Telco". If I am supposed to change the logical component to something based on EhP4, which should I choose?
    The reason that this is important is that when I go to Maintenance Optimizer, I need to ensure that my version information is correct so that I am presented with all of the available patches for the parts that I have installed.
    My Solution Manager system is 7.01 SPS 26. The ERP systems are ECC 6.0 EhP4 SPS 7.
    Any assistance is appreciated!
    Regards,
    Blair Towe

    Hello Blair,
    In this case you have to assign products EHP 4 for ERP 6 and SAP ERP 6 for your system in SMSY.
    You will then have 2 entries in SMSY, one under each product, the main instance for EHP 4 for ERP 6 must be central applications and the one for SAP ERP 6 is SAP ECC SERVER.
    This way your system should be correctly configured to use the MOPZ.
    Unfortunately I'm not aware of a guide explaining these details.
    Some times the System Landscape guide at service.sap.com/diagnostics can be very useful. See also note 987835.
    Hope it can help.
    Regards,
    Daniel.
    Edited by: Daniel Nicol on May 24, 2011 10:36 PM

  • Basic Strategy / Best Practices for System Monitoring with Solution Manager

    I am very new to SAP and the Basis group at my company. I will be working on a project to identify the best practices of System and Service level monitoring using Solution Manager. I have read a good amount about SAP Solution Manager and the concept of monitoring but need to begin mapping out a monitoring strategy.
    We currently utilize the RZ20 transaction and basic CCMS monitors such as watching for update errors, availability, short dumps, etc.. What else should be monitored in order to proactively find possible issues. Are there any best practices you all have found when implimenting Monitoring for new solutions added to the SAP landscape.... what are common things we would want to monitor over say ERP, CRM, SRM, etc?
    Thanks in advance for any comments or suggestions!

    Hi Mike,
    Did you try the following link ?
    If not, it may be useful to some extent:
    http://service.sap.com/bestpractices
    ---> Cross-Industry Packages ---> Best Practices for Solution Management
    You have quite a few documents there - those on BPM may also cover Solution Monitoring aspects.
    Best regards,
    Srini
    Edited by: Srinivasan Radhakrishnan on Jul 7, 2008 7:02 PM

  • Best practice for extracting data to feed external DW

    We are having a healthy debate with our EDW team about extracting data from SAP.  They want to go directly against ECC tables using Informatica and my SAP team is saying this is not a best practice and could potentially be a performance drain.  We are recommending going against BW at the ODS level.  Does anyone have any recommendations or thoughts on this?

    Hi,
    As you asked for Best Practice, here it is in SAP landscape.
    1. Full Load or Delta Load data from SAP ECC to SAP BI (BW): SAP BI understand the data element structure of SAP ECC, and delta mechanism is the continous process of data load from a SAP ECC (transaction system) to BI (Analytic System).
    2. You can store transaction data in DSOs (granular level), and in InfoCubes (at a summrized level) within SAP BI. You can have master data from SAP ECC coming into SAP BI separately.
    3. Within SAP BI, you SHOULD use OpenHub service to provide SAP BI data to other external system. You must not connect external extractor to fetch data from DSO and InfoCube to target system. OpenHub service is the tool that faciliate data feeding to external system. You can have Informatica to take data from OpenHubs of SAP BI.
    Hope I explain to best of your satisfaction.
    Thanks,
    S

  • Best practice architecture Wireless security

    What is the best practice architecture for wireless to the wire network?
    Use AP to Firewall and it to a router using RADIUS?
    It apply to Control is a safety?
    What models Cisco recomend (Hard and Soft?)
    Is any place in Cisco that I can use to see Architecture recomendations that integrete Wireless, Radio (Microwave) and Voice over IP com-plete system?

    using one of the 802.1x types (i.e. LEAP, EAP-FAST, PEAP) with WPAv2 (AES encryption). Too bad that there are not many wireless adapters support AES.
    All Cisco wireless product support AES in 12.3(2)JA recently.
    Also, you may want to configure WDS for radio management.

  • Best practice in migrating to a production system

    Dear experts,
    Which is the best practice to follow during an implementation project to organize development, quality and production environment?
    In my case, considering that SRM is connected to the back-end development system, what should be done to connect SRM to back-end quality environment:
    - connect the same SRM server to back-end quality environment, even if in this case the old data remain in SRM or
    - connect another SRM server to back-end quality environment?
    thanks,

    Hello Gaia,
    If yo have 3 landscape, backend connection should be like this.
    SRM DEV   - ERP DEV
    SRM QAS   - ERP QAS
    SRM PRD - ERP PRD
    If you have 2 landscape.
    SRM(client 100) - ERP DEV
    SRM(client 200) - ERP QAS
    SRM PRD         - ERP PRD
    Regards,
    Masa

Maybe you are looking for