Best Practices in SLD

Best Practices needs to be followed in SLD.
Currently our SLD's are maintained by the Basis and our Development team has only the Read Access. Whether it is NOrmal?
Thanks

Though its very irritating for a XI developer but from a best practice perspective its better if everyone in the development does not have access to modify the SLD. 
As SLD configuration is generally a onetime activity and only changes are made when you need to add or delete systems in your system landscape which should be done by the basis team.
Cheer's

Similar Messages

  • SLD Best Practices

    Hi
    From where can I get the best practices related to SLD?
    My XI system Landscape contains following systems.
    Sandbox
    Development
    Quality
    Production
    I am considering following options:
    Individual SLD for each system.
    One SLD for Sandbox, Second for Development and Quality and Third for Production.
    One for Sandbox, Development and Quality and Second for Production.
    I have some advantages and disadvantages in my mind regarding each approach however I would request forum to please share their experiences.
    Further to this there is Solution Manager also in the system which has its own SLD. Should that be utilized or should that be kept aloof from the XI system.

    Hi,
    My opinion, one SLD for each environment is good.
    Just go thru following links -
    SLD
    Central SLD vs SLD for each XI instance
    SLD best practices
    SAP NW 2004(s) SOLMAN/SLD/SLM best practice
    SLD best practices
    Check how to guides from service.sap.com.
    Hope this helps..
    Regards,
    Moorthy

  • SLD Landscape Best Practice recomendation

    I am seeking advise on setting up SLD in my ever growing landscape.
    I currently have a Master SLD on the same system as my NWDI system and a local SLD on my Solution manager 7.0 system that is updated from the master.
    The Master SLD is shared by our ECC6 Duel Stack landscape and my BI 70 Duel stack portal landscape.
    I have upcoming projects of implementing a PI 7.1 landscape, implementing CTS+ and a Solution Manager Enterprise upgrade all of which will be heavily dependent on SLD.  
    I have seen documentation that PI would like it's own local SLD.
    My question is what would be the prefered SLD landscape and how to I get there.  Any recomendations or best practices would be most appreciated.
    Bill Stouffer
    Basis Administrator.

    Hi,
    SLD that we have implemented in our landscape is like bleow:
    1) All PI  and Portal system has local SLD.
    2) For all non- production system, we have one SLD and production system we have seperate SLD.
    3) It means we are following 3 tier SLD landscape as resommended by SAP.
    4) Main SLd lies on solman, for production  we have seperate and non-production we have seperate. non production and production sld sent data to main sld that is on solman.
    4) All systems except PI and portal send data to production and non-production sld. PI and portal systems first send data to local which in turns send to production and non-production sld.
    5) So by this way you whole environment is secure as production sld is different.
    So, i will recommend to have 3 tier SLD approach. One of the important thing is don't use cenrtal user to send data across SLD as one user lock will in turns be the fallback of whole environment. So, always make each system specific user for data transfer so that user lock of one system will not impact other.
    If you need any other information please let me know.
    Thanks
    Sunny

  • Best Practice of copying Production to Development

      Hi everyone.  My management would like some documentation  on the Best Practice of copying Production systems to Development. 
    1. In the past, we've setup the clients and connections/interfaces and shrink the database by deleting unneeded production data. Now that there is note 130906, saving the ABAP versions may be easier. 
    2. Every two or three years we copy our BIW Production to Development.  This is also labor intensive to set up the development system again with source systems.
    3. We've copied our PI 70 system to a sandbox, and again this is labor intensive to reset the PI object names, since the system names are imbedded in the object names.
    4. I've done java systems to, so I'm not to concerned about these.
    5. There is talk of copying the following Production systems to Development systems - SLD, PI 7.1, Solution Manager, SAP R/3, BIW, MII...some of these make sense, others we question if it is worth it.
      As a note, we copy our SAP R/3 Production instance to our QA environment about twice a year and to another QA system monthly.  We also copy our CRM Production to our QA environment twice a year and BIW Production to QA environment once a year.  We can leverage what we know about copying to PRD to QA, but we want to give them alternatives about copying PRD to DEV.
    We'd like to use SAP's TDMS to move data from PRD to DEV.  But unfortunately, that project has not gotten management  attention as being a high priority (yet).
    Rather than take my  word for it, is there any SAP provided documentation that would give my management a set of alternatives on their desire to copy our Production systems to Development.
    Thanks...Scott B.

    From a E2E200 perspective SAP recommends the following landscape:
    Maint = Dev -> QA -> Pre-Prod -> Prod
    N+1 =  Dev2 -> QA2 -> Pre-Prod -> Prod
    Maint is all object versions to remain ideally the same versions, and only transports to fix a short dumping Prod system.  N+1 is where you have implementation/upgrade projects worked on and allow you import the entire project of transports.
    Another consideration, when you restore a Prod system to a Dev or new Dev system; you are destroying all of your version info which means your developers can't view or revert changes for the objects.

  • Upgrade from XI 3.0 (SP17) to PI 7.0 (SP12): Best Practices

    Hi mates,
    I've a few questions regarding the upgrade to PI 7.0.
    If the upgrade process is done in phased manner i.e. DEV, QA, PRD in that order over a period of 3 weeks,
    How do we handle the transports from DEV->QA->PRD when DEV is 7.0 and QA & PRD on 3.0 using CMS?
    Will there be inconsistencies when transports are moved, because a few features could be different or internal representation of IR/ID content could be different in 3.0 & 7.0?
    What are the upgrade best practices, phased manner or all at a time?
    Please throw more light on this. I appreciate the inputs from people who've helped clients upgrade from 3.0 to 7.0. Any SAP documentation related to this?
    I looked at the thread Transports from XI 3.0 to PI 7.0 but the answert is very generic, w/o any supporting documentation and real time experience.
    thx in adv
    praveen

    <i>> Hi mates,
    >
    > I've a few questions regarding the upgrade to PI
    > 7.0.
    >
    > If the upgrade process is done in phased manner i.e.
    > DEV, QA, PRD in that order over a period of 3 weeks,
    > How do we handle the transports from DEV->QA->PRD
    > when DEV is 7.0 and QA & PRD on 3.0 using CMS?</i>
    Im my experience you should implement a change freeze for this period, with the exception of emergency transports - and these have to be justified.
    <i>> Will there be inconsistencies when transports are
    > moved, because a few features could be different or
    > internal representation of IR/ID content could be
    > different in 3.0 & 7.0?</i>
    Full regression testing will be needed on each interface.  If you implement a change freeze this process becomes easier
    <i>>
    > What are the upgrade best practices, phased manner or
    > all at a time?</i>
    Allow plenty of resources to fully test every interface, as you would with any upgrade.
    You could also take the time to implement the new <a href="https://www.sdn.sap.comhttp://www.sdn.sap.comhttp://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/80bbbd55-5004-2a10-f181-8bb096923cb6">CTS+</a> for this new version to avoid any further transport issues (PI7.0 SP12 +).  A new method for a new system!
    Also, how is your SLD set up for XI?  This can have an impact.

  • Help Please!!  Best Practices for building an NDS Project...

    We are doing a Proof of Concept on using NDS to develop non-SAP Java applications. We are attempting to determine if we can replace our current Java development tools with NDS/WAS.
    We are struggling with SAP's terminology and "plumbing" for setting up/defining Java projects. For example, what is and when do you define Tracks, Software Components, Development Components, etc. All of these terms are totally foreign to us and do not relate to our current Java environment (at least not that we can see). We are also struggling with how the DTR and activities tie in to those components.
    If any one has defined best practices for setting up Java projects or has struggled with and overcome these same issues, please provide us with some guidance. This is a very frustrating and time-consuming issue for us.
    Thank you!!

    Hello Peggy,
    this is my first post but I hope it helps you anyway.
    To learn the SAP "language" I additionally used the a SAP Presentation regarding the SAP JDI.
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/documents/a1-8-4/java development infrastructure real world use webinar.pdf
    I think this one is quite useful as an addon to the other links for information you already got. Your name also indicates that your mother-tongue language is German. If so, the german version of the book (Java-Programming with the SAP WAS) is already available for purchase and really useful. Then, you can also use the information provided by the University of Potsdam. They have an introduction about how to setup a track in the SLD and then how to setup SCs.
    http://epic.hpi.uni-potsdam.de/nwlab/SC+Track.html
    Hope this helps...

  • Product and SWCV best practice

    We have a 3rd party Product  that tend to change product versions
    frequently as once in 3-5 month.
    as SAP Software logistics mechanisem is based on hierrachy of
    Product->product version->SWCU->SWCV
    My quesion is :
    what is the best way to maintain this Product versioning in the SLD and IR,
    To allow best practice of software logistics in XI and maintanance?
    Please share from you knowledge and expiriance only.
    Nimrod

    Structuring Integration Repository Content - Part 1: Software Component Versions
    Have a look at that weblog and also search for the following parts of the same. That should give you a good idea.

  • PI best practice and integration design ...

    Im currently on a site that has multiple PI instance's for each region and the question of inter-region integration has been raised my initial impression is that each PI will be in charge of integration of communications for its reginal landscape and inter-region communications will be conducted through PI - PI interface . I havent come across any best practice in this regard and have never been involved with a multiple PI landscape ...
    Any thoughts ? or links to best practice for this kind of landscape ?...
    to Summaries
    I think this is the best way to set it up, although numerous other combinations are possible, this seems to be the best way to avoid any signifcant system coupling. When talking about ECC - ECC inter-region communications
    AUS ECC -
    > AUS PI -
    > USA PI -
    > USA ECC

    abhishek salvi wrote:
    I need to get data from my local ECC to USA ECC, do i send the data to their PI/my PI/directly to their ECC, all will work, all are
    valid
    If LocalECC --> onePI --> USA ECC is valid, then you dont have to go for other PI in between...why to increase the processing time....and it seems to be a good option to bet on.
    The issue is
    1. Which PI system should any given peice of data be routed through and how do you manage the subsequent spider web of interfaces resulting from PI AUS talking to ECC US, ECC AU, BI US, BI AU and the reverse for the PI USA system.
    2. Increased processing time Integration Engine - Integration Engine should be minimal and it will mean a consistent set of interfaces for support and debug, not to mention the simplification of SLD contents in each PI system.
    I tend to think of like network routing, the PI system is the default gateway for any data not bound for a local systems you send and let PI figure out what the next step is.
    abhishek salvi wrote:
    But then what about this statement (is it a restriction or business requirement)
    Presently the directive is that each PI will manage communications with its own landscape only respectively
    When talking multiple landscapes dev / test / qa / prod, each landscape has its own PI system generally, this is an extention of the same idea except that both systems are productive, from a interface and customisation point of view given the geographical remotness of each system local interface development for local systems and support makes sense, whilst not limited to this kind of interaction typically interfaces for a given business function for a given location (location specific logic) would be developed in concert with the interface and as such has no real place on the remote system (PI).
    To answer your question there is no rule, it just makes sense.

  • Best practices in choosing Product Version / App Comp Type

    Unfornunaly I din't find a document like SAP XI Best practice.
    I am a litle bit confuced  in terms Product version, Software Component Version, Main Instance...
    Are the following scenarios right?
    1. <b>I want to integrate non-SAP System.</b>
         a) I have to create SYSTEM and PRODUCT in SLD
         b) I have to create integration scenario in my PRODUCT
         c) Which type of Application Component do I have to choose?
            (In help.sap.com <b>Choise Template if it is non-SAP product that it is not defined in SLD</b>)
            It is non-SAP product <b>but it is defined</b> in SLD. Which type of App Comp is the best?
            What sense is in Application Component Type? 
    2. <b>I want to integrate SAP SRM and Oracle DB of non-SAP application</b>
         Do I need to use SAP SRM Product Version and to create Integration Scenarion there?
         Which Application Component Type do I need to choose for non-SAP application? Template?

    Hi Sergey,
    >>1. I want to integrate non-SAP System.
    >>>> b) I have to create integration scenario in my PRODUCT
    no you don't have to create integration scenarios
    these are only created to show the flow of our processes
    but you don't have to create/use them
    take a look at this weblog if you really want to create it:
    /people/siva.maranani/blog/2005/08/27/modeling-integration-scenario146s-in-xi
    >>> Do I need to use SAP SRM Product Version and to create Integration Scenarion there?
    you don't need integration scenario here too
    but if you want you can create it
    Regards,
    michal

  • Logical level in Fact tables - best practice

    Hi all,
    I am currently working on a complex OBIEE project/solution where I am going straight to the production tables, so the fact (and dimension) tables are pretty complex since I am using more sources in the logical tables to increase performance. Anyway, what I am many times struggling with is the Logical Levels (in Content tab) where the level of each dimension is to be set. In a star schema (one-to-many) this is pretty straight forward and easy to set up, but when the Business Model (and physical model) gets more complex I sometimes struggle with the aggregates - to get them work/appear with different dimensions. (Using the menu "More" - "Get levels" does not allways give the best solution......far from). I have some combinations of left- and right outer join as well, making it even more complicated for the BI server.
    For instance - I have about 10-12 different dimensions - should all of them allways be connected to each fact table? Either on Detail or Total level. I can see the use of the logical levels when using aggregate fact tables (on quarter, month etc.), but is it better just to skip the logical level setup when no aggregate tables are used? Sometimes it seems like that is the easiest approach...
    Does anyone have a best practice concerning this issue? I have googled for this but I haven't found anything good yet. Any ideas/articles are highly appreciated.

    Hi User,
    For instance - I have about 10-12 different dimensions - should all of them always be connected to each fact table? Either on Detail or Total level.It not necessary to connect to all dimensions completely based on the report that you are creating ,but as a best practice we should maintain all at Detail level only,when you are mentioning any join conditions in physical layer
    for example for the sales table if u want to report at ProductDimension.ProductnameLevel then u should use detail level else total level(at Product,employee level)
    Get Levels. (Available only for fact tables) Changes aggregation content. If joins do not exist between fact table sources and dimension table sources (for example, if the same physical table is in both sources), the aggregation content determined by the administration tool will not include the aggregation content of this dimension.
    Source admin guide(get level definition)
    thanks,
    Saichand.v

  • Best practices for setting up users on a small office network?

    Hello,
    I am setting up a small office and am wondering what the best practices/steps are to setup/manage the admin, user logins and sharing privileges for the below setup:
    Users: 5 users on new iMacs (x3) and upgraded G4s (x2)
    Video Editing Suite: Want to connect a new iMac and a Mac Pro, on an open login (multiple users)
    All machines are to be able to connect to the network, peripherals and external hard drive. Also, I would like to setup drop boxes as well to easily share files between the computers (I was thinking of using the external harddrive for this).
    Thank you,

    Hi,
    Thanks for your posting.
    When you install AD DS in the hub or staging site, disconnect the installed domain controller, and then ship the computer to the remote site, you are disconnecting a viable domain controller from the replication topology.
    For more and detail information, please refer to:
    Best Practices for Adding Domain Controllers in Remote Sites
    http://technet.microsoft.com/en-us/library/cc794962(v=ws.10).aspx
    Regards.
    Vivian Wang

  • Add fields in transformations in BI 7 (best practice)?

    Hi Experts,
    I have a question regarding transformation of data in BI 7.0.
    Task:
    Add new fields in a second level DSO, based on some manipulation of first level DSO data. In 3.5 we would have used a start routine to manipulate and append the new fields to the structure.
    Possible solutions:
    1) Add the new fields to first level DSO as well (empty)
    - Pro: Simple, easy to understand
    - Con: Disc space consuming, performance degrading when writing to first level DSO
    2) Use routines in the field mapping
    - Pro: Simple
    - Con: Hard to performance optimize (we could of course fill an internal table in the start routine and then read from this to get some performance optimization, but the solution would be more complex).
    3) Update the fields in the End routine
    - Pro: Simple, easy to understand, can be performance optimized
    - Con: We need to ensure that the data we need also exists (i.e. if we have one field in DSO 1 that we only use to calculate a field in DSO 2, this would also have to be mapped to DSO 2 in order to exist in the routine).
    Does anybody know what is best practice is? Or do you have any experience regarding what you see as the best solution?
    Thank you in advance,
    Mikael

    Hi Mikael.
    I like the 3rd option and have used this many many times.  In answer to your question:-
    Update the fields in the End routine
    - Pro: Simple, easy to understand, can be performance optimized  - Yes have read and tested this that it works faster.  A OSS consulting note is out there indicating the speed of the end routine.
    - Con: We need to ensure that the data we need also exists (i.e. if we have one field in DSO 1 that we only use to calculate a field in DSO 2, this would also have to be mapped to DSO 2 in order to exist in the routine). - Yes but by using the result package, the manipulation can be done easily.
    Hope it helps.
    Thanks,
    Pom

  • Temp Tables - Best Practice

    Hello,
    I have a customer who uses temp tables all over their application.
    This customer is a novice and the app has its roots in VB6. We are converting it to .net
    I would really like to know the best practice for using temp tables.
    I have seen code like this in the app.
    CR2.Database.Tables.Item(1).Location = "tempdb.dbo.[##Scott_xwPaySheetDtlForN]"
    That seems to work, though i do not know why the full tempdb.dbo.[## is required.
    However, when i use this in the new report I am doing I get runtime errors.
    i also tried this
    CR2.Database.Tables.Item(1).Location = "##Scott_xwPaySheetDtlForN"
    I did not get errors, but I was returned data i did not expect.
    Before i delve into different ways to do this, i could use some help with a good pattern to use.
    thanks

    Hi Scott,
    Are you using the RDC still? It's not clear but looks like it.
    We had an API that could piggy back the HDBC handle in the RDC ( craxdrt.dll ) but that API is no longer available in .NET. Also, the RDC is not supported in .NET since .NET uses the framework and RDC is COM.
    Work around is to copy the temp data into a data set and then set location to the data set. There is no way that I know of to get to the tempdb from .NET. Reason being is there is no CR API to set the owner of the table to the user, MS SQL Server locks the tempdb to that user has exclusinve rights on it.
    Thank you
    Don

  • Best Practice for Significant Amounts of Data

    This is basically a best-practice/concept question and it spans both Xcelsius & Excel functions:
    I am working on a dashboard for the US Military to report on some basic financial transactions that happen on bases around the globe.  These transactions fall into four categories, so my aggregation is as follows:
    Year,Month,Country,Base,Category (data is Transaction Count and Total Amount)
    This is a rather high level of aggregation, and it takes about 20 million transactions and aggregates them into about 6000 rows of data for a two year period.
    I would like to allow the users to select a Category and a country and see a chart which summarizes transactions for that country ( X-axis for Month, Y-axis Transaction Count or Amount ).  I would like each series on this chart to represent a Base.
    My problem is that 6000 rows still appears to be too many rows for an Xcelsius dashboard to handle.  I have followed the Concatenated Key approach and used SUMIF to populate a matrix with the data for use in the Chart.  This matrix would have Bases for row headings (only those within the selected country) and the Column Headings would be Month.  The data would be COUNT. (I also need the same matrix with Dollar Amounts as the data). 
    In Excel this matrix works fine and seems to be very fast.  The problem is with Xcelsius.  I have imported the Spreadsheet, but have NOT even created the chart yet and Xcelsius is CHOKING (and crashing).  I changed Max Rows to 7000 to accommodate the data.  I placed a simple combo box and a grid on the Canvas u2013 BUT NO CHART yet u2013 and the dashboard takes forever to generate and is REALLY slow to react to a simple change in the Combo Box.
    So, I guess this brings up a few questions:
    1)     Am I doing something wrong and did I miss something that would prevent this problem?
    2)     If this is standard Xcelsius behavior, what are the Best Practices to solve the problem?
    a.     Do I have to create 50 different Data Ranges in order to improve performance (i.e. Each Country-Category would have a separate range)?
    b.     Would it even work if it had that many data ranges in it?
    c.     Do you aggregate it as a crosstab (Months as Column headings) and insert that crosstabbed data into Excel.
    d.     Other ideas  that Iu2019m missing?
    FYI:  These dashboards will be exported to PDF and distributed.  They will not be connected to a server or data source.
    Any thoughts or guidance would be appreciated.
    Thanks,
    David

    Hi David,
    I would leave your query
    "Am I doing something wrong and did I miss something that would prevent this problem?"
    to the experts/ gurus out here on this forum.
    From my end, you can follow
    TOP 10 EXCEL TIPS FOR SUCCESS
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/204c3259-edb2-2b10-4a84-a754c9e1aea8
    Please follow the Xcelsius Best Practices at
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/a084a11c-6564-2b10-79ac-cc1eb3f017ac
    In order to reduce the size of xlf and swf files follow
    http://myxcelsius.com/2009/03/18/reduce-the-size-of-your-xlf-and-swf-files/
    Hope this helps to certain extent.
    Regards
    Nikhil

  • Best-practice for Catalog Views ? :|

    Hello community,
    A best practice question:
    The situtation: I have several product categories (110), several items in those categories (4000) and 300 end-users.    I would like to know which is the best practice for segment the catalog.   I mean, some users should only see categories 10,20 & 30.  Other users only category 80, etc.    The problem is how can I implement this ?
    My first idea is:
    1. Create 110 Procurement Catalogs (1 for every prod.category).   Each catalog should contain only its product category.
    2. Assign in my Org Model, in a user-level all the "catalogs" that the user should access.
    Do you have any idea in order to improve this ?
    Saludos desde Mexico,
    Diego

    Hi,
    Your way of doing will work, but you'll get maintenance issues (to many catalogs, and catalog link to maintain for each user).
    The other way is to built your views in CCM, and assign these views to the users, either on the roles (PFCG) or on the user (SU01). The problem is that with CCM 1.0 this is limitated, cause you'll have to assign one by one the items to each view (no dynamic or mass processes), it has been enhanced in CCM 2.0.
    My advice:
    -Challenge your customer about views, and try to limit the number of views, with for example strategic and non strategic
    -With CCM 1.0 stick to the procurement catalogs, or implement BADIs to assign items to the views (I experienced it, it works, but is quite difficult), but with a limitated number of views
    Good luck.
    Vadim

Maybe you are looking for

  • Multiple iPhones on the same Mac

    We've recently purchased several iPhones and they are synced to one Mac. I had no problem establishing separate accounts on the Mac so each iPhone could be properly synced. However, in the past we all purchased music with different iTunes accounts an

  • One movie purchase - two iPads

    We have two iPad minis in our household.  My husband purchased a movie and this downloaded to his iPad.  We share an Apple ID on both iPads.  Should I be able to download the purchased to my iPad also?  If I can, please would someone give me the inst

  • Simple swing question, probably

    Guys, New to Swing, new to Java: I realize I can disable buttons on an application with setEnabled(false). Is there a way to make them disappear altogether? I have a small desktop app that I would like to make user configurable in that certain button

  • InvalidOperationException in CloudBlobContainer.ListBlobsSegmentedAsync on 4.1, but not 3.2 Storage Preview (WP8)

    Hi there, We've been using the Storage-Preview on Windows Phone 8.0. In the following code, token is an SAS token to access the container. container = new CloudBlobContainer(new Uri(token)); BlobResultSegment segment = await container.ListBlobsSegmen

  • Web client version is too new

    hi i installed form 11g 64 bit on win 2008 r2 us client use win xp and i installed on all client jre 6u26 when load this url : http://192.168.90.27:9001/forms/frmservlet?config=test i get this error : Forms Applet version is 11.1.1.3 network: Connect