Trade offs for spreading oraganizatons across suffixes - best practices?

Hey Everyone, I am trying to figure out some best practices here. I'v looked through the docs but have not found anything that quite touches on this.
In the past, here is how I created my directory (basically using dsconf create-suffix for each branch I needed)
dsconf list-suffixes
dc=example,dc=com
ou=People,dc=example,dc=com
ou=Groups,dc=example,dc=com
o=Services,dc=example,dc=com
ou=Groups,o=Services,dc=example,dc=com
ou=People,o=Services,dc=example,dc=com
o=listserv,dc=example,dc=com
ou=lists,o=listserv,dc=example,dc=com
A few years later, learning more, and setting up replication, it seems I may have made my life a bit more complicated that it should be. It seems i would need many more replication agreements to get every branch of the tree replicated. It also seems that different parts of the directory are stored in different backend database files.
It seems like I should have something like this:
dsconf list-suffixes
dc=example,dc=com
Instead of creating all the branches as suffixes or sub-suffixes, maybe i should have just created organization and organizational unit entries within a single suffix "dc=example,dc=com". This way I can replicate all data by replicating just one suffix. Is there a downside to having one backend db files containing all the data instead of spreading it across multiple files (were talking possibly 90K entries across the entire directory).
Can anyone confirm the logic here or provide any insight?
Thanks much in Advance,
Deejam

Well, there are a couple of dimensions to this question. The first is simply whether your DIT ought to have more or less depth. This is an old design debate that goes back to problems with changing DNs in X500 style DITs with lots of organizational information embedded in the DN. Nowadays DITs tend to be flatter even though there are more tools for renaming entries. You still can't rename entries across backends, though. The second dimension is, given a DIT, how should you distribute the containers in your DIT across the backend databases.
As you have already determined, the principal design consideration for your backend configuration will be replication, though scalability and backup configuration might also come into it. From what you have posted, though, it does not look like you have that much data. So yes, you should configure database backends and associated suffixes with sufficient granularity to support your replication requirements. So, if a particular suffix needs to be replicated differently than another suffix, they need to be defined as distinct suffixes/backends. Usually we define the minimal number of suffixes and backends needed to satisfy the topological requirements, though I can imagine there might be cases where suffixes might be more fine grained.
For large, extensible Directory topologies, I usually look for data that's sensibly divisible into "building blocks". So for instance you might have a top-level suffix "dc=example,dc=com" with a bunch of global ACIs, system users and groups that are going to need to be everywhere. Then you might have a large chunk of external customer data, and a small amount of internal employee data. I would consider putting the external users in a distinct suffix from the employees, because the two types of entries are likely to be quite different. If I have a need to build a public Directory somewhere, all I have to do is configure the external suffix and replicate it. The basic question I would be asking there is if I might ever need to expose a subset of the Directory, will the data already be partitioned for me or will I have to do data reorganization.
In your case, it does not look likely you will need to chop up your data much, so it's probably simpler to stay monolithic and use only one backend.

Similar Messages

  • I want to create checklist for rpd that tells about best practices

    Hi all,
    i want to create a checklist for rpd..that tells about what are all the best practices that we have to do..
    we have to write some script based on this script only it has to create the checklist.
    Thanks in advance
    Edited by: 988084 on 13/05/2013 02:34

    Hi,
    Pls refer the following link...
    http://www.peakindicators.com/media_pi/Knowledge/25%20-%20twenty%20golden%20rules%20for%20rpd%20design.pdf
    Thanks,
    Jprakash

  • Set filter criteria on page 1 for page 2 OData model - "best practice"?

    Hello, I have a problem with an app - where I want to filter data on a second page, based on settings from the first page. I use an OData model.
    The collections on both pages are not related in terms of "navigation" properties, that is my problem and I can not change the data source...
    So I am looking for ideas/best practices to solve this because sometimes my filtering doesn't work... the following problem occurred: Request aborted
    I have a page with a sap.m List with items="{/tabWorkPlace}" and and a local JSON model where I store relevant data during the app lifecycle.
    handleListSelect - first page
    var context = evt.getParameter("listItem").getBindingContext();
    var dataModel = sap.ui.getCore().getModel("dataModel");
    var workplace = context.getProperty("WORKPLACE_ID");
    dataModel.setProperty("/WORKPLACE_ID", workplace);
    this.nav.to("SubMaster", context);
    The general App.controller.js handles the nav.to function:
    var app = this.getView().app;
    var page = app.getPage(pageId);
    if(pageId == "secondPage") {
         page.getController().filterData();
    And the controller of the second page:
    filterData: function() {
    var oModel = sap.ui.getCore().getModel("odata");
    var dataModel = sap.ui.getCore().getModel("dataModel");
    var workplace = dataModel.getProperty("/WORKPLACE_ID");
    var items = this.getView().byId("list");
    var oFilter=new sap.ui.model.Filter("WORKPLACE_ID",sap.ui.model.FilterOperator.EQ,workplace);
    items.getBinding("items").filter(oFilter);
    I don't write this code into the onInit() or beforeRendering() function, because they are called only once and I am navigating back and forth between the two pages, because the pages are created only once and "just" the data is changed.
    The desired page looks like this - with an other collection bound to it:
    <List
      id="list"
      select="handleListSelect"
      items="{/tabWorkstep_Status}"
    >
    But when I call it - then the request gets aborted:
    The following problem occurred: Request aborted
    But despite the fact the Request is aborted, the list on the second page is filtered!
    The filter criteria for the model works when I type it into the browser with URL. Maybe this fails because the data binding for the list didn't took place at this phase?
    I have this pattern (filter criteria on one page and result on the second page) more times - (and I think a better data model would be better with navigation properties would be better, but I cannot change it)
    But at another constellation the filtering doesn't work - same error... the following problem occurred: Request aborted
    I also don't want to change the pattern (page 1 to page 2) into popup lists or this fancy new filtering possibilities because it is not suitable for my use case.
    Is there maybe a more elegant solution - because sometimes filtering works, sometimes don't..., do I have an error in my solution (general approach)?
    Many thanks for any input!
    BR,
    Denise

    Hello, yeah you are right, but it works without the odata> stuff because of this in App.controller.js:
    var uri = "http://localhost:32006/JsonOdataService.svc";
    var oModelMS = new sap.ui.model.odata.ODataModel(uri);
    sap.ui.getCore().setModel(oModelMS, "odata");
    oView.setModel(oModelMS);
    So my question is - how to navigate from one page to another - and on the other page first bind a collection to a select and then when selecting bind certain elements (a textfield) to the selected filtered entity.
    The stuff with context and binding won't work, because the two Collections don't have a navigation/association property between them...
    So for example:
    page1
    select a list item with property color: red and year 1985. Press one of the buttons and pass this criteria to another page.
    page 2:
    show a dropdown box with all car names which fullfill this criteria, and when one car is selected, then display the data for THIS car in several text fields.
    This is not a master->detail navigation example, because on page 1 i select certain criterias, and then with buttons I navigate to several pages with those criterias.
    But since the OData model has no relationships it is really hard to do it manually... With a dummy mock.json like in DJ Adams Fiori like SAPUI5 apps it is no problem... But with OData and no things related to each other it is hard...

  • Syncing App IDs across servers -- Best Practice?

    This is prompted by a comment chrisstephens made in the thread at non-existent applications in non-existent workspaces reserving app id's
    Our developers are convinced that the application id's between our dev + staging
    + production environments need to be synchronized.Our team also keeps our dev, test, and prod server app IDs synchronized -- for instance, the Widget Reporting App is always app # 38 on all three servers. For us, it's not something we see as REQUIRED, but it is convenient, and a general sanity check. If the numbers didn't sync, it seems it would be all too easy to get values mixed up and accidentally field an app to the wrong place (possibly overwriting some other application).
    What is the community's opinions on this? Would you consider this an Apex Best Practice? Just a habit for some groups? Or overly rigid thinking?
    (I personally fall in the Best Practice group.)

    One good reason to keep them the same is so that there are no differences between what is tested in one environment and what is deployed in another. Case in point, just last week someone demonstrated that an application's authentication scheme failed when the application ID was changed from xxx to xxxxxxxxx (a longer string of digits). Of course this was due to a previously unknown bug, but that's what testing should reveal.
    Another good reason is to make it possible to export application components (pages, etc.) from one database (say, dev) and install them into an application in another database (say, prod). This is not possible if the application IDs are different.
    Scott

  • Looking for a hardening guide or best practices in production for WLS 8.1.6

    Hi,
    I'm working to deliver to my government customer best practices in the form of a hardening guide that conforms to NIST SP800-44. I am aware of http://edocs.bea.com/wls/docs81/lockdown/practices.html which adds some great operational tips.
    Is anyone aware of other resources, or has delivered anything similar to their customer? I would greatly appreciate any guidance here.
    Thanks,
    Rich

    Hi I would take a guide that covers any version, or other Oracle products.
    -Rich

  • Procedure for spreading screenshots across two macs

    Hello All!
    My name is Michael.
    I have a new Macbook Pro 13" and an old G4 PowerBook 15".
    Is there a way to connect the two unit via firewire and view screenshot content over both monitors?
    If so and you answer, it would be very helpful if you could send a duplicate answer to my email at [email protected]
    Thank you very much in advance!

    The only way I can think of to do this would be:
    1. Connect via FireWire
    2. Go to your Network system preference and make sure IP over FireWire networking is set up and on
    3. Use a program like ScreenRecycler to enable the second Mac as a second monitor to the first Mac, since it accomplishes this using the IP network
    4. Take the screen shot
    I do not know if this will actually work. It depends on if the way ScreenRecycler sets up the second Mac makes it appear enough like a second monitor that the default screen shot behavior (capture all monitors) works the same.
    In any case, you need to describe more specifically what it is you are trying to achieve here. Do you want the second Mac to be a monitor to the first and a screen shot of this extended desktop configuration? Do you want a screen shot of each Mac booted into its respective OS, with its own desktop? Or something else entirely? My suggestion above only covers the first instance.

  • For DB2 to Oracle conversion best practices

    My company is enhancing existing application adding newly J2ee web interface and database as DB2.I am new to J2EE. In future if we want to migrate my database to Oracle,which are the best things to do it now.
    Which J2EE framework is good in respecte JDBC connectivity and future migration of DB2 to Oracle? (Minimal changes at Migration Time)
    It is medium size application with 5000 users.Which are other best practises to follow in development keeping the migration in Mind. Thanks..

    Yes, you should login as system, create a user, appowner, or what ever you call it, and assign that user a default tablespace of 'USERS' or whatever tablespace you decide. Then, grant that user all the privileges to create objects, i.e., create table, create procedure, create synonym, etc, etc.
    Then, logout as system, login as appowner, and do all your object creation from there.
    A user is a set of credentials that allow you access to the system. It defines your identity and your privileges and authority to do various things. A schema is the set of objects owned by a particular user. As soon as a user owns at least one object, that implicitly defines his schema. It's not possible for a user to own or control multiple schemas. If you want multiple schemas, that's fine, but you'll need multiple users, and each user will manage his own schema.
    Hope that's clear,
    -Mark
    PS I strongly suggest you review the Concepts Guide, it really is quite good. It can be found here: http://download.oracle.com/docs/cd/E11882_01/server.112/e10713/toc.htm

  • Won a brand new iPad 2 on the radio in the US, can I trade in for the New iPad??

    I won a brand new iPad 2 on a local radio station here in southern California.  When I get the call to pick it up, will apple allow me to upgrade it to the New iPad? My son won one too about 10 months ago, so I know it's sealed in the regular retail packaging.

    To get the highest trade-in value, you're probably better off selling it yourself, but I am pretty sure the Apple store will accept it as a trade-in, esp. if you haven't opened the package yet. Other electronics stores might also do a trade in for you too. The best way to find out is to call the stores in your area to ask.

  • Seaching for Best Practice links that work

    Hi,
    past few years I have been able to access SAP Best Practices documents like SAP Best Practices SAP Best Practices for CP and Wholesale Industries
    (this one still works and guides me to the building block and process overview documents!).
    Recently any link I can find to SAP Industry or Baseline Best Practices ends up with a dead link. See for example trying to get from here SAP Best Practices Baseline packages – SAP Help Portal Page
    to Localized for Netherlands V1.607 SAP Best Practices package further below on that page, results in screen shot attached. I have seen that in many more examples (different countries, or in Industry Best Practice Packages instead of Country Baseline packages....)
    Does any know whether and how SAP redesigned their access to Best Practices documents (Configuration Guides, eCatts, Scenario Process Overviews etc.?
    Thanks for your reply.
    Thijs

    Hi, Thijs,
    There is currently a problem with Best Practices on the Help Portal.  On the home page of the portal (http://help.sap.com/) there is a message that reads "Stay Tuned - There are temporary problems when accessing some content types, for example PDF documents or Best Practices. We are working on a solution."
    Our Wholesale Distribution industry group does not manage the Help Portal pages, so, unfortunately, I don't know the status of the problem or when it might be resolved.
    Lynn

  • SAP Best Practices for CRM 5.0 is available

    Hello,
    I would like to announce the availability of SAP Best Practices for CRM 5.0.
    SAP Best Practices for CRM allows a fast, safe and predictable implementation of pre-configured CRM business scenarios.
    It can be used to accelerate customer implementation projects as well as setting up demo or evaluation systems.
    For details about SAP Best Practices in general please see:
    <a href="http://www.service.sap.com/bestpractices">http://www.service.sap.com/bestpractices</a>
    For the concrete content of Best Practices for CRM 5.0 please see:
    <a href="http://help.sap.com/bp_crmv150/CRM_DE/index.htm">http://help.sap.com/bp_crmv150/CRM_DE/index.htm</a>
    Best regards,
    Joerg

    Hi Devendra!
    For the Best Practices you can have all the useful installation and informations guides here.
    <a href="http://help.sap.com/">Best Practices for SAP</a>
    Choose here the Bast Practices tab on the line-menu.
    You have to eb careful while installing the BP -> you have to use all the time the right BP release according to your SAP release.
    I hope this helps you!
    Best regards,
    Zsolt

  • Best Practice for Expired updates cleanup in SCCM 2012 SP1 R2

    Hello,
    I am looking for assistance in finding a best practice method for dealing with expired updates in SCCM SP1 R2. I have read a blog post: http://blogs.technet.com/b/configmgrteam/archive/2012/04/12/software-update-content-cleanup-in-system-center-2012-configuration-manager.aspx
    I have been led to believe there may be a better method, or a more up to date best practice process in dealing with expired updates.
    On one side I was hoping to keep a software update group intact, to have a history of what was deployed, but also wanting to keep things clean and avoid issues down the road as i used to in 2007 with expired updates.
    Any assistance would be greatly appreciated!
    Thanks,
    Sean

    The best idea is still to remove expired updates from software update groups. The process describes in that post is still how it works. That also means that if you don't remove the expired updates from your software update groups the expired updates will
    still show...
    To automatically remove the expired updates from a software update group, have a look at this script:
    http://www.scconfigmgr.com/2014/11/18/remove-expired-and-superseded-updates-from-a-software-update-group-with-powershell/
    My Blog: http://www.petervanderwoude.nl/
    Follow me on twitter: pvanderwoude

  • Best practice for Error logging and alert emails

    I have SQL Server 2012 SSIS. I have Excel files that are imported with Exel Source and OLE DB Destination. Scheduled Jobs runs every night SSIS packages.
    I'm looking for advice that what is best practice for production environment.Requirements are followings:
    1) If error occurs with tasks, email is sent to admin
    2) If error occurs with tasks, we have log in flat file or DB
    Kenny_I

    Are you asking about difference b/w using standard logging and event handlers? I prefer latter as using standard logging will not always data in the way in which we desire. So we've developed a framework to add necessary functionalities inside event handlers
    and log required data in the required format to a set of tables that we maintain internally.
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • Best practice for exporting a dps folio so a third party can work on it?

    Hi All,
    Keen for some thoughts on the best practice for exporting a dps folio, indd files, links and all, so a third party can carry on the work. Is their a better alternative to packaging each individual indd file and sharing the folio through adobe?
    I know there have been similar questions here in the past, but the last (that i've found) was updated mid 2011 and I was wondering if their have been any improvements to this seemingly backwards workflow since then.
    Thanks in advance!

    Nothing better than packaging them and using Dropbox to share. I caution you
    on one thing as far as packaging:
    http://indesignsecrets.com/file-packaging-feature-can-cause-problems-in-dps-
    workflows.php

  • Best practice for linking fields from multiple entity objects

    I am currently transitioning from PHP to ADF. I'm looking for the best practice for linking data from multiple entity objects.
    Example:
    EO 'REQUESTS' has fields: req_id, name, dt, his_stat_id, her_stat_id
    EO 'STATUSES' has fields: stat_id, short_txt_descr
    'REQUESTS' is linked to EO 'STATUSES' on: STATUSES.stat_id = REQUESTS.his_status_id
    'REQUESTS' is also linked to EO 'STATUSES' on: STATUSES.stat_id = REQUESTS.her_status_id
    REQUESTS.his_status_id is independent of REQUESTS.her_status_id
    When I create a VO for REQUESTS, I want to display: REQUESTS.name, REQUESTS.dt, STATUSES.short_txt_descr (for his_stat_id), STATUS.short_txt_descr (for her_stat_id)
    What is the best practice for accomplishing this? It appears I could do it a few different ways:
    1. Create the REQUESTS VO with a LOV for his_stat_id and her_stat_id
    2. Create the REQUESTS VO with the join to STATUSES performed within the query for the VO. This would require joining on the STATUSES EO twice (his_stat_id, her_stat_id)
    3. I just started reading about View Links - would that somehow do what I'm looking for?
    I also need to be able to update his_status_id and her_status_id through the by selecting a STATUSES.short_txt_descr from a dropdown.
    Any suggestions on how to approach such a stupidly simple task?
    Using jDeveloper 11.1.2.2.0 if that makes a difference in the solution.
    Thanks ahead of time,
    CJ

    CJ,
    I vote for solution 1 as it's just your use case. As you said you what to update the his_status_id and her_status_id through the by selecting a STATUSES.short_txt_descr by a drop down. This is exactly the LOV solution.
    ViewLinks are used fro master detail navigation (which you don't do here) and Joining the data make it difficult to update (and you still need a LOV for the drop down box.
    Timo

  • Best Practice transport procedure for SRM-MDM Catalogue repositories change

    Hi,
    I have a question regarding SRM-MDM Catalogue repository change transports.
    We currently have two QA servers and to Production servers (main and fail-over).
    We are investigating the need of a Development server.
    Changes are being made to the repositories on the system, and I see the need of a dev server.
    What is best practice for SRM-MDM Catalogue?
    With only QA and Prod environments I guess Repository schema transport is the best option, since there has not been created a Refference file (which is needed for change file transport).
    Any other options?
    We are running MDM as well, with dev, QA and prod environments. Here we use CTS+ for transports.
    Is it best practice to use CTS+ also for SRM-MDM Catalogue transports?
    KR,
    Thomas

    Hi Thomas.
    What is best practice for SRM-MDM Catalogue?
    SAP recommends to have the landscape model like DEV-QA-PROD.
    So in case of catalog as well if we follow the same technique it will help you to have a successful implementation
    Any other options?
    As a part of proceeding with the CTS+ you need to create a reference file
    Refer the Link: [CTS+|http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/d0dd1ae0-36e5-2b10-f8b4-e6365e643c0b?quicklink=index&overridelayout=true] for more details
    Is it best practice to use CTS+ also for SRM-MDM Catalogue transports?
    It is upto the requirement. if you feel there are many changes in catalog xml schema in various phases in an automatic manner then you can go ahead with CTS+ or you can perform the existing method of exporting and importing the schema to the repository.
    Hope it helps.
    Best Regards
    Bala
    Edited by: chandar_sap on Sep 28, 2011 12:17 PM

Maybe you are looking for

  • Office Web Apps don't open documents stored on Sharepoint server

    We have Sharepoint Server 2013 installed on a AD domain controller and Office Web Apps installed on a machine in the same domain and configured for internal-http WOPI zone. Discovery XML works fine. When opening a MS Office document, Office Web Apps

  • Will not display Error Message

    <div class="search_row"> <h:message for="something_else" style="color: #000000"/> </div> this error message does not want to display in Internet Explorer it is against a orange background image does anyone know how to sort this?

  • Subscription Service is not working for KM; NW Portal 7.0

    Hi, I got some problems with the subscription service. I get a notifcation mail every time a new subscription is created or deleted. So this works fine for me. But I dont get a notification mail, if someone adds or deletes files in the corresponding

  • Is Viewmail for MS Outlook 8.5 compatible with CUCMBE 7.x ?

    Hi, Just having a little trouble with Viewmail 8.5, Is it compatible with CUCMBE 7.x ? Any help would be great! Thanks

  • Please help me to upload sales order from crm 4.0 to R/3 47

    Hello friends and gurus, in smofparsfa there are two entries: R3A_SALES INTCHANGE_ORDER ZTTA A R3A_SALES INT_CHANGE_ORDER ZTTA A in crmconsum R/3 there is the entry 800 CRM X CRM R3A In crmconsum crm there is the entry 100     CRM        X     OLTP R