GRC 10 implementation approach

Hi All,
Am in the process of implementing GRC 10 with the following components for a pharma client.
Access Risk Analysis (ARA)
Access Request Management (ARM)
Emergency Access (Firefighter)
Business Role Management (BRM)
I just wanted to know your views on what kind of approach is better?. Please comment your views with your best approach.
My view is to first implement  ARA and EAM together and then go ahead with ARM and BRM packed together.
Thank You for your idea.

Hello Mani,
Apart from technical details provided by Madhu and Plaban,some more additional information
As per my understanding any successful implementation will follow the ASAP methodology,check the below link for information on ASAP methodology
Basic understanding on ASAP Methodology for beginners
You need to define resources and time line before start the project,check the below link
What is the ideal composition of an implementation team
also check the below threads which gives additional information
Implementation of GRC AC 10.1
GRC 10 implementation | SCN
While implementing if any issues Forum will help you.
All the best!!!
Regards
Baithi

Similar Messages

  • Scope and Proposal preparation for SAP GRC AC Implementation.

    Hi ,
    I would like to know procedure how the RFP (Request for proposals) are prepared for SAP GRC AC Module implementation. and also how the scope and effort estimation are calculated.
    are there any guidelines and checklist are available to be followed.
    is there any generic approach followed during SAP GRC AC implementation for proposal as well as for effort estimation.
    Pelase clarify my doubts
    Regards,
    RK

    Hi,
    Use RTA on SAP BASIS 7.00 System for any system other than ECC which has ABAP and Basis Level 7.00 with Kernel 7.00
    I don't see any option for 7.20, may be it will be available later on.
    We have SRM connected to GRC so I can cofirm, yes it is possible to connect SRM to GRC and work with SRM rulebook. RTA and Post-RTA activites and JCO and RAR connectors are a must to get it work properly. In fact all configurations are required as it is in ECC system.
    No HR RTA is required ,as there is no HR module in SRM.
    Regards,
    Sabita

  • BW Implementation Approach

    Hi Guys,
    I am looking for the ASAP Paper on BW Implementation Approach. Which details at length how do we go about implementating it. With R/3 implementation together or after R/3. Cant find it on Sapnet though.
    Can anyone help me with the link please ?
    Cheers
    Kirk

    Hi Kirk,
    As for the timing, BW in itself can function without a SAP environment all together as it could be implemented into any Landscape. With or After R3? Well with is always a better approach seeing that MasterData issues could be resolved at an early stage. Another reason is for modeling purposes. There are so many variables to consider that depending on the landscape and the future development and modules that you will be implementing it could go any route.
    Personal suggestion based on experience is to do it with the R3 implementation, but in rythm with the implementation. Keep in mind that the whole architechture of BW Modelling needs to adapt to all the different source systems - you probably know this - and that this could really influence the decission on when what will be done.
    If you do find that ASAP papare you are looking for, plz drop me a msg...
    Regards,
    Caz

  • Implementation approach/Guidelines for MDS.

    Hi Team,
    I am looking for a document which can give me the implementation approach or guidelines for Master Data Services(MDS).
    Appreciate your quick help in this.
    Thanks
    Arsa.

    Arsa, 
    There is a paper: Implementing
    a Phased Approach to Master Data Management from SQL Server 2008 R2 but is still relevant. 
    The complete series is here: Organizational Approaches to Master Data Management
    I would also recommend is the: Master Data Services Support Site
    Hope this helps.
    Reeves
    Denver, CO

  • Planning implementation approach

    Hi guys,
    I'm curious about a peculiar approach for designing planning application, by simply making all members (accounts, entities,...) -regardless of their number or any other criteria, as "NeverShare" as oppose to using Dynamic Calc/Store etc (the normal way I'd say).
    Have you seen this before? if so what do you think?
    This is not a technical issue, just need to know what do you think.
    Cheers,

    When the storage type of members is Never Share or Store (Dont want to dig into Dynamic Calc and Store) Essbase reserve storage space, less is the block size faster is the calculations, My general take If Account is dense, few of them:
    1. Will Identify the members which are actually not needed to store values: Ex:
    Ratios:
    - Sales Ratio
    - Margin Ratio
    In such type of cases we can make Ratio as Label only.
    2. Dynamic Calc: Rest of the parents can be Dynamic Calc (Though Exceptions are there),
    3. Will use Shared Members Instead of Member formula to make calculations faster,
    4. There shouldn't be a Stored or Never Share Parent of Dynamic Calc Members.
    If you want to understand more then for the outline you are taking about, click on that outline in EAS console and click on Optimize. Though not a perfect way but this will optimize the outline up to a certain extant then perform customer specific changes.
    Cheers..!!

  • Preferred portlet implementation approach in 10.3

    what should i use to implement portlets in WL Portal 10.3: struts, page-flow, jsf? i need something which is stable, time-tested and reliable.

    In my opinion page-flow was the best suited and most easily maintained. But Oracle has killed it and so new work probably shouldnt use it. The same should apply for struts, what works today will probably continue working , but dont expect to see too much new stuff here.
    JSF is recommended by Oracle. There is a whitepaper by Peter Laird which explains some stuff. There are also only a couple of supported frameworks if I recollect correctly. I detest JSF , but thats just my opinion :), if you are familiar with JSF you should probably go for it.
    JSR286 is probably reasonable, but not supported yet. JSR168 is an extremely crippled spec (again my opinion).
    So in summary , probably JSF.

  • PI 7.11 mapping lookup - data enrichment - appropriate approach?

    Hi guys,
    we just upgraded from PI 7.0 to PI 7.11.
    Now I´m facing a new scenario where an incoming order have to be processed.
    (HTTP to RFC)
    Furthermore each item of the order have to be enriched by data looked up in a SAP ERP 6.0 system.
    the lookup functionality could be accessed during RFC or ABAP Proxy
    With the new PI release we have several possibilities to implement this scenario, which are ...
    (1) graphical RFC Lookup in message mapping
    (2) ccBPM
    (3) using of the lookup API in java mapping
    (4) message mapping RFC Lookup in a UDF
    Because of performance reason I prefer to make use of the Advanced Adapter Engine, if this is possible.
    Further there should only one lookup request for all items of the order instead of each order item.
    I tried to implement possiblity (1), but it seems to be hard to fill the request table structure of the RFC function module. All examples in SDN only uses simple (single) input parameters instead of tables. Parsing the result table of the RFC seems to be tricky as well.
    Afterwards I tried to implement approach (3) using an SOAP adapter as Proxy with the protocol XI 3.0.
    (new functionality in PI 7.11)
    But this ends up in a crazy error message so it seems that SOAP adapter could not used as proxy adapter in this case.
    ccBPM seems also be an good and transparent approach, because there is no need of complex java code or lookup api.
    So  the choice is not so easy.
    What´s the best approach for this scenario??
    Are my notes to the approach correct or do I use/interpret it wrong?
    Any help, ideas appreciated
    Kind regards
    Jochen

    Hi,
    the error while trying to use the soap channel for proxy communication is ....
    com.sap.aii.mapping.lookup.LookupException: Exception during processing the payload. Error when calling an adapter by using the communication channel SOAP_RCV_QMD_100_Proxy (Party: , Service: SAP_QMD_MDT100_BS, Object ID: 579b14b4c36c3ca281f634e20b4dcf78) XI AF API call failed. Module exception: 'com.sap.engine.interfaces.messaging.api.exception.MessagingException: java.io.IOException: Unexpected length of element <sap:Error><sap:Code> = XIProxy; HTTP 200 OK'. Cause Exception: 'java.io.IOException: Unexpected length of element <sap:Error><sap:Code> = XIProxy; HTTP 200 OK'.
    so this feature seems not to work for soap lookups, isn´t it.
    Kind regards
    Jochen

  • Creating Web Services using Java Implementation

    Hi,
    This is quite a general question on how to create a Web Service using Java Implementation that needs to conform to a client XML schema.
    Here are my Version specs,
    I am using Jdeveloper 10.1.3.4.0 and deploying onto OAS 10.1.3.
    I will be creating a J2ee 1.4 (JAX-RPC) Web Service using Document/Wrapped style.
    I have been provided an XML schema from the client which is quite complex.
    Using a top-down approach, I can create my WSDL file and import the XML Schema for my type definitions.
    The Web service aim is to accept some parameters and return some data from the Oracle Database. The
    XML response from the web service must conform to the element, attribute definitions in the provided XML schema.
    From a Java implementation approach, what is the best (simplest or quickest) way to retrieve data from the Oracle
    tables and map each fields/column to the required XML output (defined in the XML schema).
    I'm not too concerned with using Java to retrieve data from the Database, more with how I can map the data returned
    to the required output. Can this mapping task be controlled within the Java program?
    Thanks in advance.

    Hi,
    This is quite a general question on how to create a Web Service using Java Implementation that needs to conform to a client XML schema.
    Here are my Version specs,
    I am using Jdeveloper 10.1.3.4.0 and deploying onto OAS 10.1.3.
    I will be creating a J2ee 1.4 (JAX-RPC) Web Service using Document/Wrapped style.
    I have been provided an XML schema from the client which is quite complex.
    Using a top-down approach, I can create my WSDL file and import the XML Schema for my type definitions.
    The Web service aim is to accept some parameters and return some data from the Oracle Database. The
    XML response from the web service must conform to the element, attribute definitions in the provided XML schema.
    From a Java implementation approach, what is the best (simplest or quickest) way to retrieve data from the Oracle
    tables and map each fields/column to the required XML output (defined in the XML schema).
    I'm not too concerned with using Java to retrieve data from the Database, more with how I can map the data returned
    to the required output. Can this mapping task be controlled within the Java program?
    Thanks in advance.

  • Custom Expiry Implementation Advice Needed

    Hi Guys,
    I need help identifying best implementation approach.  A requirement is to transitionally persist data that is getting expired from a cache to either MQ queue or a database.  Here I need to make sure if persistence operation fails (ie database/mq is down), entry should be retained in cache until other systems are operational again.
    Implementation Options:
    1) Periodically scan all cached entries and manually identify which ones are ready for expiration.  Then use coherence transactions and link it through XA with persistence operation.
    2) Periodically scan all cached entries and manually identify which ones are ready for expiration.  Remove entry only if persistence operation is successful.
    3) Use Custom Eviction Policy?  I didn't see a good way of plugging in there.
    4) Extend LocalCache.Entry and override isExpired method.  Not too sure if that is a good option or if that will even work.
    5) ??
    Thanks,
    Dmitriy

    Hi Dmitriy,
    Personally I wouldn't do anything off of the back of Coherence's built in expiry. Going through your options...
    1) While you are scanning the entries Coherence has probably evicted them. Whenever you do an action on a cache the first thing Coherence does is evict anything that has expired. So if you tried to scan a cache Coherence would evict the entries before you could scan them. A second point, I wouldn't use Coherence transactions for what you suggest as you will find they are too restrictive to be much use.
    2) See, 1 above - again you will have trouble scanning the cache
    3) Custom eviction policies are really just ways to tell Coherence if it can evict something
    4) This could get tricky and trying to do some sort of persistence off of the back of isExpired is just wrong - methods should do what the name says, you have no idea how often Coherence will call your overriden method and if it is doing persistence you could have a big impact on performance of your cluster.
    5) I would choose this option
    I am pretty sure that triggers get fired when an entry is evicted so you could add a trigger to the cache that will do the persistence when an entry is evicted. When an entry is evicted you basically get a synthetic remove event which you can detect in the trigger (although is it not nice code as you need to use reflection to get to the isSynthetic() method). A trigger that looks like this might be workable...
    import com.tangosol.util.MapTrigger;
    import com.tangosol.util.ValueExtractor;
    import com.tangosol.util.extractor.ReflectionExtractor;
    public class MyTrigger implements MapTrigger {
        private static final ValueExtractor syntheticExtractor = new ReflectionExtractor("isSynthetic");
      @Override
        public void process(Entry entry) {
            if (entry.isOriginalPresent() && !entry.isPresent() && isSyntheticEntryEvent(entry)) {
                // The entry is being evicted...
      private boolean isSyntheticEntryEvent(Entry entry) {
            boolean isSynthetic;
            try {
                Object synthetic = syntheticExtractor.extract(entry);
                isSynthetic = (synthetic instanceof Boolean) ? (Boolean) synthetic : false;
            } catch (Throwable e) {
                isSynthetic = false;
            return isSynthetic;
    JK

  • Oracle IAM integration with GRC 10

    Hi All,
    Our client is using Oracle IAM for user provisioning process. Now they have SAP GRC being implemented for two of their SAP systems. Now client wants to integrate SAP GRC Access Risks analysis (ARA) for SOD analysis and User Access Management(UAM) for user provisioning modules of SAP GRC 10 with Oracle IAM.
    As far as i know, webservices needs to be activated in GRC 10 and has done that. Now i want to know how Oracle IAM communicates with GRC 10. How connectors needs to be developed, User account to be created for web service access and how the parameters are passed from oracle to grc.
    Also how many different scenarios are there in oracle IAM for this integration?
    In SAP IDM vs SAP GRC integration we have 2 scenarios.
    1. Request raised in IDM -> SOD analysis in GRC -> Provisioning in GRC -> Return success/failure status back to IDM
    2. Request raised in IDM -> SOD analysis in GRC -> Return SOD success/failure status back to IDM -> Provisioning in IDM
    So can anyone help with possible scenarios for this integration process??

    Hi vikas and Frank,
    Do you have any information related on How to enable the webservices in the GRC 10 (does NWBC holds the key). if you have any information related to it  please share it with me.
    Thanks and regards,
    keerthi

  • Difference between EDN-DB and EDN-JMS implementation

    Experts,
    Looking for a difference between EDN-DB and EDN-JMS implementation. We have EDN implementation going on, where multiple source systems will be publishing business events. Events can be published using SOA Suite or JMS Bridge from AQ and MQ. In this scenario which implementation approach is suited better ?

    Oracle Apps Adapter can be used for many other products other than E-Business Suite (JD Edwards E1, OTM, and others).  The apps adapter was mainly used in ERP systems and was strong when you needed have added functionality other than just polling a db for a status / calling a stored procedure...  It could connect out to other business services, xml gateways and so-on.
    From a technical aspect / development perspective I always thought of that the Oracle Apps Adapter simplifies the development but, may is not flexible. The App Adapter's strength is the services that are difficult to integrate with (business services, xml gateway). If you need to hook in to these types of services, use the APPS adapter. (Again like it is named the "Apps Adapter" its for interacting w/ other applications to enable real-time processing / functionality. 
    That being said the dbAdapter can be very powerful but, like it is named is database focused (to poll some table to know when to kick off other logic and enable real-time functionality through database interaction.
    Looking at the Oracle 12C Roadmap I had heard that there is more planned for the Oracle Apps Adapter moving forward, especially as Anuj said in the "Financial Apps" side of the isle at Oracle. Who knows what is real and what isn't though =]
    Hope this helps!
    -Luke

  • GRC AC Help

    Hello community!
    I´m working in a new GRC AC implementation, I never did it before and in a first momment I thought it should not be so hard,well... looks like I was wrong. In a first moment I should implement only BRM and ARM. Users should be able to manage roles from all plugged systems through GRC AC and also your the default workflows for access request and approval.
    I´m following the guide http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/80063a8e-1da6-2e10-aaa5-fda1f0936c37?QuickLink=index&… for BRM but seems like some steps are not present.
    I´m wondering if someone with implementation experiance can give me answers for the questions bellow:
    1 - How long in average a project for ARM and BRM implemetation takes? Considering an almost "default" implementation.
    2 - Can someone send me a schedule task for the AC implementation? So I can use it as a base for my schedule.
    3 - I purchased the "Implementing SAP Governace, Risk, and Compliance" book from the SAP Press website. Basically the book and the guide aforementioned are my references. If someone can contribute commenting or sending new references I would really appreciate.
    Thanks in advance,
    Pedro

    Hi Podro,
    See my answers embedded below.
    Pedro Botega wrote:
    1 - How long in average a project for ARM and BRM implemetation takes? Considering an almost "default" implementation.
    It really depends on how many systems you are going to be connecting to the GRC system and of what types (i.e. ECC, SRM, BW, Portal, non-ABAP etc). Additionally it also depends on how complex your custom Access Approval workflow may be. In addition, if you are a big company, you may phase your Go-lives by regions, hence lengthening the effort not just technically, but in terms of finalising and agreeing the GRC business processes, test phases and training etc.....in short, there is a lot to think about and consider.
    2 - Can someone send me a schedule task for the AC implementation? So I can use it as a base for my schedule.
    Most companies retain this as their own IP, so probably not. However, from experience, a simple implementation of GRC AC with ARA/EAM from start to finish should not take more than 3 months (being very generous here!). A full on implementation involving all 4 modules (ARA, EAM, ARM, BRM) can be simple and last a generous 4/5 months if the approval flows are designed simple (or the SAP delivered workflows are used).
    However, if the landscape gets technically complex (number of target systems and 3rd party systems etc) and the approval flows get complex, you could be looking at a 5-10 month project. Sometimes the customer could have multiple test phases and dry runs, making the project last longer(!). If you are implementing BRM to build your rules from start to end, expect some challenges, hence adding time. Needless to say, delays are easily caused by technical gremlins within the product itself. Start befriending SAP Support marketplace and digging for new SNOTES
    Some companies offer a Rapid Deployment capability (both SAP and 3rd Party specialists). This would be worth researching into if you wish to save some money in terms of implementation costs.
    3 - I purchased the "Implementing SAP Governace, Risk, and Compliance" book from the SAP Press website. Basically the book and the guide aforementioned are my references. If someone can contribute commenting or sending new references I would really appreciate.
    Personally, I would start reading the documents on this page SAP Access Control 10.0 and obviously dig around this SCN forum, there are enough talented individuals sharing their experience on this thread and alot of your future questions are likely to have been already asked/discussed/answered on this forum also

  • What about GRC Process Control 3.0

    Hi experts!
    I've read a lot of information about GRC AC 5.2/3. But no so much documents about the new version of PC.
    Right now we are trying to install and test a system parameterized in our laboratory.
    Is there a specific forum for PC?
    Has anyone installed it successfully?
    What are the main steps?
    Documentation "useful?
    Thanks a lot

    Hi,
    There are quite a few successful GRC PC implementations around the world. If you are representing a customer / partner, you can touch base with me and I will be more than happy to help you.
    Application Help for PC @ SAP Help Portal, Installation,Upgrade Security guide are available @ SAP Service Market place.
    Which version of PC are you planning to implement.
    Regards
    Rohit Balu

  • Global SAP HR Implementation

    Hello Forum,
    I am in a situation of planning implementation strategy for Global SAP HR implementation across 30 counties.
    Process to be implemented
    1. Performanace management
    2. Hiring and firing
    3. Payroll
    4. Manage Employee Info.
    5. Recruting
    6. ESS
    7. MSS
    etc.
    I am wondering if some one can reply following queries with above inputs
    1. The cutover approach and principles
    2. The data conversion sequence
    3.  List the interim processes required
    4. Expected resources to support implementation
    Thanks in advance.
    Sanjay

    Hi,
    I hv worked with World's largest SAP HCM  implementaion.
    Certain points from my side..
    1.The cutover approach and principles
      Best is to start with a country where less number of population.(For first go live)
    Maintain separate testing team for SIT also.
    Adhere to global implementation strategies and unique terminologies in all countries   (Config part)
    2. The data conversion sequence
    Maintain a separate team for data conversion and data cleansing should be the job of  client. You can use different methods for uploading data for different countries (Eg: only  for US  Specific we have DTT tool etc.)
    3. List the interim processes required
    4. Expected resources to support implementation
    Respect on the implementation approach.
    Up to how many countries and you are planning to go live on the same date (Parallel implementation)?
    Or you are planning as a sequence of rollouts etc.
    Regards
    Thomas.
    Edited by: Thomas Padiyara on Jul 29, 2008 4:50 AM
    Edited by: Thomas Padiyara on Jul 30, 2008 8:48 AM

  • Retrieve nested LDAP groups independent from the network env. (five different approaches)

    Hi all,
    I want to retrieve a list of nested LDAP groups per user from the Active Directory. I have been searching google for half a day now, but I'm still not sure what approach to use. I have the following requirements:
    * The script/program must run in different network environments (I can't be sure if there is a global catelog or AD DS or AD LDS, etc). I will write my own program.
    * The membership info will be used in combination with directory ACL's and must be as complete as possible (global groups, universal groups, local groups, perhaps different domains). Distribution groups are not really necessary, because they are not used in
    the directory ACL's.
    * It would be nice to support other LDAP implementations than Active Directory using the same code, but that not a hard requirement. I could use another approach to support a different LDAP.
    Now I have figured out five possible approaches (info comes from different sites, please correct me if I'm wrong):
    1) tokengroups attribute:
    - The attribute contains Univeral groups of the forest, global groups from the local domain, domain local groups from the local domain (assuming native mode) and local groups from the local machine.
    - Returns a list of SIDs which will have to be translated to group names
    - The tokenGroups attribute exists on both AD DS and AD LDS
    - For AD DS, the tokenGroups attribute is not present if no GC server is available to evaluate the transitive reverse memberships.
    - quote from site "Now that I have had a chance to test it though I can definitely say that tokenGroups WILL get the Universal groups from the other domains even if is NOT a GC. I just did it in my test lab."
    - Token Groups cannot be retrieved if no Global Catalog is present to retrieve the transitive reverse memberships.
    2) tokenGroupsGlobalAndUniversal
    - A subset of the tokenGroups attribute. Only the global and universal group SIDs are included.
    - If you want consistent results, read tokenGroupsGlobalAndUniversal that will return the same result no matter which DC you are connected to. However, it will not include local groups.
    - other source says "tokenGroups will give you all the security groups this user belongs to, including nested groups and domain users, users, etc tokenGroupsGlobalAndUniversal will include everything from tokenGroups AND distribution groups". Not
    sure if this is correct, I think it doesn't contain local groups.
    - The tokenGroupsGlobalAndUniversal attribute exists on AD DS but not on AD LDS.
    3) LDAP_MATCHING_RULE_IN_CHAIN / 1.2.840.113556.1.4.1941
    - Use a recursive search query which returns all nested groups for user at once.
    - Returns all groups except for the primary group
    - It's a fast approach, see performance test from Richard Mueller:
    http://social.technet.microsoft.com/Forums/fr-FR/f238d2b0-a1d7-48e8-8a60-542e7ccfa2e8/recursive-retrieval-of-all-ad-group-memberships-of-a-user?forum=ITCG
    - It only works on Active Directory, not for other LDAP implementations
    4) Recursive retrieval of the memberOf attribute
    - Retrieves all groups except the primary group. (also local groups from other domains??)
    - works for all LDAP implementations
    - executes a lot of queries to the LDAP, especially if you want to scan all users/groups (perhaps limited on OU, but still)
    5) Store memberOf attribute in local database and calculate the nested groups using recursive queries to the local database
    - No heavy load to the LDAP
    - Needs space to store the user/group info locally (embedded Derby database perhaps)
    - Performs fast since the queries are executed locally
    - Works for all LDAP implementations
    My thoughts on these different approaches:
    * appreach 1) I understand that the tokengroups attribute is not present if no GC server is available. In how many network environments is this the case? This option won't work because I want to support different network environments.
    * approach 2) The tokenGroupsGlobalAndUniversal attribute exists on AD DS but not on AD LDS. Same here, in how many network environments is this the case? I don't think I can rely on this approach.
    * approach 3) Seems to be a good option. How will it perform compared to approach 5 (local recursive queries)? Won't work for other LDAP implementations
    * approach 4) I don't think I want to execute that many queries to the LDAP. I can limit the scan on OU, but still companies can have thousands of users and groups.
    * approach 5) Perhaps the best approach. I want to store user/group info locally for fast filtering / reporting (only group DNs, user names, databse id's and membership info as id-id pairs). I only need the memberOf attribute of users and groups, recursive
    loops are done locally. It will work for all LDAP implementations.
    What do you guys think? I'm not a network admin, but a programmer, so I'm no expert in network setups and when to use AD DS or AD LDS. The thing is I want to use this code at different customers without knowing their network setup (except for the domain name(s),
    LDAP host/port and bind user to connect to LDAP).
    Thanks a lot!
    Paul

    I want to write a tool that can answer questions like "what users from group ABC have delete permission in all the (sub)directories of server MyDataServer?". This results in a list of directories and users and includes nested group membership. So it's about
    effective permissions. That's why I want all information in a SQL database so I can answer these questions with a single query in milliseconds. Otherwise, in order to answer these questions, I would have to get all members from group ABC and determine the
    nested groups for all these members (which can be thousands) for every report. Using a SQL database I can retrieve this information once a night for all the members.
    But I guess I will use the LDAP_MATCHING_RULE_IN_CHAIN syntax which gives me all nested groups for a member and should work for all AD installations from W2K3 SP2 and higher. When I want to support other LDAPs I will use another method for that specific
    LDAP.
    Again - note that this question has nothing to do with LDAP or AD.  It just asks what group has permissions on what resources.
    I really think you would do well to spend time understanding the NTFS and its security along with how we sue security in Windows.  By assuming this has something to do with AD you are making it a bigger issue than needed.  AD is a repository for
    accounts and trusts and manages authentication and security group membership.  All file security is managed by the OS that hosts the files and not by AD.  Users are not normally granted access to resources through direct inclusion in the DACL but
    are given access through membership in one or more groups.  Loading AD into a SQLL database will not help you.
    ¯\_(ツ)_/¯

Maybe you are looking for