RAR Rules - DEV and QA

Hi,
Our RAR Dev has two active connectors to SAP Dev as well as QA. We have uploaded a Rule Set for QA which is working fine. We can do a risk analysis and we get results. However we are struggling to get results for Dev. How do we go about uploading a test rule set for Dev?
Please assist.
Regards.

Hi,
Thank you to you all.  My problem is solved:)
The Master User Source was pointing to Dev and the Function action did have Dev and QA. I however reran a full sync on user, role, profile as well as batch analysis and that did the trick.
Thanks once again.
Regards.

Similar Messages

  • RAR: SoD Riskk and Critical Actions risks

    Hi all,
    I would like to get your input regarding different approaches followed in order to load in RAR SoD risk and critical actions risks.
    1) Do you load all of them under the same rule set?
    2) Do you think is convenient to load them under two different rule sets? One for SoD and the other for critical action?
    My decision here since AC modules when calling to RAR are using the default SoD, would be to define everything under the same unique rule set. Agree on that?
    Keep in mind the four GRC AC modules are implemented.
    Thanks for all. Kind regards,
      Imanol

    Hi Imanol,
      It depends on the client requirements. If client wants to see critical risks as well as SoD risks in CUP then same ruleset is the way to go. If client doesn't want to confuse approvers by showing critical risks then separate ruleset is the right way. At my current client, we have separate rulesets for SoD and Critical actions. We ask role owners to reaffirm all the role assignment which contains critical actions quarterly so we are covered from that angle.
    Regards,
    Alpesh

  • Need information on the new RAR Rule Architect/Rule Set functions

    Does anyone have any information on the new 5.3 functions listed under Rule Architect/Rule Sets, specifically the Compare function?
    My 5.3 Config manual mentions this area but doesn't describe anything about it.  I have a request from our user group and need to determine if this can fit that request.
    What they are looking for is an easy way to compare our RAR Rule Set with the latest SAP version (Q2 2010 is the most recent I believe).  Just from the screen shots, it looks like we could maybe use the Rule Sets functions for that.  Load the new SAP one into RAR as a separate ruleset and then run this Compare function.  However I haven't been able to find any documentation on this function, so I don't know if it really does what we are looking for.
    Thanks.

    Hi,
    the error 'NullPointerException ' is very common error in GRC.
    kindly search, you will find lots of threads and notes on thi.
    check you permission TXT file. It contain null value some where.
    especially check SD01 & SD02 tcodes.
    Also open permission file in word and check all TAB's and ENTER's in technical view.
    Regards,
    Surpreet

  • RAR - Rules Upload

    Hi Experts,
    From the RAR, I can see the default "Global" ruleset. I went to the Configuration tab, navigated to Rule Upload > Generare Rules, and clicked on the Foreground button, and I see a list of Risk Description, conflicting conflicts etc etc.
    However, I did not want to use the Global ruleset, as I have a customized ruleset which addresses my client's SoD concerns very specifically. What I did first was to export the all components of the rules (as a backup) and then I navigated to the Rule Architect Tab, and manually deleted all the Risks, Functions, Rule Sets and Business Process (in that order).
    I then proceeded to the Configuration tab > Rule Upload and uploaded the Business process, Function, Function Authorisation, Rule Set and Risk. No error messages encountered as I followed the Rule File Templates as per the configuration guide. But it also does not tell me if I was successful in importing those files. (so I assumed no error message = import successful)
    However, when I navigated to Rule Upload > Generate Rules, and clicked on the Foreground button, I was unable to see any list generated this time. I tried to export all the components of the rules (based on what I imported) to troubleshoot, and I found that the "function_permission.txt" and the "Risk_desc.txt" portions were missing from the exported textfile. However, all the other information from other text files are in that exported text file.
    From initial analysis, this seems like the Function Authorisation and the Risk files may not have been imported successfully. Would like to know if anyone has encountered this problem and what actions should be taken to rectify it?
    Thanks!

    Hi Experts,
    Thanks for your response.
    I followed the Rule Set Template from the configuration guide to the letter.
    Upon closer inspection of the contents in what I exported, I discovered that for the "function_action.txt" portion, the Tcodes of some of the business process were not found, for e.g.
    Business Process FA may have tcodes under function action Tab, but business process IM seems to have no tcodes under the function action Tab. I suspect that during the import, certain business processes were not "picked up", whereas others were. It was a clean omission of tcodes from IM business process. Does the naming convention of business process follow some reserved words (i.e. financial accounting must be FA, procurement Must be PR etc to be same as the global ruleset)?
    In addition, for those business process which have tcodes reflected in the function action Tab, I tried to click on the "+" to expand and see the objects, fields and values under the function permission Tab, but it cannot be expanded (i.e. blank).

  • SAP GRC RAR Rules Generation Job Error - SP13 application

    Hello,
    we applied SP 13 on GRC and RAR Rule Generation job is always in "error" status; below I list an example of job log:
    INFO: -
    Scheduling Job =>237----
    Apr 4, 2011 1:36:12 PM com.virsa.cc.xsys.bg.BgJob run
    INFO: --- Starting Job ID:237 (RULE_GENERATION) - generate f113
    Apr 4, 2011 1:36:12 PM com.virsa.cc.xsys.util.Lock lock
    FINEST: Lock:1007
    Apr 4, 2011 1:36:12 PM com.virsa.cc.xsys.bg.BgJob setStatus
    INFO: Job ID: 237 Status: Running
    Apr 4, 2011 1:36:12 PM com.virsa.cc.xsys.bg.BgJob updateJobHistory
    FINEST: --- @@@@@@@@@@@ Updating the Job History -
    1@@Msg is generate f113 started :threadid: 1
    Apr 4, 2011 1:36:12 PM com.virsa.cc.xsys.bg.dao.BgJobHistoryDAO insert
    INFO: -
    Background Job History: job id=237, status=1, message=generate f113 started :threadid: 1
    Apr 4, 2011 1:36:12 PM com.virsa.cc.xsys.util.Lock unlock
    FINEST: Unlock:1007
    Apr 4, 2011 1:36:12 PM com.virsa.cc.xsys.bg.BgJob ruleGeneration
    INFO: @@@--- Rule ruleGeneration Started ....237
    Apr 4, 2011 1:36:12 PM com.virsa.cc.xsys.bg.BgJob run
    WARNING: *** Job Exception: null
    java.lang.NullPointerException
         at com.virsa.cc.xsys.bg.BgJob.ruleGeneration(BgJob.java:1245)
         at com.virsa.cc.xsys.bg.BgJob.runJob(BgJob.java:609)
         at com.virsa.cc.xsys.bg.BgJob.run(BgJob.java:363)
         at com.virsa.cc.xsys.riskanalysis.AnalysisDaemonBgJob.scheduleJob(AnalysisDaemonBgJob.java:375)
         at com.virsa.cc.xsys.riskanalysis.AnalysisDaemonBgJob.start(AnalysisDaemonBgJob.java:92)
         at com.virsa.cc.comp.BgJobInvokerView.wdDoModifyView(BgJobInvokerView.java:444)
         at com.virsa.cc.comp.wdp.InternalBgJobInvokerView.wdDoModifyView(InternalBgJobInvokerView.java:1236)
         at com.sap.tc.webdynpro.progmodel.generation.DelegatingView.doModifyView(DelegatingView.java:78)
         at com.sap.tc.webdynpro.progmodel.view.View.modifyView(View.java:337)
         at com.sap.tc.webdynpro.clientserver.cal.ClientComponent.doModifyView(ClientComponent.java:481)
         at com.sap.tc.webdynpro.clientserver.window.WindowPhaseModel.doModifyView(WindowPhaseModel.java:551)
         at com.sap.tc.webdynpro.clientserver.window.WindowPhaseModel.processRequest(WindowPhaseModel.java:148)
         at com.sap.tc.webdynpro.clientserver.window.WebDynproWindow.processRequest(WebDynproWindow.java:335)
         at com.sap.tc.webdynpro.clientserver.cal.AbstractClient.executeTasks(AbstractClient.java:143)
         at com.sap.tc.webdynpro.clientserver.session.ApplicationSession.doProcessing(ApplicationSession.java:333)
         at com.sap.tc.webdynpro.clientserver.session.ClientSession.doApplicationProcessingStandalone(ClientSession.java:741)
         at com.sap.tc.webdynpro.clientserver.session.ClientSession.doApplicationProcessing(ClientSession.java:694)
         at com.sap.tc.webdynpro.clientserver.session.ClientSession.doProcessing(ClientSession.java:253)
         at com.sap.tc.webdynpro.clientserver.session.RequestManager.doProcessing(RequestManager.java:149)
         at com.sap.tc.webdynpro.serverimpl.defaultimpl.DispatcherServlet.doContent(DispatcherServlet.java:62)
         at com.sap.tc.webdynpro.serverimpl.defaultimpl.DispatcherServlet.doGet(DispatcherServlet.java:46)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:740)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
         at com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.runServlet(HttpHandlerImpl.java:401)
         at com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.handleRequest(HttpHandlerImpl.java:266)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java:386)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java:364)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.invokeWebContainer(RequestAnalizer.java:1060)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.handle(RequestAnalizer.java:265)
         at com.sap.engine.services.httpserver.server.Client.handle(Client.java:95)
         at com.sap.engine.services.httpserver.server.Processor.request(Processor.java:175)
         at com.sap.engine.core.service630.context.cluster.session.ApplicationSessionMessageListener.process(ApplicationSessionMessageListener.java:33)
         at com.sap.engine.core.cluster.impl6.session.MessageRunner.run(MessageRunner.java:41)
         at com.sap.engine.core.thread.impl3.ActionObject.run(ActionObject.java:37)
         at java.security.AccessController.doPrivileged(Native Method)
         at com.sap.engine.core.thread.impl3.SingleThread.execute(SingleThread.java:104)
         at com.sap.engine.core.thread.impl3.SingleThread.run(SingleThread.java:176)
    Apr 4, 2011 1:36:12 PM com.virsa.cc.xsys.bg.BgJob setStatus
    INFO: Job ID: 237 Status: Error
    Apr 4, 2011 1:36:12 PM com.virsa.cc.xsys.bg.BgJob updateJobHistory
    FINEST: --- @@@@@@@@@@@ Updating the Job History -
    2@@Msg is Error while executing the Job:null
    Apr 4, 2011 1:36:12 PM com.virsa.cc.xsys.bg.dao.BgJobHistoryDAO insert
    INFO: -
    Background Job History: job id=237, status=2, message=Error while executing the Job:null
    Apr 4, 2011 1:36:12 PM com.virsa.cc.xsys.riskanalysis.AnalysisDaemonBgJob scheduleJob
    INFO: -
    Complted Job =>237----
    Apr 4, 2011 1:36:13 PM com.virsa.cc.xsys.util.Lock lock
    WARNING: It is used by the same owner: For current thread retrying to get lock : 1001
    Apr 4, 2011 1:36:13 PM com.virsa.cc.xsys.util.Lock lock
    FINEST: Lock:1001
    Apr 4, 2011 1:36:13 PM com.virsa.cc.xsys.util.Lock unlock
    FINEST: Unlock:1001
    Is there someone that can help me?
    I checked and it seems that "Use NetWeaver Logical Lock" in config tab has to be set to "No"...is it correct for you or have you got other tips?
    Thx to all

    Hello,
    actuallt current values are:
    Row CNFGPARAM| CNFGSEQ| CNFGVALUE|
    35 250 0 NO
    36 251 0 YES
    Value for 250 is ok based on your feedback.
    Value for 251 is based on SNOTE 1508611, even if  SDN forum suggests "0" against the note.
    Have you got any tips?

  • Dev and Production BW systems not in sync.

    Hello All,
    We have been using two BW systems for our implementation project. One is the development and second one is production BW system. I observed that few BW objects have different configuration (like update rules, user exits, queries , web templates…etc…) in the production compared to the development system.  Now my requirement is to find out the all the BW objects in the production that having different status than development.  So that I can make corresponding changes in the development system as well in order to sync dev and production systems.
    1.     Is there any tool/transaction is available by which I can retrieve the list of the BW objects and may be delta changes between dev and prd systems.
    2.      Or any standard process exists to sync the both versions.
    Any help to this will be highly appreciated.
    Thanks in advance.
    Regards
    Harsh.

    Hi Harsh,
    For the Transfer rules/update rules you can do a remote compare of the Generated Program which would give you an idea whether they are in sync or not. Likewise you can do remote compare of the User exit fn module.
    Bye
    Dinesh

  • Risk Analysis result different in DEV and PROD

    Hi Gurus,
    I have modified few functions in development and transported the changes but after transport there is a new SOD produced at the user level and with same access in Development there is no violation when I checked the function permission there is two duplicate entries in production ruleset compared to Development. Do I need to remove the duplicate entries in production and then run risk analysis is this going to fix SOD ?
    My assumption is GRC 10 doesn't have ability to transport changes for deletion in functions.
    Regards,
    Salman

    Dear Salman,
    looks that the rules have been appended so it shows twice.
    I suggest to download the correct rules from DEV and upload in PROD again. You can use program GRAC_UPLOAD_RULES to upload in PROD. Please make sure you set the option to overwrite, and not append.
    With GRAC_RULE_DELETE you can also delete the rules before you upload (not necessary, but possible).
    Hope this helps.
    Regards,
    Alessandro

  • Org.structure differs on dev and production

    Hi,
    I'm developing a workflow and in one of the steps (activity step), I assign agent using organizational structure (position from HR). The problem is, that the org.structure differs on dev system and on production system. Mainly the IDs of position posts differ. How to solve this? Thanks in advance.
    Best regards,
    Tomas

    >
    Tomas Linhart wrote:
    > thanks for answer, I guess that's the best way of handling such situation. I'm going to creat custom table holding position used in workflow mapped to real position ID from org.structure. Or more generally, holding WF post mapped to org.structure type (position, job, ...) with corresponding ID in org.
    I don't see why you need a custom table?
    The org issues aside, all that is needed is to assign your org unit to the task. Then leave the agent assignment blank in your WF. Job done.
    The other option is to create a dummy rule, which you thought was to be avoided. I don't agree, it takes 5 minutes, less effort than a table, less non-standard stuff ==> less explaining to people how to maintain & easier to troubleshoot.
    Edit: Forgot to answer your other question
    > I'm not sure if you consider org. replication the same as transport of org. I was thinking of transport myself, just didn't know the way how to. I've found transaction RE_RHMOVE30 (also acessible from customizing), that should do that, but am not sure, if that's the right one and/or what options to select when transporting. Also, I'm not sure if this would ensure the position/job IDs are the same on both systems.
    Replication would be carried out via transports. The RHMOVE* reports can also be used, or you could even set up an ALE connection between DEV and PRD. However I still would not recommend it. The development system is where you would need to create test users and/or org units. The other alternative is to have a master data client. All of these are a great deal of work which only make sense in certain types of environments - certainly not for the sake of a workflow.
    Edited by: Mike Pokraka on Sep 27, 2009 9:52 PM

  • Mass RAR Rule Set Changes

    My integrator is telling me that there is no way to complete a mass update to the authorizations/restrictions in our RAR rule set (AC 5.3.)  That is, at the recommendation of our external auditor, we added additional transactions to existing rules but failed to activate the company code restrictions to ignore display only access and therefore, I am receiving a significant number of SODs which are false positives. 
    I find it hard to believe that there is no easy way to activate the company code authorization objects (and others) for the additional transactions in the rule set.  The integrator is telling me that this has to be done one by one.  Please tell me that there is an easier way.
    Apologies if this is a repeat; if this topic is out there, could someone point me in the right direction?Thank you in advance!
    Thank you in advance!

    Is there any easy way?  Depends on what you think is easy  
    For mass updates to function I will typically use the:  Configuration -> Rule Upload  feature.  To perform an update to an authorization object, you would use the 'Function Authorization' selection.
    To upload the function you'd want to use the file formats from the 9 upload files SAP provides for the ruleset.  If I recall correctly, function uploads will overwrite the existing function so it is important that your upload file contains all existing function data + the additional auth objects you want to activiate. 
    As with any text file manipulation and download/upload or export/import features into GRC you want to be particulary careful with formatting and attention to detail.  Probably a good idea to take a backup of the rules if this is your first time working with the ruleset files.

  • GRC-AC v5.3 SP11 -- RAR Rules for BI, GTS, SRM, XI, GRC-AC, SolMan

    Hi!
    Has SAP released RAR Rule sets for BI, GTS, SRM, XI, GRC-AC, or Solution Manager?
    Let me know if anyone else has found them.
    Thanks,
    -john

    Hi John,
       SRM rules have always been available. I have not seen rules for BI, GTS, XI, AC or SolMan. Would definitely want to see rules for XI, BI and SolMan.
    Alpesh

  • Problem in BEx analyzer output in BI dev and BI Quality

    Hi All
    I am facing a problem in BEx analyzer output in BI Dev and BI Quality server where the query is make on BI infoset.
    When i am executing the query in BI dev server then it is giving the output result but when i tried to executing the BI query in BI quality server the i got 'No available data found result'.
    I have also checked the same selection input result in infoprovider for infoset and output is getting for the same.
    Please suggest for regarding solution.
    thanks & regards,
    Gourav Sekhri

    Hi,
    check whether you have loaded data in to quality server if no, please load data for all the target involved in info set.  If data loaded check all the target 'request for reporting available 'or not.
    Also check whether data is showing in infoset or not.. if data availble there change your selection critiria accordingly.
    Thanks
    BVR
    Edited by: binu v. rajan on Aug 20, 2011 4:23 PM

  • Key figure diffrence in Dev and Pro

    Hi Experts,
    I have one cube in Dev and Production but key figure Current Price out put is difference in both
    like dev it comes 30.12 and in production getting like 30,12.
    but make sure i am using same keyfigure.
    Thanks
    David

    Check the user default settings in SU01D tcode in dev and pro.
    SU01D -> enter ur user id -> goto Default tab and check the decimal notation.
    I guess it is 1,234,567.89 in DEV and 1.234.567,89 in PRO.
    Please make sure that this setting is same in DEV and PRO.
    you can ask your basis consultant to changes the settings....
    --- Thanks...

  • Dev and Prod Environment Sync.

    Hello Experts,
    I would like to ask for your kind help in the following matter:  How to resynch the dev and production environments.
    For some (inextricable) reason, the production and development servers are out of sync.  Mainly the SLD content is in different versions. The production one is older. Oh and there is no QA environment.
    Besides that the business systems were created manually in each server and have inconsistencies. No SLD information was transported between them, only the repository and some directory information. Other directory information was created in the production environment directly. And I am affraid some development took place in production server directly too.
    So I have been asked to produce a plan to get all this mess into shape.  So far at a macro level I have these tasks planned:
    10.- Backup prod and dev servers.
    20.- Load latest CIM model and CR content on both servers (and see what happens).
    30.- Connect the SAP systems as SLD data sources to prod SLD.
    40.- Redo configuration in prod to point to automatically created business systems (and retest).
    50.- Cleanup of manually created business systems the prod environment.
    60.- Config the prod SLD as data source for dev SLD.
    70.- Redo configuration in dev server to make it match prod SLD.
    80.- Cleanup of manually created business systems the dev environment.
    90.- Create in the dev server the repository objects that were directly created in prod.
    100.- Test everything for consistency.
    Please help me improve the task list/order.
    -Sam.

    Hi Sam
    You are planning a lot of activity on a production environment. Two coments:
    1. When you have your dev SLD configured as you require, with CIM, PI Content & Business Systems updated etc .. perhaps you should do a level of regression testing on this environment before embarking on changing the Live env.
    2. Once you are happy that the Dev env It is possible to Export and import manually the SLD content from the Dev to the Live to align both systems, refer to link:
    https://websmp104.sap-ag.de/~sapidb/011000358700000315022005E.PDF
    This is available through the Export Link in Administration page.
    Thanks
    Damien

  • Analysis Authorization In Dev and impact of reports and roles in prod trans

    Hello,
    We are planning to switch to analysis authorization. We plan to make that change first in Dev and we were wondering what would be the impact on roles and reports we transport from dev (which is switched to Analysis Authorization) to production( on Old authirization) ? We wont transport new things to production till we switch to new auth in Prd.
    Thanks a lot,
    BP.

    Hello
    Even if you are transporting the roles from dev to quality and production, the analysis authorization objects will not be checked until you set "current procedure..." in RSCUSTV23.
    So there is no harm in transporting the roles and auhotrization until you change the concept to analysis.
    regards,
    Payal

  • Performance problems between dev and prod

    I run the same query with identical data and indexes, but one system takes a 0.01 seconds to run while the production system takes 1.0 seconds to run. TKprof for dev is:
    Rows Row Source Operation
    1 TABLE ACCESS BY INDEX ROWID VAP_BANDVALUE
    3 NESTED LOOPS
    1 NESTED LOOPS
    41 NESTED LOOPS
    41 NESTED LOOPS
    1 TABLE ACCESS BY INDEX ROWID VAP_PACKAGE
    1 INDEX UNIQUE SCAN SYS_C0032600 (object id 51356)
    41 TABLE ACCESS BY INDEX ROWID VAP_BANDELEMENT
    41 AND-EQUAL
    82 INDEX RANGE SCAN IDX_BE2 (object id 53559)
    41 INDEX RANGE SCAN IDX_BE1 (object id 53558)
    41 TABLE ACCESS BY INDEX ROWID VAP_BAND
    41 INDEX UNIQUE SCAN SYS_C0034599 (object id 53556)
    1 INDEX UNIQUE SCAN SYS_C0032549 (object id 51335)
    1 INDEX RANGE SCAN IDX_BV1 (object id 53557)Tkprof for Prod is :
    Rows Execution Plan
    0 SELECT STATEMENT MODE: ALL_ROWS
    1 TABLE ACCESS MODE: ANALYZED (BY INDEX ROWID) OF 'VAP_BANDVALUE' (TABLE)
    52001 NESTED LOOPS
    26000 NESTED LOOPS
    26000 NESTED LOOPS
    26000 NESTED LOOPS
    1 TABLE ACCESS MODE: ANALYZED (BY INDEX ROWID) OF 'VAP_PACKAGE' (TABLE)
    1 INDEX MODE: ANALYZED (UNIQUE SCAN) OF 'SYS_C0018725' (INDEX (UNIQUE))
    26000 TABLE ACCESS MODE: ANALYZED (BY INDEX ROWID) OF 'VAP_BANDELEMENT' (TABLE)
    26000 INDEX MODE: ANALYZED (RANGE SCAN) OF 'IDX_BE2' (INDEX)
    26000 TABLE ACCESS MODE: ANALYZED (BY INDEX ROWID) OF 'VAP_BAND' (TABLE)
    26000 INDEX MODE: ANALYZED (UNIQUE SCAN) OF 'SYS_C0030648' (INDEX (UNIQUE))
    26000 INDEX MODE: ANALYZED (UNIQUE SCAN) OF 'SYS_C0018674' (INDEX (UNIQUE))
    26000 INDEX MODE: ANALYZED (RANGE SCAN) OF 'IDX_BV1' (INDEX).The row count varies greatly. But it shouldn't as the data is the same.
    Any ideas?

    From DEV you show the Row Source Operations for the query. The column named "Rows" signifies the actual number of rows processed with each step.
    From PROD you show the Execution Plan for the query; that is, tkprof was executed with the EXPLAIN option which generates the execution plan as of the time when tkprof was run. The "Rows" column in the Explain Plan output comes from the PLAN_TABLE.CARDINALITY, which represents an estimate by the CBO for the number of rows [expected to be] processed with each step.
    So, if by <quote>The row count varies greatly</quote> you meant these "Rows" columns outputs then you're are comparing actuals from a database with estimates from another. Get the Row Source Operations from both.
    "Identical data and indexes":
    1. data may be the same, but it is not necessarily stored physically the same way.
    2. Indexes being the same means their definitions are the same; again, physically they are not necessarily identical
    In other words, data in PROD (the way it is stored on disk) may have evolved as a result of discrete deletes/updates/inserts ... in DEV it could be, for example, stored more compact if you took a copy of PROD and moved into DEV. So, the number of blocks for your segments will likely be different between PROD and DEV, the clustering factor for your indexes are likely different, etc ... things which could [and do] influence the CBO. The statistics may be different.
    I guess what I'm saying is ... it is quite hard, if not outright impossible, to get two identical databases/instances/load ... hence, don't expect the executions to be 100% identical, even if you have "identical data and indexes". By all means compare between DEV and PROD (make sure you compare the same thing though) and use the observed differences as an indicator for further investigation ... don't chase the goal of 100% identical behavior.
    Now, by all means look at that query taking 1 second in PROD ... I have only addressed <quote>The row count varies greatly. But it shouldn't as the data is the same.</quote>

Maybe you are looking for