Standard Role to Middleware (data exchange)

Hello
We have a SAP CRM Project where customer is very restrictive with authorizations management in productive systems .
We can’t assign a SAP_ALL   to connection user (RFCUSER)  for data exchange and we are trying to find a standard role pfcg with all necessary authorization in SAP CRM & SAP ECC to exchange master data & ERP customizing.
I’m worried about this because if I use a trace (ST01) to identify authorization objects I could create an incomplete role.
Does anybody have any information about this issue?  Do you Know a standard role to fullfil this security requirement?
Thanks in advance,
Pilar.

I see to many variables here to solve this with the standard roles
You can analyze the users who have roles in both systems for the objects that will be integrated and assign those roles to the RFC users...I don't know, just an idea

Similar Messages

  • Middleware technologies used for data exchange in Cloud for Customer system

    Hi Techies,
                     I would like to know what are the different middleware technologies like ALE,EDI and IDOC, Web Services or any other technology plays role in data exchange between SAP ERP and SAP Cloud for Customer system.
    My project includes implementation of SAP Cloud for Sales, I've read many documents and seen various videos for ERP Configuration and Cloud Configuration, I see that there is a standard report that we want to execute and specify the type of data to be exchanged between Cloud for Customer and SAP ERP via SAP NW PI system.
    When executing the report we select the IDOC Type and run the report, so once after execution all data is copied to SAP Cloud for Customer system.
    What about configuration of IDOCS ? Do we need to maintain port, partner profiles, logical system etc ? as usual we do when working on Interface between SAP to SAP systems ?
    Or is it maintained when we make all the communication settings between SAP ERP <-> SAP NW PI <-> SAP Cloud for Customer system ?
    Can anyone help to understand this better ?
    Thanks,
    Gowri Shankar

    Hi Gowri,
    The standard report does exactly that.
    It generates, the ports, partner profiles, RFC destinations and other objects required for communication configuration in SAP to connect to C4C.
    If you are not using HCI, you can directly connect to C4C, otherwise you will have to manually edit the RFC destination and provide the HCI worker node URL.
    Note that this report is part of an Add On which is applicable only after a specific SAP ECC version.
    If you are on lower version, you will have to create them manually.
    regards,
    Anirudh Vyas.

  • Meet a problem of data exchange for sale order from CRM to R3.

    Dear Friends:
          I do the data exchange for sale oder from to R3 today , the problem's detail is as follows:
          When i save a sale order in CRM (Version is 5.0) . it can automatically generate a bdoc which bdoc type is BUS_TRANS_MSG. but the bdoc status alway is "Sent to receivers (not all have confirmed)". and the original order in CRM can not be change .it quote that "Document is being distributed - changes are not possible",  so i check the order status analysis in detail .it presents two error messages ," Event 'BEFORE_CHANGE', attribute '     FINI': Error code for function module 'CRM_STATUS_BEFORE_COMPLETED_EC' , "Item is not yet completed in OLTP system".  so i check  the order in R/3 ,it has already been create and without any error messages.
       Would like to tell me how to solve it . thanks your any idear..

    Hi Benjamin,
    When performing uploads to R/3 from CRM there is a response from the OTLP system that is sent back to the CRM Middleware to confirm that the data records were received and processed correctly. 
    Here is a checklist you can run through to verfiy that the connections, systems and objects that are needed are all in place:
    <b>On R/3 system:</b>
    - Check R/3 outbound queue (transaction SMQ1) for any entries that are not reaching CRM.
    - Check that all RFC destinations on R/3 are defined correctly and are pointing to CRM
    - Check the CRMCONSUM table in R/3 to ensure CRM is registered as a consumer
    - Check the CRMRFCPAR table in R/3 to ensure that order objects are valid for exchange between R/3 and CRM
    - Check for any short dumps in R/3 (ST22/ST21)
    <b>On CRM:</b>
    - Are there entries stuck in the inbound queue (SMQ2) with R3AU* names?
    - What does the CRM Middleware Trace show (SMWT)?  Sometimes this has more detail than the specific BDoc overview (SMW01)
    - Check for short dums in CRM (ST22)
    Let us know what else you uncover and we can work from there.
    Brad

  • Managed bean/Data exchange between two ADF Rich Faces based applications

    Hi,
    I have been trying to research what seems to be a small issue. My requirements are as follows.
    1. I need to be able to pass managed bean information from one ADF Rich Faces based application to another (in two separate ears) at runtime (e.g. from Ear1: SenderApp/Sender.jspx -> Ear2: ReceiverApp/Receiver.jspx).
    2. I do not want to use the database as my applications need to be performant.
    3. Serialization/de-serialization would fall pretty much under the database category. In other words, I like to avoid Serialization/de-serialization of the managed bean.
    4. I cannot use query string due to security issues.
    My question is as follows:
    1. Is there any standard/architecture/best practices for data exchange of backing beans or other forms between two ADF Rich Faces based apps (in two separate ears)?
    2. Has someone found anything similar to an applicationScope that works across applications?
    I would appreciate any ideas.
    Thanks very much,
    Edited by: user11219846 on Jul 23, 2009 2:38 PM
    Edited by: user11219846 on Jul 23, 2009 2:42 PM

    Hi,
    its not an ADF Faces problem, but not possible in Java EE. You can however fallback to vendor specific implementations like in WLS. From the WebLogic documentation : http://e-docs.bea.com/wls/docs103/webapp/sessions.html
    Enabling Web applications to share the same session*
    By default, Web applications do not share the same session. If you would like Web applications to share the same session, you can configure the session descriptor at the application level in the weblogic-application.xml deployment descriptor. To enable Web applications to share the same session, set the sharing-enabled attribute in the session descriptor to true in the weblogic-application.xml deployment descriptor. See “sharing-enabled” in session-descriptor.
    The session descriptor configuration that you specify at the application level overrides any session descriptor configuration that you specify at the Web application level for all of the Web applications in the application. If you set the sharing-enabled attribute to true at the Web application level, it will be ignored.
    All Web applications in an application are automatically started using the same session instance if you specify the session descriptor in the weblogic-application.xml deployment descriptor and set the sharing-enabled attribute to true as in the following example:
    +<?xml version="1.0" encoding="ISO-8859-1"?>+
    +<weblogic-application xmlns="http://www.bea.com/ns/weblogic/90";;>+
    +...+
    <session-descriptor>     
    +<persistent-store-type>memory</persistent-store-type>+
    +<sharing-enabled>true</sharing-enabled>+
    +...+
    +</session-descriptor>+
    +...+
    +</weblogic-application>+
    Frank

  • Break sap standard role into two sub roles

    hi,
    i have one SAP standard role. now i want to break this role into two  sub roles. how shall do it.
    please suggest me.
    regards
    ramesh
    Edited by: Ramesh Sammiti on Jul 31, 2008 11:00 AM

    Hi Ramesh,
    When you say that you want to split the SAP Standard role into two roles:
    1.Do you mean to say that you want to split the transactions and authorization data of the SAP Standard role into two separate Z* or Y* roles?
    2.Do you want to copy the SAP Standard role into two different Z* or Y* roles and then modify the authorization data according to your company's requirements?
    In the above two scenarios you must copy the SAP Standard role into Z* or Y* roles in PFCG transaction with the appropriate naming convention and make necessary changes in both the transaction data and the authorization data.
    Please be clear which SAP Standard role you are willing to split into roles and i can provide more details.
    Hope this helps.
    Regards,
    Kiran Kandepalli

  • Notifications of failed or partially failed load processes in the Data Exchange

    Hello,
    I've recently completed quite a few data integrations (to maintain coexistence) between external systems at my company and Oracle Fusion. The majority include data-out (Extracts and BI Reports), and data-in (via FBL from UCM).
    I'm wondering what the standard is for notifications on failed FBL loads. After an FBL succeeds with the RIDC, the most information I know is the process ID of the process loading my data into Fusion. In order to check to see if it succeeded or not, I have to go into the Data Exchange and check the process manually in the "Load Batch Data" GUI.
    Is there a way to get emailed notifications if a process finishes with any failures? The only automated way I know of to check on statuses is to schedule the seeded Batch Load Summary HCM extract and have something on our end check for anything that has failed. But this is pretty un-ideal when all I want is an immediate notification of failed or trouble FBL loads.
    What's the easiest/best/quickest way to be automatically notified when an FBL load is having issues?
    Thanks,
    Tor

    I am not an expert on FBL, but I think there is a ESS process involved, could you configure alerts to monitor the state and have incidents be sent to the interest parties, see Monitoring Oracle Enterprise Scheduler
    Jani Rautiainen
    Fusion Applications Developer Relations                             
    https://blogs.oracle.com/fadevrel/

  • BI & Oracle XI data exchanges using WEBServices (into both sides)

    Hello,
    Would you be so kind and give me suggestions.
    We have:
    <b>BI 7 server – Oracle Exchange Infrastructure – Oracle DB server</b>
    Our client wants to implement this data exchange solution:
    <u>Scenario A: i have load data to BI from Oracle DB</u>
    My steps are: from <i>BI</i> i have to call WEBService from <i>Oracle Exchange Infrastructure</i> when <i>Oracle Exchange Infrastructure</i> calls WEBService from <i>Oracle DB</i> and <i>Oracle DB</i> returns data set to <i>BI</i> via <i>Oracle Exchange Infrastructure.</i>
    How to schedule job for calling  WEBServices from <i>Oracle Exchange Infrastructure</i>? What have i configure in <i>BI</i>?
    <u>Scenario B: Application based on Oracle DB wants to get data from BI</u>
    Steps are: <i>Oracle DB</i> calls WEBService from <i>Oracle Exchange Infrastructure</i> when <i>Oracle Exchange Infrastructure</i> calls WEBService from <i>BI</i> and <i>BI</i> sends data to <i>Oracle DB</i> via <i>Oracle Exchange Infrastructure</i>.
    <i>BI</i> offers <i>Open Hub Service</i> for data distribution from <i>BI</i>, but I didn’t find description how to  distribute data using WEBServices.
    Is is possible to implement Scenario A and Scenario B in BI with standard tools.
    Could you give me detailed answers(step-by-step what I have to do)?
    Thanks in advance.
    Best Regards,
    Arunas Stonys

    Arunas,
    Quite an interesting landscape....
    Also what do you mean by standard tools ?
    Option A :
    You can use the XML datasource for the same and once the XML data source is called , the data enters the Delta Queue in the BI server and from there you can use the normal infopackage / real time daemon to load data into your cubes / DSO. The XML datasource works on SOAP and this has to be supported by the Oracle XI.
    Option B:
    Slightly more trickier since you are hitting the BI server directly....
    I am not sure if an infospoke can be a web service but some of the ways this could be done is :
    a. Have an Func Module which acts as a web service and have that FM return the data
    b. Have SAP XI inbetween to do the same
    Also on the landscape- depending on the nature of data loads / data requests - if BI-Oracle is more - you can look at having SAP XI there instead....
    Arun
    Hope it helps....
    P.S I would also suggest that you post the same in the Enterprise SOA forums  / enterprise web services and people like Karthik Iyengar , Durairaj etc can respond to the same in a much better way that what I am able to give you right now...
    Message was edited by:
            Arun Varadarajan

  • WMI High CPU Usage on Hyper-V VMs - Related to Data Exchange Integration Service

    Title pretty much says it all.  Some of my VMs have high CPU and moderate usage going to the WMI Integration Service.  I have tracked it down to the Data Exchange Integration Service.  If I de-select the service under the VM configuration,
    everything works normally.  Has anyone seen anything like this yet?
    Thanks, TJ

    Hi,
    Could you provide more information about your environment. for example,What is the exact text of any error messages that you received associated with this problem?  The server version of the problem on, when you experience this issue what are you trying
    to do, when  this problem occurs the system log record information, screenshots is the best information.
    More information:
    Event Logs
    http://technet.microsoft.com/en-us/library/cc722404.aspx
    If you are using Server 2008r2 or 2008r2 SP1 please confirm your hardware environment is not same with the following Hotfix described:
    Performance decreases in Windows Server 2008 R2 when the Hyper-V role is installed on a computer that uses Intel Westmere or Sandy Bridge processors
    http://support.microsoft.com/kb/2517329
    Thanks.
    We
    are trying to better understand customer views on social support experience, so your participation in this
    interview project would be greatly appreciated if you have time.
    Thanks for helping make community forums a great place.

  • Can a IDOC be used for data exchange from R/3 to CRM

    Hi All,
    First, can a IDOC be used for data exchange from R/3 to CRM
    I need to update few fields of SAP CRM Sales order with fields from SAP R/3 Work Order.
    To achive this can I use IDOC?
    Or do I update the R/3 sales order from R/3 Work order(using some interface or workflow), so that the sales order data flows from R/3 SO to CRM SO.
    Please respond immediately.
    Regards,
    Gopinath

    IDocs can be processed by the CRM system via XML/XIF adaptor. As this will be most probably a new interface that is not yet set up, it would be easier to change the orders in R/3 via an appropiate FM which should automatically generate a delta download BDoc.
    Even if they are not downloaded automatically a request download (defined via R3AR2 / 3 / 4) should take care of this.
    Hope this helps,
    Kai

  • Standard tcode for (mass) data change of internal orders or ... ??

    Hi!
    I really need some info if there by any chance SAP has standard transaction for mass data change of internal orders (more particular, distribution rules in settlement rule section, which can be found in <b>KO02 transaction</b>  ).
    I am trying to change distribution rules for settlement receivers in <b>settlement rule section</b>, that is finish past distribution rules by filling TO PERIOD and TO FISCAL YEAR fields on the right of each rule, and then entering new rules (which i get from external source -flat file, ms excel, csv...).
    If i wanted to import data in SAP i guess i would have to develop a Batch Input. But that would take me some time to develop because it is pretty complicated.
    I found tcode KO08 but i do not really know how to use it. Maybe there is another tcode that i am not aware of?
    I would appreciate any suggestions!
    Thnx, UK

    Hi Srilakshimi,
    If you are familiar with MASS transaction, then you can modify User Responsible field for Internal Orders from transaction KOK2.
    As first step you must create a selection variant in order to define which orders you want to modify. Once selection variant was created, excute transaction with it and you'll get a screen similar to MASS transaction. Select the field you want and massively replace it. Do not forget to save.
    Best Regards!
    Mgitur

  • Synchronous data exchange over JCaps without TCP/IP or WebService...

    Hi all,
    the subject may sound like a little crazy request, but that is what we actually need.
    Just to explain: we have a SAP R/3 system running (v. 4.72) which is not able to call Web Services and is also not able to open a TCP/IP-connection to a foreign host to exchange data.
    But what we need is a synchronous data exchange as, after pressing a button in SAP, we should query some database tables of another sub-system with JCaps and send back the received information to SAP.
    Do you have any ideas out there how this synchronous request from SAP to JCaps can be fullfilled with JCaps (our version is 5.1.3)?!
    We thought about using a HTTP server on the JCaps side, where SAP just sends a HTTP-request on the specified address and then we could use the data received from this call, to get data from the sub-system and then send it back to SAP over an RFC or something similar - that is the easier part (sending data back to SAP). The harder part, in my opinion, is to create a possibility for SAP to call JCaps immediately - so not asynchron, which we already implemented over a file export...
    So, is it possible to use HTTP-server from JCaps for our needs?! Or is there another, easier possibility?!
    Any help highly appreciated...
    Regards
    Bernhard Böhm

    Hi Chris,
    thanks for the input - we also have a similar thing running, also using our BW-Server (SAP ERP 6.0) as the "web service engine"....
    But now, we want a solution without another server (like the BW in the upper case) involved!
    So, we thought about using HTTP-server on the JCaps-side which should be invoked by a simple HTTP-request from SAP (also possible in 4.72).
    Now I tried to setup a simple HTTP-Server project in JCaps 5.1.3 and it is making me crazy right now...
    I just do not get it to work - all I would do is a simple JCD that just print a line in the log-file when started. The JCD has just a "processRequest"-method from HTTPS-Server-eWay. In the connectivity map I did set up the connection to the HTTP-Server with the servlet-url-name - property:
    http://localhost:18001/dpListenHTTP_servlet_HttpServerServlet (like described in the userGuide).
    But when trying to build the project I get this error:
    com.stc.codegen.framework.model.CodeGenException: code generation error at = HTTP_Listen_cmListenHTTP_jcListenHTTP1 - HTTP Server e*Way Code GeneratorProblem creating war: C:\temp\dpListenHTTPprj_WS_serTestHTTP\12217262314811\WEB-INF\classes\..\dpListenHTTP_servlet_http:\localhost:18001\dpListenHTTP_servlet_HttpServerServlet.war (The filename, directory name, or volume label syntax is incorrect) (and the archive is probably corrupt but I could not delete it)
         at com.stc.codegen.frameworkImpl.model.CodeGenFrameworkImpl.process(CodeGenFrameworkImpl.java:1569)
         at com.stc.codegen.frameworkImpl.model.DeploymentVisitorImpl.process(DeploymentVisitorImpl.java:405)
         at com.stc.codegen.frameworkImpl.model.DeploymentVisitorImpl.process(DeploymentVisitorImpl.java:308)
         at com.stc.codegen.frameworkImpl.model.DeploymentVisitorImpl.traverseDeployment(DeploymentVisitorImpl.java:268)
         at com.stc.codegen.driver.module.DeploymentBuildAction.loadCodeGen(DeploymentBuildAction.java:923)
         at com.stc.codegen.driver.module.DeploymentBuildAction.access$1000(DeploymentBuildAction.java:174)
         at com.stc.codegen.driver.module.DeploymentBuildAction$1.run(DeploymentBuildAction.java:599)
         at org.openide.util.Task.run(Task.java:136)
         at org.openide.util.RequestProcessor$Processor.run(RequestProcessor.java:599)
    Caused by: Problem creating war: C:\temp\dpListenHTTPprj_WS_serTestHTTP\12217262314811\WEB-INF\classes\..\dpListenHTTP_servlet_http:\localhost:18001\dpListenHTTP_servlet_HttpServerServlet.war (The filename, directory name, or volume label syntax is incorrect) (and the archive is probably corrupt but I could not delete it)
         at org.apache.tools.ant.taskdefs.Zip.executeMain(Zip.java:509)
         at org.apache.tools.ant.taskdefs.Zip.execute(Zip.java:302)
         at com.stc.codegen.frameworkImpl.model.util.AntTasksWrapperImpl.war(AntTasksWrapperImpl.java:404)
         at com.stc.connector.codegen.httpserveradapter.HSEWCodelet.generateFiles(HSEWCodelet.java:608)
         at com.stc.codegen.frameworkImpl.model.CodeGenFrameworkImpl.processCodelets(CodeGenFrameworkImpl.java:640)
         at com.stc.codegen.frameworkImpl.model.CodeGenFrameworkImpl.process(CodeGenFrameworkImpl.java:1546)
         ... 8 more
    Caused by: java.io.FileNotFoundException: C:\temp\dpListenHTTPprj_WS_serTestHTTP\12217262314811\WEB-INF\classes\..\dpListenHTTP_servlet_http:\localhost:18001\dpListenHTTP_servlet_HttpServerServlet.war (The filename, directory name, or volume label syntax is incorrect)
         at java.io.FileOutputStream.open(Native Method)
         at java.io.FileOutputStream.<init>(FileOutputStream.java:179)
         at java.io.FileOutputStream.<init>(FileOutputStream.java:131)
         at org.apache.tools.zip.ZipOutputStream.<init>(ZipOutputStream.java:252)
         at org.apache.tools.ant.taskdefs.Zip.executeMain(Zip.java:407)
         ... 13 moreAnyone any idea how to set up a HTTP-server-project?!
    Thanks and regards
    Bernhard Böhm

  • PeopleSoft CS SAIP Announce Status Issue in Bulk Data Exchange Status

    XML is generated in the provided Directory Path under SAIP “Web Service Targets” but “Announce Status” is blank under Bulk Data Exchange Status, Even the “Event Message Monitor” shows nothing!
    We have activated all SAIP service operations and their delivered routings on our side.
    The Transaction status under Bulk Data Exchange Status page says Announced but but “Announce Status” is blank on the same page.
    Announce status should have any of these possible values per PeopleBooks (Connector Error,Failure,Processing,Success,Unsupported)
    What could be wrong? Please help. Thank You...
    Regards,
    Ashish

    You are welcome. I'm glad you got it back up.
    (1) You say you did the symbolic link. I will assume this is set correctly; it's very important that it is.
    (2) I don't know what you mean by "Been feeding the [email protected] for several weeks now, 700 emails each day at least." After the initial training period, SpamAssassin doesn't learn from mail it has already processed correctly. At this point, you only need to teach SpamAssassin when it is wrong. [email protected] should only be getting spam that is being passed as clean. Likewise, [email protected] should only be getting legitimate mail that is being flagged as junk. You are redirecting mail to both [email protected] and [email protected] ... right? SpamAssassin needs both.
    (3) Next, as I said before, you need to implement those "Frontline spam defense for Mac OS X Server." Once you have that done and issue "postfix reload" you can look at your SMTP log in Server Admin and watch as Postfix blocks one piece of junk mail after another. It's kind of cool.
    (4) Add some SARE rules:
    Visit http://www.rulesemporium.com/rules.htm and download the following rules:
    70sareadult.cf
    70saregenlsubj0.cf
    70sareheader0.cf
    70sarehtml0.cf
    70sareobfu0.cf
    70sareoem.cf
    70sarespoof.cf
    70sarestocks.cf
    70sareunsub.cf
    72sare_redirectpost
    Visit http://www.rulesemporium.com/other-rules.htm and download the following rules:
    backhair.cf
    bogus-virus-warnings.cf
    chickenpox.cf
    weeds.cf
    Copy these rules to /etc/mail/spamassassin/
    Then stop and restart mail services.
    There are other things you can do, and you'll find differing opinions about such things. In general, I think implementing the "Frontline spam defense for Mac OS X Server" and adding the SARE rules will help a lot. Good luck!

  • Standard roles in BW (SU01) to use BPC 10.1?

    Hi experts,
    I'm currently working on BPC 10.1 and I'm trying to figure out which roles are necessary in tc SU01 to use BPC 10.1. The 10.0 security guide specified two standard roles needed for the users, /POA/BUI_FLEX_CLIENT and /POA/BUI_UM_USER. However the 10.1 security guide does not say anything about standard roles so I'm wondering if it's the same roles as in 10.0...
    Please advise
    Best regards,
    Lars

    Hi Shekar,
    The solution provided by the Note 2068917 can resolve the issue.
    It worked for me. Just create an ABAP program with the following code and execute it.
    Please provide the appropriate 'username' and '_AppsetName'  as per your requirement.
    DELETE FROM RSBPC_WEB_UP WHERE user_id = 'username' AND NAME = 'colSeq' AND CATEGORY = 'members_AppsetName'.
    WRITE SY-SUBRC  .

  • Standard Role "SAP_QAP_BW_DASHBOARDS" Unavailable in ECC System

    Hello Experts.
    We are trying to Implement Embedded Analytics in our R3 system, from the SAP documentation I came to know that in order to execute the standard Reports/Dashboards as a pre-requisite we need to have the role "SAP_QAP_BW_DASHBOARDS" assigned to user. but we are unable to find the mentioned standard role in our system.
    We have already raised an OSS message but it is also of not much help.
    we are at SAP ECC 6.0 EHP 6.
    Any help/comment would be appreciated.
    Thanks & Regards
    Neeraj.

    We also got a reply from SAP saying
    "The role SAP_QAP_BW_DASHBOARDS was never delivered (and it was not
    planned to be delivered, too). This role was just meant for SAP
    internal usage during dashboard development. Thus, the documentation
    which states that you should assign this role to your users for
    dashboard usage is incorrect."
    and they suggested we shall create our roles on our own based on our business need, for example how the roles shall look like they have created a note "2001264" where its clearly mentioned how to create the role.
    Hope this help to anyone else having the same problem.
    Thanks & Regards
    Neeraj.

  • How to track data exchange

    Hi All,
    I am working on a load generator application where i record client interactions(request and response objects) in XML serialized form.
    Now i want to replay those recorded stuff. I am able to replay simple
    calls(requests) using reflection APIs.
    But the problem comes when i have to use data returned by one call in a later stage in another call. So there should be some mechanism to track data
    exchange while recording and use this tracking mechanism while replaying?
    I think might be i need some HashMap type structure but not sure !:)

    Thanks Roy.
    JMeter deals with Web Applications but not ejb application clients
    I have recorded ejb calls in XML serialized form. When i was going to replay those recorded scenarios, i found a very
    big problem. The problem is divided into 3 cases:
    /*Case1:*/
    Var x = func1.a ();
    …. // Some intermediate statements
    func2.b (a, b, x);
    /*Case2:*/
    func2.b(a, func1.a(), b);
    /*Case3:*/
    Var x = func1.a();
    …. // Some intermediate statements
    Variable x is updated
    func2.b(a, b, x);During replay of those recorded calls
    You can see the values returned by func1 call(it was already recorded) have
    been used in func2(this call is already been recorded) in 3 different ways.
    I have to find these correlations during runtime (replaying of those recorded calls) and have to pass these dynamic values returned by func1 to func2 instead of those values which are in recorded script.

Maybe you are looking for