Track data exchange

Hi All,
I am working on a load generator application where i record client interactions(request and response objects) in XML serialized form.
Now i want to replay those recorded stuff. I am able to replay simple
calls(requests) using reflection APIs.
But the problem comes when i have to use data returned by one call in a later stage in another call. So there should be some mechanism to track data
exchange while recording and use this tracking mechanism while replaying?
I think might be i need some HashMap type structure but not sure !:)

What? I don't understand the question. What is the problem?

Similar Messages

  • How to track data exchange

    Hi All,
    I am working on a load generator application where i record client interactions(request and response objects) in XML serialized form.
    Now i want to replay those recorded stuff. I am able to replay simple
    calls(requests) using reflection APIs.
    But the problem comes when i have to use data returned by one call in a later stage in another call. So there should be some mechanism to track data
    exchange while recording and use this tracking mechanism while replaying?
    I think might be i need some HashMap type structure but not sure !:)

    Thanks Roy.
    JMeter deals with Web Applications but not ejb application clients
    I have recorded ejb calls in XML serialized form. When i was going to replay those recorded scenarios, i found a very
    big problem. The problem is divided into 3 cases:
    /*Case1:*/
    Var x = func1.a ();
    …. // Some intermediate statements
    func2.b (a, b, x);
    /*Case2:*/
    func2.b(a, func1.a(), b);
    /*Case3:*/
    Var x = func1.a();
    …. // Some intermediate statements
    Variable x is updated
    func2.b(a, b, x);During replay of those recorded calls
    You can see the values returned by func1 call(it was already recorded) have
    been used in func2(this call is already been recorded) in 3 different ways.
    I have to find these correlations during runtime (replaying of those recorded calls) and have to pass these dynamic values returned by func1 to func2 instead of those values which are in recorded script.

  • WMI High CPU Usage on Hyper-V VMs - Related to Data Exchange Integration Service

    Title pretty much says it all.  Some of my VMs have high CPU and moderate usage going to the WMI Integration Service.  I have tracked it down to the Data Exchange Integration Service.  If I de-select the service under the VM configuration,
    everything works normally.  Has anyone seen anything like this yet?
    Thanks, TJ

    Hi,
    Could you provide more information about your environment. for example,What is the exact text of any error messages that you received associated with this problem?  The server version of the problem on, when you experience this issue what are you trying
    to do, when  this problem occurs the system log record information, screenshots is the best information.
    More information:
    Event Logs
    http://technet.microsoft.com/en-us/library/cc722404.aspx
    If you are using Server 2008r2 or 2008r2 SP1 please confirm your hardware environment is not same with the following Hotfix described:
    Performance decreases in Windows Server 2008 R2 when the Hyper-V role is installed on a computer that uses Intel Westmere or Sandy Bridge processors
    http://support.microsoft.com/kb/2517329
    Thanks.
    We
    are trying to better understand customer views on social support experience, so your participation in this
    interview project would be greatly appreciated if you have time.
    Thanks for helping make community forums a great place.

  • I'm in cycle mode, and 'merge' is clicked in preferences. However, when I record, my previous track data keeps getting overwritten. Does anyone know what I'm doing wrong. Interestingly, I can still see the data in the region.

    I'm in cycle mode, and 'merge' is clicked in preferences. However, when I record, my previous track data keeps getting overwritten. Does anyone know what I'm doing wrong. Interestingly, I can still see the data in the region.

    jamestait wrote:
    when I record, my previous track data keeps getting overwritten.
    since you didn't specify, are you recording in a single take?
    http://www.bulletsandbones.com/GB/GBFAQ.html#multipassrecording
    (Let the page FULLY load. The link to your answer is at the top of your screen)

  • Meet a problem of data exchange for sale order from CRM to R3.

    Dear Friends:
          I do the data exchange for sale oder from to R3 today , the problem's detail is as follows:
          When i save a sale order in CRM (Version is 5.0) . it can automatically generate a bdoc which bdoc type is BUS_TRANS_MSG. but the bdoc status alway is "Sent to receivers (not all have confirmed)". and the original order in CRM can not be change .it quote that "Document is being distributed - changes are not possible",  so i check the order status analysis in detail .it presents two error messages ," Event 'BEFORE_CHANGE', attribute '     FINI': Error code for function module 'CRM_STATUS_BEFORE_COMPLETED_EC' , "Item is not yet completed in OLTP system".  so i check  the order in R/3 ,it has already been create and without any error messages.
       Would like to tell me how to solve it . thanks your any idear..

    Hi Benjamin,
    When performing uploads to R/3 from CRM there is a response from the OTLP system that is sent back to the CRM Middleware to confirm that the data records were received and processed correctly. 
    Here is a checklist you can run through to verfiy that the connections, systems and objects that are needed are all in place:
    <b>On R/3 system:</b>
    - Check R/3 outbound queue (transaction SMQ1) for any entries that are not reaching CRM.
    - Check that all RFC destinations on R/3 are defined correctly and are pointing to CRM
    - Check the CRMCONSUM table in R/3 to ensure CRM is registered as a consumer
    - Check the CRMRFCPAR table in R/3 to ensure that order objects are valid for exchange between R/3 and CRM
    - Check for any short dumps in R/3 (ST22/ST21)
    <b>On CRM:</b>
    - Are there entries stuck in the inbound queue (SMQ2) with R3AU* names?
    - What does the CRM Middleware Trace show (SMWT)?  Sometimes this has more detail than the specific BDoc overview (SMW01)
    - Check for short dums in CRM (ST22)
    Let us know what else you uncover and we can work from there.
    Brad

  • Can a IDOC be used for data exchange from R/3 to CRM

    Hi All,
    First, can a IDOC be used for data exchange from R/3 to CRM
    I need to update few fields of SAP CRM Sales order with fields from SAP R/3 Work Order.
    To achive this can I use IDOC?
    Or do I update the R/3 sales order from R/3 Work order(using some interface or workflow), so that the sales order data flows from R/3 SO to CRM SO.
    Please respond immediately.
    Regards,
    Gopinath

    IDocs can be processed by the CRM system via XML/XIF adaptor. As this will be most probably a new interface that is not yet set up, it would be easier to change the orders in R/3 via an appropiate FM which should automatically generate a delta download BDoc.
    Even if they are not downloaded automatically a request download (defined via R3AR2 / 3 / 4) should take care of this.
    Hope this helps,
    Kai

  • Need help restoring original track data in Match

    Truly hope someone can offer advice! I know there are several threads about changes made to track data while Match is syncing don't stick, but I have a slightly different problem:
    I upgraded TuneUp and without knowing it, TuneUp startet "fixing" my entire library by deleting many composers and ruining other carefully edited information
    By the time I realized what was happening and shut it down, some 2.000 tracks has been wreaked havoc upon
    I restored the library from Time Machine, but apparently Match had had time to sync so that the wrong edits were flushed back to my library
    I have now tried several ways of fixing this:
    Restoring the library and using one of Doug's scripts to add a space at the end of all track names to make "my version" the latest, but this process takes so long that it times out and by then Match has again ruined my track info
    Turn off Match, restore the library, make changes to some track info for all tracks, and turn Match on again. Under this approach, Match seems to treat the cloud copy as the "original" even if I made changes to my local copy later
    Restoring the library, allow Match to start syncing so that my local library is viewed as part of the Match set, then disconnect the internet connection and make track changes, turn on again and sync Match again. Still, the Match changes are flushed back to my local library
    Unless there are other suggestions, I see no way out by let Match stay turned off... Anyone with better insight than me into how Match treats changes and which take priority that could help?
    Bjorn

    Hi,
    I'm not sure this will work. I use an app, no bearing on this issue, called Ivolume which adjust  the volume of the tracks. To get match to recognise volume change and that the file has been update, Ivolume adds information into comments box. My understanding is that this change allows latest file to replace the one in the cloud.
    My thoughts are that you have matched turned off, restore library pre problems then add a comment for those tracks affected. Once done turn back on match. A long shot but worth considering.
    Jim

  • Trying to import boujou .ma tracking data into after effects, seems like it doesn't support the file.

    I have tried everything to try and get this to work; i have tried multiple settings in boujou, i have imported the .ma file into maya and exported with baked keyframes and only the camera selected etc. I don't know what i'm doing wrong. I am wondering whether Adobe has cut support for .ma files altogether, which is a real pain in my *** as the built in 3d track just isn't good enough. I have tried dragging the file into after effects, i have tried importing it neither of which do anything. The .ma files don't even show up, i have to select 'all files' under the file type when i choose import to even see the files, and when i click them it doesn't recognize it as tracking data, it'll try and import it as a .PNG file or something. I can't seem to find any plugins that would enable me to import .ma files. I just need that tracking data as a Camera in my scene and i don't care how it is done. Additionally, i have tried exporting as a .txt file and copying and pasting that data into a camera, which doesn't work (unless im doing it completely wrong).
    Thankyou for reading

    I also experience this problem. My mp4 clips has been taken with a Samsung NV24HD camera and are HD (720p). I haven't tried to convert these clips but I think it should not be necessary since I can play them in quick time. However I have perian installed, perhaps that's why they play in quick time? The next question is then: Is there a similar plugin to iMovie?

  • What is the recommended way to obtain tracking data from carriers post XSI

    We currently run an old version of SAP Business Connector. We are in the process of migrating all interfaces off BC onto PI. The one remaining interface we have problems is the XSI (Express Delivery Interface) interface we have with ECC06 and UPS via the BC server. The interface works but is not stable and we would like to decommission it if we are able.
    I'm not 100% clear but it appears that XSI is no longer the recommended solution for obtaining tracking data from carriers. What is the recommend method today? We'd be happy to use a PI or ABAP solution but would prefer a standard solution that was supported by SAP and UPS.

    Using Time Machine is the simplest way to back up.
    debasencio wrote:
    They don't fit on a standard data DVD and when I try to back it up to my 500GB external hard drive it says its not formatted to files that size.
    How is the drive formatted?
    If it's DOS/FAT32, you can only size file sizes up to 4GB.
    If you are using it on a Mac only, format it Mac OS X HFS+.

  • Synchronous data exchange over JCaps without TCP/IP or WebService...

    Hi all,
    the subject may sound like a little crazy request, but that is what we actually need.
    Just to explain: we have a SAP R/3 system running (v. 4.72) which is not able to call Web Services and is also not able to open a TCP/IP-connection to a foreign host to exchange data.
    But what we need is a synchronous data exchange as, after pressing a button in SAP, we should query some database tables of another sub-system with JCaps and send back the received information to SAP.
    Do you have any ideas out there how this synchronous request from SAP to JCaps can be fullfilled with JCaps (our version is 5.1.3)?!
    We thought about using a HTTP server on the JCaps side, where SAP just sends a HTTP-request on the specified address and then we could use the data received from this call, to get data from the sub-system and then send it back to SAP over an RFC or something similar - that is the easier part (sending data back to SAP). The harder part, in my opinion, is to create a possibility for SAP to call JCaps immediately - so not asynchron, which we already implemented over a file export...
    So, is it possible to use HTTP-server from JCaps for our needs?! Or is there another, easier possibility?!
    Any help highly appreciated...
    Regards
    Bernhard Böhm

    Hi Chris,
    thanks for the input - we also have a similar thing running, also using our BW-Server (SAP ERP 6.0) as the "web service engine"....
    But now, we want a solution without another server (like the BW in the upper case) involved!
    So, we thought about using HTTP-server on the JCaps-side which should be invoked by a simple HTTP-request from SAP (also possible in 4.72).
    Now I tried to setup a simple HTTP-Server project in JCaps 5.1.3 and it is making me crazy right now...
    I just do not get it to work - all I would do is a simple JCD that just print a line in the log-file when started. The JCD has just a "processRequest"-method from HTTPS-Server-eWay. In the connectivity map I did set up the connection to the HTTP-Server with the servlet-url-name - property:
    http://localhost:18001/dpListenHTTP_servlet_HttpServerServlet (like described in the userGuide).
    But when trying to build the project I get this error:
    com.stc.codegen.framework.model.CodeGenException: code generation error at = HTTP_Listen_cmListenHTTP_jcListenHTTP1 - HTTP Server e*Way Code GeneratorProblem creating war: C:\temp\dpListenHTTPprj_WS_serTestHTTP\12217262314811\WEB-INF\classes\..\dpListenHTTP_servlet_http:\localhost:18001\dpListenHTTP_servlet_HttpServerServlet.war (The filename, directory name, or volume label syntax is incorrect) (and the archive is probably corrupt but I could not delete it)
         at com.stc.codegen.frameworkImpl.model.CodeGenFrameworkImpl.process(CodeGenFrameworkImpl.java:1569)
         at com.stc.codegen.frameworkImpl.model.DeploymentVisitorImpl.process(DeploymentVisitorImpl.java:405)
         at com.stc.codegen.frameworkImpl.model.DeploymentVisitorImpl.process(DeploymentVisitorImpl.java:308)
         at com.stc.codegen.frameworkImpl.model.DeploymentVisitorImpl.traverseDeployment(DeploymentVisitorImpl.java:268)
         at com.stc.codegen.driver.module.DeploymentBuildAction.loadCodeGen(DeploymentBuildAction.java:923)
         at com.stc.codegen.driver.module.DeploymentBuildAction.access$1000(DeploymentBuildAction.java:174)
         at com.stc.codegen.driver.module.DeploymentBuildAction$1.run(DeploymentBuildAction.java:599)
         at org.openide.util.Task.run(Task.java:136)
         at org.openide.util.RequestProcessor$Processor.run(RequestProcessor.java:599)
    Caused by: Problem creating war: C:\temp\dpListenHTTPprj_WS_serTestHTTP\12217262314811\WEB-INF\classes\..\dpListenHTTP_servlet_http:\localhost:18001\dpListenHTTP_servlet_HttpServerServlet.war (The filename, directory name, or volume label syntax is incorrect) (and the archive is probably corrupt but I could not delete it)
         at org.apache.tools.ant.taskdefs.Zip.executeMain(Zip.java:509)
         at org.apache.tools.ant.taskdefs.Zip.execute(Zip.java:302)
         at com.stc.codegen.frameworkImpl.model.util.AntTasksWrapperImpl.war(AntTasksWrapperImpl.java:404)
         at com.stc.connector.codegen.httpserveradapter.HSEWCodelet.generateFiles(HSEWCodelet.java:608)
         at com.stc.codegen.frameworkImpl.model.CodeGenFrameworkImpl.processCodelets(CodeGenFrameworkImpl.java:640)
         at com.stc.codegen.frameworkImpl.model.CodeGenFrameworkImpl.process(CodeGenFrameworkImpl.java:1546)
         ... 8 more
    Caused by: java.io.FileNotFoundException: C:\temp\dpListenHTTPprj_WS_serTestHTTP\12217262314811\WEB-INF\classes\..\dpListenHTTP_servlet_http:\localhost:18001\dpListenHTTP_servlet_HttpServerServlet.war (The filename, directory name, or volume label syntax is incorrect)
         at java.io.FileOutputStream.open(Native Method)
         at java.io.FileOutputStream.<init>(FileOutputStream.java:179)
         at java.io.FileOutputStream.<init>(FileOutputStream.java:131)
         at org.apache.tools.zip.ZipOutputStream.<init>(ZipOutputStream.java:252)
         at org.apache.tools.ant.taskdefs.Zip.executeMain(Zip.java:407)
         ... 13 moreAnyone any idea how to set up a HTTP-server-project?!
    Thanks and regards
    Bernhard Böhm

  • PeopleSoft CS SAIP Announce Status Issue in Bulk Data Exchange Status

    XML is generated in the provided Directory Path under SAIP “Web Service Targets” but “Announce Status” is blank under Bulk Data Exchange Status, Even the “Event Message Monitor” shows nothing!
    We have activated all SAIP service operations and their delivered routings on our side.
    The Transaction status under Bulk Data Exchange Status page says Announced but but “Announce Status” is blank on the same page.
    Announce status should have any of these possible values per PeopleBooks (Connector Error,Failure,Processing,Success,Unsupported)
    What could be wrong? Please help. Thank You...
    Regards,
    Ashish

    You are welcome. I'm glad you got it back up.
    (1) You say you did the symbolic link. I will assume this is set correctly; it's very important that it is.
    (2) I don't know what you mean by "Been feeding the [email protected] for several weeks now, 700 emails each day at least." After the initial training period, SpamAssassin doesn't learn from mail it has already processed correctly. At this point, you only need to teach SpamAssassin when it is wrong. [email protected] should only be getting spam that is being passed as clean. Likewise, [email protected] should only be getting legitimate mail that is being flagged as junk. You are redirecting mail to both [email protected] and [email protected] ... right? SpamAssassin needs both.
    (3) Next, as I said before, you need to implement those "Frontline spam defense for Mac OS X Server." Once you have that done and issue "postfix reload" you can look at your SMTP log in Server Admin and watch as Postfix blocks one piece of junk mail after another. It's kind of cool.
    (4) Add some SARE rules:
    Visit http://www.rulesemporium.com/rules.htm and download the following rules:
    70sareadult.cf
    70saregenlsubj0.cf
    70sareheader0.cf
    70sarehtml0.cf
    70sareobfu0.cf
    70sareoem.cf
    70sarespoof.cf
    70sarestocks.cf
    70sareunsub.cf
    72sare_redirectpost
    Visit http://www.rulesemporium.com/other-rules.htm and download the following rules:
    backhair.cf
    bogus-virus-warnings.cf
    chickenpox.cf
    weeds.cf
    Copy these rules to /etc/mail/spamassassin/
    Then stop and restart mail services.
    There are other things you can do, and you'll find differing opinions about such things. In general, I think implementing the "Frontline spam defense for Mac OS X Server" and adding the SARE rules will help a lot. Good luck!

  • Managed bean/Data exchange between two ADF Rich Faces based applications

    Hi,
    I have been trying to research what seems to be a small issue. My requirements are as follows.
    1. I need to be able to pass managed bean information from one ADF Rich Faces based application to another (in two separate ears) at runtime (e.g. from Ear1: SenderApp/Sender.jspx -> Ear2: ReceiverApp/Receiver.jspx).
    2. I do not want to use the database as my applications need to be performant.
    3. Serialization/de-serialization would fall pretty much under the database category. In other words, I like to avoid Serialization/de-serialization of the managed bean.
    4. I cannot use query string due to security issues.
    My question is as follows:
    1. Is there any standard/architecture/best practices for data exchange of backing beans or other forms between two ADF Rich Faces based apps (in two separate ears)?
    2. Has someone found anything similar to an applicationScope that works across applications?
    I would appreciate any ideas.
    Thanks very much,
    Edited by: user11219846 on Jul 23, 2009 2:38 PM
    Edited by: user11219846 on Jul 23, 2009 2:42 PM

    Hi,
    its not an ADF Faces problem, but not possible in Java EE. You can however fallback to vendor specific implementations like in WLS. From the WebLogic documentation : http://e-docs.bea.com/wls/docs103/webapp/sessions.html
    Enabling Web applications to share the same session*
    By default, Web applications do not share the same session. If you would like Web applications to share the same session, you can configure the session descriptor at the application level in the weblogic-application.xml deployment descriptor. To enable Web applications to share the same session, set the sharing-enabled attribute in the session descriptor to true in the weblogic-application.xml deployment descriptor. See “sharing-enabled” in session-descriptor.
    The session descriptor configuration that you specify at the application level overrides any session descriptor configuration that you specify at the Web application level for all of the Web applications in the application. If you set the sharing-enabled attribute to true at the Web application level, it will be ignored.
    All Web applications in an application are automatically started using the same session instance if you specify the session descriptor in the weblogic-application.xml deployment descriptor and set the sharing-enabled attribute to true as in the following example:
    +<?xml version="1.0" encoding="ISO-8859-1"?>+
    +<weblogic-application xmlns="http://www.bea.com/ns/weblogic/90";;>+
    +...+
    <session-descriptor>     
    +<persistent-store-type>memory</persistent-store-type>+
    +<sharing-enabled>true</sharing-enabled>+
    +...+
    +</session-descriptor>+
    +...+
    +</weblogic-application>+
    Frank

  • Restoring Original Track Data

    I recently edited the details of a large number of tracks and I want to restore the original data I.e. Artists Name instead of Various Artists as I had changed it to. I thought if I deleted the tracks from ITunes and re imported them this would restore the original track data but it doesn't seemed to have work. I also tried changing the location of the folder I was importing from to see if this would make any difference but it seems wherever I import these tracks from they retain the changed information and not the original...
    Anybody able to help???

    It depends whether or not you have iTunes organise the files too. If so it will have renamed the files to match the new information, otherwise you can use a tagging tool to reset the tag from the file properties such as filename (Track No./Name), folder name (Album), parent folder name (Artist).
    tt2

  • Notifications of failed or partially failed load processes in the Data Exchange

    Hello,
    I've recently completed quite a few data integrations (to maintain coexistence) between external systems at my company and Oracle Fusion. The majority include data-out (Extracts and BI Reports), and data-in (via FBL from UCM).
    I'm wondering what the standard is for notifications on failed FBL loads. After an FBL succeeds with the RIDC, the most information I know is the process ID of the process loading my data into Fusion. In order to check to see if it succeeded or not, I have to go into the Data Exchange and check the process manually in the "Load Batch Data" GUI.
    Is there a way to get emailed notifications if a process finishes with any failures? The only automated way I know of to check on statuses is to schedule the seeded Batch Load Summary HCM extract and have something on our end check for anything that has failed. But this is pretty un-ideal when all I want is an immediate notification of failed or trouble FBL loads.
    What's the easiest/best/quickest way to be automatically notified when an FBL load is having issues?
    Thanks,
    Tor

    I am not an expert on FBL, but I think there is a ESS process involved, could you configure alerts to monitor the state and have incidents be sent to the interest parties, see Monitoring Oracle Enterprise Scheduler
    Jani Rautiainen
    Fusion Applications Developer Relations                             
    https://blogs.oracle.com/fadevrel/

  • Middleware technologies used for data exchange in Cloud for Customer system

    Hi Techies,
                     I would like to know what are the different middleware technologies like ALE,EDI and IDOC, Web Services or any other technology plays role in data exchange between SAP ERP and SAP Cloud for Customer system.
    My project includes implementation of SAP Cloud for Sales, I've read many documents and seen various videos for ERP Configuration and Cloud Configuration, I see that there is a standard report that we want to execute and specify the type of data to be exchanged between Cloud for Customer and SAP ERP via SAP NW PI system.
    When executing the report we select the IDOC Type and run the report, so once after execution all data is copied to SAP Cloud for Customer system.
    What about configuration of IDOCS ? Do we need to maintain port, partner profiles, logical system etc ? as usual we do when working on Interface between SAP to SAP systems ?
    Or is it maintained when we make all the communication settings between SAP ERP <-> SAP NW PI <-> SAP Cloud for Customer system ?
    Can anyone help to understand this better ?
    Thanks,
    Gowri Shankar

    Hi Gowri,
    The standard report does exactly that.
    It generates, the ports, partner profiles, RFC destinations and other objects required for communication configuration in SAP to connect to C4C.
    If you are not using HCI, you can directly connect to C4C, otherwise you will have to manually edit the RFC destination and provide the HCI worker node URL.
    Note that this report is part of an Add On which is applicable only after a specific SAP ECC version.
    If you are on lower version, you will have to create them manually.
    regards,
    Anirudh Vyas.

Maybe you are looking for