[Ann] FirstACT 2.2 released for SOAP performance testing

Empirix Releases FirstACT 2.2 for Performance Testing of SOAP-based Web Services
FirstACT 2.2 is available for free evaluation immediately at http://www.empirix.com/TryFirstACT
Waltham, MA -- June 5, 2002 -- Empirix Inc., the leading provider of test and monitoring
solutions for Web, voice and network applications, today announced FirstACT™ 2.2,
the fifth release of the industry's first and most comprehensive automated performance
testing tool for Web Services.
As enterprise organizations are beginning to adopt Web Services, the types of Web
Services being developed and their testing needs is in a state of change. Major
software testing solution vendor, Empirix is committed to ensuring that organizations
developing enterprise software using Web Services can continue to verify the performance
of their enterprise as quickly and cost effectively as possible regardless of the
architecture they are built upon.
Working with organizations developing Web Services, we have observed several emerging
trends. First, organizations are tending to develop Web Services that transfer a
sizable amount of data within each transaction by passing in user-defined XML data
types as part of the SOAP request. As a result, they require a solution that automatically
generates SOAP requests using XML data types and allows them to be quickly customized.
Second, organizations require highly scalable test solutions. Many organizations
are using Web Services to exchange information between business partners and have
Service Level Agreements (SLAs) in place specifying guaranteed performance metrics.
Organizations need to performance test to these SLAs to avoid financial and business
penalties. Finally, many organizations just beginning to use automated testing tools
for Web Services have already made significant investments in making SOAP scripts
by hand. They would like to import SOAP requests into an automated testing tool
for regression testing.
Empirix FirstACT 2.2 meets or exceeds the testing needs of these emerging trends
in Web Services testing by offering the following new functionality:
1. Automatic and customizable test script generation for XML data types – FirstACT
2.2 will generate complete test scripts and allow the user to graphically customize
test data without requiring programming. FirstACT now includes a simple-to-use XML
editor for data entry or more advanced SOAP request customization.
2. Scalability Guarantee – FirstACT 2.2 has been designed to be highly scalable to
performance test Web Services. Customers using FirstACT today regularly simulate
between several hundred to several thousand users. Empirix will guarantee to
performance test the numbers of users an organization needs to test to meet its business
needs.
3. Importing Existing Test Scripts – FirstACT 2.2 can now import existing SOAP request
directly into the tool on a user-by-user basis. As a result, some users simulated
can import SOAP requests; others can be automatically generated by FirstACT.
Web Services facilitates the easy exchange of business-critical data and information
across heterogeneous network systems. Gartner estimates that 75% of all businesses
with more than $100 million in sales will have begun to develop Web Services applications
or will have deployed a production system using Web Services technology by the end
of 2002. As part of this move to Web Services, "vendors are moving forward with
the technology and architecture elements underlying a Web Services application model,"
Gartner reports. While this model holds exciting potential, the added protocol layers
necessary to implement it can have a serious impact on application performance, causing
delays in development and in the retrieval of information for end users.
"Today Web Services play an increasingly prominent but changing role in the success
of enterprise software projects, but they can only deliver on their promise if they
perform reliably," said Steven Kolak, FirstACT product manager at Empirix. "With
its graphical user interface and extensive test-case generation capability, FirstACT
is the first Web Services testing tool that can be used by software developers or
QA test engineers. FirstACT tests the performance and functionality of Web Services
whether they are built upon J2EE, .NET, or other technologies. FirstACT 2.2 provides
the most comprehensive Web Services testing solution that meets or exceeds the changing
demands of organizations testing Web Services for performance, functionality, and
functionality under load.”
Learn more?
Read about Empirix FirstACT at http://www.empirix.com/FirstACT. FirstACT 2.2 is
available for free evaluation immediately at http://www.empirix.com/TryFirstACT.
Pricing starts at $4,995. For additional information, call (781) 993-8500.

Simon,
I will admit, I almost never use SQL Developer. I have been a long time Toad user, but for this tool, I fumbled around a bit and got everything up and running quickly.
That said, I tried the new GeoRaptor tool using this tutorial (which is I think is close enough to get the jist). http://sourceforge.net/apps/mediawiki/georaptor/index.php?title=A_Gentle_Introduction:_Create_Table,_Metadata_Registration,_Indexing_and_Mapping
As I stumble around it, I'll try and leave some feedback, and probably ask some rather stupid questions.
Thanks for the effort,
Bryan

Similar Messages

  • A quick counting loop for a performance test

    Hi all. I'm writing this small performance test using the add method of an arraylist. How would I wrap a for (or while) loop around this code so that it runs 1000 times, averages the total times of all the tests, and then prints the average out? Thanks.
    startTime = System.nanoTime();
    for(int i = 0; i< last.length; i++ ){
    arrayListLast.add(last);
    stopTime = System.nanoTime();
    totalTime = stopTime-startTime;                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

    What part are you having trouble with?
    Do you not know how to repeat something a certain number of times?
    Do you not know how to compute an average?
    Do you not know how to print something out?

  • Looking for a Performance testing tool

    We want to stress test our multiserver Adobe Connect before putting into production.  HP tool was not successful.  Any Recommendations on tool and method?

    Cameron,
    Thanks for your input! Sounds like you are probably using these
    products.
    I saw the PerformAssure demo. Their GUI is very slick!! PerformAssure
    does some cool stuff like comparing EJB metrics with system metrics
    and were able to show me the EJB method that was causing a problem
    with a petstore application.
    I heard about this new tool from a company called Performant
    (http://www.performant.com). I am signed up for a demo with these
    folks. Will let you all know what I find.
    I looked at Wily site and didn't persue them as they seem to be
    deployable in the production environment. Our application is not live
    yet, but we want to be able to make sure it can handle good load ;)
    Thanks!
    --N
    "Cameron Purdy" <[email protected]> wrote in message news:<[email protected]>...
    I haven't had a chance yet to look at Sitraka's software, but Precise
    Indepth/J2EE and Wily Introscope are both pretty popular with Weblogic, and
    I think they both do a good job of what they do.
    Peace,
    Cameron Purdy
    Tangosol, Inc.
    http://www.tangosol.com/coherence.jsp
    Tangosol Coherence: Clustered Replicated Cache for Weblogic
    "Neil" <[email protected]> wrote in message
    news:[email protected]..
    Hello All!
    I need help! I am trying to find out which of the tools is better
    suited for me to be able to do performance testing of an application
    under load.
    I am looking for a tool that can provide me detailed application
    performance metrics. Seems like both Insight and Perform Assure can do
    this job for me.
    If any of you have had experience with using performance tools such as
    Prescise Insight or Sitraka's Perform Assure I would appreciate any
    input you might have on your usage experience of these tools.
    Also if you know of any other tools that might provide similar
    information such as the above tools, that would be really helpful.
    Thanks a bunch!
    Neil.

  • Loadrunner 12 for Mobile Performance Testing

    I need to know does HP Load runner or UFT supports telecom protocols such as
    MML
    SMPP
    DCC
    SFTP
    SOAP
    can it support telecom apps
    SMS
    MMS
    testing calls etc.. If yes then how?

    Hi Marc,
    Have a look at tx. R3AR2 / R3AR3 / R3AR4 to load data from CRM to the CDB (and further to the mobile clients). You should be able to schedule the creation of such data loads also as a job: e.g. if you define a request for 1 BP and start this every 3 seconds, this would be equivalent to a BP change every 3 seconds. The same is true for sales orders or activities.
    An alternative would be to use the ASCII Adapter (see help.sap.com) on this to mass-generate also BDocs.
    I am not a Windows / network guy, so I cannot say how to measure performance here. But nevertheless you should have a look at commstation behaviour and also client behaviour if you execute such tests, as esp. client data processing is taking very long times if there is too many data sent to the clients.
    Hope this helps,
    Kai

  • Performance Testing - Upgrade from 4.6B to ECC6.0

    Hi,
    We are doing an upgrade from SAP 4.6B to ECC6.0. I would like to know what would be the best approach for doing a performance test in an upgrade project. More specifically,
    1. What are the main components that need to be tested for performance?
    2. What are the important transaction codes/external applications (if any) that can be used to do performance testing in both 4.6B and ECC6.0? (ST05 or ST30 or something else?)
    3. Any best practice recommended by SAP for doing performance tests?
    Thanks in Advance,
    Reena

    > We are doing an upgrade from SAP 4.6B to ECC6.0. I would like to know what would be the best approach for doing a performance test in an upgrade project. More specifically,
    >
    > 1. What are the main components that need to be tested for performance?
    Those components you use.
    > 2. What are the important transaction codes/external applications (if any) that can be used to do performance testing in both 4.6B and ECC6.0? (ST05 or ST30 or something else?)
    What is "important" for you?
    Markus

  • Performance Test - Massive data generation

    I would like to generate a massive quantity of date in SAP ERP, to be extracted by SAP BW, in an effort to create a baseline Volume of 1-2 TB of data for a performance Test Activity in BW.
    We're investigating tools like Quest's DataFactory which generates date directly on the Oracle Tables.
    Does anyone have experiences with such activities or scenarios.

    Hi,
    on the search for another tool i found this one
    http://www.turbodata.ca/help/testdatageneratoroverview.htm
    Claudius

  • Performance test planning:  timing

    Hi folks,
    I'm wondering if anyone has suggestions or input on the timing of test planning specific to performance testing. In particular, how far in advance is it recommended that performance test planning would begin in reference to the intended test window(s)

    Hi Jack,
    My preference is to start planning for a performance test as soon as possible, even before there is an application to test.
    The first stage in planning should be an analysis of how the application will be used. This information can be gathered from a business plan if the application is new, or from site usage metrics from tools like Omniture or Web Trends if the application is presently deployed.
    From the business plan or metrics you can begin to work out the key transactions (most heavily used and most business critical), as well as planning how users will execute those transaction (what percentages, what think times, etc).
    Then as the test date gets closer and the application becomes available/stable/etc you can begin to flesh out the details of the plan. But the overall goals, analysis and high level planning can begin very early in the development cycle.
    CMason
    Senior Consultant - eLoadExpert
    Empirix

  • SAP Performance Testing - Manual or Automated?

    Our organization is attempting to develop a regular performance testing effort.  Everything wehave read points to using a tool, such as LoadRunner, to do performance testing.  However, we're just starting and simply want to baseline several transactions, jobs, programs, etc (less than 30 items).  We have tools to monitor the backend results and grab metrics, but no tools to automate the testing itself.  Does anyone do their performance testing manually?  What are some advantages to doing this?

    Hi Yogi,
    I think HP LoadRunner is one of the best tools for SAP performance testing. I did it for many years. It is now included with Solution Manager. Here is the link for for HP Mercury regarding performance testing.
    https://h10078.www1.hp.com/cda/hpms/display/main/hpms_content.jsp?zn=bto&cp=1-11-126_4000_100__
    Please check this site as well, it has lot of valuable information.
    http://www.wilsonmar.com/1loadrun.htm
    Regards, Nabi.

  • MSI Z97 GAMING 3 Review--Performance Testing

    After the previous hardware and software introduction, I believe Z97 GAMING 3 will meet gamers’ expectation.
     Z97 GAMING 3 integrated with Killer E2200 LAN, Audio Boost 2, M.2 interface and the normal array of connections,
    It is truly a good gaming motherboard. Could all these features offer great performance and a good experience?
    Today I will test the performance of Z97 GAMING 3 and how good it is.
    MSI Z97 GAMING 3 Testing
    My test platform is MSI Z97 GAMING 3, Intel ® Core i7-4770K and MSI GeForce GTX 750 graphics card. The test
    consists of two parts:
    CPU Performance: Super PI, PC Mark Vantage and Cinebench R11.5.
    GAMING Performance: 3DMARK 11, Evil 6 Benchmark and FFXI Benchmark.
    Test Part 1
    CPU : Intel Core i7-4770K @ 3.5 GHz
    CPU Cooler : Thermaltake TT-8085A
    Motherboard : MSI Z97 GAMING 3
    RAM : Corsair DDR 3-1600 4GB X 2
    PSU : Cooler Master 350W
    OS : Windows 7 64 bit
    Basic performance testing (CPU setting by default)
    CPU Mark Score : 679.
    Super PI 32M Result – 8m53.897s.
    Graphics Performance Testing:3DMark 11
    3DMark 11 is designed to measure  PC’s performance. It makes extensive use of all the new features in DirectX 11
    including Tessellation, Compute Shader and Multi-threading.
    Intel ® HD4600 iGPU in 3DMark 11 Basic mode testing, the results is X385 Score.
    Performance mode test score is P1511 .
    System Performance:PCMark Vantage
    PCMark Vantage is a PC analysis and benchmarking tool consisting of a mix of applications such as based and
    synthetic tests that measure system performance.
    From the test results, the score of Z97 GAMING 3 with Intel ® HD4600 iGPU is 11,946.
    MSI  GeForce GTX 750 Testing
    Test  Part 2
    CPU : Intel Core i7-4770K @ 3.5 GHz
    CPU Cooler : Thermaltake TT-8085A
    Motherboard : MSI Z97 GAMING 3
    Graphics Card:MSI GeForce GTX 750
    RAM : Corsair DDR 3-1600 4GB X 2
    PSU : Cooler Master 350W
    OS : Windows 7 64 bit
    Graphics Performance Testing:3DMark 11
    Z97 GAMING 3 with GeForce GTX 750 the test scores is X1653 in 3DMark 11 basic test mode, The performance
    mode test score is P5078.
    System Performance:PC Mark Vantage
    From the test results, Z97 GAMING 3 with GeForce GTX 750 scores 11,518.
    System Performance:Cinebench R11.5 
    Cinebench is the software developed by MAXON Cinema 4D. Cinebench could test CPU and GPU performance with
    different processes at the same time. For the CPU part, Cinebench test the CPU performance by displaying a HD 3D
    scene. For the GPU part, Cinebench test GPU performance based on OpenGL capacity.
    Main Processor Performance (CPU) - The test scenario uses all of your system's processing power to render a photorealistic
    3D scene. Graphics Card Performance (OpenGL) - This procedure uses a complex 3D scene depicting a car chase which
    measures the performance of your graphics card in OpenGL mode.
    In Cinebench R11.5 test, MSI Z97 GAMING 3 with GeForce GTX 750 multi-core test is 6.87pts; OpenGL score is 73.48 fps.
    Z97 GAMING 3 with HD 4600 and GeForce GTX 750 in the GAME Benchmark Test
    For game performance testing, I will use Resident Evil 6 and FFXI Benchmark with the same platform.
    Evil 6 Benchmark
    CPU: Core i7-4770K
    Game resolution setting: 1920X1080
    Other setting: Default
    In the Z97 GAMING 3 with Intel® HD4600 iGPU platform, score:1175 (Rank D)
    In the Z97 GAMING 3 with GeForce GTX 750 platform, score: 5874 (Rank A)
    I use Fraps tool to record FPS status during benchmark testing.The Z97 GAMING 3 with GeForce GTX 750 average
    FPS is 202. The Z97 GAMING 3 with Intel® HD4600 iGPU average FPS is 32.
    FFXIV Benchmark
    CPU: Core i7-4770K
    Game resolution setting: 1920X1080
    Other setting: Default
    The 1920X1080 resolution, Intel® HD4600 iGPU score is only 910.
    However, the GeForce GTX 750 testing score is 4167. According to the official classification system, the score
    between 3000 to 4499 means high performance.
    I use Fraps tool to recorded FPS status during benchmark testing.
    the GeForce GTX 750 average FPS is 111.  Intel® HD4600 iGPU average FPS is 19.
    Test Summary
    MSI Z97 GAMING 3 is not very expensive. It has many features which are specially designed for gaming experience
    and good performance of benchmarks. Even in 1920x1200 resolution and high quality display setting, Z97 GAMING 3
    with Intel Core i7-4770K and MSI GeForce GTX 750 can easily handle any kind of games. The FPS of this system is
    higher than 60 and users will enjoy no lag as gaming. It is really a good and afforadable chioce for gamers.

    Thx for the sharing, since there are not much reviews about Z97 GAMING 3. 

  • Inline performance testing

    Hello,
    I'm looking for a performance testing product that I can run inline with an antivirus product  to guage the impact on performance while a full scan is running. I've looked into software like PCMark7 and SiSandra but it's not quite what I'm looking for. Your suggestions are greatly appreciated!
    This topic first appeared in the Spiceworks Community

    Dear Customer,
    Testing application with load runner sometimes generates some undesirable results.
    This could be mainly due to algorithm you are following for load runner scripts. There might be issue of load runner inducing the sequence of actions without giving think time to application and hence further scripts might be failing.
    I would suggest, have some think time/delay in your load runner scripts and see if that helps.
    I hope it helps.
    Best Regards,
    Chetan

  • ANN: New XDK release for 9.0.2B and 9.0.1.1.0A available

    New XDK Releases for 9.0.2B Beta Version and 9.0.1.1.0A Procudtion Version are online at:
    http://technet.oracle.com/tech/xml/xdkhome.html
    What's new:
    - Oracle9i XDK for C and C++ released on Linux
    - Oracle TransX Utility Aids loading data and text.These release is part of the XDK for Java 9.0.2 Beta.
    - Oracle SOAP APIs added to the XDK for Java
    Support for the SOAP services have been added to the XDK for Java 9.0.1.1.0A Production.
    - XML Schema Processor for Java supports both LAX Mode and STRICT Mode Validation
    The XML Schema Processor for Java now supports both LAX Mode and STRICT
    Mode Validation. Compared with STRICT Mode where every element from the root on down is required to be defined and validated, LAX Mode Schema validation provides developers the ability to limit the XML Schema
    validation to the subsections of the XML document.
    This will simplify XML Schema definitions and the validation process by skipping the ignorable content. Furthermore, with LAX mode, developers can divide their XML documents into small fragments and carry out the XML Schema validation in a parallel or progressive way. This results, in a much more flexible and productive schema validation process.
    - SAX2 Extension support in the Java XML Parser
    The XML Parser for Java now supports two new handlers - LexicalHandler and DeclHandler.
    XML documents now can be parsed through Oracle XML Parser SAX APIs with full access
    to the DTD declarations and the lexical items like comments and CDATA sections.
    This ensures that the complete content model is preserved.
    - JAXP 1.1 supports is now added to the XDK for Java
    - New Differ Bean in the XDK for JavaBeans
    A new bean has been added the the JavaBean XDK which analyses the differences between to XML documents and outputs the XSL sytlesheet that will convert one into the
    other. This is extremely useful when converting an XML document retrieved from a SQL query into an xHTML page on the web.
    Stylesheets can now be automaticly created for use in XSQL pages.
    - XML Compression now supported in the Java XML Parser
    Now developers can take advantage of a compressed XML stream when serializing their DOM and SAX outputs. This new functionality significantly reduces the size without losing any information. Both DOM and SAX compressed streams are fully compatible and the SAX stream can be used to generate a
    corresponding DOM tree. These compressed streams can also be stored in Oracle as a CLOB for efficient storage and retrieval.

    bump up for east cost.

  • PO Release for PO Change - Release already performed!?

    I have a question regarding the following:
    I have a PO Release triggered whenever a change is made to a PO.
    So for example I change a PO and trigger a release strategy with one approver level.
    Then the workitem for release is sent to an approver who releases the PO.
    All fine so far.
    A change is made to the same PO again regarding a value change but this value change is not big enough to trigger a higher approval level than has already been released before.
    The workflow is triggered correctly and sending out a workitem for release of PO, but when this workitem is opened, the PO is already released for that level.
    There is a requirement that it should somehow reset the previously done release in such a case so a new release can begin.
    Is there a good way of doing this?
    One way I thought was to reset the release strategy in a user-exit for PO change, but it does not seem like the cleanest of solutions...

    If I understand correctly, on a change PO you want it to retrigger for an approval if the value change of hte PO is increased...
    Do you have a net value characterstic which is set to > $$$ ?... Example if you had it set to > $0.00 then any change to the net value will invoke a new release.
    Also
    Check the following: SPRO - MM - Purchase Order - Release Procedure for PO  - Define  Release Procedure for PO - Release indicator...
    If you have a number in the % value change field, then a small change to the value will NOT invoke the release... Example, if it was at 10%, and you made a change to a net value from $100 to $101, it would not invoke a release.

  • "Channel started but inactive" in RWB Monitor for SOAP sender channel

    Dear XI specialists,
    I have configured a communication scenario involving SOAP Adapter (sender) usage, but I'm facing the issue when testing the scenario. The problem is that none of messages are coming into the SOAP Adapter (checked via TCode SXI_MONITOR in XI) when using 3rd party SOAP-clients (XML Spy, soapUI) based on WSDL-file generated and exported from XI.
    I checked the sender communication channel for SOAP Adapter - it is activated in Integration Builder's Directory. SOAP Adapter itself is also started.
    I checked Runtime Workbench monitors and in the Communication Channels Monitor of the Components Monitor received the warning saying "Channel started but inactive" for my configured channel as well as for other channels working on SOAP Adapter.
    Activities which were performed (in the order in which they were performed and results tested):
    1. the configuration channel restart;
    2. the communication channel re-activation and further re-start;
    3. SOAP Adapter restart;
    4. XI system's Java instance restart.
    All mentioned activities didn't give any positive effect in part of this issue resolution.
    On the other hand, if I use test message functionality of the Runtime Workbench (Component Monitor), then messages are seen in SXI_MONITOR and expected response messages are also returned to XI from integrated business system.
    Would you please advise what can be checked else here and where the root of the problem may be detected?
    All your answers and feedback will be greatly appreciated and awarded accordingly.
    Many thanks in advance and my regards,
    Vadim

    Dear all,
    Thank you for your input! Unfortunately, the problem still exists after testings and checkings which you suggested.
    TO Seshagiri:
    I checked message monitor and found only several messages for the required sender/receiver services pair in the used namespace - all of them were successfully processed (these are messages which I sent by means of RWB Component Monitor's test message functionality).
    TO Sarvesh:
    Yes, sender agreement is configured properly. I configured it when configuring communication scenario in the Configuration Wizard and then manually re-checked its definition. Moreover, when performing Test Configuration in ID, I received positive results.
    TO Nallam Guna:
    I cleared Data Cache in Integration Builder, but this didn't help.
    TO Gujjeti:
    I'm on SAP NetWeaver 2004s PI 7.0. Service Pack level for PI is 13.
    TO Durga:
    Thank you for provided links! I have already had a look at them - but proposed solutions doesn't seem to be working in my situation.
    Thank you all once again for your prompt answers! I really appreciate your attempts in helping me!
    I would appreciate if you could share any further ideas on this issue.
    My best regards,
    Vadim

  • Proxy messgs on ECC in scheduled status - released for processing (PI 7.1)

    Hi guys,
    I have set up a scenario (ABAP Proxy(ECC)->(using WS adapter)PI7.1->SOAP) and the outbound messages on the ECC remain in the scheduled status (green flag) saying "Released for processing (WS)". Why? Must there be some processing be executed? Manual? Or job?
    Any help on this appreciated, Olian

    Hi Olian
    Try applying the solution in the SAP note #1129614 SP2: WSRM sequences process only the first 3 messages.
    Regards
    Mark

  • SOAP performance issue

    Hi,
    My problem is that the NE I am connecting to was exposing its interfaces as LDAP and now they are changing those interfaces as SOAP. We are getting a very high load and I feel that since LDAP is optimized for fast read-speeds, replacing the LDAP interfaces with Web Services would have performance implications since SOAP performs poorly compared to the LDAP because of transport inefficiencies & marshalling/unmarshalling overhead.
    Although the NE team says that they will be able to give response withing the defined SLA time (around 1 sec) but the what they are not understanding is that even if they given response withing 1 sec, Web services invocation operations would also be expensive. Also overhead on our end (client) would also include extracting the SOAP envelope and parsing the XML by engine. Further, XML data cannot be optimized much for all cases.
    Pls guide me if I am correct or wrong in my opinion.
    Thanks
    AA

    If I get it right, LDAP ist just a protocol for calling a directory service. The backend, is performing with the same throughput independent of the type of protocol.
    Our performance tests on our soa-servers showed, the total overhead, including network, logging, Application server stuff, xml parsing, is about 10%. We got this value testing with JMeter and logging statements inside of the code.
    CPU, Memory (incl garbage collecting) and networktraffic were very, very good, we didn't except some of these results (~40 samples / second -> 3% CPU on a Sun T5120 machine), most overhead was from the backends.
    I guess you're correct if you say, SOAP needs more ressources than LDAP, but this is less than you think (in my opinion, of course).
    You definetly should do some performance tests and compare the results with the ldap variant.
    regards
    slowfly

Maybe you are looking for