Load Test Results - time series request data for by URL in VS2013

I am trying to figure out how to export and then analyze the results of a load test, but after the test is over it seems I cannot find the data for each individual request by url. This data shows during the load test itself, but after it is over it seems
as if that data is no longer accessible and all I can find are totals. The data that I want is under the "Page response time" graph on the graphs window during the test. I know this is not the response time for every single request and is probably
averaged, but that would suffice for the calculations I want to make. 
I have looked in the database on my local machine (LoadTest2010, where all of the summary data is stored) and I cannot find the data I'm looking for. 
My goal is to plot (probably in excel) each request url against the user load and analyze the slope of the response time averages to determine which requests scale the worst (and best). During the load test I can see this data and get a visual idea but when
it ends I cannot seem to find it to export. 
A) Can this data be exported from within visual studio? Is there a setting required to make VS persist this data to the database? I have, from under Run Settings, the "Results" section "Timing Details Storage" set to "All individual
details" and the Storage Type set to "Database". 
B) If this data isn't available from within VS, is it in any of the tables in the LoadTest2010 database where all of the summary data is stored?
Thanks
Luke

Hi Luke,
Since the load test is used to
simulate many users accessing a server at the same time, it is mainly verify a wev server load stress.
As you said that you want to find the data
for each individual request by url, I know that generally we can analyze the url request from the Summary like the following screen shot.
>>I
have looked in the database on my local machine (LoadTest2010, where all of the summary data is stored) and I cannot find the data I'm looking for. 
I suggest you can try to add the
SQL Tracing Connect String in the Run Setting properties to trace the data.
Reference:
https://social.msdn.microsoft.com/Forums/en-US/74ff1c3e-cdc5-403a-b82f-66fbd36b1cc2/sql-server-tracing-in-visual-studio-load-test?forum=vstest
In addition, you can try to create an excel to analyze the load test result, for more information:
http://msdn.microsoft.com/en-us/library/dd997707.aspx
Hope it help you!
Best Regards,
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
Click
HERE to participate the survey.

Similar Messages

  • Not getting or clear about Load Test Result analysis

    Hello Team,
    I ran the load test VSTS 2012.
    i am getting below result:
    Overall Results
    Max User Load
    50
    Tests/Sec
    1.85
    Tests Failed
    3,288
    Avg. Test Time (sec)
    25.7
    Transactions/Sec
    0
    Avg. Transaction Time (sec)
    0
    Pages/Sec
    46.7
    Avg. Page Time (sec)
    0.26
    Requests/Sec
    221
    Requests Failed
    13,315
    Requests Cached Percentage
    43.2
    Avg. Response Time (sec)
    0.097
    Avg. Content Length (bytes)
    11,888
    I am not clear about avg. response time: 0.097 ---actually it is in seconds right??
    SO 0.097 means less than 1 seconds it it true???
    sometime i observe that avg. response time as 0.67 ..............again it is less than 1 seconds
    So how actually it is calculated in VSTS webtest 2012..............??
    because when i took the reading from fiddler or network capture ---------for the same scenario -------i observed it is taking 7-8 seconds .........here -------its showing 0.097 ............
    I understood ................here load is of 50 users..............but for single users also i observe avg. response time as 0.29 or 0.037 like that
    Very confused about this result
    Could you please tell me formula or detailed explanation?????????
    If i get details about all parameters then it would great ------------------
    Thanks in Advance

    Hi Mon_bk,
    As Jack has stated the response times are measured in secs. 
    If you think the response time is unrealistic have you tried doing a single test run directly from your script? What response time do you get? 
    Also how have you setup your users? Are they all cached enabled or 100% new users? You can check this if in the loadtest you click on the scenario name and then in the properties section there is a field called: "Percentage
    of new users" if this is 0 that means all your users have cached enabled and this could be reason why the response times are so low. 
    Kind Regards

  • OATS Load test is not able to scale for more than 10 users

    Hi,
    I am using OATS load test runner for performance testing Fusion based application on webcenter portal.I am not able to run OATS load tester sucessfully for more than 10 VU's.After that it gives a component not found issue or loop error.Please let me know if anybody faced similar issue and was able to resolve it.Is there some setting to be done?
    Thanks,
    Ritesh
    Edited by: user766882 on Aug 20, 2012 9:07 AM

    Hi Jean,
    Our application contains customised retail webcenter portal on which retail application is hosted.This portal application is not able to scale up more than 10 users.I had created portal application without the customised framework and then for a similar usecase it is able to scale upto more than 50 users but once it hits 100 VU mark it starts giving errors.If it were an issue with client tokens or login,I would have been able to have more than 1 user at a time.It seems to be an issue with custom framework developed on top of webcenter portal.I have faced a similar issue while testing for this app with jmeter also.I was able to test for the webcenter portal app without any customisation for more than 100 concurrent users. But I was not able to test for more than 1 user for customised retail webcenter portal
    Thanks,
    Ritesh
    Edited by: user766882 on Aug 21, 2012 8:25 AM
    Edited by: user766882 on Aug 21, 2012 8:33 AM

  • Time Series initialization dates with fiscal periods!

    Dear Experts
    Problem:
    I cannot initialize planning area.
    Period YYYY00X is invalid for periodicity P FYV
    Configuration
    Storage Buckets Profile: Week, Month, Quarter, Year, Post Period FYV all checked.
    Horizon
    Start: 24.02.2013
    End: 31.12.2104
    No Time stream defined.
    Fiscal Year Variant
    2012
    Month
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    Edate
    28
    25
    31
    28
    26
    30
    28
    25
    29
    27
    24
    31
    FP
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    2013
    Month
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    Edate
    26
    23
    30
    27
    25
    29
    27
    24
    28
    26
    23
    31
    FP
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    2014
    Month
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    Edate
    25
    22
    29
    26
    24
    28
    26
    23
    27
    25
    22
    31
    FP
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    2015
    Month
    1
    2
    4
    5
    5
    7
    8
    8
    10
    10
    11
    12
    Edate
    31
    28
    4
    2
    30
    4
    1
    29
    3
    31
    28
    31
    FP
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    Question
    What goddamn dates should I enter in the planning area initialization start and end to initialize for maximum duration given the settings above?
    I tried a few dozens but none is accepted. For a start I tried the same dates as in Horizon of storage bucket profile. But given the kind of error text I have I cannot decipher with tiny little mind what dates I am expected to enter in for time series creation.
    Thanks
    BS

    Thanks Mitesh,
    No its 2014
    Here is what worked.
    Storage Bucket Horizon
    Start: 24.02.2013
    End: 22.11.2014
    Time Series Initialization
    Start: 01.02.2013
    End Date: 31.12.2014
    The fiscal year variant is what I pasted above.
    I thought time series can only be initialized for a subset of period in the storage bucket profile !. This is my first experience of this kind where Initialization period is larger than the storage horizon. Is this the case ? I went by the book always and this is just a chance discovery by me.
    I was aware of the limitation by SAP notes on need to have one more year on either sides in fiscal year variant for it to be used for initialization and hence the range of dates I tried.
    Appreciate your comments on this.
    All dates in dd/mm/yyyy.
    Thanks
    BS

  • Compare time depednt master data for two key dates in a report?

    Hello all,
    We have a requirement where we want to see the difference in master data at particular key dates (time dependent master data).
    Like if employee has State = OH till yesterday (12/01/2008) and today it is changed to State =KY (12/02/2008 to 12/31/999) loaded in BW which is time dependent.
    So basically if you run the query for Key date 12/01or before you will see OH as state and if you run query for key date 12/02 or further you will see KY.
    We want to create a report where we want to list employees and compare in two columns basically the previous value and the current value. So input would be two key date variables and the report would look like
    Employee                  Key date 1(state)                  Key date 2 (state)
    Employee1                         OH                                  KY   
    Has anyone done this before, can someone help me doing this?
    Thanks,

    Hi ,
    For this scenario , Can you please create variabale ( From value -- to value ) in the key figure date .And by using the time period in the varaible u can get the employees for both the states ...
    Hope this helps out
    thanks
    PT

  • Unix Time Stamp to Date for display

    Post Author: merph
    CA Forum: Formula
    I need to convert a unix time stamp to a readable date for a report. Does anyone outthere have a good formula for this?
    Thanks
    Merph

    Post Author: V361
    CA Forum: Formula
    If you can pull a couple of unix epoch times from the database, then let's run these formulae against them and see if it brings up the correct date and time separately.  These formula are pulled from another post by UberWolfe.
    ctime(( - 18000) / 86400+25569)
    cdate(( - 18000) / 86400+25569)
    The 18000 is for to convert for timezones for example 18000 is for     -5 GMT.

  • HT203177 Can time machine backup data for newer OS?

    I had upgraded my Macbook to OS X 10.7
    Can I continue to backup data for Time Machine used to backup data for the same Macbook with previous version of OS?

    Yes. Time Machine will continue to backup your system as it should.

  • Unable Update profile due request data for "state" field

    Anyone else having a problem updating their profile? Once I log-in the site send me page to update my profile. Once I try to click update, it comes back with an error message requesting the "state" field be complete. But there is no "state" field provided or it is not visible.
    Who ever create the profile page needs to check their work.

    Thank you all for responding.
    @ Sushant - We definitly have data in that PSA table linked to the data source (8ZECPCO22 , where ZECPCO22 is the direct update DSO) and this data is what we want to delete through process chains. As mentioned earlier we are able to manually delete these PSA requests via manage of the data source.
    Few more detail information below:
    ZECPCO22 is the direct update DSO and as per the data flow the data is further loaded from this DSO (data source -8ZECPCO22) into a master data infoobject or other targets via a infopackage and with processing type "PSA and then into Data target in series".
    So looking at the infopackage one can definitly confirm that the PSA table is used and since the DSO is a direct update we cant have change log table.
    However we are unable to delete these PSA requests through process chains using a "Delete PSA request" nor "Delete Change log Request" process types.
    Any suggestion is appreciated.
    Warm Regards
    Vikram S Reddy

  • Request data for HR ABAP.

    Dear All,
    I would like to take HR ABAP course from good Institute, so before taking this course what are the basic thing I should know. Could you please give me some date. If possible some Information also.
    Regards,
    Chandra.

    Hi,
    HR deals with the INFOTYPES which are similar to Tables in General ABAP.
    There are different ways of fetching data from these infotypes.
    There are different areas in HR LIKE Personal Admn, Orgn Management, Benefits, Time amangement, Event Management, Payroll etc
    Infotypes for these areas are different from one another area.
    storing of records data in each type of area is different
    LDBS like PNP are used in HR programing.
    Instead of Select.. we use some ROUTINES and PROVIDE..ENDPROVIDE.. etc
    and in the case of Pay roll we use Clusters and we Import and Export them for data fetching.
    On the whole Normal ABAP is different from HR abap.
    For Personal Admn the Infotypes start with PA0000 to PA1999
    Time Related Infotypes start with PA2000 to PA2999.
    Orgn related Infotypes start with HRP1000 to HRP1999.
    All custom developed infotypes stsrat with PA9000 onwards.
    In payroll processing we use Clusters like PCL1,2,3 and 4.
    Instead of Select query we use PROVIDE and ENDPROVIDE..
    You have to assign a Logical Database in the attributes PNP.
    Go through the SAp doc for HR programming and start doing.
    http://www.sapdevelopment.co.uk/hr/hrhome.htm
    See:
    http://help.sap.com/saphelp_46c/helpdata/en/4f/d5268a575e11d189270000e8322f96/content.htm
    http://help.sap.com/printdocu/core/Print46c/en/data/pdf/PAPA/PAPA.pdf
    http://help.sap.com/printdocu/core/Print46c/en/data/pdf/PAPD/PAPD.pdf
    http://help.sap.com/printdocu/core/Print46c/en/data/pdf/PYINT/PYINT_BASICS.pdf
    http://www.atomhr.com/training/Technical_Topics_in_HR.htm
    http://www.planetsap.com/hr_abap_main_page.htm
    you can see some Standard Program examples in this one..
    http://www.sapdevelopment.co.uk/programs/programshr.htm
    http://searchsap.techtarget.com/originalContent/0,289142,sid21_gci1030179,00.html?Offer=SAlgwn12604#Certification
    These are the FAQ's that might helps you
    http://www.sap-img.com/human/hr-faq.htm
    http://www.sapgenie.com/faq/hr.htm
    Reward Points if useful

  • Can Time capsule  backup data for PC?

    Can Time Capsule also backup data from PC automatically?

    From the box:
    System Requirements
    For Time Machine backup:
    Mac with OS X Leopard or later
    For set-up and administration:
    Mac computer with Mac OS X v.10.5.7 or later and Ethernet or wireless networking capability
    PC with Windows XP (SP3) or Windows Vista (SP2) or Windows 7 (SP1) or later and Ethernet or wireless networking capability
    For wireless client access:
    Mac with AirPort or AirPort Extreme wireless capability
    PC with 802.11a/b/g/n
    For shared hard drive:
    Mac with Mac OS X v10.4.8 or later
    PC with Windows XP (SP2) or Windows Vista; Bonjour for Windows included with AirPort Utility available as download via Software Update.
    For shared printing:
    USB printer
    Mac with Mac OS X v10.2.7 or later
    PC with Windows XP (SP2) or Windows Vista; Bonjour for Windows included with AirPort Utility available as download via Software Update.

  • Error importing result file from Visual Studio Online load test - Incorrect Syntax near ')'

    We are seeing the exact same issue as reported in:
    https://social.msdn.microsoft.com/Forums/vstudio/en-US/4a338992-1a46-47ae-9f13-10a53a6264b5/error-downloading-load-test-report-from-visual-studio-service-load-test-incorrect-syntax-near?forum=TFService#cb43383b-c60d-4dcc-a353-352ac6908b35
    When clicking on the "Download report" after running a load test using Visual Studio Online an error occurs.  The download completes, but the import fails at 15% with the following:
    This is happening for 2 different users on 2 different systems.  
    Any assistance in resolving this would be greatly appreciated!
    Let me know if additional information is needed.
    Thanks!

    Hi,
    We are researching your issue. Please clarify my question with detailed information:
    1. Do you get the error in your first post when you click ‘Download report’ after you run load test from
    Visual Studio 2013 on
    team foundation service?
    2 You said that ‘I am trying to import the test results into a Visual Studio 2012 (Update 3) on-prem instance’, what and where is
    your load test result repository? A SQL Server database or the default database:LoadTest2010 in SQL Express on the same machine with Visual Studio 2013?
    3 Whether you specify the load test result repository under Load Test->Manage Test Controller?
    4. What error do you get? Please provide us more detailed information about the error message or share us with a screenshot.
    5 Whether you can run the load test successfully locally and get the load test result with the same load test result database?
    6 You said that your co-worker can work fine. Please check whether your account has enough permissions for load test result database and download load test result from TFS to load test result database based on the account
    permission of your co-work.
    Thanks,
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • How to find out the top 10 data loads based on time duration?.

    Hi,
    We are working on performance tuning. so we want to find out the top 10 loads which had run long time in date wise.
    We need the load start time and end time of the load process, Infosource and datatarget name.
    There are nearly 1000 loads, So it is very difficult to collect the load's timings in RSMO.
    Is there any another alternative to collect the top 10 loads based on time duration in date wise?
    Thanks & Regards,
    Raju

    Hi Gangaraju,
    You can install BI Statistics for getting these type of data.
    Or you check in RSDDSTAT or RSMDATASTATE_EXT or  table for the Load process time.
    Or goto t-code ST13 to get detailed analysis of a Process chain for a given period.
    Hope this helps.
    Regards,
    Ravi Kanth

  • FileSaveService - test results requested

    This is a follow-on from the thread
    'FileSaveService hides but enforces the file type/xtn'..
    <http://forum.java.sun.com/thread.jspa?threadID=5234123>
    Since my other post on this matter only asked for explanations, I thought I'd start this one purely for test results. See this page for details..
    <http://www.physci.org/test/jws/fss1/>
    2 Dukes to anyone that can provide an answer to both of the first two questions shown here..
    <http://www.physci.org/test/jws/fss1/#question>
    The 'Launch Test' button is here.
    <http://www.physci.org/test/jws/fss1/#launch>

    Yes, this is a bug.
    <http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=6646622>
    If you have an interest, please vote for the bug.

  • Data not uploading in Time dependent Master data Infoobject

    Hello All,
    I have a master data infoobject for HR entity and have to load data from PSA to that info object.
    The HR entity infoobject already have sone data like below:
    HR Entity
    Version
    Date from
    Date To
    x
    A
    01.07.2013
    31.12.9999
    x
    A
    19.04.2013
    30.06.2013
    x
    A
    01.09.2012
    18.04.2013
    x
    A
    01.01.2012
    31.08.2012
    x
    A
    01.01.1000
    31.12.2011
    Now the data in PSA is as follows:
    HR Entity
    Start Date
    End Date
    X
    01.01.2012
    18.12.2013
    Once I loaded this data to the infoobject, i can not see this value which is the latest value of this HR entity.
    Can somebody please explain how the data gets loaded in the time dependent master data infoobject and why this entry is not getting loaded in the info object.
    Regards
    RK

    Hi,
    did you activate master data after your load?
    You can check also version 'M' records and see if your record is there.
    The load went green?
    The problem is, that your entry overlaps all exisitng time intervals, which can't be deleted or merged as there may be dependent transactional data. You have first to delete the transactional data for this entity.
    Then you can delete the time-dependent data and reoload it from your PSA.
    BW will build then correct time intervals.
    The easiest is to change the time interval in PSA, see example below:
    At the moment the time interval is not accepted. But you can add time intervalls before 31.12.2011 and after 01.07.2013, Then system will create remaiing time intervals, e.g. your new record is:
    HR Entity
    Start Date
    End Date
    X
    01.08.2013
    18.12.2013
    Result will be:
    HR Entity
    Version
    Date from
    Date To
    x
    A
    19.12.2013
    31.12.9999
    x
    A
    01.08.2013
    18.12.2013
    x
    A
    01.07.2013
    31.07.2013
    Regards, Jürgen

  • Remove test run from Load Test Manager in Visual Studio Online Load Testing

    I have been using the Visual Studio Online Azure load testing for a while now, and I have a number of test runs that I would like to remove.  I am not referring to my local Load Test Results Store (on-premise SQL DB), as I can remove test runs no problem.
     I mean, how to remove them from "the cloud" so we can no longer re-download the test results.  

    Hi David,
    As far as I know, it's not supported for Visual Studio Online to run load tests for solutions hosted on GitHub. You can submit a user voice
    here.
    Best regards,
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

Maybe you are looking for

  • ColdFusion 11 wsconfig.exe -- Error while configuring connector for IIS.

    I ran into a problem with my ColdFusion 11 64-bit installation/migration on a Windows Server 2012 R2/IIS 8.5.  I was following the ColdFusion 11 Lockdown Guide - Pete Freitag (very helpful) and I ran into problems on page 35 "Run the ColdFusion Web S

  • My iPhone 3GS died randomly last night and I can not get it to turn on. Suggestions?

        I was listening to music last night and I paused the music on my headphones. I then decided I was going to listen to the rest and pressed play on my headphones and nothing was happening. The battery percentage was 85%. I know my phone sometimes h

  • HP 2710p + Kingston SSD SMS200S3/120G + Sintech adapter PA6007 - No detection!!

    Can anybody hepl me with my problem?  I exchanged my 100Gb Toshiba MK1011GAH for the Kingston SMS200S3/120G with the Sintech adapter PA6007. But my HP 2710P PC tablets doesnot detect the SSD. The HP 2710P BIOS software version 1.38  made it impossibl

  • Crystal Reports 11g Performance Issue

    We just upgraded our database to Oracle 11g from Oracle 10g and we are having significant performance issues with Crystal Reports. Our DEV and TEST environments are on 11g and are very slow to connect to the database and attach to specific tables.  I

  • AJAX- ODP LOV Question

    I have an ODP using the below query with an order by for rec in ( select "CCC_DDD_CCC" as "ID", "CAMPUS_NAME" as "NAME" from "LS_CAMPUS" where "CCC_DDD" = :ODP_ITEM_VALUE order by "NAME") loop htp.prn('<option value="' || rec.id || '">' || HTF.ESCAPE