A Jmeter Test Plan must have listener to showcase the result of performance test execution.
Listeners capture the response coming back from Server while Jmeter runs and showcase in the form of – tree, tables, graphs and log files.
It also allows you to save the result in a file for future reference. There are many types of listeners Jmeter provides. Some of them are: Summary Report, Aggregate Report, Aggregate Graph, View Results Tree, View Results in Table etc.
Here is the detailed understanding of each parameter in Summary report.
By referring to the figure:
Label: It is the name/URL for the specific HTTP(s) Request. If you have selected “Include group name in label?” option then the name of the Thread Group is applied as the prefix to each label.
Samples: This indicates the number of virtual users per request.
Average: It is the average time taken by all the samples to execute specific label. In our case, the average time for Label 1 is 942 milliseconds & total average time is 584 milliseconds.
Min: The shortest time taken by a sample for specific label. If we look at Min value for Label 1 then, out of 20 samples shortest response time one of the sample had was 584 milliseconds.
Max: The longest time taken by a sample for specific label. If we look at Max value for Label 1 then, out of 20 samples longest response time one of the sample had was 2867 milliseconds.
Std. Dev.: This shows the set of exceptional cases which were deviating from the average value of sample response time. The lesser this value more consistent the data. Standard deviation should be less than or equal to half of the average time for a label.
Error%: Percentage of Failed requests per Label.
Throughput: Throughput is the number of request that are processed per time unit(seconds, minutes, hours) by the server. This time is calculated from the start of first sample to the end of the last sample. Larger throughput is better.
endTime = lastSampleStartTime + lastSampleLoadTime
startTime = firstSampleStartTime
converstion = unit time conversion value
Throughput = Numrequests / ((endTime - startTime)*conversion)
KB/Sec: This indicates the amount of data downloaded from server during the performance test execution. In short, it is the Throughput measured in Kilobytes per second.
Latency is a difference between time when request was sent and time when response has started to be received.
Response time (= Sample time = Load time = Processing Time + Latency Time) is a difference between time when request was sent and time when response has been fully received.
Comments
Post a Comment