My scenario - To compare web requests response time between two UAT builds. Goal is to identify major change in code that cause significant difference in new UAT build.
I am using Visual Studio with added header "x-dynaTrace" which filters out my testing cases.
However comparision between build always shows variance between +/5% to +/-120% which is caused by database calls alone(shown by purepath comparision).
Is there better approach to compare test results ?
If you're just looking to compare the web request time between two tests, you could chart the avg response time of these web requests, making one chart for the time frame of each test.
Then, change the axis for each chart to reflect the same number and you should be able to quickly visualize and compare the response time of each test over time.
You should be able to generate something similar to what I have built below (this uses a dual axis chart so that I can visualize other metrics as well).