Showing results for 
Show  only  | Search instead for 
Did you mean: 

Explanation of apdex value changes with increase/decrease in resolution



I'm looking for an explanation of how apdex values are obtained when the resolution changes. 

For example,

  1. I call the metrics api using a resolution of 10 minutes and retain the results.
  2. I make a secondary call using the same metric and parameters - only the resolution is changed to 30 minutes.
  3. What is the origin of the values obtained for the 30 minute resolution?  Are they averages of interim points that were obtained at 10 minute resolution?

I've attached a screen shot (below) as an example of what I'm comparing.  In this example the value at 10:30am at 30 minute resolution (.63) turns out to be a rounded average of the interim points leading up to the same timestamp at 10 minute resolution - But is this just a coincidence?  Would like a definitive explanation.







DynaMight Guru
DynaMight Guru


I have noticed discrepancies like these in the past. The way Apdex is calculated in Data Explorer is different then it is calculated in Applications. You might also see differences when comparing both. You will also see differences if you play with dimensions.

I would say that you submit a support ticket, so someone from there can explain it better, or even help you get around it. I believe Grail will help in uniformizing these different data values in the end, but in the meantime be careful how you present Apdex values from Data Explorer...

Antonio Sousa

Featured Posts