16 Jun 2023 11:38 PM - last edited on 19 Jun 2023 08:02 AM by MaciejNeumann
Hello,
I'm looking for an explanation of how apdex values are obtained when the resolution changes.
For example,
I've attached a screen shot (below) as an example of what I'm comparing. In this example the value at 10:30am at 30 minute resolution (.63) turns out to be a rounded average of the interim points leading up to the same timestamp at 10 minute resolution - But is this just a coincidence? Would like a definitive explanation.
17 Jun 2023 12:36 AM
I have noticed discrepancies like these in the past. The way Apdex is calculated in Data Explorer is different then it is calculated in Applications. You might also see differences when comparing both. You will also see differences if you play with dimensions.
I would say that you submit a support ticket, so someone from there can explain it better, or even help you get around it. I believe Grail will help in uniformizing these different data values in the end, but in the meantime be careful how you present Apdex values from Data Explorer...