Showing results for 
Show  only  | Search instead for 
Did you mean: 

Value of Average Changes as Resolution changes, why?

As you can see, the same graph, the same time frame (last 7 days), just that 1st one is daily resolution while 2nd one is using hourly resolution, but the average is different. Anybody knows why?


The summary data presented on the chart is calculated based on the data presented on the chart. That is why when your resolution changes the average values of operation time changes in each "time bucket". The reason for that is that the operation time is the weighted average of operation time against number of operations.

Let me illustrate that. In the image below there are operation times for the same period (see report (demouser / demouser123) ) but with different resolutions; the smallest table simply shows the average value for the given time without split into time buckets.

Now, I exported the data into spreadsheet and did some calculations below. See how average of averages differs from the correct value being the "sum of operation time" divided by "sum of operations". And this is the value you see in the smallest table, where there is is no split by time.

How to fix your chart? Since DC RUM 2017 you can have multiple benchmarks and can attach additional data with the same filters and metrics, but without time dimension. See the aforementioned reported for details: