Showing results for 
Show  only  | Search instead for 
Did you mean: 

This product reached the end of support date on March 31, 2021.

BT Violation Detection Settings - Relative versus Absolute


In the BT configuration Base Line Settings "Violation Detection Settings..." for response times, there is a setting for Relative Deviation (by %) and for Absolute Deviation (by time). Does one dominate, or do either of them trigger a violation?

I have a monitored application that handles web service requests. Most for these are simple data queries with a rapid turn around in the sub-millisecond range. But periodically there are requests for big updates that take as much as 12 seconds to complete. With the overall average time being less than 1 millisecond, a couple of big requests trigger a violation. If I set the absolute deviation to 1,200 milliseconds, do I also need to set the Relative deviation to be perhaps 2400% so the occasional, but expected, long requests done trigger an alert?

So the question is: want is the relationship between the two types of deviation? Does the most restrictive always dominate (probably the Relative %)? And if so, is there some way to only use the Absolute?



I may see the solution to this application's needs. In the Application Overview page for any of the defined applications, in the associated Business Transaction strip there is a gear symbol beside the BT name. One of the options is to open the "Configure Baseline" dialog where you can opt to use a static baseline of a static response time in milliseconds. I assume that overrides the settings in the Business Transaction "Violation Detection Settings...".

This setting will set a static baseline to override the dynamically-updated baseline. Then, the settings in the BT will determine how the baseline deviations are treated for both cases.

In the BT settings:

  • The setting for "Relative Deviation %" determines when the dynamic baseline is considered in violation. Since it changes daily, this is set as a relative percentage.
  • The setting for "Absolute Deviation" will determine when the absolute baseline is considered in violation, if you specified this setting. Since this is static, an absolute value is used.

If you specify a static baseline, it will override the dynamic baseline from being used.

Dynatrace Pro
Dynatrace Pro

Hi Steve,

the higher value will be the relevant one. As an example, take response time. Let's say you configure an absolute deviation of 200 ms and a relative deviation of 100%. If your baseline is 100ms, a 250ms response time would not be considered a significant violation, 350 ms would.

If your baseline is 300 ms, 550 ms wouldn't be considered a significant violation, but 650 ms would.

The formula for the value of comparison is basically something like max(baseline+abs, baseline*(1+rel)).

Note that the actual values depend on the metric, different statistical methods are used for the different baselines.

If setting this on the BT level isn't flexible enough, you have the option of setting similar limits per BT splitting when you're in the respective application overview subscreen. This would only make sense if your outliers fall into a specific splitting, though.

Or you fix your outliers to be faster 😉 Best regards,



Thanks Peter. I especially like your explanation in the form of an math expression.