In my 1 system profile, I have 3 applications, 1 of which is the main application with 90% of the total traffic coming to this system profile. One of the other applications experiences high failure rate because of the splitting "/robots.txt" (google or yahoo search bots), which i see in the business transactions view in the monitoring tab. Basically all of the robots.txt requests fail and many times bring up the total failure rate of that application close 80-90%, which in turn affects the overall failure rate of the whole system profile to more than the default/ideal 3% threshold. Consequently I get too many high failure rate email alerts which end up being false positives when i see that total failure rate of my main application is just 1%.
What i am looking for is a way to switch off the contribution of failure rate of one of the splittings/web request/URI (/robots.txt in mycase) towards the total failure rate of the application that it belongs to.
And/Or a way to delete/ignore a particular web request from the list of monitored web page requests.
Thanks, that worked! Well, partially..
I am no longer seeing high failure rate and that robot URI in the dynatrace monitoring tab, but i am still getting the high overall failed transaction rate email alerts. Email says overall failure rate above 3%, but ui see the application is below 1% in monitoring tab. The profile is still set to alert if avg failure rate goes beyond 3% for more than 5min. I even checked the errors dashboard, didn't find robots.txt there, and made a chart for failed % for that application and it didn't go near 2%.
Let me know what you think of this. I'll also open a ticket for this.
This is a builtin incident and we don't know what kind of aggregation has been used to trigger the message because each aggregation has a different way to calculate.
I guess you can also ask to support.