Showing results for 
Show  only  | Search instead for 
Did you mean: 

This product reached the end of support date on March 31, 2021.

reliability of data with poor quality traffic



We have a customer whose quality traffic is poor. They have peaks of 97% gape rate, 30% duplicated traffic and one of the machines has 72% two-way-loss rate. One of the AMDs are dropping packets too.

We are worried about the reliability of the data; we mean: the information that DCRUM shows related to performance, availability, responses time, are real? or that information is not true due to the poor quality of the traffic?

We think poor quality traffic affect to the quantity of the data, not to the reliability; we mean: if we have high gap rate we'll see less request than we estimate. But the availability, responses time, etc. should be the expected. Is this true?

Thanks in advance.




Dynatrace Pro
Dynatrace Pro

Those measurements that were successfully completed with no errors accompanying them - will be correct. With an exception of FDI - reasons for slowness may bee reported incorrectly because (falsely reported) poor network quality may affect FDI reasoning algorithms. And of course excluding the network quality metrics.

In the other words, it will be be an uncontrolled sample of real transaction load, Therefore I'd say that despite correctness of the response time measurements (and thus performance), thy can't be relied on. Availability will be affected by falsely reported TCP connectivity issues

Long story short, there's no point in discussing reliability of such an uncontrolled sample of measurements.