12 Aug 2025 02:44 PM
Hello, community!!
We need to create a problem everytime a log on Grail value is >= to setup number. We have been able to create the graphical representation in DQL:
fetch logs, from: -24h, scanLimitGBytes:1, samplingRatio:1
| filter matchesValue(content, "*ranchdressing*") AND (offset1 >= "0.005" or offset2 >= "0.005")
But with traditional Metric Events we can only go up to 60 minutes (after converting the DQL into a metric)
Another problem is to convert the DQL above into a metric, with Openpipeline it won't accept the DQL as it is. We have tried several combinations, but we end up having to take the >= of in orther to get something, which is not useful.
With workflows we have the same problem, as we would need to convert, first, the DQL into an event.
Any ideas?
Thanks
Solved! Go to Solution.
12 Aug 2025 08:00 PM
Have you tried to set up the Davis Anomaly Detection for that?
Wondering if the 24h as timeframe will be a problem here.
12 Aug 2025 09:07 PM
@dannemca and @Puche , I believe the best approach here is using Site Reliability Guardian for that. You need to setup
This approach is effective and works well for evaluating any data, not just logs.
12 Aug 2025 10:07 PM
Hi @Julius_Loman,
Very nice solution!!! Thanks for sharing it!!!
Best regards,
János
13 Aug 2025 01:19 PM
Indeed a Dynatrace Master!!! 👨🏫
13 Aug 2025 08:49 AM
Hello!
@Julius_Loman, thanks for your answer but the same DQL that works with Dashboard or workflow, doesn't work with SRG.
This is a handicap, we need to redo queries, instead of copy and paste.
Any ideas?
Thanks
13 Aug 2025 11:14 AM
What do you want to evaluate? Count of such logs? In that case you just need to add summarize to your query.
18 Aug 2025 01:29 PM
summarize count to provide a single value
13 Aug 2025 01:31 PM
Yes, counts of such logs, but only for values that are >= than 0.005 on either of two variable (offsets). And that every time it reaches 10 over 24h it creates a problem.
We were expecting that if a DQL query works for a Dashboard or a workflow, it should work for other DQL request such as SRG, log metric extraction or Openpipeline.
Thanks @Julius_Loman
13 Aug 2025 01:55 PM
@Puche yes it works, but for SRG needs a value - just like a metric. You can also do that by creating a log metric and evaluating that.
In your case you probably want:
fetch logs
| filter matchesValue(content, "*ranchdressing*") AND (offset1 >= "0.005" or offset2 >= "0.005")
| summarize count()
This will output just a single number of log lines with the conditions above. I also removed the from, sampling, etc. as it will be handled by SRG.
13 Aug 2025 03:05 PM
Thank you very much, that works now to create the SRG. I'll keep on working on your other tips and I'll let you know the results.
18 Aug 2025 11:55 AM
The workflow with the SRG has worked and fulfilled the needs of the petitions.
But some other doubts have come across during this process.
Why is necessary to create an SRG for the workflow instead of just using and execution of a DQL?
And another question is How can you use the SRG to create a pipeline in Openpipeline, as you mention in step 3?
Thank you @Julius_Loman
18 Aug 2025 02:20 PM
@Puche, sure, it's possible to do that using workflow only. But workflows are, from my experience, too costly for that (260$/year for a single workflow in list prices) and they don't leave much historical track record for evaluations. I believe SRG is the way to do it.
SRG sends bizevents with the results and evaluations. You create an OpenPipeline pipepile for those bizevents, and there you can create a Davis event - essentially triggering a problem opening event.
19 Aug 2025 12:03 PM
Thank you for your help.
Best regards