cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Create a Dynatrace problem using log data everytime it reaches 10 hits in 24h

Puche
Participant

Hello, community!!

We need to create a problem everytime a log on Grail value is >= to setup number. We have been able to create the graphical representation in DQL:

fetch logs, from: -24h, scanLimitGBytes:1, samplingRatio:1
| filter matchesValue(content, "*ranchdressing*") AND (offset1 >= "0.005" or offset2 >= "0.005")

But with traditional Metric Events we can only go up to 60 minutes (after converting the DQL into a metric)

Another problem is to convert the DQL above into a metric, with Openpipeline it won't accept the DQL as it is. We have tried several combinations, but we end up having to take the >= of in orther to get something, which is not useful.

With workflows we have the same problem, as we would need to convert, first, the DQL into an event.

 

Any ideas?

 

Thanks

13 REPLIES 13

dannemca
DynaMight Guru
DynaMight Guru

Have you tried to set up the Davis Anomaly Detection for that?

https://docs.dynatrace.com/docs/discover-dynatrace/platform/davis-ai/anomaly-detection/set-up-a-cust...

Wondering if the 24h as timeframe will be a problem here.

Site Reliability Engineer @ Kyndryl

@dannemca and @Puche , I believe the best approach here is using Site Reliability Guardian for that. You need to setup

  • a SRG which checks the logs
  • a simple workflow to trigger the SRG every 24 hours (or every hour based on your preference)
  • create an openpipeline for your SRG and have a data extraction rule in openpipeline for davis event to open a problem when SRG results are failed

This approach is effective and works well for evaluating any data, not just logs.

Certified Dynatrace Master | Alanata a.s., Slovakia, Dynatrace Master Partner

Hi @Julius_Loman,

Very nice solution!!! Thanks for sharing it!!!

Best regards,

János

Dynatrace Community RockStar 2024, Certified Dynatrace Professional

Indeed a Dynatrace Master!!! 👨‍🏫

 

Site Reliability Engineer @ Kyndryl

Puche
Participant

Hello!

@Julius_Loman, thanks for your answer but the same DQL that works with Dashboard or workflow, doesn't work with SRG. 

 

Puche_0-1755071270300.png

This is a handicap, we need to redo queries, instead of copy and paste. 

 

Any ideas?

 

Thanks

What do you want to evaluate? Count of such logs? In that case you just need to add summarize to your query.

Certified Dynatrace Master | Alanata a.s., Slovakia, Dynatrace Master Partner

PacoPorro
Dynatrace Leader
Dynatrace Leader

summarize count to provide a single value

Puche
Participant

Yes, counts of such logs, but only for values that are >= than 0.005 on either of two variable (offsets). And that every time it reaches 10 over 24h it creates a problem. 

We were expecting that if a DQL query works for a Dashboard or a workflow, it should work for other DQL request such as SRG, log metric extraction or Openpipeline. 

 

Thanks @Julius_Loman 

@Puche  yes it works, but for SRG needs a value - just like a metric. You can also do that by creating a log metric and evaluating that. 

In your case you probably want:

fetch logs
| filter matchesValue(content, "*ranchdressing*") AND (offset1 >= "0.005" or offset2 >= "0.005")
| summarize count()

This will output just a single number of log lines with the conditions above. I also removed the from, sampling, etc. as it will be handled by SRG.

Certified Dynatrace Master | Alanata a.s., Slovakia, Dynatrace Master Partner

Puche
Participant

Thank you very much, that works now to create the SRG. I'll keep on working on your other tips and I'll let you know the results.

Puche
Participant

The workflow with the SRG has worked and fulfilled the needs of the petitions.

But some other doubts have come across during this process. 

Why is necessary to create an SRG for the workflow instead of just using and execution of a DQL?

And another question is How can you use the SRG to create a pipeline in Openpipeline, as you mention in step 3?

Thank you @Julius_Loman 

 

@Puche, sure, it's possible to do that using workflow only. But workflows are, from my experience, too costly for that (260$/year for a single workflow in list prices) and they don't leave much historical track record for evaluations. I believe SRG is the way to do it.

SRG sends bizevents with the results and evaluations. You create an OpenPipeline pipepile for those bizevents, and there you can create a Davis event - essentially triggering a problem opening event.

Certified Dynatrace Master | Alanata a.s., Slovakia, Dynatrace Master Partner

Puche
Participant

Thank you for your help.

 

Best regards

Featured Posts