11 Jun 2024 05:45 PM
Hello Community,
I need to create a metric from a value using the countDistinct function and to save that value over time, the DQL query is the following:
fetch logs
| FILTER matchesValue(k8s.container.name, "onboarding-service") AND matchesPhrase(content, "\"result\":\"ok\",\"step\":\"cvu\"") AND matchesPhrase(content, "triggering next step with state machine")
| summarize countDistinct(client.id), alias: CvuCreadoPorCliente
but metric extraction only supports https://docs.dynatrace.com/docs/shortlink/lma-log-processing-matcher
any other idea to do this?
Thanks
15 Jul 2024 09:37 AM
any update please for this or any summarization in log processing (some example etc)?
15 Jul 2024 08:48 PM
I can imagine some possible solutions (or rather workarounds)....
What is the cardinality of client.id? How many of them are active in the same moment (e.g. in 1 minute)? What is the desired time resolution for this metric (does it have to be 1 minute or maybe 1h is good enough?)
16 Jul 2024 07:49 AM
if you are thinking about recreating timestamps based on interval and timespan it is too crazy to be used for dynamic dashboards
16 Jul 2024 05:05 PM
Not thinking about it at all....
But answering your question regarding examples for summarization in log processing:
https://docs.dynatrace.com/docs/observe-and-explore/logs/lma-use-cases/lma-e2e-create-log-metric#lma...
or if you use OpenPipeline already:
https://docs.dynatrace.com/docs/platform/openpipeline/use-cases/tutorial-log-processing-pipeline
My question was toward: how we can use supported summarizations (sum or count) to get data with reduced volume enough but still detailed enough to calculate countDistinct
07 Aug 2025 04:25 AM
Hi there!
We’re working on a metric to count distinct users per page. We initially tried using UserId and Page.name as dimensions, but ran into issues with high cardinality.
Would you be open to sharing any ideas or suggestions on how we might approach this more efficiently?
16 Oct 2024 01:50 PM
I have a similar challenge. I would like to create a Metric for "SynchronizationTime" it is the sum of
(ResponseTimestamp - EventTimestamp)/1000
My DQL running directly on the Logs looks like this:
fetch logs
| filter query.name == "IntegrationLog"
| fieldsAdd Synctime = (c_ResponseTimestamp__c-c_EventTimestamp__c)/1000