03 Feb 2025 06:57 AM
Hi Dynatrace Community,
As a newbie navigating Dynatrace, I'm on a mission to create great dashboards from Kong API Gateway logs. Things are going well so far, OpenPipeline makes it pretty smooth to generate metrics from the logs and visualize them on dashboards. However, I've hit a couple of bumps on this journey.
First what works with processing in OpenPipeline:
Adding fields like status_count: 1 and dt.source_entity: HOST-***
Using DQL for transformations like:
fieldsAdd upstream_status_code = if(upstream_status == "-", "499", else:substring(upstream_status, from:0, to:3))
The Issues are:
1️) Missing Metric Mystery:
I have a DQL statement that works perfectly when testing with "Run sample data." However, despite waiting patiently for days, the metric doesn’t show up anywhere. Here's the snippet:
2️) Array Field Extraction Troubles:
I'm trying to extract the IP from the first object in the tries array. I've tried various combinations of parse try, "JSON:tryJson" and JSON_ARRAY: without success.
Current Metrics field extraction is status_count
With dimensions:
I’ve attached a sample JSON for reference. If you have any insights or DQL wizardry up your sleeve, I’d love to hear them.
Thanks in advance for your help, and for sympathizing with a newbie learning the ropes! 😊
Cheers,
Erwin
Solved! Go to Solution.
03 Feb 2025 07:18 PM
Hi Erwin.
I created a sample record using the JSON you have provided. By the looks, that is valid JSON meaning you are able to add a simple "|parse content, "JSON:j" at the end of the DQL query. This should end up looking something like this.
Hope this helps!
04 Feb 2025 05:29 AM
Hi Keegan,
Thanks for checking, and confirming the validity of the json. Indeed I'm parsing the json, and with the processing instruction
fieldsAdd upstream_status_code = if(upstream_status == "-", "499", else:substring(upstream_status, from:0, to:3))
The json is parsed implicitly, with the following instruction I explicitly parse the json:
parse latencies, "JSON:latenciesJson"
| fieldsFlatten latenciesJson, prefix: "lat."
| fieldsAdd upstream_latency = if(toDouble(lat.proxy) < 1000, ceil(toDouble(lat.proxy) / 100) * 100, else:ceil(toDouble(lat.proxy) / 1000) * 1000)
And in OpenPipeline this looks good, however the metric is afterwards not available when querying the metric, like in notebook it should be available to select:
04 Feb 2025 07:21 AM
OK, found it, solution is easier then thought 🤣
When not parsing JSON explicitly it's parsed implicitly, with the following DQL
fieldsAdd upstream_latency = if(toDouble(latencies.proxy) < 1000, ceil(toDouble(latencies.proxy) / 100) * 100, else:ceil(toDouble(latencies.proxy) / 1000) * 1000)
The "Run sample data" in open pipeline shows null, however, the metric upstream_latency is available in notebook/dashboard, my mission to create great dashboards from Kong API Gateway logs continues...
06 Feb 2025 08:41 AM
Final addition to this question, the above solution is the answer on the first question, the second question was how to get metrics from the first object in the tries array.
The answer to this is:
parse arrayFirst(tries),"JSON:my_try"
| fieldsFlatten my_try, prefix: "try."
| fieldsAdd upstream_system = concat( try.ip, ":", toString(try.port) )
Hope this answer helps future similar questions 😀