cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Help Needed with Extracting Metrics from Kong API Gateway Logs

ErwinB
Visitor

Hi Dynatrace Community,

As a newbie navigating Dynatrace, I'm on a mission to create great dashboards from Kong API Gateway logs. Things are going well so far, OpenPipeline makes it pretty smooth to generate metrics from the logs and visualize them on dashboards. However, I've hit a couple of bumps on this journey.

First what works with processing in OpenPipeline:
Adding fields like status_count: 1 and dt.source_entity: HOST-***
Using DQL for transformations like:
fieldsAdd upstream_status_code = if(upstream_status == "-", "499", else:substring(upstream_status, from:0, to:3))

The Issues are:
1️) Missing Metric Mystery:
I have a DQL statement that works perfectly when testing with "Run sample data." However, despite waiting patiently for days, the metric doesn’t show up anywhere. Here's the snippet:

parse latencies, "JSON:latenciesJson"
| fieldsFlatten latenciesJson, prefix: "lat."
| fieldsAdd upstream_latency = if(toDouble(lat.proxy) < 1000, ceil(toDouble(lat.proxy) / 100) * 100, else:ceil(toDouble(lat.proxy) / 1000) * 1000)

2️) Array Field Extraction Troubles:
I'm trying to extract the IP from the first object in the tries array. I've tried various combinations of parse try, "JSON:tryJson" and JSON_ARRAY: without success.

Current Metrics field extraction is status_count
With dimensions:

  • route.name
  • `request.headers.x-field`
  • kong_env
  • dt.source_entity
  • upstream_status_code
  • upstream_latency
  • service.name
  • consumer.username
  • service.host

I’ve attached a sample JSON for reference. If you have any insights or DQL wizardry up your sleeve, I’d love to hear them.

Thanks in advance for your help, and for sympathizing with a newbie learning the ropes! 😊

Cheers,
Erwin

4 REPLIES 4

KeeganNelson
Dynatrace Participant
Dynatrace Participant

Hi Erwin. 

I created a sample record using the JSON you have provided. By the looks, that is valid JSON meaning you are able to add a simple "|parse content, "JSON:j" at the end of the DQL query. This should end up looking something like this.

KeeganNelson_0-1738610231952.pngKeeganNelson_1-1738610252561.png


Hope this helps!

Hi Keegan,

Thanks for checking, and confirming the validity of the json. Indeed I'm parsing the json, and with the processing instruction
fieldsAdd upstream_status_code = if(upstream_status == "-", "499", else:substring(upstream_status, from:0, to:3))
The json is parsed implicitly, with the following instruction I explicitly parse the json:
parse latencies, "JSON:latenciesJson"
| fieldsFlatten latenciesJson, prefix: "lat."
| fieldsAdd upstream_latency = if(toDouble(lat.proxy) < 1000, ceil(toDouble(lat.proxy) / 100) * 100, else:ceil(toDouble(lat.proxy) / 1000) * 1000)

ErwinB_0-1738646415512.png

And in OpenPipeline this looks good, however the metric is afterwards not available when querying the metric, like in notebook it should be available to select:

ErwinB_1-1738646610113.png

 

upstream_latency is also added to dimensions in the Metrics Extraction tab of the OpenPipeline, imho it should work but it doesn't 😕
 
Or should I first parse the content as you show and then the "Complex record" latencies? How can you get the field proxy out of content.latencies?
 
Again thanks for your help!
Have a nice day,
Erwin

OK, found it, solution is easier then thought 🤣

When not parsing JSON explicitly it's parsed implicitly, with the following DQL
fieldsAdd upstream_latency = if(toDouble(latencies.proxy) < 1000, ceil(toDouble(latencies.proxy) / 100) * 100, else:ceil(toDouble(latencies.proxy) / 1000) * 1000)

ErwinB_0-1738653358751.png

The "Run sample data" in open pipeline shows null, however, the metric upstream_latency is available in notebook/dashboard, my mission to create great dashboards from Kong API Gateway logs continues...

Final addition to this question, the above solution is the answer on the first question, the second question was how to get metrics from the first object in the tries array.
The answer to this is:
parse arrayFirst(tries),"JSON:my_try"
| fieldsFlatten my_try, prefix: "try."
| fieldsAdd upstream_system = concat( try.ip, ":", toString(try.port) )

 

Hope this answer helps future similar questions 😀

Featured Posts