10 Nov 2025
01:50 PM
- last edited on
12 Nov 2025
07:40 AM
by
GosiaMurawska
Good morning fellow Dynatracers? Dynatrac-ians? (Whichever you prefer)
First off we have an on prem solution for Dynatrace.
We have several 3rd party vendors that are making performance and monitoring data available through
their own crafted API interface. I have read over and am familiar with the part of Dyaatrace's API that
allows a person to post data into the Dynatrace DB/on-prem cluster to then be used with the Data explorer
tool OR pulled out of Dynatrace with their own API.
What I am looking for is a generic tool that can handle going out to an API and retrieve the data. I know curl does exactly this.
However, the challenge we have is processing the returned response/information afterwards.
Let's say I run a curl command, get my JSON data back. At that point, the data
is at the command line (or a file if I send the output there).
Is there a tool that specializes in taking this JSON data and ETL'ing the results?
If there is a tool that does both the curl command AND ETL of the responses, that would be ideal
(we would want to run this on regular intervals in cron etc..)
Do such tools exist? I'm trying to find a low code/programming solution to taking
data from multiple API sources and processing it down to a file that I can then post in to Dynatrace.
Linux would be idea, but windows is also an option. Maybe even windows as preferred if a gui
is an option. 🙂
This is a gap in my knowledge, so I am looking for suggestion to help with
these upcoming projects.
As always, thanks to the community for being as supportive and engaging !
Solved! Go to Solution.
12 Nov 2025 08:32 AM
Hi,
You can try Logstash with plugins: http_poller (input) → json / mutate / ruby (filter) → http (output to Dynatrace).
or n8n: Cron → HTTP Request → Set / Function / IF → HTTP Request (Dynatrace).