Showing results for 
Show  only  | Search instead for 
Did you mean: 

Fetch data from external API with new Dynatrace feature

I'm not practical of the new Dynatrace features and I'm trying to understand better which are the correct steps to perform a configuration.
I'd like to showcase a business flow within Dynatrace and most of the critical data can only be taken via external API (Agent approach is not possible).

I'd like to understand which is the best path to follow with the newest Dynatrace Saas capabilities.

It's my understanding that I should start with a Notebook which will call external API and retrieve Json data (and I have tried that and it's working).

But then I'll have to store those jsons somewhere and call it at scheduled frequency (should this be configured in the notebook still?)

After that I'll have to manipulate the data inside those json with DQL (maybe) and finally I can display it?

Is there a general picture that describe which will be the correct step?

Many thanks


DynaMight Champion
DynaMight Champion

Hi @y_buccellato 

Notebook is one time polling. From what I know you will need to use Workflow to schedule polling and store results to Grail.

Here some insight about your use case :

If you want to poll data periodically from the Dynatrace API via the classic environment SDK and ingest it into grail, then currently the best way is to go via Workflows. Because in workflows you can set the execution interval of a workflow. 

Don't hesitate to ask your questions in the Developers section :dynatrace:

Observability Engineer at Phenisys - Dynatrace Professional

I'm moving in the direction of workflow.

I will admit it's a complete new experience and all these new feature are like a puzzle to me. Adding the fact that you need to shift your attention on some additional skill which involve javascript and new feature and the puzzle become even harder 😄

Dynatrace Guru
Dynatrace Guru

We're still mainly using the extension framework for getting data from external sources as it can be bundled with dashboards, alerts, screens, log patterns and metric metadata. It can also run on your existing activegates, has built in configuration flows and supports polling any technology.

That's not to say that there isn't an overlap in use cases with workflows, but the extensions framework is usually the go-to for the solutions we build for customers.


In my case I'm suppose to track the difference in sales orders in between platforms (D365 , SAP CPI and SAP 4Hana ERP).

The first two are completely Saas so an Activegate with an extension running I don't think would make so much sense mainly for two reason:

  1. I have to take the time to understand where to place an AG able to get the data from those platforms
  2. I have to take time to develop a whole extension if none is available for my specific use case (non out of the box extension exist for such platforms)

For the third platform (which is on premise) the Database query extension on top of an AG could be a solution to catch valuable data to cross reference.


Thank you

Endpoints which are easily accessible from our SaaS cluster and which outputs a format which is easy to parse with JavaScript, such as JSON, is where the line on which one to use becomes very fuzzy. You can't go wrong either way.


Thank you Mike, I appreciate your constructive feedback.

Wish you the best

Featured Posts