Showing results for 
Show  only  | Search instead for 
Did you mean: 

Retrieving DQL Data while connecting to PowerBI


We are trying to connecting Dynatrace to PowerBI. We want to run DQL Query and have the results in PowerBI, so we can use: DQL queries via API:

But this is a POST Method and not a GET. I have worked with a colleague and we can authenticate with Tokens for the classic platform APIs and use PowerBI/Excel to integrate with the methods mentioned here and in the Dynatrace and PowerBI (YT Video).

I see something similar, but not the same, this is more of RFE (Grail to PowerBI Connector)

How can we connect from PowerBI/Excel to DT, to retrieve DQL Response using the DQL API (POST Method)


DynaMight Guru
DynaMight Guru


Do you know if you token has "Read logs (" permission?

I would check Log Monitoring API.

Best regards

❤️ Emacs ❤️ Vim ❤️ Bash ❤️ Perl

I have read logs permission 
I have client token for OAuth and Token ID for API V2 when using token ID method for getting logs using Get method i am getting Error Token Authentication failed Token Authentication failed

and got stuck,  Since i am new using API's i don't know how to generate bearer token 


For me, easier way is from here:


You can create your token there and play with Dynatrace API Explorer.

Best regards

❤️ Emacs ❤️ Vim ❤️ Bash ❤️ Perl

DynaMight Guru
DynaMight Guru

Remember also to select the appropriate scope of such a token:

Have a nice day!

Dynatrace Enthusiast
Dynatrace Enthusiast


Yes it is possible and here is a rudimentary piece of code to help:
Note you would need to create an oauth client on your tenant to be able to use this as described here: Access platform APIs from outside | Dynatrace Developer

then in your Power query connection in Powerbi use the following (this example runs a fetch logs 1000 records DQL query):

body = grant_type=client_credentials&client_id=<your oauth client id here>&client_secret=<your oauth secret here>&scope=<your scope required here(e.g. storage:logs:read+storage:buckets:read)>,
Data= Web.Contents("",
DataRecord = Json.Document(Data) ,
postBody = <your dql query here, e.g. {"query": "fetch logs | limit 1000","timezone": "UTC","locale": "en_US","requestTimeoutMilliseconds": 60000,"maxResultRecords": 1000}>,
Source = Json.Document(Web.Contents("https://<your DT tenant here>/platform/storage/query/v1/query:execute?enrich=metric-metadata",
[Headers=[#"Authorization"="Bearer "&token, #"Content-Type"="application/json"],
result = Source[result],
records = result[records],
#"Converted to Table" = Table.FromList(records, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Expanded Column1" = Table.ExpandRecordColumn(#"Converted to Table", "Column1", {"timestamp", "content", "", "", "dt.entity.container_group", "dt.entity.container_group_instance", "", "dt.entity.process_group", "dt.entity.process_group_instance", "", "dt.source_entity", "event.type", "", "log.source", "loglevel", "", "status", "", "log.iostream"}, {"Column1.timestamp", "Column1.content", "", "", "Column1.dt.entity.container_group", "Column1.dt.entity.container_group_instance", "", "Column1.dt.entity.process_group", "Column1.dt.entity.process_group_instance", "", "Column1.dt.source_entity", "Column1.event.type", "", "Column1.log.source", "Column1.loglevel", "", "Column1.status", "", "Column1.log.iostream"})
#"Expanded Column1"

Featured Posts