09 Oct 2024 03:00 PM
while working with Workflows I noticed that the Result that is returned from a TASK1 DPL Query in a workflow is in a JSON format and when I try passing that onto another DPL query, it is giving an error?
So i would like to know how to pass result of task1 to next task1 in workflow and also how to parse it?
09 Oct 2024 03:24 PM
I'm not exactly sure what you're asking about, but you might need to use the expression filters to_json or from_json like so: {{result("task1")|from_json}}
09 Oct 2024 03:30 PM
Thanks for your reply..
Actually i am passing result from Task1 to Task2 in workflow. Right now, i am getting result of Task1 in JSON format. And trying to parse it so that it would become huma readable format.
09 Oct 2024 03:32 PM
You can use the parse command in DQL to parse the json into a record with the fields.
Extraction and parsing commands - Dynatrace Docs
09 Oct 2024 03:40 PM
This is error coming while running it
Error evaluating 'query' in input. the JSON object must be str, bytes or bytearray, not dict Use the expression preview in edit mode to review the input.
09 Oct 2024 03:58 PM
That's where you'll need to use the "to_json" expression filter like I mentioned earlier. Something like this: {{result("task1")|to_json}}
10 Oct 2024 12:21 PM - edited 10 Oct 2024 01:49 PM
I tried to parse into a record with the fields, but it is not fetching result.
data json:"""[{{result("task1")|to_json}}]"""
| parse content, "JSON:json"
| fields "Namespace"
I am trying to print an output of TASK 1 into tabular format
10 Oct 2024 04:28 PM
When viewing your Workflow execution, you can see the result of your "task1". You may need to get into a field before you actually can access the data returned. For example, you may need to use {{ result("task1").records }} to access the "records" field of the result.
It's hard to determine exactly what you're trying to do with limited information, so sorry if this doesn't help you.
14 Oct 2024 02:55 PM
@nicemukesh ,
Let me try to explain this with an example.
The first task is a DQL and the result would obviously be a JSON value.
What I assume here is from the JSON you are getting you want a particular field to be used in another DQL which would require you to loop through the result of first task.
Just let me try to explain this with an example.
Here I am taking two DQLs first one gets the hosts which have CPU usage greater than 50%.
timeseries by:{dt.entity.host}, cpu_usage = avg(dt.host.cpu.usage)
| filter arrayAvg(cpu_usage) > 50
| lookup [fetch dt.entity.host], sourceField:dt.entity.host, lookupField:id, fields:{entity.name}
The second DQL gets the host details by host id
fetch dt.entity.host
| fields id == ""
Now I do not want to pass the id as a static or hardcoded value rather I would like to pass host id of the hosts that came up in the first DQL result.
Now the sample result of first DQL would be something like this.
[
{
"status": "NONE",
"content": "2023-01-01T01:01:01Z localhost proxy[12529]: 10.0.0.10:38440 http-in~ individual_servers/abc 217/0/0/1/218 HTTP_STATUS 200 284 - - --NN 5749/5745/0/1/0 0/0 {|||some.domain.com:443} {|} \"POST /path?param=foo HTTP/1.1\" ECDHE-RSA-AES256-GCM-SHA384 TLSv1.2",
"content2": "2023-01-01T01:01:01Z localhost proxy[12529]: 10.0.0.10:38440 http-in~ individual_servers/abc 217/0/0/1/218 HTTP_STATUS 200 284 - - --NN 5749/5745/0/1/0 0/0 {|||some.domain.com:443} {|} \"POST /path?param=foo HTTP/1.1\" ECDHE-RSA-AES256-GCM-SHA384 TLSv1.2",
"loglevel": "NONE",
"host.name": "HOST-IG-12-34567",
"timestamp": "2023-01-01T01:01:01.000000001Z",
"event.type": "LOG",
"log.source": "/var/spool/some.log",
"dt.entity.host": "HOST-1234567890",
"dt.source_entity": "PROCESS_GROUP_INSTANCE-ABCDEF1234567",
"dt.security_context": "application-a",
"dt.entity.process_group": "PROCESS_GROUP-ABCDEF1234567",
"dt.entity.process_group_instance": "PROCESS_GROUP_INSTANCE-ABCDEF1234567"
},
{
"status": "NONE",
"content": "2023-01-01T01:01:01Z localhost proxy[12529]: 10.0.0.10:38440 http-in~ individual_servers/abc 217/0/0/1/218 HTTP_STATUS 200 284 - - --NN 5749/5745/0/1/0 0/0 {|||some.domain.com:443} {|} \"POST /path?param=foo HTTP/1.1\" ECDHE-RSA-AES256-GCM-SHA384 TLSv1.2",
"content2": "2023-01-01T01:01:01Z localhost proxy[12529]: 10.0.0.10:38440 http-in~ individual_servers/abc 217/0/0/1/218 HTTP_STATUS 200 284 - - --NN 5749/5745/0/1/0 0/0 {|||some.domain.com:443} {|} \"POST /path?param=foo HTTP/1.1\" ECDHE-RSA-AES256-GCM-SHA384 TLSv1.2",
"loglevel": "NONE",
"host.name": "HOST-IG-12-34567",
"timestamp": "2023-01-01T01:01:01.000000002Z",
"event.type": "LOG",
"log.source": "/var/spool/some.log",
"dt.entity.host": "HOST-1234567890",
"dt.source_entity": "PROCESS_GROUP_INSTANCE-ABCDEF1234567",
"dt.security_context": "application-a",
"dt.entity.process_group": "PROCESS_GROUP-ABCDEF1234567",
"dt.entity.process_group_instance": "PROCESS_GROUP_INSTANCE-ABCDEF1234567"
}
]
from this I want to extract the field dt.entity.host and then use it in second DQL.
Now let's see the workflow part of it.
This what our workflow looks like and let me break down stepwise.
The first step is hosts_cpu which is pretty standard just give the first DQL mentioned above.
Now coming to the second step, this should be a loop task so go to the Options first and enable loop task. Now we want to loop this task over all the records that we get the output of step-1 (hosts_cpu) so we configure something like this.
In the list you have to give something like
{{ result("hosts_cpu")["records"] }}
don't worry about how I got this. Just under the list you'll see option expand the list if you click that expand in the box and start typing records, you'll get the options.
Now the loop is in place, and we have to use that loop variable and then extract dt.entity.host from that variable.
So the actual DQL in the second task becomes this.
The DQL if expanded, looks like
fetch dt.entity.host
| filter id == "{{ _.item ['dt.entity.host'] | replace('"', "")}}"
Also, don't bother about the replace part I had to use it because I was getting only one " like HOST-ABC12345" and I had to remove that " for the DQL to work properly.
Hope this helps.
Regards,
@Maheedhar_T
15 Oct 2024 08:20 AM
Thanks for your detailed explanation.. It is great help.
I will check it from my end and will keep posted on this
Mukesh
15 Oct 2024 01:36 PM
The main blocker in my DQL is, how to pass JSON outputs