26 May 2025
10:16 AM
- last edited on
30 May 2025
08:58 AM
by
MaciejNeumann
I am trying to do an integration from Zscalar to Dynatrace which sends a specific payload which needs to be transformed to Dynatrace specific event payload to be ingested to Dynatrace. Is there a Webhook proxy in Dynatrace to do this transformation even if it is in the form of an extension. I appreciate your support on this urgent requirement.
Solved! Go to Solution.
26 May 2025 03:41 PM
Thank you for the feedback Antonio, your feedback is very generic. The Zscaler send webhooks call and its payload is different than DT. A transformation is required. Any other comments on this?
26 May 2025 09:53 PM
Hi,
It would depends which information you need to ingest, or if you are using Grail or Managed... I recommend you checking OpenPipeline where you can transform data before it is ingested in Dynatrace SaaS server.
Best regards
28 May 2025 12:30 PM
None of the ingest worked for the this kind of interface, we get always webhook configuration error. Is there a webhook proxy in DT. If not id OpenPipeline the only solution for this.
28 May 2025 05:48 PM
Bear in mind we are only sending events at the moment from Zscalar to Dynatrace
28 May 2025 09:44 PM
Hi @MarwanC ,
If you have syslog enabled on Zscaler, you might consider integrating those syslogs into Dynatrace. https://docs.dynatrace.com/docs/analyze-explore-automate/logs/lma-log-ingestion/lma-log-investion-sy...
for Event only ingestion then may be ingest event via API should help ? [ you might adjust your playload accoding to DT ]
https://docs.dynatrace.com/docs/discover-dynatrace/references/dynatrace-api/environment-api/events-v...
BR,
28 May 2025 10:43 PM
We have had a similar challenge in our environment, where there was a sending entity whose payload could not be made to fit within the constraints of Dynatrace's available API endpoints (for logs or events). After many conversations with our Dynatrace account team, we had to admit defeat. Our eventual solution was to implement an intermediate (and custom) OpenTelemetry collector to receive the incoming data using a compatible receiver and then push them into Dynatrace using the OTLP Log exporter.
30 May 2025 02:29 PM
Thanks Marco, I fear I am dreading to the same fate at the end, "OpenTelemetry collector to receive the incoming data using a compatible receiver and then push them into Dynatrace using the OTLP Log exporter" Can you clarify the last part - OLTP log exporter? is that Dynatrace interface does it use the Dynatrace API? Thanks kindly for your reply.
29 May 2025 09:14 PM
Addionally two options not yet mentioned (if the source system can only emit data and call webhook)
Anyway, be sure to double-check and explore the standard options already mentioned. If it's log / event data, I'd use the standard Log ingest API and OpenPipeline to process it.
30 May 2025 02:03 PM
Thanks Julius, for summary of the options available, OPTION ONE: is to create a complete extension for this simple event mapping from Zscaler to Dynatrace, what is the simplest extension documentation/code that you recommend to achieve this. Is this a long shot to achieve a simple interface? OPTION TWO: seems to be interesting, an app function will accept the event payload from Zscaler in its format and then transform it to a Dynatrace payload format event, the documentation refers to doing authentication by ourselves if we want to expose a function to the outside what is meant by this, is this app function written in python like in the same concept as a lambda function here?. OPTION 3: seems to be the most viable if I look at this but I am currently struggling with OpenPipeline concept: at the entrance and when it ends up in Dynatrace in the area of event, if I understand this, the Log ingest API is called, then we design a transformer/processor (that is clear clear) to map the event coming from the different payload to the Dynatrace Payload, the last part is confusing it ends up in grail and then what, where do we see it in Dynatrace, is only visible and extracted in a notebook using DQL, if you can kindly clarify those three options mentioned even though from high level and no hand on it will be much appreciated. The webhook interface is a standard interface in the industry, and ought to be supported out of the box in Dynatrace, the mapping is the only input required as an input - I do not wish to re-invent the wheel here and struggle with difficult solution to maintain. I appreciate any feedback on this.
30 May 2025 10:23 PM
Option 1 (Extension) - I don't have anything prepared at the moment, but I'd choose this path if this has to be a reusable thing and has to work with Dynatrace Managed (Onpremise). Especially if you need to combine it with other data.
Option 2 - I'd choose this if you really need to write your own webhook and it won't be called frequently. Most likely, you don't need to choose this path at all.
Option 3 - Ingest using API (preferred). You call the log ingest API, construct the payload and headers so that the event gets ingested. Then with OpenPipeline, you can process the data (transform it, create a metric, set storage bucket, etc.). Since you sent it as log, it can be seen in the Logs app, or Notebooks/Dashboard or in any other app querying the logs in Grail such as Security Investgator.
If you are on Dynatrace Managed, you can still use this option, just it does not have Openpipeline, but you have other processing options.
Option 4 (bizevents, not yet mentioned, noting it for reference) is similar to Option 3, just the data has slightly different characteristics and purpose.
If you need some help on that, can you please elaborate more on the constraints of the webhook configuration on the zscaler side and provide same sample of the payload?
31 May 2025 10:13 AM
Thank you Julius, your recommendations make sense, I already made the decision to use OpenPipeline, it seems to fit the solution, and the filtered data can be presented on dashboards or pushed to an another Davis Event which is the intention, but I am still long way from achieving that (the pipeline is easy to construct just using rename field will do the trick). The issue that I am having at the moment is basic, that Zscaler do not publish their payloads anywhere and we sort of trying to guess what kind of data is coming through to the logs. We wrote a simple python to push a sample Zscaler payload which we found somewhere on the internet and the ingest is going through giving response code 204 indicating a success? but we can not see the entry in the logs or the logs and event apps, any idea what search criteria based on this payload data ingested that we can search on, we checked the exact date but we can not find the entry. strange... any feedback is much appreciated. I am stuck here now.
[2025-05-31 11:00:22] Sending raw Zscaler payload... Sending payload to: https://DT_URL Headers: {'Authorization': 'Api-Token REMOVED', 'Content-Type': 'application/json'} Payload: { "eventType": "SECURITY_ALERT", "timestamp": "2025-05-31T23:00:00Z", "source": "Zscaler", "alertDetails": { "severity": "HIGH", "category": "Malware", "description": "Suspicious activity detected", "affectedUser": "user@example.com", "affectedDevice": "MY_LAPTOP", "ipAddress": "192.168.1.100", "location": "London, UK" }, "actionTaken": "Blocked", "correlationId": "a770faffe1384991" } Response status code: 204 Response body:
31 May 2025 10:30 AM
We also used curl using the API part of DT, unable to find the entry, maybe I am not searching well 😞
curl -X 'POST' \ 'https://DT_URL/api/v2/logs/ingest' \ -H 'accept: application/json; charset=utf-8' \ -H 'Authorization: Api-Token OUR_TOKEN' \ -H 'Content-Type: application/json' \ -d '{ "eventType": "SECURITY_ALERT", "timestamp": "2025-05-31T23:00:00Z", "source": "Zscaler", "alertDetails": { "severity": "HIGH", "category": "Malware", "description": "Suspicious activity detected", "affectedUser": "user@example.com", "affectedDevice": "L51W0V7", "ipAddress": "192.168.1.100", "location": "London, UK" }, "actionTaken": "Blocked", "correlationId": "a770faffe1384991"}'
Output:
Code | Details |
204 | Response headersaccess-control-allow-origin: https://URL_DT access-control-expose-headers: * cache-control: no-store,no-cache date: Sat,31 May 2025 09:25:11 GMT dynatrace-response-source: Cluster pragma: no-cache server: ruxit gateway strict-transport-security: max-age=31536000;includeSubDomains traceresponse: 00-d6fb8d25873473cd5eaad7bf8d84cb92-b781e801e0f0df41-01 vary: Origin x-dt-tracestate: ec907795-f23c233@dt x-robots-tag: noindex |
31 May 2025 03:39 PM - edited 31 May 2025 03:40 PM
@MarwanC 204 response code means the data has been successfully ingested.
You can lookup the log entry for example by the API token it was ingested with (just the public part):
fetch logs
| filter matchesValue(dt.auth.origin, "dt0c01.JUR2SVT6KBWHDJ7JWERHQ5UB")
If you still can't see it, please verify:
Actually, it's not important to know the payloads in advance; that's where the power of Grail comes in. You can just ingest it (maybe easily enrich it by adding log.source or some other data) and you can parse the content on query.
This is the curl example of yours in my environment (I just pointed it to my environment and used my API token. I executed the curl twice, that's why you can see two records here.
31 May 2025 05:31 PM
Indeed thank you, I knew my data were ingested but did not search in an efficient way - thank you
Just a silly question, how did you change the filter box to be able to run DQL, mine is always forcing a filter, I used a Notebook to do the search (see screenshots) next step is now to build the pipeline - thanks very much for your feedback.
31 May 2025 09:15 PM
@MarwanC switch to DQL input in the logs app is hidden a little bit, I'd give it a more prominent space in the app (FYI @michal_franczak )
02 Jun 2025 08:40 AM
Thanks for the reply. Indeed I agree may be the switch option should be at the top. As I mentioned that the simple ingest API call I conducted is either done from a curl or python program and this seems to work but when we do the actual test on the Zscaler intself and providing the the logs ingest call and the token: I am getting the following error: Failed : Invalid Webhook Config {"RESPONSE_MESSAGE":"The webhook configuration is not valid. Check the values of the configuration fields if they meet requirements."} - what is meant by the webhook configuration. Your feedback is much appreciated.
02 Jun 2025 08:47 AM
@MarwanC that seems to be an issue on the zscaler side - not accepting the configuration you provided. Unfortunately, I don't have access to any zscaler environment. Can you share the configuration you are trying to do in zscaler?
02 Jun 2025 11:53 AM
Thanks once more Julius, the only screenshot I can share from Zscaler is whe we define the configuratin for DT and its beared token, I attach the screen shot below, I do not see any data ingested in DT after putting the API call/DT Toekn - this is needed to see what kind of Payload it is ingested int he content as we did before, once I know this payload the next step is to define the OpenPipeline for the integration but I am stuck in the first step which is essential before I can proceed. Do I need to change the strategy and use something else? See attached screenshot from Zscaler.
02 Jun 2025 11:55 AM
Test Webhook gives:
Failed : Invalid Webhook Config
{"RESPONSE_MESSAGE":"The webhook configuration is not valid. Check the values of the configuration fields if they meet requirements."}
02 Jun 2025 12:23 PM
@MarwanC you have the URL and the bearer token empty. Can you share the values you are defining there? Blur the middle part of the token and hostname of the URL.
02 Jun 2025 02:30 PM
We removed them for security reasons. But trust me they were put correctly.
02 Jun 2025 02:32 PM
One idea which spring in mind. If Zscaler send XML out would it still needed to say something to the Dynatrace API?
02 Jun 2025 02:55 PM
Also what is the difference of Access Token and Bearer Token, is it possible that Zscaler uses a differen format when injecting the header to send to Dynatrace? e.g; Authorization: Bearer <your_token_here>
02 Jun 2025 03:50 PM
@MarwanC
Based on your screenshot, you have 3 options to authorize. Basic, Bearer, OAuth. You were trying to send that with api-token authentication, which is a different method.
Dynatrace support Oauth2 - https://docs.dynatrace.com/docs/shortlink/oauth, what options do you have for oauth2 authentication on the zscaler side?
03 Jun 2025 08:35 AM
Thanks Julius for your useful input as usual, Indeed I noticed just in the last tests only that the GUI input is always forcing to add the Bearer in front of the DT token, what does it mean here for our API call to ingest it, the link you provided is not working for me. I also include the screenshot from Zscaler again to show that UI supports the three types, can the ingest call used with any of the three and needs to be done to configure this?
03 Jun 2025 01:42 PM
After reading through the docs, I believe you need to use the value
api-token dt0c01.N3BJLL......<rest of your api-token>
Maybe just add it (don't click on test webhook).
To summarise:
03 Jun 2025 05:20 PM
Just an update, we have added in the Zscaler UI, in the token field api-token <token> but upon saving is always coming up as Bearer api-token <token> - We are generating alerts if there is anything being ingested. We also created an OAuth2 client and we see that the permission of ingest are all available, are you sure that OAuth client is not supported to ingest logs using this OAuth Bearer token?
03 Jun 2025 05:26 PM
the docs of
The docs that we follwed for OAuth client creation: https://docs.dynatrace.com/docs/manage/identity-access-management/access-tokens-and-oauth-clients/oa...
03 Jun 2025 06:04 PM
The scope we will use in the OAuth is as follows: We are hoping the ingest event one for the pipeline at least it works: "scope":"openpipeline:events.sdlc.custom:ingest openpipeline:events.sdlc:ingest openpipeline:security.events.custom:ingest openpipeline:security.events:ingest openpipeline:events.custom:ingest openpipeline:events:ingest storage:logs:read storage:events:read storage:events:write storage:metrics:read storage:logs:write","token_type":"Bearer"
03 Jun 2025 07:27 PM
Julius, I can confirm that after hard work to try to create an OAuth client and a token and giving all the relevant scope to the token, the log ingest is not accepting the Bearer token to inject logs or event in open pipelines, despite adding the following scopes which are available during the OAuth client creation
03 Jun 2025 09:11 PM
Yes, you are right, currently
If you have an option to run a "proxy" between zscaler and Dynatrace, this could be the easiest method - just rewriting authorization headers. If not, I'd probably go with creating a Dynatrace app function which will consume log from zscaler and send it as log event internally. This won't require any additional infrastructure.
When called externally app functions can be authenticated using Platform bearer tokens.
03 Jun 2025 10:59 PM
Julius, thanks once more, I am glad we are in the same page. My next venture is to try to get a Dynatrace app this will be easier than an external web proxy using lambda function or open telemetry collection. The issue you can get me into speed how I can write quickly his function as the deadline is approaching to deliver this integration. Where will this app be hosted on the SaaS server or one of our Active Gate and I guess it is written python which I favour. You support is much appreciated.
04 Jun 2025 12:22 AM
Julius, I would also raise the concern that if OAuth is working in Dynatrace to support Bearer token to ingest logs than the task can be easily resolved by using pipelines. Can we put this issue to the product team to see why they discontinued the support for OAuth for ingest API, the scope are all there and the token is generated but the API does not support. How can escalate resolving this as it seems to be the easiest solution in this integration. Can you help to raise a Product change feature to the product team? on the community or direct via the support line. I have a ticket for this as well.
04 Jun 2025 09:57 AM
Julius, on the subject OAuth client, currently it does not support ingest of logs into grail , what about fetching data e.g. api/v1/entity/infrastructure/hosts what scope I need to assign to get a full extract of the hosts, using the OAuth client, thanks for your support.
03 Jun 2025 08:59 AM
does it mean we have been using the wrong token all the time?