07 Jul 2025 03:19 PM - edited 07 Jul 2025 03:30 PM
We recently set up log forwarding to Dynatrace Api via logstash.
In logstash we can see these responses from Dynatrace Api:
body=>"{\"success\":{\"code\":200,\"message\":\"Some events were limited. Following limits were applied: Log Event attribute value size is too large, will be truncated.\"}}"
We can see in Dynatrace that content of the logs ingested via logstash is indeed huge, but it turns out it's due to the fact that each log record contains multiple logs records merged inside it (see attached log content with 15 merged logs). Moreover, I suspect that the reason almost no field extracted for these logs in Dynatrace have been caused by this issue.
Is there a way to split the logs records? I noticed there is a setting for splitting logs which are ingested with OneAgent but I haven't found how to do this option with other logs.
Thanks!
Solved! Go to Solution.
09 Jul 2025 11:28 PM
The ingest endpoint on Dynatrace SaaS supports receiving multiple events in a single payload if JSON us used, but the format has to match what the API is expecting. I believe it has to come in as an array of JSON objects in order for the events to automatically be split. Reference is available here.
For plain text format events, only a single event is supported per API call. In this case, logstash would need to be adjusted to not send multiple log events in a single API call.