cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

OpenTelemetry Implementation with Dynatrace for Large JSON HTTP Responses

We are implementing OpenTelemetry with Dynatrace as the backend to instrument the capture of HTTP communications between our system and for asearch provider, where the responses are large JSON objects that we cannot truncate, as all the information is valuable. We understand that Dynatrace has a 4 MB limit for OTLP messages, including traces, but we have not found specific information in the official documentation about storage limits for traces in Dynatrace.

Can you confirm if there is a specific storage limit for individual traces in Dynatrace (e.g., per span or event) and, if so, what is that limit?
Since our response objects may exceed the payload limit, we are considering storing them externally (e.g., in AWS S3 or Azure Blob Storage) and referencing them in the traces via an ID or URL. Is this a recommended practice by Dynatrace, and are there any specific configurations in OpenTelemetry or Dynatrace that we should implement to optimize this integration?
Does Dynatrace recommend any specific external storage provider for this case, or do we have flexibility to choose based on our needs?
Are there additional configurations in OpenTelemetry (such as compression or batch processing) that we should consider to ensure traces are stored correctly without loss, especially for large objects?

0 REPLIES 0

Featured Posts