03 Jul 2025 08:44 AM
Hi Dynatrace team and community,
I am currently deploying Dynatrace OneAgent in a Kubernetes cluster, along with an ActiveGate container (running as a Pod). I have two different traffic flow designs for how the ActiveGate connects to Dynatrace SaaS, and I would like your recommendation on which one is the best practice or officially supported.
Picture 1
Each Worker Node sends OneAgent data to the ActiveGate Container
The ActiveGate Container then forwards the data to the Environment ActiveGate
Finally, the data is sent to Dynatrace SaaS directly over HTTPS (port 443)
Picture 2
Each Worker Node sends OneAgent data to the ActiveGate Container
The ActiveGate Container sends data to Dynatrace SaaS
My Questions:
Between Picture 1 and Picture 2, which one is the best practice when deploying ActiveGate as a container on Kubernetes?
In the case of Picture 2 (proxy):
Is it acceptable and supported to have the ActiveGate container send data via a proxy?
Do all Worker Nodes need outbound proxy access, or only the nodes running ActiveGate?
Are there advantages or trade-offs in terms of security, scalability, maintenance, or performance between the two designs?
Thank you in advance for your support and recommendations
(I’ve attached both Picture 1 and Picture 2 diagrams for clarity.)
Solved! Go to Solution.
04 Jul 2025 09:38 PM
Case 1 is officially unsupported. You must not route AG traffic via another Environment AG.
On the other hand traffic via HTTP proxies is possible and supported.
Best practice is to deploy AG into the K8S environment as part of the Dynatrace Operator deployment and route OA traffic to SaaS using this ActiveGate(s) (standard behaviour). If an HTTP proxy is required for outbound communication, for example due to network policies, it can be configured in DynaKube directly.
07 Jul 2025 08:45 AM
As a fall back communication I always implement a fw rule between the worker nodes and Saas or Managed 443 (if it is allowed). If something would happen the AG you still have OA information.
Best regards,
János
08 Jul 2025 07:36 AM - edited 08 Jul 2025 08:01 AM
Hi all, and thank you @Julius_Loma and @Mizső for your previous insights.
I have a follow-up scenario based on the discussion here.
We are running an on-premise Kubernetes environment where:
The in-cluster ActiveGate container cannot access the internet, so it cannot send data to Dynatrace SaaS directly.
We also cannot define an HTTP proxy in the DynaKube CRD (due to policy restrictions or lack of proxy infrastructure).
Given that both outbound direct access and proxy-based access are not allowed, what are the available options to make Dynatrace work in this kind of environment?
Specifically:
Can we route data from OneAgent (inside the K8s cluster) through the internal network to an external Environment ActiveGate (e.g., deployed in a DMZ or management network that has internet access)?
Any guidance, architecture recommendation, or documentation regarding air-gapped or network-restricted Kubernetes environments would be greatly appreciated.
Thanks in advance!