09 Oct 2025
12:12 PM
- last edited on
17 Dec 2025
11:34 AM
by
Michal_Gebacki
I'm trying to use ArgoCD to manage upgrades of the dynatrace operator in OpenShift. My subscription yaml is below. Everything I have been seeing is that startingCSV refers to what version to start at when a fresh install is done. Lets say for example I want to upgrade to version 1.7.0 how can I do that without uninstalling and reinstalling the subscription? I only see a install plan for 1.7.1, it seems OpenShift always only has the latest version available to go to but I don't want to always be force to go to the latest version.
apiVersion: operators.coreos.com/v1alpha1
kind: Subscription
metadata:
labels:
operators.coreos.com/dynatrace-operator.dynatrace: ""
name: dynatrace-operator
namespace: dynatrace
spec:
channel: alpha
installPlanApproval: Manual
name: dynatrace-operator
source: certified-operator-index
sourceNamespace: openshift-marketplace
startingCSV: dynatrace-operator.v1.6.2
16 Dec 2025 03:58 PM
Hi, @sivart_89!
Is this Dynatrace Documentation page helping to address your issue in any way?
-> Update or uninstall Dynatrace Operator
Please let us know, thank you!
23 Dec 2025 07:52 PM
Hi @Michal_Gebacki, no this doc does not help any. I am looking for how I can do an upgrade using more of a gitops approach, using ArgoCD to pick up the change I push to my GitHub repo then applying the changes to the cluster. Specifically, my question is around using ArgoCD in OpenShift, how you can setup things to have ArgoCD upgrade the dynatrace operator to a specific version.
24 Dec 2025 05:53 AM
@sivart89 it will really depend on how you have set up your deployment structure in Argo.
You don't really specify if you use Helm or Manifest approach.
For helm it will look something like this
For Manifests it will look something like this
Replace manifests with those from the new Operator release.
Sync CRDs before Operator manifests (use sync waves or PreSync hooks).
The instructions provided before are correct and relevant, you just need to massage it to be relevant for your CD tool and do the groundwork for your deployment approach.
I use both Flux and Argo depending on the k8s cluster being used. fundamentals and approach are the same, the "how i do it" are just manipulated for the relevant tool. You will know best as to what you have set up and how to do it relevant to your environment.
Featured Posts