29 Jan 2025 10:45 PM
Hi everyone,
I'm working on implementing a Terraform configuration for Dynatrace across multiple environments (e.g., prod, test). I have no issues exporting the existing configuration or importing state, but I’m looking for advice on how to best organize the exported data.
By default, the export creates a structure with a configuration folder with modules. Is this the recommended approach, or is there a better way to structure it—especially when managing multiple environments?
Since I'm new to this, I’d appreciate any guidance on how to visualize and structure the project effectively. Here’s what my directory looks like after a partial export:
C:\Users\Lance\source\repos\
Sincerely,
Lance
03 Feb 2025 11:00 AM
Hi @lwaldrop ,
I created a terraform structure, where I use separate terraform-workspaces to keep the state of the individual tenants (Dev/Test/Acc/Prod), and created settings.yaml file per environment:
In the locals.tf, I read the correct settings file per tenant, based on the workspace name.... So if I "terraform workspace select develop", automatically my /environments/develop/settings.yaml gets activated.
In this way I can easily create specific variables containing configuration items/names specific to an environment (e.g. HOST-GROUP-12345678) or specific name-prefixes which are different between environments (e.g. subdirectory names in log directories).
Further I created multiple "top-level" main-entries, like "system" and "sre" to make a splitting between resonsibilities. Inside those directories I'm keeping the original module names as visible in the export, because that gives me a hint of the original API/rule that is used to inject the setting...
So
hope this helps somewhat...