You can now audit the following:
- Login events
- Logout events
- Any change to a configuration
- Any change to API tokens
To enable and use the new environment Audit logs API
2. Go to Settings > Integration > Dynatrace API > Generate token. Give the token a name and enable Read audit logs. Alternatively, use an existing API token by adding this access scope.
3. Copy the Generated token value.
/auditlogs API endpoint is available in the Dynatrace Environment API v2
5. Use the copied API token within the
Authorization header to get the audit logs for a given timeframe.
Environment audit logs are stored once the audit feature is enabled, as explained in the first step of the setup above. Events that occur before the feature is enabled aren’t stored!
Audit logs are retained for 30 days and then automatically deleted.
Note: If you need to store audit logs for a longer period of time, for example, to meet compliance standards, we recommend that you set up an automated process that downloads audit logs every day and stores them in your own infrastructure.
A full article of this can be found on:
Well I spent some time yesterday on this in terms of attempting to get this into Splunk. I have been trying to take the approach to make it as easy and vendor supported as possible. I have not had much luck yet after trying a few things. In order to save people some time, here is what I have tried to date.
I found 2 different apps that have been designed to work with Dynatrace.
Both work with one another and after reading the documentation on these, it did not appear that it would work for this, but I gave it a try anyway and had no luck. They seemed to be focused on time series metrics for the most part. Also, the last version was Oct. 30, 2018 so I am not expecting to see an update anytime soon for the added feature of the audit log.
I also did some searching on GitHub. Within that search I was able to locate both of the above Splunk apps and thought perhaps I could get my hands on the master as I believe both were built using the Splunk Add-On Builder. I did find one repository with it, but when you attempt to import it into Add-On Builder (v2.2.0), it fails. There is a new version of Add-On Builder which is 3.0.1, but I have not been able to test that yet as we first need to do a Splunk upgrade.
Splunk Add-On Builder
After going through all the above with no success, I decided to try using the Splunk Add-On Builder to just create my own. Off to a good start, but far from it working.
I spent just about the entire day trying different things around this. In doing so, I found that it comes down to 4 options that I can think of at this time.
Option 1: License the 3rd party Splunkbase app "REST API Modular Input" app
I won't go into details here, but I found many people currently using this app for exactly this type of need. This Splunk app provides a straight forward, time savings method for polling data from RESTful endpoints. While you could do this yourself with something like the Splunk Add-On Builder which of course is free, it's also a lot of work. The developer of this app has done the hard work for you, at least from what I can tell so far. I plan to test this out to see just how good it is and will provide an update later. You can find the app on Splunkbase here.
Option 2: Start from scratch with Splunk Add-On Builder
As much as I do love to learn new things and really get into the gears of how things work, I also don't have the time right now. I have started to build an app using this method. Setting up the connection and all the initial steps are straight forward and I was able to pull the JSON of the audit long. Thats where things get difficult and you will end up spending most of your time if you go with this option. As before, I am not going to go into details here around the configuration of Splunk, but between all the mapping and other configurations that would need to be done, it makes option 1 above look much more inviting for myself anyway.
You can find the Splunk Add-On Builder here.
Note: Pay attention to the version you download. There are currently 2 version. If your instance of Splunk is below version 8.0, the Splunk Add-On Builder v3.0.1 will not work correctly. Unfortunately Splunk allows you to install apps that may not be compatible, meaning there is no check in their install engine. This is a pet peeve of mine with Splunk, but thats another story.
Option 3: Setup the Splunk HTTP Event Collector
My thought here is to create a script that can be execute by either cron or some other scheduler which would pull the audit log from Dynatrace on a time interval of my choosing and would then send that log to Splunk via the HTTP Event Collector.
You can find the documentation on the Splunk HTTP Event Collector here.
Option 4: Custom script to pull down the audit log and use a Splunk Forwarder
Same thing as option 3 for the most part with the only difference being that your custom script would write the Dynatrace audit data to a local log where a configured Splunk Forwarder would pick it up and index it in Splunk like any other log. You would most likely want to ensure you have setup this custom script for rolling logs which the Splunk Forwarder knows how to work with.
Out of all these options, I think option 1 is the one I am going to try to work with first. While I have not tested this yet, it does appear that it would make everything straight forward and much easier to setup and have working. It's also a licensed app with support and not expensive. Plus, I could think of many other things I could use that app for beyond just Dynatrace providing even more value. Again, I have not tested this yet so it is only theory right now.
Recommendation to Dynatrace
Like with all vendors there are pros and cons. Splunk is a great tool, but when it comes to polling for data, this has never been an area they shine in. I would love to see Dynatrace add the ability to stream this data. That option currently exists today for user sessions and I think providing the same option around the audit log would be a huge value. You could easily then as a customer stay in a 100% vendor support solution because you could configure the Splunk HTTP Collector with a listener and then point the Dynatrace Audit Log Stream right to that Splunk HTTP Collector. In fact, it could really be considered an out of the box integration.
I will update when I do more testing. Hope this all helps!
Very good write up! thanks for following up on this and providing your experiences so far. I guess it is safe to say that the splunk apps are not working as expected when it comes to ingesting Data into Dynatrace.
You're welcome! To be clear, the Splunk apps for Dynatrace do work for what they were intended for at the time. Audit logs where not available at the time the apps were last updated. The problem there is that from what I can tell since I was able to find the repository in GitHub is that the apps have been somewhat abandon. They were built using the Splunk Add-On Builder, but when I attempted to import the master into our instance, it failed. There is a 50/50 chance that they might end up getting updated to include the ingestion of the audit log, but I am not holding my breath considering the last update to them was in 2018.
I really think the perfect solution here would be if Dynatrace adds the ability to stream this log like they did with user sessions. That in my opinion would be ideal. Since this is still in early adopter status, perhaps they will take that into consideration.
No as we have really found no need until now with the audit logs to use Splunk with Dynatrace. I am not opposed to it, I just have not seen any real reason to do it until now.
I guess if you are a company that has gone all out with Splunk, then I can see where bringing in the metrics from Dynatrace would be of high value, but for us that is not the case. It's actually the other way around 🙂
We want Dynatrace being the one thing that is everywhere. It all really just depends on what your company uses Splunk for in the end. I know our instance is mainly for a single point to view all logs and the ability to conduct advanced searches on those. Plus being able to retain the data based on our needs.
Yes, I am going to PERFORM 2020 and can't wait! I went last year as well. I think that would be great to meet up and talk about this. I know there were a few other community members on here that said the same thing. Let's plan on it.
Thank you! I will get over there and accept it in just a few. In your case, I would for sure then check the Dynatrace App For Splunk on Splunkbase.
There are 2 apps in total that are dependent on each other.
Both are free and the install is very straight forward and easy. From what I seen of it while looking into the audit stuff, I think it would give you what you are looking for most likely. I don't think it offers bidirectional functionality though. Only pulls from Dynatrace. Worth a try anyway. Keep in mind though, the last update to it was in 2018. Everything I looked at around it that I was able to find made it look like it had been abandon.
Also, look into the "REST API Modular Input" app on Splunkbase. You can find it here. This is a 3rd party app which you license. It has been around for sometime, gets updated frequently, and because it's a paid for app, it has support. This is the next thing I am trying for the audit logs, but it would most likely also give you the other functionality you are looking for perhaps.
Good news all - I have Dynatrace SaaS audit logs going to Splunk as of this morning.
I have been experimenting with what I thought was the best solution and found this is it and is very simple to do. I am currently writing a tutorial on it and will post it later today. All it takes is an add-on app for Splunk from a 3rd party developer which in return gives Splunk the ability to conduct pulls instead of just listening like it does with the Event Collector and then a custom class I did that you add to a Python file in order be the acting ResponseHandler and you are off and running.
Look for the tutorial later today 🙂