Showing results for 
Show  only  | Search instead for 
Did you mean: 

AppMon Splunk Plugin


The Splunk Data Forwarder plugin will allow you to take dashboard data from AppMon and export it to Splunk.



Download the .Jar file for the plugin here and install it in your Dynatrace server. You’ll now see it appear in the Installed Plugins window.

Bring up the properties window of the plugin here to set the configuration settings needed for the plugin to access Dynatrace and Splunk.

SplunkUsername /SplunkPassword –The credentials for a user with access to Splunk

SplunkPort – Port Splunk is running on, default is 8089

SplunkServerAddress – Address that the Splunk server is located on

SplunkHost – Determines what Splunk host the data will be listed under

SplunkIndex – Determines what Splunk index will be used

DTUser /DTPassword – Credentials of a Dynatrace user with access to the dashboards in the REST Api

DTDashboard – Name of the dashboard that will be used

DTAddress – Address the DT Server resides at (will normally be localhost unless you are running the plugin from a collector)

Create a Task

Now go to a System Profile and create a new task. Select “Splunk Data Forward”.

You’ll find the settings you configured for the plugin will be copied over to this new task. Change what you need then go to the Schedule tab.

This plugin can be run manually or on a schedule. If you are running on a schedule you’ll need to ensure the scheduled time period is the same as the timeframe set in the dashboard being used. For this example the task will be set to run every 10 minutes.

Also be sure to execute the task on the Server, unless you’ve specified the DTAddress field accordingly.

Set up the Dashboard

Now you’ll need to define what data will be sent over to Splunk. This is determined entirely by the results and dashlets that appear in the defined dashboard.

The following dashlets are supported for data export:







-Tagged Web Request

-Web Request

The amount of data being exported is limited by the Reporting settings of the dashlet/dashboard. Be sure to change the Maximum number of lines per table according to how many events you want shipped to Splunk.

You’ll also need to change the Result Limit of any Purepath dashlet you are using.

There is no maximum to the amount of data you are allowed to export. Trials have been run where over 4 million events have been shipped to Splunk. Keep in mind this does take some time to execute though.

Any columns added to a dashlet will determine the data that is exported

Chart dashlets should have their name changed as this is used to label the measures in Splunk.

Once you have your dashboard set up you can preview the data being shipped to Splunk at a URL similar to the one below. Just replace the portion regarding the server and dashboard name accordingly.


Run the task

Your task is now ready to run. It may take some time depending on the amount of events sent. If you encounter issues please contact me as the plugin is still being tested.

The Data in Splunk

Now the data is in Splunk for your use.

Each event sent will have a “DataType” field. This will tell you what dashlet each event came from.

An important thing to keep in mind:

Purepath and Measure Events will be set to the time that they occurred originally. Whereas any other data will be set to the time the plugin exported the data to Splunk. This is merely because those other data types did not have a timestamp attributed with them in the XML.

From here you can manipulate the data however you want. If you come up with any good ideas for dashboards please feel free to share them.

Also let me know if you have any recommendations for changes to make to the plugin.



Hi Connor

Thanks for the plugin. I have some questions and hopefully this will help everyone in the community. 

  1. How is this "data-forwarder" different from the ability to pull data from dynaTrace using the Splunk App. This data-forwarder executes as scheduled task on the dynaTrace server, vs the Splunk App (Dashboard pull) executes as a scheduled app on the Splunk Server. Are we seeing different set of measures/metrics? What would be the difference in data for someone to go either ways?
  2. Can we use this "data-forwarder" to run on stored sessions? 
  3. What are the other/additional use cases you are addressing with this plugin? And if there are new uses cases, can we combine this with the Splunk App so we have a single solution for our community?
  4. Do you have to configure anything on the Splunk server to consume this data? Or are you using Splunk Universal Forwarder?
  5. What is the data transfer speed/latency if you have large data sets to consume from the dashboards?
  6. What is the CPU footprint of the forwarder task on the dynaTrace Server? 

Thanks, Rajesh


Alright here's a trial run of the plugin. It began at 11:52 and ended at noon. The plugin sent a total of 1 million events over to splunk in this timeframe. Below are charts showing the affect of running the plugin on our DT server for that duration.



Both CPU and memory took a bit of a hit here. Though this is expected when sending a million events. I'll try another run with many tasks pushing smaller amounts of data.

Here's the same graph after running 6 tasks concurrently with each sending ~10,000 events each to splunk. They began at 12:11


1) There's no need for a Flume server. As for the measures and metrics you're able to get charting, database, and error data. I don't recall if the Splunk App was able to get this.

2) Yes. Just set the source of a dashboard to a stored session. That was one of my main requirements in making this plugin.

3) Primarily I wanted the ability to pick and choose the data to send over instead of just sending live data. With that I didn't want to rely on using business transactions for the majority of data. As for a single solution I imagine there's some way my peanut butter and your jelly could make a sandwich (big grin)

4) Nothing needs to be configured on the Splunk server. That is unless you change the index being used. Then you need to ensure that exists. The plugin is using the SDK for Java that Splunk provides.

5) I haven't timed the transfer rate out yet, but it can take up to 10 minutes to transfer a couple million events over. But if you are only shipping 10k events then it doesn't take more than a minute.

6) Nice question. Let me get back on this.

Thanks for  the good questions. I hope some of these answers helped.

Hi Connor

There are three parts to the Splunk App.

  • One is the Real Time business feed using Flume / Bridge.
  • Second is the Dashboards export similar to what you have here, but configuring the Dashboard Name on the Splunk server and it runs as a Data Input Script.
  • And third is the Alerts Plugin. 

I would be interested in combining this(or replacing the Dashboards export with this), in case we are not seeing any significant performance impact on the Dynatrace server and if this works on variety of Dashboards/Dashlets. The current Dashboard Export relies on a specific XSL stylesheet and hence there is a limitation on what you can export, unless you can change the stylesheet. 

By default we use the index=dynatrace in all our data input mechanisms. 

Thanks, Rajesh

I like how this plugin lets us configure the dashboard/measure export from inside Dynatrace instead of having to go to the console of the Splunk server (many customers have a more difficult time getting console access).  Since this can be a task run on a collector, I assume no more overhead on the dT server than pulling from Splunk (only overhead is the REST calls, right?).  

Also, First question my customer had was "how do we specify the index used on the Splunk server".  This plugin make this clearer and easier to set.

FYI..if the index is specified incorrectly in the task, we get a not very helpfull log message on the collector.

You are correct. If you run the task on a collector the only load on the server is the REST calls.

I'm glad you've tried it out. If you come up with some unique dashboards in Splunk using data from the plugin feel free to share them. 

Apologies for the logs/errors the plugin creates. They aren't very specific or detailed as to the issue occurring. Those will be updated very soon.


Would your team also have an interest in being able to specify the source / sourcetype of the data being exported? There are many teams who are specific about the way they organize their data in Splunk and that functionality could be added.


Will this plugin work with AppMon 6.5?

Hello Steve,

I guess it should work because it is platform independent and you can check more details in the following link.



Hmm, doesn't list 6.5. I would assume it still works fine though. Thanks for the link.