cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Splunk AppMon Integration

harin_jalajayes
Contributor

HI,

I have a requirement to send AppMon data to Splunk. I am aware of the realtime streaming capability to Splunk. In this case, DT is sending data to Search head server where flume is running.

We are looking for a solution where AppMon data can be posted to a different server from where Splunk forwarders will do the rest of the job.

Any suggestions on this?

Thanks,

Harin

4 REPLIES 4

arihant_polavar
Dynatrace Pro
Dynatrace Pro

You can definitely do what you are asking. In fact, that is what we did at my client site. Here's what we did:-

Install a Splunk Heavy Forwarder on a different server, install the Flume server and the Splunk app on the Heavy Forwarder, send the AppMon data to this Flume server and then query the indexes on the Heavy Forwarder from the main Splunk Infrastructure.

Let me know if you have any questions.

Thanks,

Ari

harin_jalajayes
Contributor

Thanks @Ari P. That helps. I believe installing the app takes care of Flume as well.

So, you do a scheduled clean up as well using flume?

We won't be using the dashboards which comes with Splunk app. I believe you suggested Heavy forwarder so that i can install app and Flume comes with it. What if i just have to write the Dynatrace data as log files in a different server and handover the job to Universal forwarder?

Also, i heard about Splunk HEC. Does that do a better job?

Thanks,

Harin

arihant_polavar
Dynatrace Pro
Dynatrace Pro

You are correct. Installing the app takes care of the Flume process as well.

Yes, we do have a scheduled clean up of the AppMon data folders. We do ours every week. Since the data is indexed to our actual Splunk infrastructure in almost real time and stored there, you don't really need those folders to sit there and grow in size.

We don't really use the dashboards either. We mostly use our integration to send Business Transaction data from AppMon into Splunk. The BT Feed does write all the data in three separate folders (one for PurePaths, one for User Actions and one for Visits). Same for alerts as well (it has its own folder). You can essentially treat these folders as log files and hand it over to the Universal Folder.

I personally do not have any experience with the Splunk HEC so I can't really speak to that.

Hope this helps,

Ari

Thanks @Ari P.