cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Rogue applications?

greg_birdwell
Newcomer

Hi,

I have several applications listing in my monitoring that I would not necessarily consider mine (like bing.com, google.com etc). How are those detected? Are they from links on a configured application?

9 REPLIES 9

michael_ghelli1
Inactive

@Greg Birdwell

Any chance you can attach a picture? I'm having a tough time envisioning what you are describing.

Thanks!

greg_birdwell
Newcomer

yep, see attached. I had been disabling them and ignoring, but the last one showed up today and it looks like it might be a malware site, so trying to understand how the detection works (to see if we have an issue!)

michael_ghelli1
Inactive

Thanks Greg. We'll get you an answer. I'm sure this will benefit the whole community

this might indicate a probe for an open proxy - e.g. apache. what happens is that a script asks the apache to retrieve a different page for it (in these cases google / bing ...)

If that's the case this would be visible in the webserver logs as failed requests to these sites.

Not sure if / how that can be disabled - will need R&D here.

AlexanderSommer
Dynatrace Pro
Dynatrace Pro

Hi Greg,

Applications are detected either automatically by http headers or if the customer creates a detection rule by an URL pattern. The URL pattern rules are always considered first.

Auto detection (current version): By default ruxit checks the host, the x-forwarded-for or the x-host header for the "real" domain name - which is then used as the application name. Other headers could be defined in the global settings -> real user monitoring -> applications, look for "Identify domain names using HTTP request headers".

If ruxit picks up a new header value (new domain) it makes some checks before the application will be treated as application candidate, here are some examples:

  • header must be valid according the specification
  • response code must be valid: 200
  • known robots and crawlers are ignored for new applications
  • and some more ...

When this checks are passed the new domain is an application candidate and the javascript tag will be injected. Only if a monitor signal from the javascript tag for that application is finding it's way to ruxit, then this application will be a real application and shows up in the list.

Previous versions: unfortunately we don't had all the additional checks in previous versions. So most of the "www.google.com" applications are because we did a bad job on the auto detection. These apps are created because someone has send to your IP a http request with "www.google.com" in the host header. This is done to check if there is an open proxy. More info: http://meatballwiki.org/wiki/OpenProxy

With one of the next releases we will clean up the application lists and will remove all this false detected applications. For now it is the best way to just disable real user monitoring for them. If you still see new "false" detected application, please send me a note or open a support ticket, so that we can take a closer look.

Alex

Hey Alex,

Will this cleanup even include the merging of www.domain-name.com and domain-name.com? I see that they are listed as two different applications now.

No, this is will not happen. If you don't have a redirect for either www.domain-name.com or domain-name.com (which should be done to eliminate the duplicated content problem with google) we will not merge these apps. If one of this hosts works with a redirect, we will not detect it as a separate application.

So without redirect you can create a custom rule matching the domain patter: domain-name.com.

greg_birdwell
Newcomer

it looks like the list is cleaned up now! unfortunately it also looks like the cleanup might have been too aggressive as a few of my real sites have also disappeared from the application list. what was the fix that was put in place for this? thanks!

After 72 hours auto detected applications will be hidden, but with appear again immediately with the first user action.