29 Oct 2014 08:11 PM - last edited on 01 Sep 2022 11:06 AM by MaciejNeumann
Hi,
I have several applications listing in my monitoring that I would not necessarily consider mine (like bing.com, google.com etc). How are those detected? Are they from links on a configured application?
Solved! Go to Solution.
29 Oct 2014 08:24 PM
Any chance you can attach a picture? I'm having a tough time envisioning what you are describing.
Thanks!
29 Oct 2014 08:35 PM
yep, see attached. I had been disabling them and ignoring, but the last one showed up today and it looks like it might be a malware site, so trying to understand how the detection works (to see if we have an issue!)
29 Oct 2014 08:54 PM
Thanks Greg. We'll get you an answer. I'm sure this will benefit the whole community
29 Oct 2014 09:08 PM
this might indicate a probe for an open proxy - e.g. apache. what happens is that a script asks the apache to retrieve a different page for it (in these cases google / bing ...)
If that's the case this would be visible in the webserver logs as failed requests to these sites.
Not sure if / how that can be disabled - will need R&D here.
30 Oct 2014 08:36 AM
Hi Greg,
Applications are detected either automatically by http headers or if the customer creates a detection rule by an URL pattern. The URL pattern rules are always considered first.
Auto detection (current version): By default ruxit checks the host, the x-forwarded-for or the x-host header for the "real" domain name - which is then used as the application name. Other headers could be defined in the global settings -> real user monitoring -> applications, look for "Identify domain names using HTTP request headers".
If ruxit picks up a new header value (new domain) it makes some checks before the application will be treated as application candidate, here are some examples:
When this checks are passed the new domain is an application candidate and the javascript tag will be injected. Only if a monitor signal from the javascript tag for that application is finding it's way to ruxit, then this application will be a real application and shows up in the list.
Previous versions: unfortunately we don't had all the additional checks in previous versions. So most of the "www.google.com" applications are because we did a bad job on the auto detection. These apps are created because someone has send to your IP a http request with "www.google.com" in the host header. This is done to check if there is an open proxy. More info: http://meatballwiki.org/wiki/OpenProxy
With one of the next releases we will clean up the application lists and will remove all this false detected applications. For now it is the best way to just disable real user monitoring for them. If you still see new "false" detected application, please send me a note or open a support ticket, so that we can take a closer look.
Alex
10 Nov 2014 01:33 PM
Hey Alex,
Will this cleanup even include the merging of www.domain-name.com and domain-name.com? I see that they are listed as two different applications now.
10 Nov 2014 04:21 PM
No, this is will not happen. If you don't have a redirect for either www.domain-name.com or domain-name.com (which should be done to eliminate the duplicated content problem with google) we will not merge these apps. If one of this hosts works with a redirect, we will not detect it as a separate application.
So without redirect you can create a custom rule matching the domain patter: domain-name.com.
20 Nov 2014 09:42 PM
it looks like the list is cleaned up now! unfortunately it also looks like the cleanup might have been too aggressive as a few of my real sites have also disappeared from the application list. what was the fix that was put in place for this? thanks!
21 Nov 2014 06:34 AM
After 72 hours auto detected applications will be hidden, but with appear again immediately with the first user action.