yep, see attached. I had been disabling them and ignoring, but the last one showed up today and it looks like it might be a malware site, so trying to understand how the detection works (to see if we have an issue!)
this might indicate a probe for an open proxy - e.g. apache. what happens is that a script asks the apache to retrieve a different page for it (in these cases google / bing ...)
If that's the case this would be visible in the webserver logs as failed requests to these sites.
Not sure if / how that can be disabled - will need R&D here.
Applications are detected either automatically by http headers or if the customer creates a detection rule by an URL pattern. The URL pattern rules are always considered first.
Auto detection (current version): By default ruxit checks the host, the x-forwarded-for or the x-host header for the "real" domain name - which is then used as the application name. Other headers could be defined in the global settings -> real user monitoring -> applications, look for "Identify domain names using HTTP request headers".
If ruxit picks up a new header value (new domain) it makes some checks before the application will be treated as application candidate, here are some examples:
Previous versions: unfortunately we don't had all the additional checks in previous versions. So most of the "www.google.com" applications are because we did a bad job on the auto detection. These apps are created because someone has send to your IP a http request with "www.google.com" in the host header. This is done to check if there is an open proxy. More info: http://meatballwiki.org/wiki/OpenProxy
With one of the next releases we will clean up the application lists and will remove all this false detected applications. For now it is the best way to just disable real user monitoring for them. If you still see new "false" detected application, please send me a note or open a support ticket, so that we can take a closer look.
No, this is will not happen. If you don't have a redirect for either www.domain-name.com or domain-name.com (which should be done to eliminate the duplicated content problem with google) we will not merge these apps. If one of this hosts works with a redirect, we will not detect it as a separate application.
So without redirect you can create a custom rule matching the domain patter: domain-name.com.
it looks like the list is cleaned up now! unfortunately it also looks like the cleanup might have been too aggressive as a few of my real sites have also disappeared from the application list. what was the fix that was put in place for this? thanks!