<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Reduce/prevent Google Search crawling RUM beacon URL's in Real User Monitoring</title>
    <link>https://community.dynatrace.com/t5/Real-User-Monitoring/Reduce-prevent-Google-Search-crawling-RUM-beacon-URL-s/m-p/235437#M5663</link>
    <description>&lt;P&gt;Hi community,&lt;/P&gt;&lt;P&gt;Our SEO's are seeing that a large amount of our google crawl budget is being used on Dynatrace RUM Beacon URL's (/rb_XXXXX). This is a pity as the budget should of course go to customer pages (and for some reason customers aren't interested in RUM beacon info.... &lt;span class="lia-unicode-emoji" title=":face_savoring_food:"&gt;😋&lt;/span&gt;)&lt;/P&gt;&lt;P&gt;I've discussed this matter earlier with Dynatrace and they've indicated either using the robots.txt or change the beacon URL to a different domain (Google allocates crawl budget per domain). Currently there's apparently no option within the settings to add the NOINDEX option.&lt;/P&gt;&lt;P&gt;The robot.txt isn't optimal as we sometimes still see Google crawling the URL's. Preferably I like to remain as much as possible with Dynatrace out of the box configuration. Therefore changing the beacon URL is something I don't prefer as well (I'm using the oneagent) plus the crawling issue remains and is pushed to a different domain.&lt;/P&gt;&lt;P&gt;Any suggestion how we nicely prevent google crawling these useless URL's?&lt;/P&gt;&lt;P&gt;Kind regards,&lt;/P&gt;&lt;P&gt;Daan&lt;/P&gt;</description>
    <pubDate>Thu, 25 Jan 2024 07:04:06 GMT</pubDate>
    <dc:creator>Daniël</dc:creator>
    <dc:date>2024-01-25T07:04:06Z</dc:date>
    <item>
      <title>Reduce/prevent Google Search crawling RUM beacon URL's</title>
      <link>https://community.dynatrace.com/t5/Real-User-Monitoring/Reduce-prevent-Google-Search-crawling-RUM-beacon-URL-s/m-p/235437#M5663</link>
      <description>&lt;P&gt;Hi community,&lt;/P&gt;&lt;P&gt;Our SEO's are seeing that a large amount of our google crawl budget is being used on Dynatrace RUM Beacon URL's (/rb_XXXXX). This is a pity as the budget should of course go to customer pages (and for some reason customers aren't interested in RUM beacon info.... &lt;span class="lia-unicode-emoji" title=":face_savoring_food:"&gt;😋&lt;/span&gt;)&lt;/P&gt;&lt;P&gt;I've discussed this matter earlier with Dynatrace and they've indicated either using the robots.txt or change the beacon URL to a different domain (Google allocates crawl budget per domain). Currently there's apparently no option within the settings to add the NOINDEX option.&lt;/P&gt;&lt;P&gt;The robot.txt isn't optimal as we sometimes still see Google crawling the URL's. Preferably I like to remain as much as possible with Dynatrace out of the box configuration. Therefore changing the beacon URL is something I don't prefer as well (I'm using the oneagent) plus the crawling issue remains and is pushed to a different domain.&lt;/P&gt;&lt;P&gt;Any suggestion how we nicely prevent google crawling these useless URL's?&lt;/P&gt;&lt;P&gt;Kind regards,&lt;/P&gt;&lt;P&gt;Daan&lt;/P&gt;</description>
      <pubDate>Thu, 25 Jan 2024 07:04:06 GMT</pubDate>
      <guid>https://community.dynatrace.com/t5/Real-User-Monitoring/Reduce-prevent-Google-Search-crawling-RUM-beacon-URL-s/m-p/235437#M5663</guid>
      <dc:creator>Daniël</dc:creator>
      <dc:date>2024-01-25T07:04:06Z</dc:date>
    </item>
    <item>
      <title>Re: Reduce/prevent Google Search crawling RUM beacon URL's</title>
      <link>https://community.dynatrace.com/t5/Real-User-Monitoring/Reduce-prevent-Google-Search-crawling-RUM-beacon-URL-s/m-p/241253#M5845</link>
      <description>&lt;P&gt;There isnt a good one for all solution. What you can do is block the recording of IPs. Yeah it might be a never ending battle adding in new IPs/ranges and it an app by app basis so the API would come in hand to post the rules to all apps.&amp;nbsp;&lt;/P&gt;&lt;P&gt;Just food for thought:&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="ChadTurner_0-1711562965951.png" style="width: 999px;"&gt;&lt;img src="https://community.dynatrace.com/t5/image/serverpage/image-id/18547iC4A70C57C6EA070E/image-size/large?v=v2&amp;amp;px=999" role="button" title="ChadTurner_0-1711562965951.png" alt="ChadTurner_0-1711562965951.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 27 Mar 2024 18:09:50 GMT</pubDate>
      <guid>https://community.dynatrace.com/t5/Real-User-Monitoring/Reduce-prevent-Google-Search-crawling-RUM-beacon-URL-s/m-p/241253#M5845</guid>
      <dc:creator>ChadTurner</dc:creator>
      <dc:date>2024-03-27T18:09:50Z</dc:date>
    </item>
  </channel>
</rss>

