<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: How to avoid creating duplicate business events when using openpipeline? in Log Analytics</title>
    <link>https://community.dynatrace.com/t5/Log-Analytics/How-to-avoid-creating-duplicate-business-events-when-using/m-p/274497#M1262</link>
    <description>&lt;P&gt;Hi,&lt;BR /&gt;this thread will help you&amp;nbsp;&lt;BR /&gt;&lt;A href="https://community.dynatrace.com/t5/Extensions/Log-Ingestion-From-a-Database-Table/td-p/231961" target="_blank"&gt;https://community.dynatrace.com/t5/Extensions/Log-Ingestion-From-a-Database-Table/td-p/231961&lt;/A&gt;&lt;BR /&gt;&lt;BR /&gt;TLDR: play with the ingestion system fetching rules or play with filtering option when reading the logs (&lt;A href="https://docs.dynatrace.com/docs/shortlink/filtering-commands#dedup" target="_blank"&gt;dedup&lt;/A&gt; for example)&lt;/P&gt;</description>
    <pubDate>Sat, 05 Apr 2025 21:16:58 GMT</pubDate>
    <dc:creator>yanezza</dc:creator>
    <dc:date>2025-04-05T21:16:58Z</dc:date>
    <item>
      <title>How to avoid creating duplicate business events when using openpipeline?</title>
      <link>https://community.dynatrace.com/t5/Log-Analytics/How-to-avoid-creating-duplicate-business-events-when-using/m-p/272718#M1250</link>
      <description>&lt;P&gt;Hello, we have a system that runs an SQL query at interval and writes the result in a log file. The log file is ingested in dynatrace via openpipeline and bizevents are extracted from the logs. Unfortunately, the SQL query returns often the same information (it is not a tail but rather it reads the content of a table). This results in many duplicates log entries being created as well as multiple duplicate business events. How can we avoid ingesting duplicate business events when using openpipeline? Is there a way we can filter out the entry if it already exists? Each of our log entries contain an ID which uniquely identifies the entry, thus it would be easy to filter it out.&lt;/P&gt;&lt;P&gt;When attempting to do a lookup or fetch command in DQL in openpipeline it seems like this is not permitted.&lt;/P&gt;&lt;P&gt;Your help and recommandation is appreciated.&lt;/P&gt;</description>
      <pubDate>Mon, 17 Mar 2025 13:46:21 GMT</pubDate>
      <guid>https://community.dynatrace.com/t5/Log-Analytics/How-to-avoid-creating-duplicate-business-events-when-using/m-p/272718#M1250</guid>
      <dc:creator>strudeau</dc:creator>
      <dc:date>2025-03-17T13:46:21Z</dc:date>
    </item>
    <item>
      <title>Re: How to avoid creating duplicate business events when using openpipeline?</title>
      <link>https://community.dynatrace.com/t5/Log-Analytics/How-to-avoid-creating-duplicate-business-events-when-using/m-p/274497#M1262</link>
      <description>&lt;P&gt;Hi,&lt;BR /&gt;this thread will help you&amp;nbsp;&lt;BR /&gt;&lt;A href="https://community.dynatrace.com/t5/Extensions/Log-Ingestion-From-a-Database-Table/td-p/231961" target="_blank"&gt;https://community.dynatrace.com/t5/Extensions/Log-Ingestion-From-a-Database-Table/td-p/231961&lt;/A&gt;&lt;BR /&gt;&lt;BR /&gt;TLDR: play with the ingestion system fetching rules or play with filtering option when reading the logs (&lt;A href="https://docs.dynatrace.com/docs/shortlink/filtering-commands#dedup" target="_blank"&gt;dedup&lt;/A&gt; for example)&lt;/P&gt;</description>
      <pubDate>Sat, 05 Apr 2025 21:16:58 GMT</pubDate>
      <guid>https://community.dynatrace.com/t5/Log-Analytics/How-to-avoid-creating-duplicate-business-events-when-using/m-p/274497#M1262</guid>
      <dc:creator>yanezza</dc:creator>
      <dc:date>2025-04-05T21:16:58Z</dc:date>
    </item>
  </channel>
</rss>

