<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Custom Extension Creator - SQL Extension - Ingesting large data into logs in Extensions</title>
    <link>https://community.dynatrace.com/t5/Extensions/Custom-Extension-Creator-SQL-Extension-Ingesting-large-data-into/m-p/292159#M6962</link>
    <description>&lt;P&gt;What happens if you hard-set it in the .py file? It's not the most beautifull thing but guess it could work?&lt;/P&gt;</description>
    <pubDate>Wed, 24 Dec 2025 15:28:24 GMT</pubDate>
    <dc:creator>michiel_otten</dc:creator>
    <dc:date>2025-12-24T15:28:24Z</dc:date>
    <item>
      <title>Custom Extension Creator - SQL Extension - Ingesting large data into logs</title>
      <link>https://community.dynatrace.com/t5/Extensions/Custom-Extension-Creator-SQL-Extension-Ingesting-large-data-into/m-p/292133#M6959</link>
      <description>&lt;P&gt;Hello,&lt;BR /&gt;&lt;BR /&gt;I got an initiative ingesting data from a SQL server into logs.&lt;BR /&gt;&lt;BR /&gt;For this, I checked the features provided by the&amp;nbsp;Custom Extension Creator, selecting and extracting SQL data.&lt;BR /&gt;It worked quite well, but I think that it doesn't handle pretty well large data retrieved from a server at once, or based on checkpoints, such as: iterative batches and based on the last ingested timestamp.&lt;BR /&gt;Considering this, I was thinking to use Dynatrace Python Extension SDK, where I intend using a field for SQL query...&lt;BR /&gt;But, the inconvenience with this approach would be that there is no way having a multiline field, for more complex SQL queries, considering that the monitoring configuration is stored in a JSON structure, and not in a YAML file that supports new lines. In case a complex SQL query contains comments, whatever comes after comments "--" will be ignored&amp;nbsp;&lt;span class="lia-unicode-emoji" title=":thinking_face:"&gt;🤔&lt;/span&gt;&lt;BR /&gt;I will see up to where I can get implementing the data ingestion from a SQL server, using&amp;nbsp;Dynatrace Python Extension SDK.&lt;BR /&gt;&lt;BR /&gt;Please, let me know if you have any other suggestions, or any workarounds.&lt;BR /&gt;&lt;BR /&gt;Thanks,&lt;BR /&gt;Chris&lt;/P&gt;</description>
      <pubDate>Wed, 24 Dec 2025 03:59:45 GMT</pubDate>
      <guid>https://community.dynatrace.com/t5/Extensions/Custom-Extension-Creator-SQL-Extension-Ingesting-large-data-into/m-p/292133#M6959</guid>
      <dc:creator>chris_cho</dc:creator>
      <dc:date>2025-12-24T03:59:45Z</dc:date>
    </item>
    <item>
      <title>Re: Custom Extension Creator - SQL Extension - Ingesting large data into logs</title>
      <link>https://community.dynatrace.com/t5/Extensions/Custom-Extension-Creator-SQL-Extension-Ingesting-large-data-into/m-p/292149#M6960</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;Do you want to ingest extension logs? Or which logs?&lt;/P&gt;&lt;P&gt;My first recommenation would be from OneAgent, if you have OneAgent, in inframode, installed in SQL host.&lt;/P&gt;&lt;P&gt;Best regards&lt;/P&gt;</description>
      <pubDate>Wed, 24 Dec 2025 11:32:18 GMT</pubDate>
      <guid>https://community.dynatrace.com/t5/Extensions/Custom-Extension-Creator-SQL-Extension-Ingesting-large-data-into/m-p/292149#M6960</guid>
      <dc:creator>AntonPineiro</dc:creator>
      <dc:date>2025-12-24T11:32:18Z</dc:date>
    </item>
    <item>
      <title>Re: Custom Extension Creator - SQL Extension - Ingesting large data into logs</title>
      <link>https://community.dynatrace.com/t5/Extensions/Custom-Extension-Creator-SQL-Extension-Ingesting-large-data-into/m-p/292158#M6961</link>
      <description>&lt;P&gt;Thank you for your reply.&lt;BR /&gt;It's not about SQL logs, but specific data from particular tables.&lt;/P&gt;</description>
      <pubDate>Wed, 24 Dec 2025 15:08:19 GMT</pubDate>
      <guid>https://community.dynatrace.com/t5/Extensions/Custom-Extension-Creator-SQL-Extension-Ingesting-large-data-into/m-p/292158#M6961</guid>
      <dc:creator>chris_cho</dc:creator>
      <dc:date>2025-12-24T15:08:19Z</dc:date>
    </item>
    <item>
      <title>Re: Custom Extension Creator - SQL Extension - Ingesting large data into logs</title>
      <link>https://community.dynatrace.com/t5/Extensions/Custom-Extension-Creator-SQL-Extension-Ingesting-large-data-into/m-p/292159#M6962</link>
      <description>&lt;P&gt;What happens if you hard-set it in the .py file? It's not the most beautifull thing but guess it could work?&lt;/P&gt;</description>
      <pubDate>Wed, 24 Dec 2025 15:28:24 GMT</pubDate>
      <guid>https://community.dynatrace.com/t5/Extensions/Custom-Extension-Creator-SQL-Extension-Ingesting-large-data-into/m-p/292159#M6962</guid>
      <dc:creator>michiel_otten</dc:creator>
      <dc:date>2025-12-24T15:28:24Z</dc:date>
    </item>
    <item>
      <title>Re: Custom Extension Creator - SQL Extension - Ingesting large data into logs</title>
      <link>https://community.dynatrace.com/t5/Extensions/Custom-Extension-Creator-SQL-Extension-Ingesting-large-data-into/m-p/292160#M6963</link>
      <description>&lt;P&gt;Thank you for your reply.&lt;BR /&gt;&lt;BR /&gt;Hardcoded SQL statement would not be a viable and flexible option.&lt;BR /&gt;The main idea is to provide a &lt;STRONG&gt;generic implementation&lt;/STRONG&gt; for self-serve, thus anyone could use this Dynatrace Extension based on their needs, by providing all elements extracting the SQL data and ingesting it into logs, events, or bizevents.&lt;BR /&gt;Another option than using one line field for SQL query, would be using external files stored into a specific location, such as Nexus folder for example, which will be uploaded and executed&amp;nbsp;by the Dynatrace Extension. Work in progress....&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="chris_cho_0-1766602334205.png" style="width: 400px;"&gt;&lt;img src="https://community.dynatrace.com/t5/image/serverpage/image-id/31342i73249743116FAB03/image-size/medium?v=v2&amp;amp;px=400" role="button" title="chris_cho_0-1766602334205.png" alt="chris_cho_0-1766602334205.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 24 Dec 2025 18:52:29 GMT</pubDate>
      <guid>https://community.dynatrace.com/t5/Extensions/Custom-Extension-Creator-SQL-Extension-Ingesting-large-data-into/m-p/292160#M6963</guid>
      <dc:creator>chris_cho</dc:creator>
      <dc:date>2025-12-24T18:52:29Z</dc:date>
    </item>
    <item>
      <title>Re: Custom Extension Creator - SQL Extension - Ingesting large data into logs</title>
      <link>https://community.dynatrace.com/t5/Extensions/Custom-Extension-Creator-SQL-Extension-Ingesting-large-data-into/m-p/293572#M7019</link>
      <description>&lt;P&gt;Finally, I found the solution and implemented it successfully, by using the custom extension with Python SDK.&lt;BR /&gt;Also, a field with multi-lines should include&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;"type": "text",
"subType": "multiline",&lt;/LI-CODE&gt;&lt;P&gt;the properties' keys are case-sensitive&amp;nbsp;&lt;span class="lia-unicode-emoji" title=":winking_face:"&gt;😉&lt;/span&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 23 Jan 2026 16:38:36 GMT</pubDate>
      <guid>https://community.dynatrace.com/t5/Extensions/Custom-Extension-Creator-SQL-Extension-Ingesting-large-data-into/m-p/293572#M7019</guid>
      <dc:creator>chris_cho</dc:creator>
      <dc:date>2026-01-23T16:38:36Z</dc:date>
    </item>
  </channel>
</rss>

