14 Dec 2023 07:40 PM
I have been researching this for a while and I wanted to ask if anyone is doing something similar. My organization has a couple of applications that store logs in a MSSQL database table and we would like to ingest them into Dynatrace as log data. I have been looking at several of the database extensions and they all seem to focus on metric data instead of log data but I noticed that the "Custom database extension" says "allows you to run queries against database and send the results back to Dynatrace as metrics and logs".
I went ahead and setup VS code with the Dynatrace extension and was able to build a basic extension but I'm struggling to find documentation for all the commands and all the examples are for metric ingestion, not logs. I have the configuration monitoring file with the database configuration and I found a command for extension.yaml called "sqlServer" but I can't figure out the syntax.
Is it possible to use Dynatrace to ingest a database table so that it shows up in 'Logs and Events'? Does anyone have any code examples of using an extension to ingest a database table as logs? We're using Dynatrace SAAS with agents updated to the latest version and logs by grail enabled.
Solved! Go to Solution.
14 Dec 2023 11:17 PM
Yes, extensions are more oriented for ingesting metrics. But no issue with injecting logs, as we are doing it for Oracle, MSSQL & DB2.
Trick is to inject it through Log API. Also, inject it through the local ActiveGate API, so it's more efficient. At the moment, all of our integrations are through EF1, but this will be unsupported after Oct 2024. But, if you have to do it now, you have to go through EF1, as EF2 will be only effective after next Perform.
15 Dec 2023 09:06 AM - edited 15 Dec 2023 09:08 AM
Hi,
This is possible now already using the method you just tried. There's no need to wait for Perform or coded extensions in the new framework.
Here's a working example of how to do it with MSSQL:
subgroups:
- subgroup: longest_queries
interval:
minutes: 5
featureSet: Queries
ingest: log
query: >
SELECT TOP (100)
UPPER(
ISNULL(
CONVERT(sysname, SERVERPROPERTY('InstanceName')),
'MSSQLSERVER'
)
) AS instance_name,
LEFT(
CONVERT(sysname, SERVERPROPERTY('ServerName')),
ISNULL(
NULLIF(
CHARINDEX('\', CONVERT(sysname, SERVERPROPERTY('ServerName'))) -1,
-1
),
LEN(CONVERT(sysname, SERVERPROPERTY('ServerName')))
)
) AS server_name,
p.query_id AS query_id,
p.query_plan AS query_plan,
q.object_id AS object_id,
ISNULL(OBJECT_NAME(q.object_id), '') AS object_name,
qt.query_sql_text AS query_sql_text,
ROUND(
CONVERT(
float,
SUM(rs.avg_duration * rs.count_executions)
) * 0.001,
2
) AS total_duration,
ROUND(
CONVERT(
float,
SUM(rs.avg_duration)
) * 0.001,
2
) AS avg_duration,
SUM(rs.count_executions) AS num_executions,
COUNT(distinct p.plan_id) AS num_query_plans
FROM
sys.query_store_runtime_stats rs
JOIN sys.query_store_plan p
ON p.plan_id = rs.plan_id
JOIN sys.query_store_query q
ON q.query_id = p.query_id
JOIN sys.query_store_query_text qt
ON q.query_text_id = qt.query_text_id
WHERE
rs.first_execution_time <= CURRENT_TIMESTAMP
AND rs.last_execution_time >= DATEADD(minute, -5, CURRENT_TIMESTAMP)
GROUP BY
p.query_id,
qt.query_sql_text,
q.object_id,
p.query_plan
HAVING
COUNT(distinct p.plan_id) >= 1
ORDER BY
total_duration DESC
attributes:
- key: extension
value: const:sql_mssql
- key: event.group
value: const:longest_queries
- key: instance
value: col:instance_name
- key: server
value: col:server_name
- key: content
value: col:query_sql_text
- key: query_id
value: col:query_id
- key: query_plan
value: col:query_plan
- key: object_id
value: col:object_id
- key: object_name
value: col:object_name
- key: total_duration
value: col:total_duration
- key: avg_duration
value: col:avg_duration
- key: num_executions
value: col:num_executions
- key: num_query_plans
value: col:num_query_plans
This example is from our MSSQL extension which currently is on the hub. You can download the full extension yourself if you go here: https://www.dynatrace.com/hub/detail/microsoft-sql-server-2/#release-notes
15 Dec 2023 06:37 PM
Thanks for the info folks! I managed to get the extension deployed and the Monitoring configuration status is "OK" but I don't see any database logs. Does anyone have an ideas why the logs aren't showing up in Logs and Events?
Here's my proof of concept configuration in extension.yaml:
15 Dec 2023 06:58 PM
There can be a few issues. If you look inside the extension log files on the ActiveGate you might get a clue. You can also open a support case.
15 Dec 2023 08:10 PM
@EPrice First I would recommend naming the columns, so in your case:
sqlServer:
- group: test
query: SELECT Message, TimeUtc FROM database_name.dbo.ELMAH_Error
attributes:
- key: content
value: col:Message
- key: time
value: col:TimeUtc
ingest: log
Check both the extension status and log files for the extensions on the ActiveGate.
10 Jan 2024 07:01 PM
"I've tried to find information on how to configure the Yaml, but I haven't found it. Could you assist me with this? I'm stuck at this point, and this is what I've got so far:"
11 Jan 2024 07:18 AM - edited 11 Jan 2024 07:20 AM
Hey @carlos_carreno ,
see a more comprehensive example below. In this case, the extension will perform the query every 5 minutes (interval setting - optional) if featureset userlog is enabled (that will be a configuration option in the extension so you can enable/disable this query by extension configuration - again optional).
Then your extension will execute the query and the output will be ingested as log (ingest: log). Then you can automatically create log attributes. You just need to fill the content attribute at least - that's the log message.
name: custom:corona.extension
version: "0.0.1"
minDynatraceVersion: "1.282.0"
author:
name: cjch1979
sqlServer:
- group: ownmetrics
interval:
minutes: 5
ingest: log
featureSet: userlog
query: >
select count(*) as count, user as user, concat('User ',user,' has ',count(*),' logins') as content from users group by user
attributes:
- key: users
value: col:user
- key: count
value: col:count
- key: content
value: col:content
Every row from the query will be ingested as a separate log entry.
Hope this helps.
19 Jan 2024 09:38 PM
Another thing to be careful about is the agent version. If you use the latest version in Julius' example then it won't work because the actual latest version is one version behind.
19 Jan 2024 07:59 PM
I'm still working out how to best accomplish this but one of the challenges we identified is that logs ingested by this method do no show up under "log.source" which means you have to query the events directly in advanced mode. I ended up defining a custom column so we can filter based on that field.
24 Jan 2024 04:03 PM
Thanks for everyone's help, I was able to speak with Mike and Stefan who were very supportive. Here's what I've found so far as a minimum configuration in extension.xml. This is a working solution that has been deployed and validated.
name: custom:application.ingestionversion: 0.0.5minDynatraceVersion: "1.279.0"author:name: Disco StusqlServer:- group: instanceinterval:minutes: 10ingest: logquery: SELECT 'APPLICATION_NAME' AS query, ErrorID, ErrorDescription, ErrorFunction, DateCreated FROM dbo.tblErrorLog WITH (NOLOCK) WHERE DateCreated >= DATEADD(minute, -15, GETDATE())attributes:- key: queryvalue: col:query- key: ErrorIDvalue: col:ErrorID- key: contentvalue: col:ErrorDescription- key: Typevalue: col:ErrorFunction- key: Timevalue: col:DateCreated
This works with some caveats:
fetch logs
| filter contains(query, "APPLICATION_NAME")
| fields Time, content, Type, ErrorID
| sort ErrorID desc
Presently we're facing an issue with duplicate events being captured as the query captures the past 15 minutes of logs and runs every 10 minutes. I think we will lose some log entries if we set both of these values to the same interval. We can set the query to only capture distinct entries but it won't make a difference since the extension just runs the entire query on interval.
Is there a way to prevent the ingestion of duplicate rows? I'm hoping there's something that can be added to extension.yaml because there's a cost associated with ingestion. Alternately, is there a way to structure a DQL query to filter out duplicates?
24 Jan 2024 04:41 PM
It looks like you can populate log.source by using it as a key in extension.yaml. I'm continuing to test.