19 Jun 2025 05:43 PM
Is that possible to automate the process of pulling data from files stored in SharePoint (such as .txt, .csv, or .xlsx) and ingesting that data into Dynatrace as logs for further comparison and analysis.
The goal is to run this process on a scheduled basis — for example, daily — so it checks the file at a specific time, extracts the relevant information, and sends it to Dynatrace using the Logs Ingestion API.
Thanks in Advance 😃.
Solved! Go to Solution.
19 Jun 2025 10:49 PM
The answer is yes, so long as there is a path for Dynatrace to connect to the SharePoint site housing the data.
It will take some custom coding, but in short, you could construct a Dynatrace workflow that consists of the following actions:
1. A code block that downloads the file from SharePoint via the Microsoft Graph API and store the contents as a variable
2. A looping code block that pushes the data into the log ingest pipeline, one event at a time.