Showing results for 
Show  only  | Search instead for 
Did you mean: 

This product reached the end of support date on March 31, 2021.

Question on customize data to be written to external database.



Client side wanted to have CORRELATED performance metrics between application and server host resource utilization between periods. Can
be recent or can be of past years. So we are trying to know how to cater for such
requirement and populate the correct performance metrics.

So our pm is questioning the follows:

  • To have understanding of what
    type of performance data (Session storage store pure path) is store in what
    database / flat file for AppMon, DCRUM, Synthetic.

  • How we can “selectively” have
    these data to be written to external database so can address query and capacity

Anyone have some ideas? Thanks

Best wishes, YH


DynaMight Leader
DynaMight Leader

Hello Yee,

Session data stores on the file system not in the database.

A session is a set of diagnosis data related to a specific System Profile.

Examples of a session are:

  • PurePath Session – Combined information about PurePaths and Time Series within a given time period.
  • Memory Dump – Analysis data about the number, size, and class of allocated objects and their references.
  • Thread Dump – Collection of data for thread analysis, including CPU time information.
  • Sampling – Statistical data for all threads, which can help to find Rules and entry points.

There are tree types of sessions:

  • Live Sessions
  • Stored Sessions
  • Offline Sessions

Use file system or drive-level encryption for your session data if you need to protect them. Additionally, you can hide strings in PurePaths that should be kept confidential when the data is captured.

Every session — live, stored, or offline — can be exported to a .dts archive (Dynatracesession file). A stored session can be restored by importing a .dts archive. You can import and export sessions by using the Cockpit Dashlet or the Session Browser Dashlet dashlet.



Dynatrace Champion
Dynatrace Champion

Hi Yee,

Out of the box, we write a number of Measures to the Performance Warehouse (relational database) that come from PurePath data (and we store the full PurePath in the Session Store). PureStack data, that is any non-transactional/infrastructure data, such as CPU/Memory is pulled every 10 seconds and is Measure data (meaning it is stored in the PWH). If you run monitors to pull additional infrastructure data, that is also stored directly as measure data in the PWH.

So we already selectively store correlated performance data to the PWH to be available for years (configurable, "forever" by default).

You can also create your own measures, and can "subscribe" to measures that we may not have "turned on" by default that you want to capture.

To see the "subscribed" measures, edit your system profile, select measures and then select one measure on the right, then click Ctrl-A to select all, then right click and choose "Expand All". To see the full universe of measures available, click the "Create measure" button. You can then select one measure category at a time and expand it to see all the measures for that category.

The important thing to remember is all measures are stored in the PWH (Performance Warehouse), and that many are stored out of the box, and others can be selected by the user for long term storage.





Thanks for sharing the different suggestion. Our user team did suggest to backup the sessions folder, but this also have cons, like document stated for live session its 5GB. This is huge if we keep days/weeks of data. Export dts file can be quite slow depending on infra, our is quite big so take collecting hours of data already need lots of time.

So we can try to read data we need from PWH, we using microsoft sql?

Thanks & Best wishes,


Dynatrace Champion
Dynatrace Champion

Have you looked at the "Realtime Streaming" options? You can stream UEM data to ELK (Elastic and Kibana, we dont really need Logstash) using the "PureLytics Stream" and/or you can stream Business Transaction data to ELK or Splunk using the "Business Transaction Feed". These are the typical preferred integration techniques for getting at the "big data" we capture. Of course, you can access the PWH data, but since the schema is not public, it is subject to change from release to release, and you would have to figure out the relationships on your own. So I would never recommend it.



Hi Dave,

Noted on the PWH data change over version and the complexity. Thank you for the suggestion on integration techniques ondata. Will feedback and discussion with the team.

Best wishes,

Yee Heng

There are numerous ways you can extract the relevant data for your capacity management estimation and your requirements addressed. Actual PurePath have complex structure of storage, are proprietary and infeasible + non-essential for any use case analysis. Use following:

1. Charting(Table+Graphs), Reports & Schedules

2. REST interface

3. Export Via Realtime Streaming

4. Export Visits to ElasticSearch

5. Custom Plugins (Such as for Incidents)

6. Dynatrace Sessions