Real User Monitoring
User session monitoring, key user actions - everything RUM.
cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Flutter User Action Auto-Instrumentation – Any Updates Since the Last Discussion?

AndreMaas
Observer

Hello everyone,

I hope you are doing well.

I’m opening this new thread because the previous discussion about Flutter Mobile App Instrumentation has already been closed (https://community.dynatrace.com/t5/Real-User-Monitoring/Flutter-Mobile-App-Instrumentation/m-p/26062...). I have carefully read that post and the responses, and I understand that at that time there was no auto-instrumentation for user actions (button taps, gestures) at the Flutter/Dart layer, only partial automatic capture for web requests, errors, and navigation, with manual instrumentation being required for user interactions.

With that context in mind, I would like to ask whether there have been any updates, improvements, or roadmap changes since then regarding auto-instrumentation for Dart/Flutter applications, especially at the user action level.

If auto-instrumentation for Flutter user interactions is still not available, I would also appreciate guidance on current best practices recommended by Dynatrace for day-to-day observability in Flutter apps.

Scenario

A user taps on the “My balances” button, and the application displays all available balances for the user’s account.

Desired outcome

We would like to observe this interaction as a single, well-defined user action, for example:

  • User Action: Touch on “My balances”
  • Within this action, we would like to see:
    • The backend endpoint GET /v1/balances being called
    • A response time of 200 ms
    • An HTTP status code 200 – Success

In other words, the goal is to have the button tap (“Touch on My balances”) as the parent user action, with the related HTTP request fully correlated inside it, allowing clear end-to-end visibility of the user experience.

Questions

  1. Since the last closed discussion, are there any new updates or plans on the roadmap for automatic user action instrumentation in Flutter (e.g., button taps, gestures, UI interactions)?
  2. If this is still not supported, what is the recommended approach today to:
    • Manually instrument user interactions in Flutter
    • Correlate those interactions with HTTP requests
    • Keep the instrumentation effort manageable and consistent across the app?

Any updates, best practices, or documentation references would be highly appreciated.

Thank you in advance for your support.

1 REPLY 1

matthias_hochri
Dynatrace Pro
Dynatrace Pro

Hello,

With release of 3.333.1 we added the following:

  • [New RUM experience] Added automatic user interaction tracking for touch events. When enabled server-side, tap interactions are captured with widget paths, labels, and responder information. The feature is controlled by the touchInteractionEnabled flag from the native agent configuration.

(We will probably correct this changelog still, so it is more precise. The touchInteractionEnabled is simply the configuration in new rum on grail experience where you can turn off UserInteractions in the web platform, so this gets reflected at runtime.)

So this will now show out of the box user interactions (What and where the user clicked). Additionally on the roadmap until June are user actions which will sit on top of the user interactions and will then automatically support your use case (e.g. web request linked to user action) for the new rum on grail experience. 

Featured Posts