Anyone experienced this issue before and is it Client centric or something should be changed at Server level?
Also, making the change will have any performance issue?
As far as I know this only adjusts how much your client pulls at that specific time as it is something I have to select every time I go over the 10k limit for results. Someone from Dynatrace may need to talk on how this affects the Dynatrace server as depending on your result amount it may be possible to to cause a memory issue on the dynatrace server but I have never seen an issue with it.
This is directly from their article in the documentation:
The analysis is stopped by the Dynatrace Server if the PurePaths limit is reached. This
limitation is not related to the number of PurePaths you are able to
store on disk. This PurePaths limit ensures that the Dynatrace
Client does not run out of memory when displaying the data.
blue information bar indicates that the analysis was canceled by the
Server and shows the maximum number of analyzed PurePaths. The
recommended response is to switch to a smaller timeframe for analysis.
Use the arrow buttons in the toolbar to navigate through the PurePath
However this article also states the following:
The Result Limit
is the maximum number of returned result values (e.g. invoked
methods). Set a limit to shorten analysis time and save resources on the
They do seem to give varying information that may depend on what dashboard/dashlet you are using. If you are concerned it would be best to place a ticket then update this when you hear back if no one from Dynatrace responds here.
Here is a comment from Andi Grabner:
"With dynaTrace 4 we introduced this new feature so that Dashlets are not requesting too much data from the dynaTrace Server. In case you have lets say 1 mio PurePaths it probably doesnt make sense to show them all in the PP Dashlet. It would take a while to request that data form the server and it would take lots of UI resources in your client.
Therefore we added a Limit Setting to the Dashlet Properties. It defaults to 10000 PurePaths but can be changed in the Dashlet Properties"
Basically you can look at as many PurePaths or web requests as you like, you could have performance issues on your client if you are pullling, viewing, and analyzing too much data at once but that really depends on what your hardware on you client is capable of. I could see there might also be issues if too many clients are asking for large amounts of data at once.
You will likely notice that the more PurePaths you are requesting (i.e large timeframes) the longer it will take to load.
Note that increasing the limit does not only affect the dT client. All of that data is pulled through the server, where it has to analyze the PPs to apply any conditions or filters. That can be a lot of work. A general best-practice is to not have PurePath-based dashboards that have broad timeframes to them. This is roughly analogous to writing a "select * from myTable..." query in a database to then scan the huge number of returned rows looking for your answers. The idea would be to use some other technique (focused BTs or measures, one of the various "hotspot" dashlets, and so on) to find the trouble spots, then dive into the PurePath details when you're in the general area.