Showing results for 
Show  only  | Search instead for 
Did you mean: 

Dynatrace Leak Analysis always runs out of memory


Hoping someone can shed some light on the cryptic error message I'm getting.

My DT Memory Analysis Server (DT 6.5) is 32 GB heap. The memory dump shows used memory at 2.52 GB - this is the size of the heap. So I should have ample memory to run post processing on this dump.

Yet no matter what size i increase the server heap (started with 20 GB, now at 32 GB), it runs for a while and then the following error occurs:


Not enough memory to calculate all garbage collection sizes - Required: 1,949, Available: 1,638

What do those numbers mean? The heap size for the server is sufficient. Is it referring to the DT client from where I am running the post processing? That's at 2GB so should be fine...



Hello Roy,

Do you have a standalone Memory Analysis Server?

If you plan a standalone Memory Analysis Server, you should consider four cores for it, and RAM should be around 25 percent larger than the largest memory dump you want to analyze.

Have a look on the below links for the set up a Memory Analysis Server and Memory Analysis Server best practices.



Dynatrace Pro
Dynatrace Pro

Hi Roy,

Adding to Babar's answer, you can very quickly verify your assumptions about heap allowance by going into Settings -> Dynatrace Server -> Services -> Dynatrace Memory Analysis Server and clicking "Test connection". This will print a message with how much heap memory is actually dedicated for doing memory analysis:

You can then use the guides Babar shared and edit your dtanalysisserver.ini file on the component's file system to play around with the heap allocations.

Best regards,



Thanks both for your answer.

As it turns out, I did have my memory analysis server with sufficient heap size. The problem was that I was trying to post-process an offline session in my client, which now I know does not access the analysis server at all (set up in the profile) but rather processes it locally. So the DT client is also able to perform this processing and that is what was trying to do. The solution was to increase the memory on my client instead and it worked.