We want to using Dynatrace with java process monitoring on WebSphere 126.96.36.199 . After the below parameters inserted Generic JVM Parameters, Instance starting time increase from 30 seconds to 4 minutes.There is no error in SystemOut.log.
Why is it taking longer to get up ? Let me know your comments thanks.
If you have a network latency or connection issue or overloaded collector then the agent startup can take more time because during startup the agent sends Java Class Files to the Collector.
Have a look on the below link for the 'Collector best practices'.
Thanks for your comments. There is no error in log files. Only following warning exists.
License = license exhausted; too many agents connected;
This collector also defined different Tomcat server. But no any latency when starting JVM process.
Look in your agent log file and find the following log string:
Average transformation time for 1333 classes/modules was 5ms - maximum set to 60ms
This string tells you how long it's taking to move class files back and forth between your instrumented JVM and the collector. Compare the JVM that is slow starting with a JVM which starts quickly. What are the values? The system always tests 1333 classes/modules and the MAX value should always be the same unless you've modified it. We're interested in the second value.
Also keep in mind that a WebSphere JVM has LOTS of classes and thus will take longer to start than any other JVM, even with the same "Average Transformation Time" values.
Locating a collector closer to your WAS JVM and reducing your Average Transformation Time will improve your startup time significantly. It might be worth experimenting and putting a collector right on the same host as WAS JVM. Not necessarily a final configuration but will help confirm whether your issue is Collector Transformation time or something else.
Let us know what you find.