cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Application process unavailable incident

gopikrishnanr
Organizer

Hi,

I have got an incident of Application process Unavailable unexpected and I don't know the reason for the same . Collector log shows the agent connection was lost as below :

What could be the possible reason for this. I have multiple agents on the same host and this incident was for one agent.

2017-07-08 05:01:33 WARNING [Instrumentor] Connection to 'Prod_WCF_AWSWEB[awswebpool]@mliprd81:5500' closed (by SocketException: Connection reset)
2017-07-08 05:01:33 WARNING [EventCollector] agent connection 'Prod_WCF_AWSWEB[awswebpool]@mliprd81:5500' closed (prev event: 65, current event: -1).
2017-07-08 05:01:33 WARNING [EventCollector] Connection to agent 'Prod_WCF_AWSWEB[awswebpool]@mliprd81:5500' closed without receiving EOF event. Some data might be missing
2017-07-08 05:02:29 WARNING [Controller] Connection between controller and agent Prod_WCF_AWSWEB[awswebpool]@mliprd81:5500 closed.: com.dynatrace.diagnostics.collector.agentcenter.Controller a:711
java.net.SocketException: Connection reset by peer: socket write error
at java.net.SocketOutputStream.socketWrite0(Native Method)
at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:113)
at java.net.SocketOutputStream.write(SocketOutputStream.java:159)
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140)
at java.io.DataOutputStream.flush(DataOutputStream.java:123)
at com.dynatrace.diagnostics.core.serialization.buffer.wrapper.DataOutputStreamWrapper.flush(SourceFile:86)
at com.dynatrace.diagnostics.collector.protocol.agent.ControllerProtocol.sendCommandToAgent(SourceFile:78)
at com.dynatrace.diagnostics.collector.protocol.agent.ControllerProtocol30.sendNOOPCommand(SourceFile:1835)
at com.dynatrace.diagnostics.collector.agentcenter.Controller.a(SourceFile:768)
at com.dynatrace.diagnostics.collector.agentcenter.Controller.sendNOOPCommand(SourceFile:793)
at com.dynatrace.diagnostics.collector.agentcenter.Controller.run(SourceFile:954)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:304)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:178)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
2017-07-08 05:02:29 INFO [Controller] Controller closed connection to agent Prod_WCF_AWSWEB[awswebpool]@mliprd81:5500
2017-07-08 05:02:36 INFO [InheritanceMapStorage] 58989 ClassInheritances written (94ms)
2017-07-08 05:02:36 INFO [AgentPeerPool] removed agent 'Prod_WCF_AWSWEB[awswebpool]@mliprd81:5500' from pool - number of agents: 4
2017-07-08 05:02:36 WARNING [AgentCenter] Agent "Prod_WCF_AWSWEB[awswebpool]@mliprd81:5500" (83142d22) has been disconnected for more than 60000ms. Removing agent...

Regards,

Gopikrishnan

2 REPLIES 2

BabarQayyum
Leader

Hello Gopikrishnan,

Do you have a firewall in between agent and collector?

2017-07-08 05:02:36 WARNING [AgentCenter] Agent "Prod_WCF_AWSWEB[awswebpool]@mliprd81:5500" (83142d22) has been disconnected for more than 60000ms. Removing agent...

According to your logs agent could not connect to the collector within 60000ms which is the default time.

What is this mean "I have multiple agents on the same host and this incident was for one agent"?

Regards,

Babar

Hi Babar,

There is no firewall between agent and collector.

What is this mean "I have multiple agents on the same host and this incident was for one agent"?

I mean am listening to multiple processes in the same host not just one .NET process and of that multiple processes this incident was for one.

Regards,

Gopikrishnan R.