Hi,
When availability monitoring is configured for the Spark Submit process group, and when 2 or more separate Spark processes/jvms are running (same server), and one stops, no problem is created.
Solved! Go to Solution.
which kind of process group availability did you use.
I used "If any process becomes unavailable"
Mike
ok, the issue here is that when multiple processes for a process group are running on the same server they are normally treated as worker processes of the same "meta process" As such we do not alert if the number of workers goes from 2 to 1. Its an idea akin to threads. If you need to have this in a different way you need to give the different workers their own identity. look at the help on how to customize process group detection. The Node Id concept can be used to have multiple independent processes within the same group on the same server.
Thanks Michael, I made the changes and can manage each process as 1 worker.