I want to kill the spark job programatically. Following is the scenario:
When I kill the spark job using yarn application -kill <app_id>
it is getting killed, but if I do the ps -ef | grep <app_name>
then it is that spark job entry. How do I make sure it is getting killed from ps -ef
as well?
I want to do this programatically since I am doing yarn application -kill
through code.
Any help regarding this is appreciable.
Thanks in advance.