4

Stopping standalone spark master fails with the following message:

$ ./sbin/stop-master.sh 
no org.apache.spark.deploy.master.Master to stop

Why? There is one Spark Standalone master up and running.

Jacek Laskowski
  • 72,696
  • 27
  • 242
  • 420
AravindR
  • 677
  • 4
  • 11

2 Answers2

4

Spark master was started under different user.

/tmp/Spark-ec2-user-org.apache.spark.deploy.master.Master-1.pid

Was not accessible.Had to login under different user who actually started the stand alone cluster manager master.

AravindR
  • 677
  • 4
  • 11
0

In my case, I was able to open the master WebUI page on browser where it clearly mentioned that Spark Master is running on port 7077.

However, while trying to stop using stop-all.sh, was facing no org.apache.spark.deploy.master.Master to stop . So I tried a different method - to find what process is running on port 7077 using below command :

lsof -i :7077

I got the result as java with a PID of 112099

Used the below command to kill that process :

kill 112099

After this when I checked the WebUI, it had stopped working. Successfully killed the Spark Master.

Sowjanya R Bhat
  • 1,128
  • 10
  • 19