0

I am trying to setup a spark standalone cluster with 3 nodes. Configurations for Linux servers are below:

master node with 2 core and 25GB memory

worker node 1 with 4 core and 21GB memory

worker node 2 with 8 core and 19GB memory

I have started the master node successfully and its url is spark://IP:7077

when I start any of the worker node with command ./sbin/start-worker.sh spark://IP:7077 I get the below error message:

22/11/25 09:13:58 INFO Worker: Connecting to master IP:7077...
22/11/25 09:14:54 ERROR RpcOutboxMessage: Ask terminated before connecting successfully
22/11/25 09:14:54 WARN NettyRpcEnv: Ignored failure: java.io.IOException: Connecting to /IP:7077 timed out (120000 ms)
22/11/25 09:14:54 WARN Worker: Failed to connect to master IP:7077
org.apache.spark.SparkException: Exception thrown in awaitResult:

openjdk 11.0.17 is the java version installed on all the 3 nodes

Any solutions to resolve this issue will be helpful.

shee8
  • 139
  • 10

0 Answers0