1

I have a cloudera cdh5.3 quickstart running on a VM. I am having problems with running Spark. I have gone through those steps http://www.cloudera.com/content/cloudera/en/documentation/core/latest/topics/cdh_ig_spark_configure.... and run the word exapmle and it worked. But when I go to the master (quickstart.cloudera:18080) it has no workers there the cores=0, memory=0... when I go to (quickstart.cloudera:18081) there is a worker. My question is how to add workers? And what should I enter in export STANDALONE_SPARK_MASTER_HOST?

This is the spark-env.sh:

#Change the following to specify a real cluster's Master host
export STANDALONE_SPARK_MASTER_HOST=worker-20150402201049-10.0.2.15-7078
export SPARK_MASTER_IP=$STANDALONE_SPARK_MASTER_HOST
### Let's run everything with JVM runtime, instead of Scala
export SPARK_LAUNCH_WITH_SCALA=0
export SPARK_LIBRARY_PATH=${SPARK_HOME}/lib
export SCALA_LIBRARY_PATH=${SPARK_HOME}/lib
export SPARK_MASTER_WEBUI_PORT=18080
export SPARK_MASTER_PORT=7077
export SPARK_WORKER_PORT=7078
export SPARK_WORKER_WEBUI_PORT=18081
export SPARK_WORKER_DIR=/var/run/spark/work
export SPARK_LOG_DIR=/var/log/spark
export SPARK_PID_DIR='/var/run/spark/'
if [ -n "$HADOOP_HOME" ]; then
export LD_LIBRARY_PATH=:/lib/native
fi
export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-etc/hadoop/conf}
### Comment above 2 lines and uncomment the following if
### you want to run with scala version, that is included with the package
#export SCALA_HOME=${SCALA_HOME:-/usr/lib/spark/scala}
#export PATH=$PATH:$SCALA_HOME/bin

Thank you

Amori
  • 41
  • 2
  • 5
  • Is `worker-20150402201049-10.0.2.15-7078` really the master's host name? Sounds like the worker's name. – Marius Soutier Apr 04 '15 at 17:15
  • The hostname is: quickstart.cloudera and that is the workers name. I also tried quickstart.cloudera as Masters name but didn't work as well. – Amori Apr 04 '15 at 18:20
  • Yes I tried putting spark://10.0.2.15:7077 and spark://10.0.2.15:18080 but still I get: URL: spark://10.0.2.15:7077 Workers: 0 Cores: 0 Total, 0 Used Memory: 0.0 B Total, 0.0 B Used Applications: 0 Running, 0 Completed Drivers: 0 Running, 0 Completed Status: ALIVE – Amori Apr 04 '15 at 18:38
  • This is Spark Master at spark://10.0.2.15:7077 and the worker's is spark://10.0.2.15:7078 – Amori Apr 05 '15 at 18:11
  • I haven't tried that before and it WORKED. Thank you very much Marius – Amori Apr 05 '15 at 22:09

1 Answers1

0

Add export STANDALONE_SPARK_MASTER_HOST=10.0.2.15 to your spark-env.sh so both master and worker agree on the same host address.

Marius Soutier
  • 11,184
  • 1
  • 38
  • 48