0

i installed pre_built version of spark on each node of my cluster, (just download it then unzip it)

Question 1 :

Do i have to copy into conf directory the files slaves.template and spark-env.sh.template then edit them to connect my machines to each other ? if yes how can i do it only by command

Question2:

i lunched master on one remote machine (and when i wanted to access to spark web UI from my local machine using

 http://IPofRemoteMachine:8080

IP_address:8080 or   IP_address:4040 

nothing has displayed on my browser, why and what i am missing ,?

Question3:

if i have 6 nodes on my cluster and if i want to use only 4 for example, do i have to lunch the master , then lunch workers only in nodes that i want to use?

hammad
  • 117
  • 4
  • 11

1 Answers1

1

Answer 1 :

You need to rename files by removing .template from them as slaves & spark-env.sh.

Suppose there are two machines 10.1.1.11(A) & 10.1.1.12(B) and you want to run spark master on machine A and workers on both A & B then in slaves you should write all IPs on which workers will run:

sample slaves file

10.1.1.11
10.1.1.12

sample spark-env.sh file

export SPARK_MASTER_MEMORY=1024M
export SPARK_DRIVER_MEMORY=1024M
export SPARK_WORKER_INSTANCES=1
export SPARK_EXECUTOR_INSTANCES=1
export SPARK_WORKER_MEMORY=1024M
export SPARK_EXECUTOR_MEMORY=1024M
export SPARK_WORKER_CORES=2
export SPARK_EXECUTOR_CORES=2
export SPARK_MASTER_IP=10.1.1.11
export SPARK_MASTER_WEBUI_PORT=8081

You can configure spark-env.sh (just a script file) with more options available here

Answer 2 :

You can change your spark web UI port by editing spark-env.sh to include SPARK_MASTER_WEBUI_PORT=8081

Then you can acess spark web ui on 10.1.1.11:8081.

If you get Could not resolve hostname check my answer here.

Answer 3 :

You can change nodes on which worker will be running in slaves file.

Community
  • 1
  • 1
Amit Kumar
  • 2,685
  • 2
  • 37
  • 72
  • how can i modify spark-env.sh by command because i have access to the remote machine only by commands – hammad Aug 29 '16 at 21:19
  • am i able to use (echo 'export SPARK_MASTER_IP=10.1.1.11 >> spark-env.sh) ?? – hammad Aug 29 '16 at 21:21
  • i did hat you said as follows http://stackoverflow.com/questions/39216303/failure-when-launching-spark-on-a-remote-machine-standalone , i could launch all but i still can not access to web UI (port=8081) – hammad Aug 29 '16 at 23:54
  • what is in the logs? Is port 8081 free on master machine? – Amit Kumar Aug 30 '16 at 05:59
  • is that what you mean by log: ( ./sbin/start-master.sh starting org.apache.spark.deploy.master.Master, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-o‌​rg.apache.spark.depl‌​oy.master.Master-1-u‌​buntu0.out – hammad Aug 30 '16 at 10:42
  • this iis the detail o my problem http://stackoverflow.com/questions/39226544/spark-web-ui-unreachable – hammad Aug 30 '16 at 11:48