1

i have installed spark2.0.0 on 12 nodes (in cluster standalone mode), when i launch it i get this :

./sbin/start-all.sh

starting org.apache.spark.deploy.master.Master, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.master.Master-1-ibnb25.out

localhost192.17.0.17: ssh: Could not resolve hostname localhost192.17.0.17: Name or service not known

192.17.0.20: starting org.apache.spark.deploy.worker.Worker, logging to /home/mbala/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb28.out

192.17.0.21: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb29.out

192.17.0.19: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb27.out

192.17.0.18: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb26.out

192.17.0.24: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb32.out

192.17.0.22: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb30.out

192.17.0.25: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb33.out

192.17.0.28: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb36.out

192.17.0.27: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb35.out

192.17.0.17: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb25.out

192.17.0.26: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb34.out

192.17.0.23: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb31.out

i have already set the port o master Port=8081 and its IP=192.17.0.17 means the HOSTNAME=ibnb25, i launched the cluster from this host.

from my local machine i use this command to access to the cluster

 ssh mName@xx.xx.xx.xx 

and when i wanted to access to the web UI from my local machine, i used the IPaddress of the master (HOST ibnb25)

192.17.0.17:8081

but it couldn't be displayed, so i tried with the address that i use to access to the cluster

xx.xx.xx.xx:8081

but nothing is displaying on my browser..... what is wrong?? pleaseeee help me

hammad
  • 117
  • 4
  • 11

1 Answers1

0

Your /etc/hosts file seems to be incorrectly set up.

You should get hostname and IP with following commands:

hostname
hostname -i

Make sure there is space between hostname and IP.

Sample /etc/hosts file looks like :

192.17.0.17  <hostname>
192.17.0.17  localhost
<Other IP1>  <other hostname1>
.
.
.
<Other IP-n>  <other hostname-n>

Make sure to have all IP host entries in cluster on each node in /etc/hosts file.

For FQDN read this.

Amit Kumar
  • 2,685
  • 2
  • 37
  • 72
  • do u mean slaves file? – hammad Aug 30 '16 at 12:41
  • and is it ok that i use the brower from my local machine? – hammad Aug 30 '16 at 12:50
  • check /etc/hosts file. I have mentioned it clearly, did you even run hostname and hostname -i commands? its ok. – Amit Kumar Aug 30 '16 at 12:52
  • yup i did, hostname gave 'ibnb25' and hostname -i gave '10.0.10.45', – hammad Aug 30 '16 at 12:55
  • but when i use ifconfig it gives : inet adr:192.17.0.17 – hammad Aug 30 '16 at 12:57
  • may i ask you what is the difference between 'ifconfig' and 'hostname -i' and why they gave 2 different values for the same host – hammad Aug 30 '16 at 13:59
  • i used sudo vi /etc/hosts but it seems that i dont have all privelege, this is the error message ""mName is not in the sudoers file. This incident will be reported."" – hammad Aug 30 '16 at 14:05
  • 'ifconfig' lists all network adapters and their information including ip addresses, hostname -i gives what is in your /etc/hosts file resolved as per your hostname. though you must have googled that. – Amit Kumar Aug 30 '16 at 14:06
  • Let us [continue this discussion in chat](http://chat.stackoverflow.com/rooms/122209/discussion-between-amit-kumar-and-hammad). – Amit Kumar Aug 30 '16 at 14:57