1

i need your help, i created 2 apps (one which using spray framework and the other one receive messages from kafka and send it to cassandra). Both run all the time and should never stop. I m in standalone on the server and my conf is :

- In spark_env.sh :

SPARK_MASTER_IP=MYIP
SPARK_EXECUTOR_CORES=2
SPARK_MASTER_PORT=7077
SPARK_EXECUTOR_MEMORY=4g
#SPARK_WORKER_PORT=65000
MASTER=spark://${SPARK_MASTER_IP}:${SPARK_MASTER_PORT}
SPARK_LOCAL_IP=MYIP
SPARK_MASTER_WEBUI_PORT=8080

- In spark_env.sh :
spark.master                     spark://MYIPMASTER:7077
spark.eventLog.enabled           true
spark.eventLog.dir               /opt/spark-1.6.1-bin-hadoop2.6/spark-events
spark.history.fs.logDirectory    /opt/spark-1.6.1-bin-hadoop2.6/logs
spark.io.compression.codec       lzf
spark.cassandra.connection.host MYIPMASTER
spark.cassandra.auth.username   LOGIN
spark.cassandra.auth.password   PASSWORD

I can access on both pages : MYIP:8080/ and MYIP:4040/ But on http://MYIP:8080/, i see only my workers , i can t see my application which running.

When i submit i use this :

/opt/spark-1.6.1-bin-hadoop2.6/bin/spark-submit --class MYCLASS --verbose --conf spark.eventLog.enable=true --conf spark.master.ui.port=8080 --master local[2] /opt/spark-1.6.1-bin-hadoop2.6/jars/MYJAR.jar

Why ? Could you help me?

Thanks a lot :)

thomas poidevin
  • 176
  • 2
  • 20

3 Answers3

4

In your spark-submit command you are using the --master as local[2] which is submitting the application in local mode. If you wants to run it on the standalone cluster that you are running then you should pass spark master URL in master option i.e. --master spark://MYIPMASTER:7077

Hokam
  • 924
  • 7
  • 19
  • Thanks for your answer, i tried --master spark://MYIPMASTER:7077 but it s doesn t work, i ve Caused by: java.net.ConnectException: Connection refused: /127.0.0.1:7077 What should i do ? I tried as well with the external Ip but i ve the same issue :( – thomas poidevin Sep 15 '16 at 14:24
  • Can you please try with the same URL which is visible on the spark master UI. – Hokam Sep 15 '16 at 17:04
  • thanks a lot , i fixed it , a type mistake... now it works perfectly – thomas poidevin Sep 16 '16 at 15:30
  • Can you please accept/vote the answer if it is helpful. – Hokam Sep 16 '16 at 15:51
0

In terms of the master, spark-submit will respect the setting by following orders,

  1. The master URL in your application code, which is the SparkSession.builder().master("...")
  2. The --master parameter for the spark-submit command
  3. The default configuration in your spark-defaults.conf
Zhong Dai
  • 484
  • 4
  • 9
0

Mode: Standalone cluster

1> bin/spark-submit --class com.deepak.spark.App ../spark-0.0.2-SNAPSHOT.jar --master spark://172.29.44.63:7077, was not working because master was specified after the jar

2> bin/spark-submit --class com.deepak.spark.App --master spark://172.29.44.63:7077 ../spark-0.0.2-SNAPSHOT.jar, this worked