4

I'm getting the following exception when I'm trying to submit a Spark application to a Mesos cluster:

17/01/31 17:04:21 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 17/01/31 17:04:22 ERROR SparkContext: Error initializing SparkContext. org.apache.spark.SparkException: Could not parse Master URL: 'mesos://localhost:5050' at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2550) at org.apache.spark.SparkContext.(SparkContext.scala:501)

Jacek Laskowski
  • 72,696
  • 27
  • 242
  • 420

1 Answers1

4

You probably used a wrong command to build Spark, e.g., missing -Pmesos. You should build it using ./build/mvn -Pmesos -DskipTests clean package since Spark 2.1.0.

zsxwing
  • 20,270
  • 4
  • 37
  • 59
  • Curious where this `-Pmesos` option is documented at? I haven't been able to find any other reference to it anywhere. – zznq May 11 '17 at 16:55
  • See http://spark.apache.org/docs/2.1.0/building-spark.html#building-with-mesos-support – zsxwing May 11 '17 at 18:14