2

I've build Spark 2.1 source code successfully. However, when I run some of the examples (e.g., org.apache.spark.examples.mllib.BinaryClassification), I get the following error.

Exception in thread "main" java.lang.NoClassDefFoundError: scopt/OptionParser

I tried to run those examples using Spark 2.1 pre-built version (examples/jars/spark-examples_2.11-2.1.0.jar), and I got the same error. Spark 1.6 pre-built version works (lib/spark-examples-1.6.2-hadoop2.6.0.jar). There are posts related to this error, but they don't seem to be applicable because Spark examples folder does not have any .sbtfile.

smz
  • 391
  • 1
  • 3
  • 11

1 Answers1

2

I found the answer. To avoid the error, scopt_x.xx-x.x.x.jar should also be submitted using --jars. When you build Spark examples, in addition to spark-examples_x.xx-x.x.x.jar, scopt_x.xx-x.x.x.jar will be built too (in my case in the same target folder examples/target/scala-2.11/jars).

Once you have the jar file, you can submit it with your applications:

./bin/spark-submit \
   --jars examples/target/scala-2.11/jars/scopt_x.xx-x.x.x.jar \
   --class org.apache.spark.examples.mllib.BinaryClassification \
   --master ...
Derlin
  • 9,572
  • 2
  • 32
  • 53
smz
  • 391
  • 1
  • 3
  • 11
  • Man this is crazy. I've been doing spark - and even contributed to mllib - for 4 years but could not figure this out. The `bin/run-example` **really** should have taken care of this – WestCoastProjects Oct 21 '17 at 02:12