0

I'm running Spark's example called JavaPageRank, but it's a copy that I compiled separately using maven in a new jar. I keep getting this error:

ERROR SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[Executor task launch worker-0,5,main] java.lang.NoClassDefFoundError: com/google/common/collect/Iterables

Despite the fact that guava is listed as one of Spark's dependencies. I'm running compiled Spark 1.6 that I downloaded pre-compiled from the apache website.

Thanks!

user2839294
  • 51
  • 1
  • 5

1 Answers1

0

The error means that the jar containing com.google.common.collect.Iterables class is not in the classpath. So your application is not able to find the required class in runtime.

If you are using maven/gradle , try to clean, build and refresh the project. Then check your classes folder and make sure the guava jar is in the lib folder. Hope this will help.

Good luck!

Lina
  • 1,217
  • 1
  • 15
  • 28
  • Hi! Thank you for replying so quickly. But I didn't compile the spark source myself - should this problem persist if I use a pre-built version? – user2839294 Feb 10 '16 at 05:28
  • Ok if I understand you correctly you are using the examples inside the Spark package itself . If yes then most probably not building the spark package might be the problem. You need to build the package itself before running and using anything in it. Run "mvn -DskipTests clean package" to build your spark package. Similar to the question asked here http://stackoverflow.com/questions/27618843/why-does-spark-submit-and-spark-shell-fail-with-failed-to-find-spark-assembly-j/30047304#30047304 hope this will help. – Lina Feb 10 '16 at 05:47
  • I'm only copying one example from spark and compiling it in another external jar, with very minor changes. I want to run this jar based on spark that is already compiled. When I run the example directly from the compiled spark, it does not give an error. But it does give this error when I use my external jar. – user2839294 Feb 10 '16 at 05:58
  • Sorry for my late response.Your example is using Spark libraries and APIs right? When you use the compiled Spark , all of the sources and libraries that Spark needs, are in place , so you do not see an error. When you run it in you external jar, you have to make sure that you have all those needed dependencies configured in your maven/gradle or... Do you have those dependencies in your pom.xml? Have you tried to build your external application? – Lina Feb 11 '16 at 06:41
  • I've tried listing both spark-core and guava as my dependencies in maven. Even if I do list guava as a dependency, I get the same error. This is what my dependencies look like ` org.apache.spark spark-core_2.10 1.6.0 com.google.guava guava 19.0 ` – user2839294 Feb 11 '16 at 22:58