I added some methods to the Breeze library and I can see those methods through IDE. And I was trying to add the Breeze library which is build by myself to my project which is based on Apache Spark. However, when I package my project by command "sbt assembly" and run it on my cluster, it throws an error "no such method xxx" which means the cluster actually didn't run my Breeze library. So could anyone tell me how to make the cluster run the Breeze library which is build by myself?
Asked
Active
Viewed 494 times
1 Answers
0
I have a guess that spark uses some version of Breeze libraries itself and prefer them over you custom .jars in assembly. You can try to build spark with your custom library. Install your library in your local maven repository, specify it in apache spark's pom.xml
and build your own spark version.

Nikita
- 4,435
- 3
- 24
- 44
-
1I have build breeze and publish it to .m2 repository and build spark by maven command "mvn package". After that, I substituted the spark-assembly-1.3.0-hadoop1.0.4.jar in the lib folder of Spark which is on my cluster with the same name jar file which is just build. I also put this jar to the lib folder of my project and package my project by command "sbt assembly". When I submit my project to cluster, it throws errors as follows: Lost task 126.2 in stage 0.0 (TID 449) on executor sr476: java.lang.NoClassDefFoundError (Could not initialize class breeze.linalg.DenseVector$) [duplicate 200] – Mark Apr 10 '15 at 13:56
-
You are getting a `java.lang.NoClassDefFoundError` and not `java.lang.ClassNotFoundException`. It means there is such class but it contains some errors in initialization (maybe propagated from your custom code or due to some incompatibilities). Unfortunately, i can't say more about this :( – Nikita Apr 10 '15 at 15:55
-
Thanks. I just find that I am using the wrong scala version 2.11.x instead of 2.10.4 which leads to errors. Problem solved! Thank you very much! – Mark Apr 11 '15 at 06:23