0

I am developing a SparkJob on jobserver (v0.6.2 spark 1.6.1) using spark graphx and I am running into the following exception when trying to launch my job on Spark JobServer:

    {
  "status": "JOB LOADING FAILED",
  "result": {
    "errorClass": "java.lang.NoClassDefFoundError",
    "cause": "org.apache.spark.graphx.VertexRDD",
    "stack": ["java.net.URLClassLoader.findClass(URLClassLoader.java:381)", "java.lang.ClassLoader.loadClass(ClassLoader.java:424)", "java.lang.ClassLoader.loadClass(ClassLoader.java:357)", "java.lang.Class.getDeclaredFields0(Native Method)", "java.lang.Class.privateGetDeclaredFields(Class.java:2583)", "java.lang.Class.getField0(Class.java:2975)", "java.lang.Class.getField(Class.java:1701)", "spark.jobserver.util.JarUtils$.loadObject(JarUtils.scala:61)", "spark.jobserver.util.JarUtils$.loadClassOrObject(JarUtils.scala:37)", "spark.jobserver.JobCache$$anonfun$getSparkJob$1.apply(JobCache.scala:46)", "spark.jobserver.JobCache$$anonfun$getSparkJob$1.apply(JobCache.scala:37)", "spark.jobserver.util.LRUCache.get(LRUCache.scala:35)", "spark.jobserver.JobCache.getSparkJob(JobCache.scala:37)", "spark.jobserver.JobManagerActor$$anonfun$startJobInternal$1.apply$mcV$sp(JobManagerActor.scala:216)", "scala.util.control.Breaks.breakable(Breaks.scala:37)", "spark.jobserver.JobManagerActor.startJobInternal(JobManagerActor.scala:192)", "spark.jobserver.JobManagerActor$$anonfun$wrappedReceive$1.applyOrElse(JobManagerActor.scala:144)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)", "ooyala.common.akka.ActorStack$$anonfun$receive$1.applyOrElse(ActorStack.scala:33)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)", "ooyala.common.akka.Slf4jLogging$$anonfun$receive$1$$anonfun$applyOrElse$1.apply$mcV$sp(Slf4jLogging.scala:26)", "ooyala.common.akka.Slf4jLogging$class.ooyala$common$akka$Slf4jLogging$$withAkkaSourceLogging(Slf4jLogging.scala:35)", "ooyala.common.akka.Slf4jLogging$$anonfun$receive$1.applyOrElse(Slf4jLogging.scala:25)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)", "ooyala.common.akka.ActorMetrics$$anonfun$receive$1.applyOrElse(ActorMetrics.scala:24)", "akka.actor.Actor$class.aroundReceive(Actor.scala:467)", "ooyala.common.akka.InstrumentedActor.aroundReceive(InstrumentedActor.scala:8)", "akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)", "akka.actor.ActorCell.invoke(ActorCell.scala:487)", "akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)", "akka.dispatch.Mailbox.run(Mailbox.scala:220)", "akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)", "scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)", "scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)", "scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)", "scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)"],
    "causingClass": "java.lang.ClassNotFoundException",
    "message": "org/apache/spark/graphx/VertexRDD"
  }

Altough I've included graphx dependency in my build.sbt and in Dependecy.scala on jobserver.

Any Help?

zaki benz
  • 672
  • 7
  • 21

1 Answers1

0

Use the dependent-jar-uris context configuration param. Then the jar gets loaded for every job.

noorul
  • 1,283
  • 1
  • 8
  • 18
  • I added spark-graphx in Dependencies.scala and ran the folowing "project job-server-extras" and only then I've started jobserver using reStart. What is the diffrenece between the two solutions? – zaki benz Jan 18 '17 at 14:25
  • It might make a difference if you are running using reStart. How exactly are you starting job-server? – noorul Jan 19 '17 at 04:40
  • Yes I am using reStart – zaki benz Jan 20 '17 at 10:44