I'have very simple example Spark job which counts 2+2 compiled with Spark 1.6.
I'm performing spark Submit in the following way:
spark-submit --master yarn --deploy-mode cluster --executor-memory 2G --driver-memory 1G --conf spark.yarn.jar=hdfs:/user/bigdata-app-xxx-yyy/diy/lib/spark-assembly-1.6.0-hadoop2.6.0.jar --queue root.xxxyyy --num-executors 4 --principal bigdata-app-xxx-yyy@kontosa.COM --keytab /clf/hadoop/conf/keytabs/bigdata-app-xxx-yyy.keytab --class com.vanilla.meir.Main hdfs:/user/bigdata-app-xxx-yyy/xxx/lib/spark-hello-world.jar
Job submitted, but it fails the following exception:
19/12/08 07:15:37 INFO storage.MemoryStore: MemoryStore started with capacity 457.9 MB
19/12/08 07:15:37 INFO spark.SparkEnv: Registering OutputCommitCoordinator
19/12/08 07:15:37 INFO ui.JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
19/12/08 07:15:37 INFO util.Utils: Successfully started service 'SparkUI' on port 35371.
19/12/08 07:15:37 INFO ui.SparkUI: Started SparkUI at http://10.204.152.26:35371
19/12/08 07:15:37 INFO cluster.YarnClusterScheduler: Created YarnClusterScheduler
19/12/08 07:15:37 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 43674.
19/12/08 07:15:37 INFO netty.NettyBlockTransferService: Server created on 43674
19/12/08 07:15:37 INFO storage.BlockManager: external shuffle service port = 7337
19/12/08 07:15:37 INFO storage.BlockManagerMaster: Trying to register BlockManager
19/12/08 07:15:37 INFO storage.BlockManagerMasterEndpoint: Registering block manager 10.204.152.26:43674 with 457.9 MB RAM, BlockManagerId(driver, 10.204.152.26, 43674)
19/12/08 07:15:37 INFO storage.BlockManagerMaster: Registered BlockManager
19/12/08 07:15:37 INFO scheduler.EventLoggingListener: Logging events to hdfs://Titan/user/spark/applicationHistory/application_1564355610025_265304_1
19/12/08 07:15:37 WARN spark.SparkContext: Dynamic Allocation and num executors both set, thus dynamic allocation disabled.
19/12/08 07:15:37 INFO ui.SparkUI: Stopped Spark web UI at http://10.204.152.26:35371
19/12/08 07:15:37 INFO cluster.YarnClusterSchedulerBackend: Shutting down all executors
19/12/08 07:15:37 INFO cluster.YarnClusterSchedulerBackend: Asking each executor to shut down
19/12/08 07:15:38 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
19/12/08 07:15:38 INFO storage.MemoryStore: MemoryStore cleared
19/12/08 07:15:38 INFO storage.BlockManager: BlockManager stopped
19/12/08 07:15:38 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
19/12/08 07:15:38 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
19/12/08 07:15:38 INFO spark.SparkContext: Successfully stopped SparkContext
19/12/08 07:15:38 ERROR spark.SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: Exception when registering SparkListener
at org.apache.spark.SparkContext.setupAndStartListenerBus(SparkContext.scala:2155)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:578)
at com.vanilla.meir.Main$.main(Main.scala:16)
at com.vanilla.meir.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:542)
Caused by: java.lang.ClassNotFoundException: com.cloudera.spark.lineage.ClouderaNavigatorListener
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.util.Utils$.classForName(Utils.scala:174)
at org.apache.spark.SparkContext$$anonfun$setupAndStartListenerBus$1.apply(SparkContext.scala:2123)
at org.apache.spark.SparkContext$$anonfun$setupAndStartListenerBus$1.apply(SparkContext.scala:2120)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:34)
at org.apache.spark.SparkContext.setupAndStartListenerBus(SparkContext.scala:2120)
... 8 more
19/12/08 07:15:38 INFO spark.SparkContext: SparkContext already stopped.
19/12/08 07:15:38 INFO remote.RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
19/12/08 07:15:38 ERROR yarn.ApplicationMaster: User class threw exception: org.apache.spark.SparkException: Exception when registering SparkListener
org.apache.spark.SparkException: Exception when registering SparkListener
at org.apache.spark.SparkContext.setupAndStartListenerBus(SparkContext.scala:2155)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:578)
at com.vanilla.meir.Main$.main(Main.scala:16)
at com.vanilla.meir.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:542)
Caused by: java.lang.ClassNotFoundException: com.cloudera.spark.lineage.ClouderaNavigatorListener
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.util.Utils$.classForName(Utils.scala:174)
at org.apache.spark.SparkContext$$anonfun$setupAndStartListenerBus$1.apply(SparkContext.scala:2123)
at org.apache.spark.SparkContext$$anonfun$setupAndStartListenerBus$1.apply(SparkContext.scala:2120)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:34)
at org.apache.spark.SparkContext.setupAndStartListenerBus(SparkContext.scala:2120)
... 8 more
19/12/08 07:15:38 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: org.apache.spark.SparkException: Exception when registering SparkListener)
19/12/08 07:15:38 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
19/12/08 07:15:38 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
19/12/08 07:15:46 ERROR yarn.ApplicationMaster: SparkContext did not initialize after waiting for 100000 ms. Please check earlier log output for errors. Failing the application.
19/12/08 07:15:46 INFO util.ShutdownHookManager: Shutdown hook called
it used to be ok on previous release and run successfully on Spark 1.5.2 but recompiling code for new Spark version brings this exeption.
Can someone help?