0

Getting the following exception.I ensured that I have jars in the location and i run the job with root permission.

ERROR [ExecutorRunner for app-20170509111035-0004/19686] 2017-05-09 11:11:19,267 SPARK-WORKER Logging.scala:95 - Error running executor
java.lang.IllegalStateException: No assemblies found in '/usr/apps/cassandra/dse/resources/spark/lib'.
        at org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:249) ~[spark-launcher_2.10-1.6.3.3.jar:1.6.3.3]
        at org.apache.spark.launcher.AbstractCommandBuilder.findAssembly(AbstractCommandBuilder.java:342) ~[spark-launcher_2.10-1.6.3.3.jar:1.6.3.3]
        at org.apache.spark.launcher.AbstractCommandBuilder.buildClassPath(AbstractCommandBuilder.java:187) ~[spark-launcher_2.10-1.6.3.3.jar:1.6.3.3]
        at org.apache.spark.launcher.AbstractCommandBuilder.buildJavaCommand(AbstractCommandBuilder.java:119) ~[spark-launcher_2.10-1.6.3.3.jar:1.6.3.3]
        at org.apache.spark.launcher.WorkerCommandBuilder.buildCommand(WorkerCommandBuilder.scala:39) ~[spark-core_2.10-1.6.3.3.jar:1.6.3.3]
        at org.apache.spark.launcher.WorkerCommandBuilder.buildCommand(WorkerCommandBuilder.scala:48) ~[spark-core_2.10-1.6.3.3.jar:1.6.3.3]
        at org.apache.spark.deploy.worker.CommandUtils$.buildCommandSeq(CommandUtils.scala:63) ~[spark-core_2.10-1.6.3.3.jar:5.0.8]
        at org.apache.spark.deploy.worker.CommandUtils$.buildProcessBuilder(CommandUtils.scala:51) ~[spark-core_2.10-1.6.3.3.jar:5.0.8]
        at org.apache.spark.deploy.worker.ExecutorRunner.org$apache$spark$deploy$worker$ExecutorRunner$$fetchAndRunExecutor(ExecutorRunner.scala:143) ~[spark-core_2.10-1.6.3.3.jar:5.0.8]
        at org.apache.spark.deploy.worker.ExecutorRunner$$anon$1.run(ExecutorRunner.scala:71) [spark-core_2.10-1.6.3.3.jar:5.0.8]
INFO  [dispatcher-event-loop-0] 2017-05-09 11:11:19,268 SPARK-WORKER Logging.scala:58 - Executor app-20170509111035-0004/19686 finished with state FAILED message java.lang.IllegalStateException: No assemblies found in '/usr/apps/cassandra/dse/resources/spark/lib'.

Any help would be appreciated.

Balaji Reddy
  • 5,576
  • 3
  • 36
  • 47
  • http://stackoverflow.com/questions/43590062/remote-spark-job-fails-no-assemblies-found – Ashraful Islam May 09 '17 at 16:18
  • yes. But it din help me – Balaji Reddy May 09 '17 at 16:19
  • http://stackoverflow.com/questions/30478045/dse-4-6-to-dse-4-7-failed-to-find-spark-assembly – undefined_variable May 09 '17 at 18:46
  • Thank you for the response. I will check and keep you posted. – Balaji Reddy May 09 '17 at 18:49
  • How did you run Spark - in DSE you have to run Spark with `dse spark`, `dse spark-submit`, etc. DSE has a different dependencies layout than the OSS Spark so if you run with OSS Spark scripts you will like have problems. – Jacek L. May 10 '17 at 14:51
  • The other problem here may be that you recently upgraded DSE from 4.8 to 5.0 but you somehow it uses some old scripts. You can check for example if you have the following line in you `bin/dse` script: `export _SPARK_ASSEMBLY="$("$DSE_SCRIPT" spark-classpath)"` – Jacek L. May 10 '17 at 14:57
  • @JacekL. Thanks for the response. the issue resolved. Will update the answer shortly – Balaji Reddy May 10 '17 at 15:01

1 Answers1

0

I find out the root cause. I'm running DSE analytics cluster and on one of my seed node, accidentally JRE version got changed and after pointing to the correct JRE this issue disappeared.

Balaji Reddy
  • 5,576
  • 3
  • 36
  • 47