0

I built a fat jar and I am trying to run it with spark-submit on an EMR or locally. here is the command:

spark-submit \
--deploy-mode client  \
--class com.stash.data.omni.source.Runner myJar.jar  \
<arguments>

I keep getting an error related to akka configurations:

Exception in thread "main" com.typesafe.config.ConfigException$Missing: No configuration setting found for key 'akka.version'

It seems like the jar cannot find the reference.confs for akka at all. Has anyone dealt with this? I am able to run it without spark-submit on my local machine.

collarblind
  • 4,549
  • 13
  • 31
  • 49

2 Answers2

1

I think the issue is bundling into a jar with all it's dependencies, which causes problems with Akka, as described in the documentation:

Akka’s configuration approach relies heavily on the notion of every module/jar having its own reference.conf file. All of these will be discovered by the configuration and loaded. Unfortunately this also means that if you put/merge multiple jars into the same jar, you need to merge all the reference.conf files as well: otherwise all defaults will be lost.

You can follow this documentation to package your application and process to merge the reference.conf resources while bundling.It talks about packaging using sbt, maven, and gradle.

Let me know if it helps!!

Anand Sai
  • 1,566
  • 7
  • 11
0

it was my merge strategy. i had a catch all case _ => MergeStrategy.first. i changed it to case x => MergeStrategy.defaultMergeStrategy(x) and it worked.

collarblind
  • 4,549
  • 13
  • 31
  • 49