I want to run a spark job locally for testing. If spark-submit
and a assembled jar are used it works just fine.
However if sbt run
is used I get a very strange error https://gist.github.com/geoHeil/946dd7706f44f338101c8332f4e13c1a
Trying to set java-opts like
javaOptions ++= Seq("-Xms512M", "-Xmx2048M", "-XX:MaxPermSize=2048M", "-XX:+CMSClassUnloadingEnabled")
Did not help to solve the problem.
Trying to fiddle with memory settings in local[*]
mode like
.set("spark.executor.memory", "7g")
.set("spark.driver.memory", "7g")
did only spawn further problems of ExecutorLostFailure