Context: Windows 10/Linux Ubuntu 8 LTS
Spark: 3.3.0 (spark-3.3.0-bin-hadoop3)
I'm running spark-submit with a fat jar, but I receive a "jar does not exists" from DependencyUtils, even if the JAR is in place. As a consequence the mainClass is not found
Command:
bin\spark-submit --verbose --master local --class <packagename>.Application
--executor-memory 800m C:\<path to project>\target\scala-
2.12\ProjectName.jar
The result is
Using properties file: null
Parsed arguments:
master local
deployMode null
executorMemory 800m
executorCores null
totalExecutorCores null
propertiesFile null
driverMemory null
driverCores null
driverExtraClassPath null
driverExtraLibraryPath null
driverExtraJavaOptions null
supervise false
queue null
numExecutors null
files null
pyFiles null
archives null
mainClass <packagename>.Application
primaryResource file:/C:<path to project>target/scala-2.12/ProjectName.jar
name <packagename>.Application
childArgs []
jars null
packages null
packagesExclusions null
repositories null
verbose true
Spark properties used, including those specified through
--conf and those from the properties file null:
Main class:
<packagename>.Application
Arguments:
Spark config:
(spark.app.name,<packagename>.Application)
(spark.app.submitTime,1664225037825)
(spark.jars,file:/C:<path to project>/target/scala-2.12/ProjectName.jar)
(spark.master,local)
(spark.submit.deployMode,client)
(spark.submit.pyFiles,)
Classpath elements:
file:/C:<path to project>/target/scala-2.12/PrrojectName.jar
22/09/26 22:43:57 WARN DependencyUtils: Local jar C:<path to project>\target\scala-2.12\ProjectName.jar does not exist, skipping.
Error: Failed to load class <packagename>.Application.
22/09/26 22:43:57 INFO ShutdownHookManager: Shutdown hook called
Thanks for any suggestion.
Lorenzo