On my current project, I tried to deploy a 2.2 version of SPARK where on the cluster a 2.1 version is available. I looked in the SPARK documentation the way to deploy specific dependencies on a cluster which led me to use the following spark-submit:
spark-submit --master yarn --class MainMethodSparkApp --conf spark.driver.extraClassPath=localPath-to-jar-with-ependencies
--conf spark.executor.extraClassPath=localPath-to-jar-with-dependencies --conf spark.jars=hdfsPath-jar-with-dependencies --queue queueName --deploy-mode cluster
--driver-memory xx --num-executors xx --executor-memory xx --executor-core xx
The driver and executors are set using the jar with the dependencies. However, I am still getting a No.Such.Method.Exception; The SPARK documentation is a bit unclear as to which options to use to successfully deploy specific dependencies. What am I missing? Any suggestions are welcomed. Thanks a lot!