0

I am trying to execute spark-submit locally,

spark-submit --master local --executor-cores 1 --queue default --deploy-mode client test.py

but getting error

py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext. : java.lang.NoSuchFieldError: JAVA_9

I am using python 3.8 and pyspark 3.0.1, for java I have Open JDK 1.8

user9040429
  • 690
  • 1
  • 8
  • 29

1 Answers1

1

I ran into the same error, when trying to call a local jar via sc._jvm. I managed to make it work by reverting to an earlier version of pyspark (2.4.3) in my case.

I am also using python 3.6.8 - I had a different error with pyspark 3.0.1 for python 3.8.

sergiu
  • 11
  • 1