1

I am on a windows machine. Running pyspark that was set up using databricks cli.

Getting this error:

Python in worker has different version 3.8 than that in driver 3.9, PySpark cannot run with different minor versions. Please check environment variables PYSPARK_PYTHON and PYSPARK_DRIVER_PYTHON are correctly set.

It says pyspark driver is 3.9, but my PYSPARK_DRIVER_PYTHON is pointing to a 3.8 version of python.

PYSPARK_DRIVER_PYTHON points to 3.8

Why is my pyspark driver still running 3.9?

Thank you.

jrudd
  • 113
  • 2

0 Answers0