0

I need help changing the python version of a spark worker node to get rid of the following error message:

RuntimeError: Python in worker has different version 3.10 than that in driver 3.9, PySpark cannot run with different minor versions. Please check environment variables PYSPARK_PYTHON and PYSPARK_DRIVER_PYTHON are correctly set.

I am using a conda environment on my Mac.

I have reset the PYSPARK_PYTHON and PYSPARK_DRIVER_PYTHON environment variables:

export PYSPARK_PYTHON = /path/to/python3.9
export PYSPARK_DRIVER_PYTHON = /path/to/python3.9

However, I still get the same error message. I saw in another post to check the SparkContext Config for worker node information: SparkContext Config

However, I still do not know how to change the worker node information... can someone please help me out? (I saw other posts saying to change the worker node python version but I do not know how to find the worker node, check the version, and change it... I need more detail)

0 Answers0