1

I read some posts regarding to the error I am seeing now when import pyspark, some suggest to install py4j, and I already did, and yet I am still seeing the error.

I am using a conda environment, here is the steps:
1. create a yml file and include the needed packages (including the py4j)
2. create a env based on the yml
3. create a kernel pointing to the env
4. start the kernel in Jupyter
5. running `import pyspark` throws error: ImportError: No module named py4j.protocol
mdivk
  • 3,545
  • 8
  • 53
  • 91

1 Answers1

2

The issue is resolved with adding environment section in kernel.json and explicitely specify the variables of the following:

 "env": {
  "HADOOP_CONF_DIR": "/etc/spark2/conf/yarn-conf",
  "PYSPARK_PYTHON":"/opt/cloudera/parcels/Anaconda/bin/python",
  "SPARK_HOME": "/opt/cloudera/parcels/SPARK2",
  "PYTHONPATH": "/opt/cloudera/parcels/SPARK2/lib/spark2/python/lib/py4j-0.10.7-src.zip:/opt/cloudera/parcels/SPARK2/lib/spark2/python/",
  "PYTHONSTARTUP": "/opt/cloudera/parcels/SPARK2/lib/spark2/python/pyspark/shell.py",
  "PYSPARK_SUBMIT_ARGS": " --master yarn --deploy-mode client pyspark-shell"
 }
mdivk
  • 3,545
  • 8
  • 53
  • 91