I have a machine with JupyterHub (Python2,Python3,R and Bash Kernels). I have Spark(scala) and off course PySpark working. I can even use PySpark inside an interactive IPython notebook with a command like:
IPYTHON_OPTS="notebook" $path/to/bin/pyspark
(this open a Jupyter notebook and inside Python2 I can use Spark)
BUT I can't get PySpark working inside JupyterHub.
the spark kernel is more than what i really need.
I only need Pyspark inside JupyterHub. Any suggestion ?
thanks.