I am trying to import and use pyspark
with anaconda.
After installing spark, and setting the $SPARK_HOME
variable I tried:
$ pip install pyspark
This won't work (of course) because I discovered that I need to tel python to look for pyspark
under $SPARK_HOME/python/
. The problem is that to do that, I need to set the $PYTHONPATH
while anaconda don't use that environment variable.
I tried to copy the content of $SPARK_HOME/python/
to ANACONDA_HOME/lib/python2.7/site-packages/
but it won't work.
Is there any solution to use pyspark in anaconda?