1

I am trying to install apache spark-1.6.1 as a stand alone mode. I have followed "https://github.com/KristianHolsheimer/pyspark-setup-guide" link. But, after the execution of

$ sbt/sbt assembly

I have tried

$ ./bin/run-example SparkPi 10"

but, it gave an error,

./bin/run-example: line 26: /home/dasprasun/opt/spark/bin/load-spark env.sh: No such file or directory
Failed to find Spark examples assembly in /home/dasprasun/opt/spark/lib or /home/dasprasun/opt/spark/examples/target
You need to build Spark before running this program

After Completion of all the steps, I gave the following comman in ipython

In [1]: from pyspark import SparkContext

it gave the following error:

ImportError Traceback (most recent call last) <ipython-input-1-47c4965c5f0e> in <module>()
----> 1 from pyspark import SparkContext
ImportError: No module named pyspark

I do not understand what is happening. Please help me to figure this out.

Sounak
  • 13
  • 3
  • Have you added the py4j and pyspark libraries to your PYTHONPATH environment variable? Zips for both libraries should be present in ~/opt/spark/python/lib – dayman May 27 '16 at 15:10
  • Whatever you said I did all the things but same error is coming. One thing I forgot to mention, I am installing it on a remote server. – Sounak May 30 '16 at 07:24

0 Answers0