I am trying to install apache spark-1.6.1 as a stand alone mode. I have followed "https://github.com/KristianHolsheimer/pyspark-setup-guide" link. But, after the execution of
$ sbt/sbt assembly
I have tried
$ ./bin/run-example SparkPi 10"
but, it gave an error,
./bin/run-example: line 26: /home/dasprasun/opt/spark/bin/load-spark env.sh: No such file or directory
Failed to find Spark examples assembly in /home/dasprasun/opt/spark/lib or /home/dasprasun/opt/spark/examples/target
You need to build Spark before running this program
After Completion of all the steps, I gave the following comman in ipython
In [1]: from pyspark import SparkContext
it gave the following error:
ImportError Traceback (most recent call last) <ipython-input-1-47c4965c5f0e> in <module>()
----> 1 from pyspark import SparkContext
ImportError: No module named pyspark
I do not understand what is happening. Please help me to figure this out.