i did download anaconda3, spark1.6.1, java, git etc...
and i set up environment like this
$ nano .bashrc
export PATH="/home/moon/anaconda3/bin:$PATH"
export SCALA_HOME=/usr/local/src/scala/scala-2.12.1
export PATH=$SCALA_HOME/bin:$PATH
export SPARK_PATH=~/spark-1.6.1-bin-hadoop2.6
export PATH=$SPARK_PATH/bin:$PATH
export ANACONDA_ROOT=/usr/home/moon/anaconda3
export PYSPARK_PYTHON=$ANACONDA_ROOT/bin/python3
export PYSPARK_DRIVER_PYTHON="jupyter"
export PYSPARK_DRIVER_PYTHON_OPTS="notebook"
$. .bashrc
$cd spark-1.6.1-bin-hadoop2.6
$bin/pyspark
then i can connect jupyter notebook (python3)
But i can`t run spark ex) sc , sc.textFile
How can i integrate spark and jupyter