0

Not long ago, when I input pyspark in my terminal.

the terminal will finally become...um...like this:

some information

>>>

but now it start with jupyter notebook automatically.

This phenomenon happened with spark-3.0.0-preview2-bin-hadoop3.2

I have used many version of spark.

Is above phenomenon due to my error in configuration or due to spark edition update?

Thanks for your help.

  • PYSPARK_DRIVER_PYTHON = jupyter does this environment variable exists ? – E.ZY. Jul 28 '20 at 12:19
  • YES,how to make pyspark start with`>>>`,sorry for such not accurate description. –  Jul 28 '20 at 12:44
  • 1
    just unset this system variable, i bet you have set this system somewhere in your bash or zsh file thats way. you can delete this line in your bash file or just UNSET PYSPARK_DRIVER_PYTHON every time before you start spark – E.ZY. Jul 30 '20 at 07:07

0 Answers0