0

Dear fellow members of community. I have spark 3.4.0 installed and set up. To run a pyspark terminal, I set up environment variables and it works fine. I have a python microservice which receives a python code to run on pyspark terminal. How can I initialize a pyspark terminal inside my running python microservice and interact with it?

Even if we exclude spark out of the picture, how can I initialize a python kernel I read about jupyter kernels but unable to get the picture on how to create one?

Tried following the post: https://ipython.org/ipython-doc/3/development/kernels.html

0 Answers0