3

I am using sparkmagic to connect Jupyter notebooks to a remote spark cluster via Livy.

The way it is now, I need to execute a notebook cell to bring up the %manage_spark user-interface widget, and manually select the language and click "create-session" in order to establish the spark context for the notebook.

enter image description here

Is there a way to automatically generate the session when executing the cell, instead of having to manually select the language and hit "create-session"?

Can one provide arguments to the sparkmagic somehow for instance?

I'm imagining being able to do Kernel->Restart & Run All, and have the notebook execute completely.

conner.xyz
  • 6,273
  • 8
  • 39
  • 65

1 Answers1

0

Kind of late, but here is how I pass the parameters for the spark session in the cell

%spark add -s session -l python -t None -u http://localhost:8998
Ahmed
  • 36
  • 5
  • Been so long I don't have any way to test this in a reasonable amount of time. Looks like this could work. Figuring out how to point to Livy might take a minute, but this could work. – conner.xyz Jan 11 '21 at 22:15
  • Is there any solution for notebook side? I'm trying to do the same thing, but by modifying the jupyter_notebook_config.py with `c.NotebookApp.nbserver_extensions = {"sparkmagic.magics": True}`. but it doens't work. I hope someone could continue exploring this issue. – Alain ux Jan 31 '23 at 10:14