0

I don't find any solution fot start one apache spark session for all notebooks in one pipeline, any ideas ?

thebluephantom
  • 16,458
  • 8
  • 40
  • 83
Dev
  • 71
  • 6
  • if you just want to run another jupyter notebook within a jupyter notebook, you could use `%run` magic. [ref](https://docs.qubole.com/en/latest/user-guide/notebooks-and-dashboards/notebooks/jupyter-notebooks/running-jupy-notebooks.html#running-a-jupyter-notebook-from-another-jupyter-notebook) – samkart Aug 23 '23 at 08:18
  • i use synapse notebook – Dev Aug 23 '23 at 08:26
  • @Dev, its also valid for synapse notebook. See: https://stackoverflow.com/questions/68760592/synapse-notebook-reference-how-to-use-run – Ziya Mert Karakas Aug 24 '23 at 14:38

2 Answers2

1

Not possible to use 1 Spark session for multiple Synapse notebooks/users.

Synapse provides purpose-built engines for specific use cases. Spark for Synapse is designed as a job service, not a cluster model.

Use %run.

You can use %run magic command to reference another notebook within current notebook's context. All the variables defined in the reference notebook are available in the current notebook. %run magic command supports nested calls but not support recursive calls. You receive an exception if the statement depth is larger than five.

thebluephantom
  • 16,458
  • 8
  • 40
  • 83
0

The Spark pools cannot be used in multiple Notebook but spark session can be used in multiple Notebook create spark session in Notebook1 and use in all notebook using

%run NotebookName

then run remaining codes note:-it can done in azure synapse notebook