Currently, GlobalTempViews
are not shared across different Spark sessions or notebooks. When you use the same Spark pool to two new notebooks by limiting the number of executors to use per notebook, it will not take the GlobalTempViews
.
The workaround can be:
Using the command listed below, we may call one notebook from another.
mssparkutils.notebook.run("notebook path", <timeoutSeconds>, <parameters>)
Global Temporary views have a scope in the calling notebook's spark session. As the views are produced in the same spark session, when we call the notebook, it will run in the spark session of the calling notebook, and we may access the global temporary views that are created in the callee.
Code in Notebook2:
from notebookutils import mssparkutils
returned_GBview = mssparkutils.notebook.run("/Notebook 1")
df2=spark.sql("select * from {0}".format(returned_GBview))
df2.show()
You can see that I am able to show my sample data which was read in Notebook1 using Global temporary view.

Notebbok1 code:
df.createOrReplaceGlobalTempView("globleview")
mssparkutils.notebook.exit("globleview")
We are getting the dataframe by returning the name of the Global temporary view in exit function.
