I have to sequentially run a couple of notebooks in databricks and pass the result in the form of dataframe from one notebook to another and so on and create a table in the last notebook using databricks and pyspark.
Is this feasible ?
Unable to understand how to pass dataframes.
I tried creating a global temp view but not sure if values can be updated in global temp views