1

I am using databricks and running multiple notebooks on the same cluster. Each notebook represents a task which requires specific task settings.

Now when i change the spark settings on a notebook, it seems this gets applied at the cluster level and the same change happens for the spark session on the other notebook.

But this is not what i want, i want to be able to provide different spark setting for each notebook on the same cluster

I have tried setting the spark.databricks.session.share to false but still the same issue persists.

Is this possible in Databricks?

partlov
  • 13,789
  • 6
  • 63
  • 82
  • Are you calling one notebook from another? Session isolation should be enabled by default, but if you call another notebook it will share same session as its caller. – partlov Jun 28 '23 at 11:44
  • No I have 2 independent notebooks attached to the same cluster – Naveen Balachandran Jun 29 '23 at 10:47
  • So, as far as I know all these sessions share same `SparkContext`. You are able to have per session configuration of `spark.sql` configs, but I'm not sure what exact config you want to change? – partlov Jun 29 '23 at 12:10
  • I want to have number of executors, number of partitions ( shuffle partitions) and executor memory configured differently for the 2 notebooks which are running at the same time – Naveen Balachandran Jun 30 '23 at 01:29

0 Answers0