2

When I run (in debug mode) a Spark notebook in Azure Synapse Analytics, it doesn't seem to shutdown as expected.

In the last cell I call: mssparkutils.notebook.exit("exiting notebook")

But then when I fire off another notebook (again in debug mode, same pool), I get this error:

AVAILABLE_COMPUTE_CAPACITY_EXCEEDED: Livy session has failed. Session state: Error. Error code: AVAILABLE_COMPUTE_CAPACITY_EXCEEDED. Your job requested 12 vcores. However, the pool only has 0 vcores available out of quota of 12 vcores. Try ending the running job(s) in the pool, reducing the numbers of vcores requested, increasing the pool maximum size or using another pool. Source: User.

So I go to Monitor => Apache Spark applications and I see my the first notebook I ran still in a "Running" status and I can manually stop it.

How do I automatically stop the Notebook / Apache Spark application? I thought that was the notebook.exit() call but apparently not...

ntg
  • 12,950
  • 7
  • 74
  • 95
Dudeman3000
  • 551
  • 8
  • 21
  • Are you running the notebook inside a pipeline or in debug mode (inside a notebook)? – Volume999 Mar 16 '22 at 19:38
  • In debug mode I think. Develop => + Notebook, then writing code. Maybe this is working as intended and I can only dev on one notebook at a time? (or allocate more cores, or specify "don't use all my cores" with a %%config {} cell at the beginning?) I don't know what I'm doing, thanks for the help! – Dudeman3000 Mar 16 '22 at 21:57

1 Answers1

3

In debug mode, the cluster's vcores are supplied to the notebook for the entire duration of the debug (that is one hour of inactivity or until you manually terminate it)

Thus, you have two options: Work on one notebook at a time, closing the debug before starting another

OR

Configure the session to reduce the number of executors so that the spark cluster can provision all three debug modes at the same time (might need to increase the size of the cluster)

Volume999
  • 90
  • 10
  • You make sense, but is there any way to stop the notebook, or do i have to actually close it? I would like to be able to see the output so far... – ntg Jun 16 '22 at 11:50
  • And I am checking the performance, so ideally would like to be running on the whole pool...:S – ntg Jun 16 '22 at 11:57