2

Say I have a simple notebook orchestration :

Notebook A -> Notebook B

Notebook A finish first then trigger Notebook B

I am wondering if there is an out of box method to allow Notebook A to terminate the entire job? (without running Notebook B).

Putting dbutils.notebook.exit in Notebook A will exit Notebook A but Notebook B still can run.

P.S. Passing parameters between Notebooks to terminate all the down streamNotebooks one-by-one is an alternative solution but not ideal. I want a solution to kill the job at the root Notebook. And, I do not want to raise Exceptions in Notebook Aand kill the job by running into an "error" status.

QPeiran
  • 1,108
  • 1
  • 8
  • 18
  • It might not be possible to stop the job without raising Exceptions in Databricks workflows. But you can use Data factory with the help of `dbutils.notebook.exit` to execute Notebook, B if condition satisfies. I can provide a solution for that if necessary. – Saideep Arikontham Aug 11 '22 at 05:34
  • @SaideepArikontham Thank you very much for your reply but I am still looking for a solution within Databricks. – QPeiran Aug 11 '22 at 08:05
  • You can now pass parameters between tasks. https://docs.databricks.com/workflows/jobs/share-task-context.html – Michael Gardner Mar 30 '23 at 15:20

1 Answers1

0

As of now there is no way to stop the present cell after a successful run in "run all" scenario. Additionally, you can always put an error in the cell where you want run all to stop since errors stop "run all". Also, you can run all cells above and run all cells below. For example: If you want to run all cells up to cell X, go to cell X+1 and “run all cells above”.