I believe Databricks Jobs creates a separate instance, runs the workflows and discards the instance.
Unfortunately, this assumption appears to be right: the Databricks support supplied me with the following explanation (Feb 2023):
For now, sharing a dashboard that can be regularly updated can be done only through its job run result. [...] Now, the challenge you have is to share a version of the notebook without any code and with interactive visualizations.
If that is the case, you can create a dashboard based on the visualisations in the notebook, and in the job result page, in the output toggle, select [...] dashboard or results only options. You will get a link to that specific dashboard like /run/latestSuccess/dashboard/{dashboard_id}.
For my use case (sharing updated dashboards with non-coder audience), I've setup scheduled runs of the notebooks, added the users to the job run notifications (only for successful runs) and supplied them with an illustrated guide how to access the dashboard view from that email (subject like [long number] Success--view run [number] of [name of notebook]
:
- click link "View run in Databricks"
- In "Output" dropdown menu select
Dashboard: <name of your dashboard>
- Click the right arrow in the sidebar "Task run details" to hide the sidebar
If you find a way to setup a dashboard with a single URL that is updated periodically, please let me know!