I'm currently working on a project where I have two distinct jobs on Databricks. The second job is dependent on the results of the first one.
I am wondering if there is a way to automatically trigger the second job once the first one has completed successfully. Ideally, I would like to accomplish this directly within Databricks without the need for an external scheduling or orchestration tool. Has anyone been able to implement this type of setup or know if it's possible?