I am making use of Databricks Workflows. I have a job that consists of three tasks:
- Extract - references a normal databricks notebook
- DLT - references a dlt pipeline
- PostExec - references a normal databricks notebook
I pass a parameter into the first task making use of the parameters options. In the notebook, I register the parameter with the following code, so that I can reference it later in the following tasks: dbutils.jobs.taskValues.set("parmater_1", parameter_value)
I can then reference this parameter in the tasks that also reference notebooks with the following code: parameter_1 = dbutils.jobs.taskValues.get(taskKey="Extract",key="parmater_1")
But I cannot reference this value in the tasks that refer to DLT pipelines. When I run the above code, it produces the following error: TypeError: Must pass debugValue when calling get outside of a job context. debugValue cannot be None.
I know DLT uses configuration, but is it possible to persist a parameter in the first step to be passed programmatically to the DLT step?