1

I am making use of Databricks Workflows. I have a job that consists of three tasks:

  1. Extract - references a normal databricks notebook
  2. DLT - references a dlt pipeline
  3. PostExec - references a normal databricks notebook

I pass a parameter into the first task making use of the parameters options. In the notebook, I register the parameter with the following code, so that I can reference it later in the following tasks: dbutils.jobs.taskValues.set("parmater_1", parameter_value)

I can then reference this parameter in the tasks that also reference notebooks with the following code: parameter_1 = dbutils.jobs.taskValues.get(taskKey="Extract",key="parmater_1")

But I cannot reference this value in the tasks that refer to DLT pipelines. When I run the above code, it produces the following error: TypeError: Must pass debugValue when calling get outside of a job context. debugValue cannot be None.

I know DLT uses configuration, but is it possible to persist a parameter in the first step to be passed programmatically to the DLT step?

Alex Ott
  • 80,552
  • 8
  • 87
  • 132
HGe
  • 11
  • 1

1 Answers1

0

Task values aren't supported for DLT yet... You can pass only the configuration parameters defined in the pipeline's settings or Spark conf.

Alex Ott
  • 80,552
  • 8
  • 87
  • 132