0

I need to execute a DLT pipeline from a Job, and I would like to know if there is any way of passing a dynamic input parameter to apply filter in databricks. I know settings in the pipeline that we use in the DLT notebook, but it seems we can only assign values to them when creating the pipeline. What I would like is to specify a value each time the pipeline is executed. Is that possible?

I tried doing through widget but I need to execute a DLT pipeline from a Job and pass through it

Alex Ott
  • 80,552
  • 8
  • 87
  • 132
  • Why do you need to pass dynamic parameters to the same pipeline? Is it understandable that results of computation aren't deterministic? – Alex Ott May 30 '23 at 10:55
  • We want to reuse the code for different table column values. Based on passing the input parameter in function the values will be calculated in databricks. – abhishek kumar May 30 '23 at 11:15
  • The code reuse is a bit different from reusing the same pipeline. – Alex Ott May 30 '23 at 11:45
  • I am passing transactional tables and it's required column name through widgets as input parameter in function and using the parameter values dynamically to apply the logic. As using widget we have to manually enter the input values in databricks . So wanted to automate it and pass it through DLT input parameter. As it is done in ADF. – abhishek kumar May 30 '23 at 12:14

0 Answers0