1

I got around 9 source system (on premises / cloud based).

  1. Every source system has to end up in Raw zone of Data Lake for both Initial and Incremental Load.
  2. But few will go through standardisation zone and then to Staging (SQL Tables) which will be ultimately consumed by Raw Vault (Data Vault 2.0) processing.
  3. Rest will be straight from Raw Zone to Staging (SQL Tables)

We got all the pipeline done to move data from source => raw zone => standardisation zone => Staging.

Now am looking into ADF Orchestration Framework, where i should be able to dynamically execute pipeline based on the Metadata / next valid Pipeline to run.

With Execute Pipleline there is no option to specify dynamically which pipeline to execute.

What is the best option Logic App / Azure function.

I want to pass the Pipeline name dynamically.

Paul Andrew
  • 3,233
  • 2
  • 17
  • 37
Sreedhar
  • 29,307
  • 34
  • 118
  • 188
  • 1
    If you need this to be truly variable, then you'll need to build a queue/management system. Here is another answer where I outline this concept: https://stackoverflow.com/questions/61564943/stop-running-azure-data-factory-pipeline-when-it-is-still-running/61597689#61597689 – Joel Cochran May 11 '20 at 14:41
  • @JoelCochran thanks will look into it. – Sreedhar May 12 '20 at 10:27
  • Here is another of my answers with some more information: https://stackoverflow.com/questions/59085000/method-to-put-alerts-on-long-running-azure-data-factory-pipeline/59290603#59290603 – Joel Cochran May 13 '20 at 15:26

1 Answers1

1

If you don't have too many pipelines to direct to, you can use the Switch activity in a pipeline to run different pipelines for different cases based on metadata.

Cedersved
  • 1,015
  • 1
  • 7
  • 21