2

Inside ADF I'm trying to get the ready-made contents of a query for a GraphQL API (Web activity block) stored in a JSON somewhere in the blob. Because of speed requirements, we can't afford to just spin up Databricks every single time.

What can be done to get the content, not metadata of a JSON file and store it inside a ADF variable that would parametrize further pipeline blocks (the path to the file is known, fixed, and the file is accessible via a linked service)?

1 Answers1

-1

I would go with creating meta data Azure SQL database (basic cost only 5 usd per month). It can be connected via private link with Azure Data Factory. This is simplest and fastest way. You just save data there and later fill dataflow etc. parameters with results from that database.

Hubert Dudek
  • 1,666
  • 1
  • 13
  • 21