Every time a document is placed in a certain bucket I want to start a DAG workflow to analyse this document. I need to trigger the DAG work flow using cloud functions using Cloud storage trigger and event type Finalize and create
1 Answers
Update - as of September 2019, there now exists a Python example in the docs, and the snippets in the docs can be found on GitHub.
Take a look at this doc, which shows you how to Trigger a DAG using Google Cloud Functions and a bucket of your choosing on a Finalize/Create event
[edited to add Python specific details]
Since you need Python, and this doc is for NodeJS, you'll have to do the adaptation on your own for now. Calls to Airflow's API in Composer go through the identity aware proxy (see this diagram, which is also shown below, for more detail), and so your post request to trigger the DAG will also have to go through the identity aware proxy.
Luckily, there is a great Python example for making a post request to IAP in the python-docs-samples sample repo. If you copy that script into your repo and get your client ID using the same steps for NodeJS (this script), it should trigger the DAG just as NodeJS did.

- 161
- 9
-
1I know this document but it is written in JS , I have tried to convert it to python but I failed. – Raef ElSayed Aug 19 '19 at 23:32
-
Sorry about that - I was thinking the Python 3.7 was for the underlying Composer, but now realize it was for GCF. I'll update my original post with some more resources. – LEC Aug 20 '19 at 15:49