I wanted to know if airflow tasks can be executed upon getting a request over HTTP. I am not interested in the scheduling part of Airflow. I just want to use it as a substitute for Celery.
So an example operation would be something like this.
- User submits a form requesting for some report.
- Backend receives the request and sends the user a notification that the request has been received.
- The backend then schedules a job using Airflow to run immediately.
- Airflow then executes a series of tasks associated with a DAG. For example, pull data from redshift first, pull data from MySQL, make some operations on the two result sets, combine them and then upload the results to Amazon S3, send an email.
From whatever I read online, you can run airflow jobs by executing airflow ...
on the command line. I was wondering if there is a python api which can execute the same thing.
Thanks.