Apache Airflow is a workflow management platform to programmatically author, schedule, and monitor workflows as directed acyclic graphs (DAGs) of tasks. Use this tag for questions about version 2+ of Airflow. Use the more generic [airflow] tag on all Airflow questions, and only add this one if your question is version-specific.
Questions tagged [airflow-2.x]
690 questions
0
votes
1 answer
How to Trigger a DAG task by another task, regardless of the success of a previous task in Airflow using Python?
Description
How to run multiple ExternalPythonOperator (I need different packages / versions for different DAG tasks) after each other in serial without being dependent on the previous task's success "upstream_fail".
So it should just execute task…

sogu
- 2,738
- 5
- 31
- 90
0
votes
1 answer
Airflow how to perform an Operator for each item in a dynamically calculated list?
Is there a way to perform an airflow operator for each item in a non-hard-coded list? For each item in a dynamically calculated list (result of a @task) want to run a docker image, passing in the item as an environment variable:
for item in…

Antoine Dahan
- 574
- 2
- 9
- 23
0
votes
0 answers
How to get parent dag info before execution in Airflow
I am designing a reconciliation job flow in Airflow 2.3.3 where the parent dags can trigger same child dag using TriggerDagOperator.
The child dag must somehow understand its parent name and accordingly read the parameter values from json file for…

Santanu Ghosh
- 91
- 1
- 8
0
votes
0 answers
Emulate an `expand` with a TaskGroup in Airflow
Objective
I would like to have the following structure in my DAG Graph:
---- sleeper_1 ---- stringer_1 ----
/ \
list_generator ----- ... ----- printer
…

GregoirePelegrin
- 1,206
- 2
- 7
- 23
0
votes
1 answer
Cannot Import Name 'BigQueryTableExistenceAsyncSensor' from 'airflow.providers.google.cloud.sensors.bigquery'
I want to import BigQueryTableExistenceAsyncSensor from airflow.providers.google.cloud.sensors.bigquery
here is my code:
from airflow import DAG
from util.dags_hourly import create_dag_write_append #this is class that I created, no issues with other…

Mohammad Iqbal
- 29
- 1
- 10
0
votes
1 answer
Saving status of all the task in xcom as soon as it gets completed a in airflow
i am writing a dag and i want to save the status of a task with their taskid in a dictonary in XCOM, the status can be success or failed or anything.
For example below are the two task
def task_18(ti):
print("TASK 18 COMPLETE")
def task_19(ti):
…

NeedToCodeAgain
- 129
- 2
- 9
0
votes
1 answer
How to run a specific dag first in airflow?
I'm using apache airflow (2.3.1) to load data into a database. I have more than 150 dags, I need to run some of them first, how can I do this?
The initialization of the work of the dags occurs at 3 am and the dags start to run randomly, standing in…

Никита Землин
- 1
- 1
0
votes
1 answer
Airflow DagRunAlreadyExists even after providing the custom run id and execution date
I am Getting DagRunAlreadyExists exception even after providing the custom run id and execution date.
This occurs when there are multiple request within a second.
Here is the MWAA CLI call
def get_unique_key():
from datetime import datetime
…

Ankur Vyas
- 118
- 9
0
votes
1 answer
How to trigger cloud data fusion from airflow with dynamic parameters
I am trying to create a DAG in Airflow 2+ which will trigger multiple data fusion pipelines using CloudDqtaFusionStartPipeline operator and they will run in parallel.
However, I want to assign the parameter values (like pipeline name, runtime…

Santanu Ghosh
- 91
- 1
- 8
0
votes
1 answer
Apache Airflow error run id already exists
Getting this error even after passing custom UUID as run_id to airflow dagrun.:
run id already exists
In order to create multiple DagRun within one second, One should set parameter "replace_microseconds" to "false" according to documentation.
But…

Paresh
- 3
- 2
0
votes
0 answers
Moving files of very different size from one place to another - optimization in Airflow
I'm implementing a DAG in Airflow moving files from an on-prem location to Azure Blob Storage. Reading files from the source and sending them to Azure is realized via a Data Access Layer outside of Airflow.
The thing is that files in the source can…

tomomomo
- 157
- 2
- 13
0
votes
0 answers
Airflow using rundate as a date Parameter
I have a DAG that fetches stock data after market hours 16:30 EST. I need to fetch the same date as the date parameter in my task to fetch 1 year old stock data. I initially used {{ds}} and found out {{ds}} only passed the interval before the run…

NinjaWarrior
- 25
- 1
- 2
- 9
0
votes
2 answers
How to make a docker image/container on Google Cloud Composer and deploy it?
I have airflow dags running on Google Cloud Composer that trains machine learning models on some training data and stores the model with the best accuracy. I want to make a docker container/image that has the best model and deploy it directly to…

shubh gupta
- 94
- 7
0
votes
0 answers
How to integrate AWS SQS with Apache Airflow without AWS CLI configuration/Credentials file setup?
I don't want to use AWS credentials from the ~/.aws/credentials file to integrate AWS SQS in Airflow. Is there any other way to integrate directly from Airflow UI connection menu like S3 integration where we can directly pass the access and secret…

user1992
- 169
- 1
- 15
0
votes
1 answer
How to create dynamic task sequence with new airflow version
I am trying to create a sequence of tasks like below using Airflow 2.3+
START -> generate_files -> download_file -> STOP
But instead I am getting below flow. The code is also given. Please advice.
from airflow import DAG
from airflow.decorators…

Santanu Ghosh
- 91
- 1
- 8