Apache Airflow is a workflow management platform to programmatically author, schedule, and monitor workflows as directed acyclic graphs (DAGs) of tasks. Use this tag for questions about version 2+ of Airflow. Use the more generic [airflow] tag on all Airflow questions, and only add this one if your question is version-specific.
Questions tagged [airflow-2.x]
690 questions
0
votes
0 answers
How can I use python's ABC library for nested abstractmethods to build an interface for Airflow2.X?
I am using Airflow2.0's taskflow API to generate DAGs in order to orchestrate ETL jobs.
Airflow2.0 doesn't seem to provide a framework to generate DAGs according to the DRY principle. Basically each DAG needs to be generated in a separate file and…

omoshiro
- 3
- 3
0
votes
1 answer
Use Smart Sensors and still get context variable
I am on Airflow 2.1.4 and I am trying to modify a custom sensor to act as a Smart Sensor.
Among other things, to allow a custom sensor to work as a Smart Sensor you need to give it a poke_context_fields class variable. This isn't very well…

Stephen
- 8,508
- 12
- 56
- 96
0
votes
1 answer
How to run airflow dag on working days between 9am to 4pm in every 10 minutes
I have a DAG that needs to be scheduled to run in working days (Mon to Fri) between 9AM to 4PM in every 10 minutes. How do i do this in Airflow.

Amitjoc
- 83
- 4
- 9
0
votes
1 answer
Make Airflow load all keys in a Kubernetes secret without specifying the keys
I am using Google Cloud Composer 1.17.7 with Airflow 2.1.4.
I am mainly following these docs.
I created a Kubernetes secret that looks like this:
apiVersion: v1
data:
KEY1: base64encodedvalue1
KEY2: base64encodedvalue2
KEY3:…

Giulia Savorgnan
- 63
- 6
0
votes
2 answers
Passing a variable from a DAG to an external function
I have the following two files. One with the DAG and two tasks (DummyOperator and TaskGroup).
# example_dag.py
from datetime import timedelta
from airflow import DAG
from airflow.operators.dummy import DummyOperator
from airflow.operators.python…

tristobal
- 426
- 2
- 7
- 18
0
votes
1 answer
Which Airflow permissions allow triggering of a DAG Run through the API?
Using Airflow 2.0.2, I'm trying to use the airflow API to trigger DAG Runs. When I run a simple GET like
curl -X GET --user "fooUser:passw0rd" "${ENDPOINT_URL}/api/v1/pools"
I get expected results:
{
"pools": [
{
"name":…

Mike S
- 1,451
- 1
- 16
- 34
0
votes
1 answer
Airflow None Schedule is not working. Tasks are still running automatically
I have an airflow dag which i don't want to schedule. I'm using schedule_interval:None in my dag file but the dag is still running automatically once it is deployed.
Airflow version used: 2.1.0
Airflow Screenshot Attached.
I'm using the following…

Akash Srivastava
- 23
- 2
0
votes
0 answers
Airflow task succeed but returns sigterm
I have a task in Airflow 2.1.2 which is finishing with success status, but after that log shows a sigterm:
[2021-12-07 06:11:45,031] {python.py:151} INFO - Done. Returned value was: None
[2021-12-07 06:11:45,224] {taskinstance.py:1204} INFO -…

mrc
- 2,845
- 8
- 39
- 73
0
votes
1 answer
Is there a way to run SnowflakeOperator, SnowflakeHook locally in a non-Airflow script?
I have some DAGs that use SnowflakeOperator and SnowflakeHook. Both of them are making connections to Snowflake using snowflake_connection input, which I have saved under Admin > Connections in Airflow.
SnowflakeHook(
…

Howard S
- 121
- 6
0
votes
1 answer
Naming Airflow dags other then the python callable when using taskflow api
I trying to create multiple dags using the taskflow API that have a variable passed into them which can be used by tasks within the dag
For example I am trying to have this code
from airflow.decorators import dag, task
from datetime import…

Nicholas Stevens
- 505
- 4
- 8
0
votes
1 answer
Scheduler logs different between airflow1 and airflow2
Because I can't use the airflow CLI, I'm actually parsing scheduler logs with grep on airflow1 in order to retrieve some infos such as :
check if the dag is triggered or not / if it's successful or not / start timestamp with the pattern "INFO…
0
votes
2 answers
Change default XcomArg key in custom operators
I have a bunch of custom operators and I wanted to try to make use of XcomArg and using .output in my tasks.
For example, below I commented out the xcom_push to return the list:
def execute(self, context):
#…

ldacey
- 518
- 8
- 16
0
votes
1 answer
How do I trigger a backfill with TriggerDagRunOperator?
I have a requirement where I need the dag triggered by TriggerDagRunOperator to execute a backfill and not just for the same execution date.
The TriggerDagOperator is set as follows:
trigger1 = TriggerDagRunOperator(
task_id = 'trigger1',
…

Tessa Altman
- 33
- 4
0
votes
1 answer
Trigger airflow task on sensor timeout
I currently have a PythonSensor which waits for files on an ftp server. Is it possible to have this sensor trigger a task on timeout? I am trying to create the following dag:
airflow sensor diagram
I have taken a look at BranchPythonOperator but it…

rew0rk
- 1
- 1
0
votes
1 answer
Facing issue to integrate airflow 2.2.2 with SQL Server as meta store
I'm facing an issue when trying to integrate airflow 2.2.2 with SQL Server as meta store:
Microsoft SQL Server 2019
sqlalchemy.exc.IntegrityError: (pyodbc.IntegrityError) ('23000', "[23000] [Microsoft][ODBC Driver 17 for SQL Server][SQL…

zubair shahzad
- 1
- 3