0

I am trying to run a simple DAG on Airflow running on Docker.

I've got two python scripts, the first one takes in the data using an API call, and the second one pushes the data into google sheets. So I've used t1 to execute the first python script and t2 to execute the second.

Here is my code:

from airflow import DAG
#from airflow.operators.python import PythonOperator, BashOperator 
from datetime import datetime, timedelta
from airflow.operators.bash_operator import BashOperator
from airflow.utils.dates import days_ago

default_args = {
    'owner': 'airflow',
    'depends_on_past': False,
    'email': ['airflow@airflow.com'],
    'email_on_failure': True,
    'email_on_retry': False,
    'retries': 1,
    'retry_delay': timedelta(minutes=1)}

with DAG("ETL_script",
        description= "DAG for Data retrieval",
        default_args = default_args,
        start_date= datetime(2022, 1, 1),
        schedule_interval= "@weekly", 
        catchup = False
        ) as dag:
    t1 = BashOperator(
            task_id = 'extract_data',
            bash_command = '/Users/user.name/Coding Projects/Python_Apps/Docker_Applications/Airflow_Docker_Apps/Python_Scripts/API_extract.py',
            dag=dag,)
    
    t2 = BashOperator(
        task_id = "Insert_into_google_Sheets",
        bash_command = 'python3 "/Users/user.name/Coding Projects/Python_Apps/Docker_Applications/Airflow_Docker_Apps/Python_Scripts/Google_Sheets_Connection.py"',
        dag=dag,)
    
    t1 >> t2 # Defining the task dependencies

When I run Bash command on zsh I can run both of the python scripts perfectly but on Airflow, I get this error:

*** Reading local file: /opt/airflow/logs/ETL_script/extract_data/2022-05-02T21:27:56.923195+00:00/1.log
[2022-05-02, 21:27:58 UTC] {taskinstance.py:1043} INFO - Dependencies all met for <TaskInstance: ETL_script.extract_data manual__2022-05-02T21:27:56.923195+00:00 [queued]>
[2022-05-02, 21:27:58 UTC] {taskinstance.py:1043} INFO - Dependencies all met for <TaskInstance: ETL_script.extract_data manual__2022-05-02T21:27:56.923195+00:00 [queued]>
[2022-05-02, 21:27:58 UTC] {taskinstance.py:1249} INFO - 
--------------------------------------------------------------------------------
[2022-05-02, 21:27:58 UTC] {taskinstance.py:1250} INFO - Starting attempt 1 of 2
[2022-05-02, 21:27:58 UTC] {taskinstance.py:1251} INFO - 
--------------------------------------------------------------------------------
[2022-05-02, 21:27:58 UTC] {taskinstance.py:1270} INFO - Executing <Task(BashOperator): extract_data> on 2022-05-02 21:27:56.923195+00:00
[2022-05-02, 21:27:58 UTC] {standard_task_runner.py:52} INFO - Started process 148 to run task
[2022-05-02, 21:27:58 UTC] {standard_task_runner.py:79} INFO - Running: ['***', 'tasks', 'run', 'ETL_script', 'extract_data', 'manual__2022-05-02T21:27:56.923195+00:00', '--job-id', '134', '--raw', '--subdir', 'DAGS_FOLDER/ETL_script.py', '--cfg-path', '/tmp/tmpeaprux6w', '--error-file', '/tmp/tmp1hhf5ge7']
[2022-05-02, 21:27:58 UTC] {standard_task_runner.py:80} INFO - Job 134: Subtask extract_data
[2022-05-02, 21:27:58 UTC] {logging_mixin.py:109} INFO - Running <TaskInstance: ETL_script.extract_data manual__2022-05-02T21:27:56.923195+00:00 [running]> on host 4c9d01f48299
[2022-05-02, 21:27:58 UTC] {taskinstance.py:1448} INFO - Exporting the following env vars:
AIRFLOW_CTX_DAG_EMAIL=***@***.com
AIRFLOW_CTX_DAG_OWNER=***
AIRFLOW_CTX_DAG_ID=ETL_script
AIRFLOW_CTX_TASK_ID=extract_data
AIRFLOW_CTX_EXECUTION_DATE=2022-05-02T21:27:56.923195+00:00
AIRFLOW_CTX_DAG_RUN_ID=manual__2022-05-02T21:27:56.923195+00:00
[2022-05-02, 21:27:58 UTC] {subprocess.py:62} INFO - Tmp dir root location: 
 /tmp
[2022-05-02, 21:27:58 UTC] {subprocess.py:74} INFO - Running command: ['bash', '-c', '/Users/user.name/Coding Projects/Python_Apps/Docker_Applications/Airflow_Docker_Apps/Python_Scripts/API_extract.py']
[2022-05-02, 21:27:58 UTC] {subprocess.py:85} INFO - Output:
[2022-05-02, 21:27:58 UTC] {subprocess.py:89} INFO - bash: /Users/user.name/Coding: No such file or directory
[2022-05-02, 21:27:58 UTC] {subprocess.py:93} INFO - Command exited with return code 127
[2022-05-02, 21:27:58 UTC] {taskinstance.py:1774} ERROR - Task failed with exception
Traceback (most recent call last):
  File "/home/airflow/.local/lib/python3.7/site-packages/airflow/operators/bash.py", line 188, in execute
    f'Bash command failed. The command returned a non-zero exit code {result.exit_code}.'
airflow.exceptions.AirflowException: Bash command failed. The command returned a non-zero exit code 127.
[2022-05-02, 21:27:58 UTC] {taskinstance.py:1288} INFO - Marking task as UP_FOR_RETRY. dag_id=ETL_script, task_id=extract_data, execution_date=20220502T212756, start_date=20220502T212758, end_date=20220502T212758
[2022-05-02, 21:27:58 UTC] {standard_task_runner.py:98} ERROR - Failed to execute job 134 for task extract_data (Bash command failed. The command returned a non-zero exit code 127.; 148)
[2022-05-02, 21:27:58 UTC] {local_task_job.py:154} INFO - Task exited with return code 1
[2022-05-02, 21:27:58 UTC] {local_task_job.py:264} INFO - 0 downstream tasks scheduled from follow-on schedule check


So from my understanding the first task t1 fails and gives me this error. I think I might've not defined the $PATH correctly and so this gave me this error. This was my first thought but I don't know how to correct it.

Additional Information

Here is what my working directory looks like:

Docker_Applications
├───dags
│   ├───ETL_script.py
│   
│    
├───logs
├───plugins
├───Python_Scripts   
│   ├───API_extract.py
│   ├───Google_Sheets_Connection.py
│  
├───.env
├───docker-compose.yaml

my echo $PATH is

Projects:/Library/Frameworks/Python.framework/Versions/3.10/bin:/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin:/Library/Frameworks/Python.framework/Versions/3.10/bin

and my docker-compose.yaml file looks like:

# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements.  See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership.  The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License.  You may obtain a copy of the License at
#Documentation: https://airflow.apache.org/docs/apache-airflow/stable/start/docker.html#
#   http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied.  See the License for the
# specific language governing permissions and limitations
# under the License.
#

# Basic Airflow cluster configuration for CeleryExecutor with Redis and PostgreSQL.
#
# WARNING: This configuration is for local development. Do not use it in a production deployment.
#
# This configuration supports basic configuration using environment variables or an .env file
# The following variables are supported:
#
# AIRFLOW_IMAGE_NAME           - Docker image name used to run Airflow.
#                                Default: apache/airflow:2.2.5
# AIRFLOW_UID                  - User ID in Airflow containers
#                                Default: 50000
# Those configurations are useful mostly in case of standalone testing/running Airflow in test/try-out mode
#
# _AIRFLOW_WWW_USER_USERNAME   - Username for the administrator account (if requested).
#                                Default: airflow
# _AIRFLOW_WWW_USER_PASSWORD   - Password for the administrator account (if requested).
#                                Default: airflow
# _PIP_ADDITIONAL_REQUIREMENTS - Additional PIP requirements to add when starting all containers.
#                                Default: ''
#
# Feel free to modify this file to suit your needs.
---
version: '3.7'
x-airflow-common:
  &airflow-common
  # In order to add custom dependencies or upgrade provider packages you can use your extended image.
  # Comment the image line, place your Dockerfile in the directory where you placed the docker-compose.yaml
  # and uncomment the "build" line below, Then run `docker-compose build` to build the images.
  image: ${AIRFLOW_IMAGE_NAME:-apache/airflow:2.2.5}
  # build: .
  environment:
    &airflow-common-env
    AIRFLOW__CORE__EXECUTOR: CeleryExecutor
    AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgresql+psycopg2://airflow:airflow@postgres/airflow
    AIRFLOW__CELERY__RESULT_BACKEND: db+postgresql://airflow:airflow@postgres/airflow
    AIRFLOW__CELERY__BROKER_URL: redis://:@redis:6379/0
    AIRFLOW__CORE__FERNET_KEY: ''
    AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION: 'true'
    AIRFLOW__CORE__LOAD_EXAMPLES: 'false'
    AIRFLOW__API__AUTH_BACKEND: 'airflow.api.auth.backend.basic_auth'
    _PIP_ADDITIONAL_REQUIREMENTS: ${_PIP_ADDITIONAL_REQUIREMENTS:-}
  volumes:
    - ./dags:/opt/airflow/dags
    - ./logs:/opt/airflow/logs
    - ./plugins:/opt/airflow/plugins
  user: "${AIRFLOW_UID:-50000}:0"
  depends_on:
    &airflow-common-depends-on
    redis:
      condition: service_healthy
    postgres:
      condition: service_healthy

services:
  postgres:
    image: postgres:13
    environment:
      POSTGRES_USER: airflow
      POSTGRES_PASSWORD: airflow
      POSTGRES_DB: airflow
    volumes:
      - postgres-db-volume:/var/lib/postgresql/data
    healthcheck:
      test: ["CMD", "pg_isready", "-U", "airflow"]
      interval: 5s
      retries: 5
    restart: always

  redis:
    image: redis:latest
    expose:
      - 6379
    healthcheck:
      test: ["CMD", "redis-cli", "ping"]
      interval: 5s
      timeout: 30s
      retries: 50
    restart: always

  airflow-webserver:
    <<: *airflow-common
    command: webserver
    ports:
      - 8080:8080
    healthcheck:
      test: ["CMD", "curl", "--fail", "http://localhost:8080/health"]
      interval: 10s
      timeout: 10s
      retries: 5
    restart: always
    depends_on:
      <<: *airflow-common-depends-on
      airflow-init:
        condition: service_completed_successfully

  airflow-scheduler:
    <<: *airflow-common
    command: scheduler
    healthcheck:
      test: ["CMD-SHELL", 'airflow jobs check --job-type SchedulerJob --hostname "$${HOSTNAME}"']
      interval: 10s
      timeout: 10s
      retries: 5
    restart: always
    depends_on:
      <<: *airflow-common-depends-on
      airflow-init:
        condition: service_completed_successfully

  airflow-worker:
    <<: *airflow-common
    command: celery worker
    healthcheck:
      test:
        - "CMD-SHELL"
        - 'celery --app airflow.executors.celery_executor.app inspect ping -d "celery@$${HOSTNAME}"'
      interval: 10s
      timeout: 10s
      retries: 5
    environment:
      <<: *airflow-common-env
      # Required to handle warm shutdown of the celery workers properly
      # See https://airflow.apache.org/docs/docker-stack/entrypoint.html#signal-propagation
      DUMB_INIT_SETSID: "0"
    restart: always
    depends_on:
      <<: *airflow-common-depends-on
      airflow-init:
        condition: service_completed_successfully

  airflow-triggerer:
    <<: *airflow-common
    command: triggerer
    healthcheck:
      test: ["CMD-SHELL", 'airflow jobs check --job-type TriggererJob --hostname "$${HOSTNAME}"']
      interval: 10s
      timeout: 10s
      retries: 5
    restart: always
    depends_on:
      <<: *airflow-common-depends-on
      airflow-init:
        condition: service_completed_successfully

  airflow-init:
    <<: *airflow-common
    entrypoint: /bin/bash
    # yamllint disable rule:line-length
    command:
      - -c
      - |
        function ver() {
          printf "%04d%04d%04d%04d" $${1//./ }
        }
        airflow_version=$$(gosu airflow airflow version)
        airflow_version_comparable=$$(ver $${airflow_version})
        min_airflow_version=2.2.0
        min_airflow_version_comparable=$$(ver $${min_airflow_version})
        if (( airflow_version_comparable < min_airflow_version_comparable )); then
          echo
          echo -e "\033[1;31mERROR!!!: Too old Airflow version $${airflow_version}!\e[0m"
          echo "The minimum Airflow version supported: $${min_airflow_version}. Only use this or higher!"
          echo
          exit 1
        fi
        if [[ -z "${AIRFLOW_UID}" ]]; then
          echo
          echo -e "\033[1;33mWARNING!!!: AIRFLOW_UID not set!\e[0m"
          echo "If you are on Linux, you SHOULD follow the instructions below to set "
          echo "AIRFLOW_UID environment variable, otherwise files will be owned by root."
          echo "For other operating systems you can get rid of the warning with manually created .env file:"
          echo "    See: https://airflow.apache.org/docs/apache-airflow/stable/start/docker.html#setting-the-right-airflow-user"
          echo
        fi
        one_meg=1048576
        mem_available=$$(($$(getconf _PHYS_PAGES) * $$(getconf PAGE_SIZE) / one_meg))
        cpus_available=$$(grep -cE 'cpu[0-9]+' /proc/stat)
        disk_available=$$(df / | tail -1 | awk '{print $$4}')
        warning_resources="false"
        if (( mem_available < 4000 )) ; then
          echo
          echo -e "\033[1;33mWARNING!!!: Not enough memory available for Docker.\e[0m"
          echo "At least 4GB of memory required. You have $$(numfmt --to iec $$((mem_available * one_meg)))"
          echo
          warning_resources="true"
        fi
        if (( cpus_available < 2 )); then
          echo
          echo -e "\033[1;33mWARNING!!!: Not enough CPUS available for Docker.\e[0m"
          echo "At least 2 CPUs recommended. You have $${cpus_available}"
          echo
          warning_resources="true"
        fi
        if (( disk_available < one_meg * 10 )); then
          echo
          echo -e "\033[1;33mWARNING!!!: Not enough Disk space available for Docker.\e[0m"
          echo "At least 10 GBs recommended. You have $$(numfmt --to iec $$((disk_available * 1024 )))"
          echo
          warning_resources="true"
        fi
        if [[ $${warning_resources} == "true" ]]; then
          echo
          echo -e "\033[1;33mWARNING!!!: You have not enough resources to run Airflow (see above)!\e[0m"
          echo "Please follow the instructions to increase amount of resources available:"
          echo "   https://airflow.apache.org/docs/apache-airflow/stable/start/docker.html#before-you-begin"
          echo
        fi
        mkdir -p /sources/logs /sources/dags /sources/plugins
        chown -R "${AIRFLOW_UID}:0" /sources/{logs,dags,plugins}
        exec /entrypoint airflow version
    # yamllint enable rule:line-length
    environment:
      <<: *airflow-common-env
      _AIRFLOW_DB_UPGRADE: 'true'
      _AIRFLOW_WWW_USER_CREATE: 'true'
      _AIRFLOW_WWW_USER_USERNAME: ${_AIRFLOW_WWW_USER_USERNAME:-airflow}
      _AIRFLOW_WWW_USER_PASSWORD: ${_AIRFLOW_WWW_USER_PASSWORD:-airflow}
    user: "0:0"
    volumes:
      - .:/sources

  airflow-cli:
    <<: *airflow-common
    profiles:
      - debug
    environment:
      <<: *airflow-common-env
      CONNECTION_CHECK_MAX_COUNT: "0"
    # Workaround for entrypoint issue. See: https://github.com/apache/airflow/issues/16252
    command:
      - bash
      - -c
      - airflow

  flower:
    <<: *airflow-common
    command: celery flower
    ports:
      - 5555:5555
    healthcheck:
      test: ["CMD", "curl", "--fail", "http://localhost:5555/"]
      interval: 10s
      timeout: 10s
      retries: 5
    restart: always
    depends_on:
      <<: *airflow-common-depends-on
      airflow-init:
        condition: service_completed_successfully

volumes:
  postgres-db-volume:

Note: This yaml file is the same as the one from the tutorial on Airflow webpage. Please suggest any changes or best practices because I am still learning!

Thank you so much!!!!

0 Answers0