2

I'm working on the airflow Helm chart and I'm trying to spin up a local environment with helm, minikube and docker.

When I helm install airflow through their example/custom-values.yaml, things look good.

I would like now to add an extra container to that chart and have some environment variables defined for that container. The idea is to be able to override the values for those env variables through the helm install .. --set command. Here's the amended file:

custom-values.yaml:

#
# NOTE:
# - This is intended to be a `custom-values.yaml` starting point for non-production deployment (like minikube)

# External Dependencies:
# - A PUBLIC git repo for DAGs: ssh://git@repo.example.com:my-airflow-dags.git
#

###################################
# Airflow - Common Configs
###################################
airflow:
  ## the airflow executor type to use
  ##
  executor: CeleryExecutor

  ## the fernet key used to encrypt the connections in the database
  ##
  fernetKey: ""

  ## environment variables for the web/scheduler/worker Pods (for airflow configs)
  ##
  config:
    # Security
    AIRFLOW__CORE__SECURE_MODE: "True"
    AIRFLOW__API__AUTH_BACKEND: "airflow.api.auth.backend.deny_all"
    AIRFLOW__WEBSERVER__EXPOSE_CONFIG: "False"
    AIRFLOW__WEBSERVER__RBAC: "False"

    # DAGS
    AIRFLOW__CORE__LOAD_EXAMPLES: "False"

    ## Disable noisy "Handling signal: ttou" Gunicorn log messages
    GUNICORN_CMD_ARGS: "--log-level WARNING"


  extraVolumes: 
    - name: synchronised-dags
      emptyDir: {}

  extraContainers:
    - name: s3-sync
      image: sidecar_python:latest
      imagePullPolicy: Never
      env:
        - name: BUCKET_NAME
          value: "my-s3-bucket"
        - name: PROJECT_NAME
          value: "my-project"
        - name: FEATURE_BRANCH
          value: "my-branch"
        - name: LOCAL_DIR
          value: "/dags"
      volumeMounts:
      - name: synchronised-dags
        mountPath: /dags

  extraVolumeMounts:
    - name: synchronised-dags
      mountPath: /opt/airflow/dags

###################################
# Airflow - Scheduler Configs
###################################
scheduler:

  ## custom airflow connections for the airflow scheduler
  ##
  connections:
    - id: my_aws
      type: aws
      extra: |
        {
          "aws_access_key_id": "XXXXXXXXXXXXXXXXXXX",
          "aws_secret_access_key": "XXXXXXXXXXXXXXX",
          "region_name":"eu-central-1"
        }

  ## custom airflow variables for the airflow scheduler
  ##
  variables: |
    { "environment": "dev" }

  ## custom airflow pools for the airflow scheduler
  ##
  pools: |
    {
      "example": {
        "description": "This is an example pool with 2 slots.",
        "slots": 2
      }
    }

###################################
# Airflow - WebUI Configs
###################################
web:
  ## configs for the Service of the web Pods
  ##
  service:
    type: NodePort

###################################
# Airflow - Worker Configs
###################################
workers:
  ## the number of workers Pods to run
  ##
  replicas: 1

###################################
# Airflow - DAGs Configs
###################################
dags:

  path: /opt/airflow/dags
###################################
# Database - PostgreSQL Chart
###################################
postgresql:
  enabled: true

###################################
# Database - Redis Chart
###################################
redis:
  enabled: false

I tried to override the values for the env variables defined in extraContainers.env as:

helm install airflow stable/airflow  --version "7.1.0" --namespace "airflow" -f ./custom-values.yaml --set extraContainers[0].env[0].name=BUCKET_NAME --set extraContainers[0].env[0].value=new-s3-bucket

I also tried to debug some of my tries using --dry-run --debug mode, but I can't seem to be able to override those values.

Is there anything I should modify in the custom-values.yaml file to be able to pass new values through --set?

AlessioG
  • 576
  • 5
  • 13
  • 32
  • 3
    Those values are underneath the `airflow` top-level key, so you need to include that in your `--set` path, `--set airflow.extraContainers[0]...`. It might be easier to use another tool to generate the `custom-values.yaml` file; anything that can manipulate and write out JSON files would work. – David Maze Jul 15 '20 at 11:25
  • Hi @DavidMaze your suggestion worked thanks. Is there any specific tool you can suggest? Thanks again for your help here. – AlessioG Jul 15 '20 at 12:02
  • I've successfully used Jenkins pipeline code to generate this; if you need to modify it in place tools like `jq` or `yq` might help too. – David Maze Jul 15 '20 at 12:32

0 Answers0