I am trying to pass secret variables to my KubernetesPodOperator
in airflow
Here is what I have done :
- Create a
secret.yaml
file that looks like the following
apiVersion: v1
kind: Secret
metadata:
name: my-secret
type: Opaque
data:
SECRET_1: blabla
SECRET_2: blibli
- Apply the secret :
kubectl apply -f ./secret.yaml
- Retrieve the secret from my DAG file :
from airflow.contrib.kubernetes.secret import Secret
from airflow.contrib.operators.kubernetes_pod_operator import KubernetesPodOperator
from airflow.models import DAG
SECRET_1 = Secret(
deploy_type="env", deploy_target="SECRET_1", secret="ai-controller-object-storage", key="SECRET_1"
)
SECRET_2 = Secret(
deploy_type="env", deploy_target="SECRET_2", secret="ai-controller-object-storage", key="SECRET_2"
)
with DAG(...) as dag:
KubernetesPodOperator(
task_id=..,
trigger_rule="all_success",
namespace="default",
image=IMAGE,
startup_timeout_seconds=600,
secrets=[
SECRET_1,
SECRET_2], ...)
So now as I understand, I should access SECRET_1
as environnement variable in my container from KubernetesPodOperator
However my first task from a python script (with os.environ["SECRET_1"]
) returns an error that indicates that this environment variable does not exist :
KeyError: 'SECRET_1'
How can I access this variable from my python script then ?