2

I am creating an Airflow DAG for my pythin job. This jobs create json files in s3 bucket with current date. I am planning to pass the date as environment variable. is it possible to do so? what i am trying is below

recommendations = KubernetesPodOperator(
    namespace='data',
    image = "quay.io/comp/data-rp-recommender:0.1.1",
    name = "data-rp-recommender-run",
    task_id = "data-rp-recommender-task",
    dag = dag,
    get_logs = True,
    in_cluster = True,
    is_delete_operator_pod = False,
    volumes = [
        Volume(
            name = 'etcctek',

        )
    ],
    volume_mounts = [
        VolumeMount(
            name = 'etcctek',
            mount_path = '/opt/docker/etc',
            sub_path = None,
            read_only = True
        )
    ],
    env_vars = {
    'DATE' :
    'USER_PRECEDING_STATS_S3' : 's3a://ml/rp-recommendations/userPreceedingStats/int/DATE/json'
    'NUGGET_CATEGORICALS_S3' : 's3a://ml/rp-recommendations/nuggetCategorials/int/DATE/json'
    'PREREQUISITE_STATS_S3' : 's3a://ml/rp-recommendations/prerequisiteStats/int/DATE/json'
    'USER_CURRENT_STATS_S3' : 's3a://ml/rp-recommendations/userCurrentStats/int/DATE/json'
    },
 
    resources = {
        'request_memory': '2048M',
        'request_cpu': '1.5'
    }
)

In the above example i want to set env_vars for DATE as current date and then use that current date in other variables below. It's my first time in airflow and needs some help

priyanka Dhiman
  • 95
  • 3
  • 11
  • Refer this: [Access execution date airflow](https://stackoverflow.com/questions/36730714/execution-date-in-airflow-need-to-access-as-a-variable) – AnkurSaxena Feb 01 '21 at 18:27

1 Answers1

0

Assuming you want to write task execution start time:

var = '{{ ti.start_date }}'

More details on task instance model here

Miguel
  • 126
  • 1
  • 9