I am creating an Airflow DAG for my pythin job. This jobs create json files in s3 bucket with current date. I am planning to pass the date as environment variable. is it possible to do so? what i am trying is below
recommendations = KubernetesPodOperator(
namespace='data',
image = "quay.io/comp/data-rp-recommender:0.1.1",
name = "data-rp-recommender-run",
task_id = "data-rp-recommender-task",
dag = dag,
get_logs = True,
in_cluster = True,
is_delete_operator_pod = False,
volumes = [
Volume(
name = 'etcctek',
)
],
volume_mounts = [
VolumeMount(
name = 'etcctek',
mount_path = '/opt/docker/etc',
sub_path = None,
read_only = True
)
],
env_vars = {
'DATE' :
'USER_PRECEDING_STATS_S3' : 's3a://ml/rp-recommendations/userPreceedingStats/int/DATE/json'
'NUGGET_CATEGORICALS_S3' : 's3a://ml/rp-recommendations/nuggetCategorials/int/DATE/json'
'PREREQUISITE_STATS_S3' : 's3a://ml/rp-recommendations/prerequisiteStats/int/DATE/json'
'USER_CURRENT_STATS_S3' : 's3a://ml/rp-recommendations/userCurrentStats/int/DATE/json'
},
resources = {
'request_memory': '2048M',
'request_cpu': '1.5'
}
)
In the above example i want to set env_vars for DATE as current date and then use that current date in other variables below. It's my first time in airflow and needs some help