I am new to DBT and have previously been using Airflow for data transformations.
In Airflow there is a variable called {{ ds }}
which represents the logical date in this form YYYY-MM-DD
and {{ ds_nodash }}
which represents the logical date in this form YYYYMMDD
. I can then set up a task similar this:
my_task = BigQueryOperator(
task_id='t_my_task',
sql= """ SELECT * FROM my_table where my_date="{{ ds }}" """,
destination_dataset_table='my_project.my_dataset.my_table_new${{ ds_nodash }}',
write_disposition='WRITE_TRUNCATE',
dag=dag
)
This means that I am running the SQL query given on the third line and this will overwrite the table on the fourth line. In the Airflow interface, if I rerun say just the day "2022-01-11" then it will automatically overwrite that partition on that date.
I am trying to figure out how to do the same in DBT.