We have one DAG which used to work pretty well. We later automated the process and ran the same DAG file using different threads 4-8 times almost simultaneously. It is throwing an error that says it is duplicate. How to make it unique so that all the DAG calls are successful?
Asked
Active
Viewed 48 times
0
-
Why don't you just change the dag id and have 4 copies of it running simultaneously? – Jacob Celestine Dec 13 '21 at 05:09
-
We can't as we read from a standard configuration. We resolved it using some unique run_id. – Sujoy Dec 14 '21 at 14:09