I've created a Pipeline that's executed by a trigger every time a blob is created, the problem is that are scenarios where the process needs to upload multiple files at the same time, when it happens, the pipeline executes as many times as the number of blobs and it causes that the data is wrong. I tried to cofigure a Copy Data Activity in the main Pipeline in order to copy every blob created, but since this pipeline is inside the first one, it executes many times as well.
Asked
Active
Viewed 269 times
2 Answers
2
What you can do is filter the copy activity source based on the property Filter by last modified
, where you can specify a start time and end time in UTC.
you can try this Incrementally copy new and changed files based on LastModifiedDate by using the Copy Data tool
OR...
Here as per your scenario, just mention the Start time.
- This start time is nothing but the last time a triggered pipeline run was executed! You can can get the Triggered pipeline run details using a REST API call Trigger Runs - Query By Factory.
- Now you can choose to query the runs that were executed in the last
x
hours or to be safe in the last day based on how frequent you have the files created in Storage. - Next, from this result collect only
triggerRunTimestamp
and append to a array variable. - Find the Max or last run time using functions. Set this time as the StartTime in UTC for the copy activity source filter as explained at the start.
If this is feasible, I can spin an example pipeline.

KarthikBhyresh-MT
- 4,560
- 2
- 5
- 12
1
Any reason why you are mapping your event trigger to the original path source where all files are being created and uploaded ? Cant you create a dummy blob path at the end with a dummy file to have a final trigger once all files are uploaded to overcome this issue?
note: this is how we manage this :) but there is a redundant file generated unfortunately

Nandan
- 3,939
- 2
- 8
- 21