-1

I have submitted 3 jobs in parallel in AWS Batch and I wanted to create a trigger when all these 3 jobs are completed.

Something like I should be able to specify 3 job ids and can update DB once all 3 jobs are done.

I can do this task easily by having long pooling but wanted to do something based on events. I need your help with this.

Saurabh Sharma
  • 463
  • 2
  • 7
  • 20
  • Why not use a job queue and trigger a lambda function at the end? – Stargazer Mar 02 '20 at 17:37
  • 1
    Step Functions would be a good option. Another might be to write completion events into DynamoDB (e.g. have one DynamoDB item associated with a given set of 3 jobs) and have each job append its ID to a list attribute on the DynamoDB item when complete, then have DynamoDB Streams invoke a Lambda each time an item is updated. That Lambda would count the number of completed event IDs and, if 3, trigger the next part of your workflow. – jarmod Mar 02 '20 at 17:58

1 Answers1

0

The easiest option would be to create a fourth Batch job that specifies the other three jobs as dependencies. This job will sit in the PENDING state until the other three jobs have succeeded, and then it will run. Inside that job, you could update the DB or do whatever other actions you wanted.

One downside to this approach is that if one of the jobs fails, the pending job will automatically go into a FAILED state without running.

Nathan Collins
  • 301
  • 2
  • 5