I wrote a basic pipeline in Go running on Google Dataflow.
Basically it transforms Pubsub events to elastic documents and then update Elastic document in bulk.
I need to find a way to limit the number of Bulk request per second. Because when my pubsub subscription accumulated a lot of messages and my Dataflow streaming job wants to "catch up", it's literally killing my Elastic cluster.
How would you do it?