Context of a client sending a file to a server that has been chunked into parts.
In a single server setup, an option is to store the the chunks on the server. When new chunks arrive they get added to the existing data.
With multiple servers "micro services", a request may be sent to server A , and then to server B. So when B goes to work with a new chunk it fails to retrieve the old chunks from A.
What are best practices for handling this pattern? What I have so far is
A) Route requests regarding file A to the same server B) OR, Store the chunks of the file on a shared service
The problem with A) is that it starts to defeat some of the advantages of having multiple servers.
Problem with B) is that it requires a lot more file transfer back and forth.
Is there a canonical / standard way to handle this?