I have the following algorithm that write data to Azure blob storage
private const long MaxChunkSize = 1024 * 1024 * 4; // 4MB
private void UploadPagedDataToBlob(...)
{
...
List<Task> list_of_tasks = new List<Task>();
do
{
var stream = new MemoryStream(data, index, (int)blockSize);
var task = _blob.WritePagesAsync(stream, startPosition, null);
list_of_tasks.Add(task);
...
}
while (remainingDataLength > 0);
Task.WaitAll(list_of_tasks.ToArray());
}
If my file has size 628MB
=> then list_of_tasks
has 157
tasks (628/MaxChunkSize). Usually I have more than 1 TB file. I don't want to have so much running tasks, how to create more efficient algorithm? What is the optimal number of running tasks? For example no more than 200, any recommendations?