I have 8gb file on server and I want to download 1.5 gb from this file using http multirange requests. I use curl.
All requests are distributed uniformly on file except first one, that contain big 500mb range (there are 161 requests in total).
I discovered, that download time for this first request with big range is ~40 sec, and total time ~560 sec. That means, that i download 500mb for 40 seconds, and 1gb for 520sec. So i have 6x slowdown for uniformly distributed requests. I also noticed, that download rate drops in ~6-8 times , when this uniformly distributed requests performs.
I don't understand, why this happens. Ranges in each request are sorted by offset increasing, so i don't get, why we can get such slowdown. Could you explain, what can cause such a mess? And moreover, how can i improve performance for such sets of request?
I could provide a set of requests and timing if needed.