My application pre-fetches a large number of video frames using asynchronous HttpWebRequest requests. So, if there are 100 frames, the prefetcher will request all 100 frames asynchronously, all at once, and process when received back. i.e. it makes 100 asynchronous calls all at once. This can saturate the network card, but that is ok. I want to maximize network bandwidth.
However, while this prefetch is happening, the user may want to view one of the frames. So, let's say they want to view frame 56. Problem is, frames 1 - 100 have already been requested, and are in the pipe, so the request for frame 56 may take a long time to get a response.
What would be nice is if there is some way of re-prioritizing the asynch requests after they have been made. And to push the user request to the front of the queue.
If I can't do this, I suppose I will have to request the frames in batches, so that I can slip in my user request between batches, and avoid timeout.
Any ideas on how to design this properly would be very much appreciated.