1

I am writing an app in C# to measure and display download speed. I have the following code to download a 62MB file in chunks, which seems to work well for my purposes. I plan to extend this to measure the time required for each chunk, so it can be graphed.

Before doing so, I have a few questions to make sure this is actually doing what I think it is doing. Here is the code:

private void DownloadFile()
{
  string uri = ConfigurationManager.AppSettings["DownloadFile"].ToString();
  HttpWebRequest request = (HttpWebRequest)WebRequest.Create(new Uri(uri));

  int intChunkSize = 1048576; //  1 MB chunks

  using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
  {
    byte[] buffer = new byte[intChunkSize];
    int intStatusCode = (int)response.StatusCode;
    if (intStatusCode >= 200 && intStatusCode <= 299)   //  success
    {
      Stream sourceStream = response.GetResponseStream();
      MemoryStream memStream = new MemoryStream();
      int intBytesRead;
      bool finished = false;
      while (!finished)
      {
        intBytesRead= sourceStream.Read(buffer, 0, intChunkSize);
        if (intBytesRead > 0)
        {
          memStream.Write(buffer, 0, intBytesRead);
          //  gather timing info here
        }
        else
        { 
          finished = true;
        }
      } 
    }   
  } 
}   

The questions:

  1. Does response contain all the data when it is instantiated, or just the header info? response.ContentLength does reflect the correct value.

  2. Even though I am using a 1 MB chunk size, the actual bytes read (intBytesRead) in each iteration is much less, typically 16384 bytes (16 KB), but sometimes 1024 (1 KB). Why is this?

  3. Is there any way to force it to actually read 1 MB chunks?

  4. Does it serve any purpose here to actually write the data to the MemoryStream?

Thanks.

Dan

0 Answers0