5

In my download manager application, I'm using the code below to calculate the current transfer rate:

        TimeSpan interval = DateTime.Now - lastUpdateTime;

        downloadSpeed = (int)Math.Floor((double)(DownloadedSize + cachedSize - lastUpdateDownloadedSize) / interval.TotalSeconds);

        lastUpdateDownloadedSize = DownloadedSize + cachedSize;
        lastUpdateTime = DateTime.Now;

This mostly works the way I want (I'm updating the speed every 4 seconds or so), but there are always some crazy spikes in the download rate as it fluctuates. My average download speed is around 600 kB/s, and sometimes it shows 10.25 MB/s or even negative values like -2093848 B/s. How could this be?

What is the best way to calculate real-time download rate? I'm not interested in the average rate (DownloadedSize / TimeElapsed.TotalSeconds), because it doesn't give realistic results.

marko
  • 217
  • 1
  • 4
  • 14
  • You should use DateTime.UtcNow instead of DateTime.Now, it doesn't change twice a year. – spender May 08 '12 at 23:22
  • Doesn't your calculation effectively cancel out to `DownloadedSize/interval.TotalSeconds`? – spender May 08 '12 at 23:24
  • Lookup instantaneous rate of change in a calculus book or online. – P.Brian.Mackey May 08 '12 at 23:25
  • You don't show us enough code to say why you might be getting a negative number. How is `DownloadSize` calculated? Is `cachedSize` zeroed after being used here? As @spender mentioned, you should change the way you calculate elapsed time, although `DateTime.UtcNow` isn't a much better choice than `DateTime.Now`. See http://blog.mischel.com/2012/05/08/dont-depend-on-what-you-dont-control/ for the reason why. – Jim Mischel May 08 '12 at 23:32
  • @spender - No, it doesn't cancel out like that. Jim Mischel - DownloadedSize is incremented with CachedSize when the download cache is written to the local file. The maximum cache size is 1MB. I think there is no problem in that area. And thanks for the suggestion, I'll consider using a Stopwatch instead of System Time based intervals, altough calculating the rate every second instead of every few miliseconds seems to be the cure for those spikes. – marko May 09 '12 at 12:52

1 Answers1

4

Given that "real-time" is unachievable, you should try to simulate it, by making the interval as small and precise as possible, and calculating the average over the interval, checking for sanity in the code. For instance:

DateTime now = DateTime.Now;
TimeSpan interval = now - lastUpdateTime;
timeDiff = interval.TotalSeconds;
sizeDiff = DownloadedSize + cachedSize - lastUpdateDownloadedSize;
speed = (int)Math.Floor((double)(sizeDiff) / timeDiff);
lastUpdateDownloadedSize = DownloadedSize + cachedSize;
lastUpdateTime = now;

One difference with your code:

  1. Only calculate Now once, use it twice.
rewritten
  • 16,280
  • 2
  • 47
  • 50
  • 2
    I don't agree with your reasoning for using TotalMilliseconds over TotalSeconds. Both are represented by double precision floating point numbers. There won't be any loss of precision preferring one over the other... you're simply adding more calculation. – spender May 08 '12 at 23:51
  • Thanks saverio, I think there was some problem with the time interval, because this speed calculation was made every few miliseconds. Now I'm calculating this every second or so and there are no huge spikes in the rate. – marko May 09 '12 at 12:44