My kid asked me a funny question yesterday:
Dad, does a computer have trouble adding / multiplying large numbers like I do? Does it take longer?
I laughed and answered of course not, computers are equally fast with any numbers, they are that smart.
Later, I started thinking about it and asked myself...am I truly right? I tested a few scenarios with doubles and integers and yes, the magnitude of the number doesn't seem to have any impact on the time it takes the CPU to make an operation (yeah, I'm bored).
The highly complicated test implementation was the following:
static void Main(string[] args)
{
Test(1, 0); //JIT test method
var elapsed = Test(int.MaxValue, 0);
Console.WriteLine("Testing with 0: {0} ms", elapsed);
elapsed = Test(int.MaxValue, 1);
Console.WriteLine("Testing with 1: {0} ms", elapsed);
elapsed = Test(int.MaxValue, 1000000);
Console.WriteLine("Testing with 10E6: {0} ms", elapsed);
elapsed = Test(int.MaxValue, long.MaxValue / 2);
Console.WriteLine("Testing with MaxValue/2: {0} ms", elapsed);
Console.ReadKey();
}
private static long Test(int repetitions, long testedValue)
{
var stopwatch = new Stopwatch();
stopwatch.Start();
for (int i=0; i<repetitions; ++i)
{
var dummy = testedValue + testedValue;
}
stopwatch.Stop();
return stopwatch.ElapsedMilliseconds;
}
Still, the question keeps nagging at my head. I am no real expert on how arithmetics is performed exactly in a modern CPU so I am kind of interested in knowing why there is no difference.