2

I've got code for test on BenchmarkDotnet..

And then I found that something felt wrong..

My projects version is

C# 11.0 .NET 7.0 Nuget Package BenchMarkDotnet 0.13.5

ReadOnlyMemory<float> readOnlyFloats;
ReadOnlyMemory<double> readOnlyDoubles;

    [GlobalSetup]
    public void Setup()
    {
        var rand = new Random();
        var floats = Enumerable.Range(0, ArrayLength).Select(a => rand.NextSingle()).ToArray();
        var doubles = Enumerable.Range(0, ArrayLength).Select(a => rand.NextDouble()).ToArray();

        readOnlyFloats = floats;
        readOnlyDoubles = doubles;
    }

    [Benchmark]
    public double Calc_WaveStatistic_Sum()
    {
        return readOnlyFloats.Span.Sum();
    }

    [Benchmark]
    public double Calc_WaveStatistic_Sum_Double()
    {
        return readOnlyDoubles.Span.Sum();
    }

    [Benchmark]
    public double Calc_WaveStatistic_Avg()
    {
        return readOnlyFloats.Span.Avg();
    }

    [Benchmark]
    public double Calc_WaveStatistic_Avg_Double()
    {
        return readOnlyDoubles.Span.Avg();
    }

and the extend method goes to

public static double Sum(this ReadOnlySpan<float> values)
{
    double sum = 0;
    for (int i = 0; i < values.Length; i++)
        sum += values[i];
    return sum;
}

public static double Sum(this ReadOnlySpan<double> values)
{
    double sum = 0;
    for (int i = 0; i < values.Length; i++)
        sum += values[i];
    return sum;
}

public static double Avg(this ReadOnlySpan<float> values)
{
    return values.Sum() / values.Length;
}
    
public static double Avg(this ReadOnlySpan<double> values)
{    
    return values.Sum() / values.Length;
}

For this code, I expect the Sum() method to be faster than Avg() but result is enter image description here

Avg() is excuting an extra calculation after the Sum() (divide by length).

So is my result right?

If it's right, why is it faster?

Palle Due
  • 5,929
  • 4
  • 17
  • 32
Ramga
  • 29
  • 3

1 Answers1

9

It is not.

First, you're performing a very small microbenchmark. Measurement error by the benchmarking harness is a non-neglible factor here. In case of the float version, the (mean) measurement error reported by BenchmarkDotNet is 7.67ns for Avg and 4.62ns for Sum. Just the error entirely explains the difference in mean time (7.67ns + 4.62ns = 12.29ns, and the difference in mean is 12.4ns).

In case of double, the error reduces the mean difference to 43.27ns. This appears more significant, but you need to remember that performance analysis is, by necessity, statistical analysis. The standard deviation reported for sum in this table is huge, 35.68ns. In statistical terms this means that the result is too noisy to be considered significant – the difference is almost in the range of standard deviation. There's a big chance a lot of the smaller results are simply flukes.

To eliminate those errors, you should simply increase the ArrayLength. You did not share its value, but I am assuming it is very small. Using an array that's 100x larger is going to push your results into the millisecond range, where deviations of a few nanoseconds won't influence the results too heavily.

V0ldek
  • 9,623
  • 1
  • 26
  • 57