1

if I use the ilmath.min() function with just two 1x1 double (other datatypes not checked) matrices it returns the maximum value instead of the minimum. Everything works fine if the size of the the matrices are bigger than 1x1. Please confirm that the following code returns 1.0 and not 0.0 as aspected. For me this looks like a Bug or is it a feature?

Console.WriteLine(ILMath.min(0.0,1.0));
Console.ReadKey();

Thanks in advance.

  • See the CodeCasters answer below. It confirms, that this is a bug and how to fix/workaround it. Alternatively, one can simply use Math.Min(0.0,1.0) if working with system scalar values only. – Haymo Kutschbach Jan 25 '14 at 13:19

2 Answers2

1

I vote for a bug. there is a bugtracker at http://ilnumerics.net/mantis You may consider filing an issue there.

1

This line, 4295 of Functions\BuiltIn\min.cs, is only used for scalar values (i.e. input size of 1 element):

return array<double>(
    new double[1] 
    {
        (A.GetValue(0) > B.GetValue(0)) ? A.GetValue(0) : B.GetValue(0) 
    }
);

That seems to return max, not min. Change > to < and it should work, but I can't find any relevant test cases in their download, so I don't know what this will break.

CodeCaster
  • 147,647
  • 23
  • 218
  • 272
  • 1
    I've added the bug in the bugtracker: http://ilnumerics.net/mantis/view.php?id=185. It will be fixed in the next version. The proposed workaround above can be used until it is released. – Haymo Kutschbach Jan 25 '14 at 12:38