0

If I write a function in f# like so:

let max x y =
    if x > y then x
    else y

F# treats x and y as generics that need to be comparable, so I can call it with ints, and later floats, no problem. But if I use an operator like addition:

let add x y =
    x + y

F# initially assumes x and y are ints. If I call it first with ints, then it can only be used with ints from then on. If I call it first with floats, then type inference assumes it accepts only floats from then on. Why doesn't it treat it generically like max?

jackmott
  • 1,112
  • 8
  • 16

0 Answers0