Swift can infer int literals to be doubles or floats
let x: Float = 3
It even works with arithmetic. It will convert everything before doing the math, so this is also 3:
let y: Float = 5/2 + 0.5
But what are the actual rules for this? There are ambiguous situations, for example if the inference is for a parameter:
func foobar(_ x: Int) -> Int {
return x
}
func foobar(_ x: Float) -> Float {
return y
}
foobar(1/2)
In this case it infers it as an int and returns 0, but if you delete the first function it switches to a float and returns 0.5.
What are the rules? Where is it documented?
Even more annoying is when Swift could infer it as a float but doesn't
func foobar(_ x: Float) -> Float {
return x
}
let x = 1/2
foobar(x) // Cannot convert value of type 'Int' to expected argument type 'Float'