1

Swift can infer int literals to be doubles or floats

let x: Float = 3

It even works with arithmetic. It will convert everything before doing the math, so this is also 3:

let y: Float = 5/2 + 0.5

But what are the actual rules for this? There are ambiguous situations, for example if the inference is for a parameter:

func foobar(_ x: Int) -> Int {
  return x
}

func foobar(_ x: Float) -> Float {
  return y
}

foobar(1/2)

In this case it infers it as an int and returns 0, but if you delete the first function it switches to a float and returns 0.5.

What are the rules? Where is it documented?

Even more annoying is when Swift could infer it as a float but doesn't

func foobar(_ x: Float) -> Float {
   return x
}

let x = 1/2
foobar(x) // Cannot convert value of type 'Int' to expected argument type 'Float'
Max
  • 21,123
  • 5
  • 49
  • 71
  • 2
    Related: [Strange Swift numbers type casting](https://stackoverflow.com/q/28813516/1187415): `1` and `2` can be both integer literals and floating point literals. – Martin R Sep 18 '18 at 18:17
  • Unfortunately I don't believe the ambiguity case is officially documented – but the rule is that the compiler prefers to use the default type for a literal where it can. The default type being `Int` for an integer literal and `Double` for a floating-point literal (glossing over `IntegerLiteralType`). When ranking solutions, the compiler assigns a score to each to work out which it should favour (higher score is worse), and "non-default literal" [is a component of that score](https://github.com/apple/swift/blob/82d07ad88d392bd45d87622cc71b7389e99db104/lib/Sema/ConstraintSystem.h#L450). – Hamish Sep 18 '18 at 18:44
  • Re: your last example – Swift doesn't infer types across multiple statements (the constraint solver operates on a single expression with a potential contextual type to convert the expression type to). – Hamish Sep 18 '18 at 18:46

3 Answers3

2

There are two Swift behaviors at play here:

  1. Swift can infer an int literal as type float if needed (or double)
  2. Each literal has a default type that is used if inference fails. The type for int literals is Int

With only one function, rule 1 applies. It sees that a float is needed, so it infers the division as float division and the int literals as floats:

func foobar(_ x: Float) -> Float {
  return y
}
foobar(1/2) // 0.5

If you overload the function, rule 1 no longer works. The type is now ambiguous so it falls back to the default type of Int, which luckily matches one of the definitions:

func foobar(_ x: Int) -> Int {
  return x
}
func foobar(_ x: Float) -> Float {
  return y
}
foobar(1/2)  // 0

See what happens if you make it so the default no longer works. Neither rule applies so you get an error:

func foobar(_ x: Double) -> Double {
  return x
}
func foobar(_ x: Float) -> Float {
  return y
}
foobar(1/2)  // Ambiguous use of operator '/'
Max
  • 21,123
  • 5
  • 49
  • 71
  • 2
    For all practical purposes the default type of an integer literal is `Int`, but just for fun it's worth noting that it's actually defined by the (undocumented) top-level type `IntegerLiteralType`, which can be changed by the user. Try defining `typealias IntegerLiteralType = Double` and seeing what happens with your last example :) – Hamish Sep 18 '18 at 19:16
1

Literals don't have a type as such. The docs say,

If there isn’t suitable type information available, Swift infers that the literal’s type is one of the default literal types defined in the Swift standard library. The default types are Int for integer literals, Double for floating-point literals, String for string literals, and Bool for Boolean literals.

So unless your argument explicity says anything other than Int, it will infer integer literals as Int.


Refer this for more information, Lexical Structure - Literals.

Rakesha Shastri
  • 11,053
  • 3
  • 37
  • 50
  • Ah, the overloaded function makes its type ambiguous, so it picks the default type, so it finds the matching declaration. – Max Sep 18 '18 at 18:43
0

By default, passing in 1/2 as your argument, you are performing a calculation on two Integers which will evaluate to a result of type Integer thus the first function being used.

To have a Float, one or all of the arguments have to be of type Float so either 1.0/2 or 1/2.0 or 1.0/2.0. This will cause the second function to run instead.

In let x = 1/2, x is inferred to be of type Int because both 1 and 2 are of type Int.

Swift will not try to infer a Float if not indicated.