0

https://developer.apple.com/library/archive/referencelibrary/GettingStarted/DevelopiOSAppsSwift/ImplementingACustomControl.html#//apple_ref/doc/uid/TP40015214-CH19-SW1 In this tutorial,

button.widthAnchor.constraint(equalToConstant: 44.0).isActive = true

constant is typed "44.0" not "44".

Is there any difference between them?

I measured time of the methods.

func evaluateProblem(problemNumber: Int, problemBlock: () -> Void)
{
    print("Evaluating problem \(problemNumber)")

    let start = DispatchTime.now() // <<<<<<<<<< Start time
    let end = DispatchTime.now()   // <<<<<<<<<<   end time

    let nanoTime = end.uptimeNanoseconds - start.uptimeNanoseconds // <<<<< Difference in nano seconds (UInt64)

    print("Time to evaluate problem \(problemNumber): \(nanoTime)")
}

evaluateProblem(problemNumber: 2) {
    let b: CGFloat = 44
    print(b)
}

evaluateProblem(problemNumber: 1) {
    let a: CGFloat = 44.0
    print(a)
}

But the faster one is not fixed.

손지현
  • 61
  • 9
  • Your code doesn’t seem to actually run the block, or I’m missing something. Also checking just by running it once won’t give you any reasonable results. Also the compiler will convert the constant if needed, it’s just common to put floats/doubles as such and not integers in code. – Sami Kuhmonen Apr 12 '20 at 12:39
  • No difference, just a good habit to have. Makes for cleaner syntax. – Michael Wells Apr 12 '20 at 12:52
  • @DavidH Not so; it compiles to `CGFloat` due to what's in Rob's answer. Swift has no implicit conversions; if it looks like that's what's going on, it's actually an `ExpressibleBy` at work. –  Apr 12 '20 at 13:26

1 Answers1

1

You can initialize Double, Float, CGFloat, Int, etc. with integer literals because all of the above conform to the ExpressibleByIntegerLiteral protocol. Behind the scenes initialization with a literal simply calls the init(integerLiteral:) method of the conforming type.

Likewise, there is a ExpressibleByFloatLiteral protocol that handles initialization with floating point literals, and that protocol has an initializer that must also be implemented by conforming types.

As far as which to use, it's a matter of personal preference and style. Both ways of initialization are valid and unless you're doing thousands of initializations the performance difference would be negligible.

Rob C
  • 4,877
  • 1
  • 11
  • 24
  • I might add that IIRC, earlier versions of Swift (maybe through Swift 2.x?) didn't yet have the ability to infer the proper type without the decimal point. Maybe this is a result what *ExpressableByIntegerLiteral* technically accomplishes, but the Swift language has undergone such massive changes i the first 5 or so years, sometimes the best answer to a question like this is "you used to be forced to add 1.0 to infer a float over an integer but not anymore". –  Apr 12 '20 at 16:16
  • @dfd I think you are correct about initialization through Swift 2.x but your answer doesn't seem to address why people still use decimals at initialization. – Rob C Apr 12 '20 at 17:10
  • That's why I didn't answer, but commented. :-) I can think of two possible reasons, and there may be more. (1) Either force of habit, the OP is looking at outdated code, or the person coding has a background in other languages like GLSL that *does* require a decimal point for float numbers. (2) Readability. A decimal point implies at some amount of precision over a `Int`. Just thought of a third - using the example above, suppose there are constraints - maybe on `button` but maybe a view on a different screen but in the same app - that actually have a constant more precise than 44.0? –  Apr 12 '20 at 18:53