0

A playground contains only an extension of NSDecimalNumber that conforms it to ExpressibleByStringLiteral and a variable x that attempts to utilize that extension, and the LLDB RPC server crashes:

import Cocoa

extension NSDecimalNumber: ExpressibleByStringLiteral {
    init(stringLiteral value: Self.StringLiteralType) {
        self.init(string: value)
    }
}



let x: NSDecimalNumber = "1.2"

  • Swift 3
  • Xcode 8.2.1 (8C1002)
  • OS X El Capitan 10.11.6 (15G1421)

Filed on Apple Radar and OpenRadar: https://openradar.appspot.com/31556528

Ky -
  • 30,724
  • 51
  • 192
  • 308

1 Answers1

1

While this is a bug of the compiler that should be reported, your extension is wrong on many levels.

  1. ExpressibleByStringLiteral has two parent protocols that have to be implemented too.

  2. You are not declaring the typealias for the protocol associated type (StringLiteralType).

  3. You cannot use Self. outside protocol declaration.

  4. The initializer would have to be public.

  5. You can declare only convenience initializers in an extension, not designated initializers.

  6. The only way to implement the initializer is using a required initializer inside the class definition.

In summary, you cannot declare conformance with this protocol in an extension.

Why this is a bad idea:

  1. NSDecimalNumber initializer is locale dependendent. That means your code behavior would change depending on current locale.

  2. In Swift we should use Decimal instead of NSDecimalNumber.

Sulthan
  • 128,090
  • 22
  • 218
  • 270
  • Note that there's a difference between what the OP is proposing and what you're proposing. Primarily, an `NSDecimalNumber` or `Decimal` can exactly represent "1.2" whereas going through a double, which cannot represent 1.2 exactly can result in different results. – David Berry Apr 11 '17 at 17:28
  • @DavidBerry You are right, I was hoping for a better `Decimal` initialization but you are right it goes through floating point again. – Sulthan Apr 11 '17 at 17:37
  • Perhaps even more important `decimal = Decimal(string: "1.2")` yields different results from `decimal = Decimal(string: "1.20")` – David Berry Apr 11 '17 at 17:53
  • @DavidBerry In my testing both return `12e-1`. – Sulthan Apr 11 '17 at 17:56
  • 1
    But one of them returns 3 digits of precision and one returns 2 :) At least that's what I recall of the guts of `NSDecimalNumber` Of course, the last time I used it was about a decade ago... – David Berry Apr 11 '17 at 18:02