64

I'm migrating a project from Swift 2.2 to Swift 3, and I'm trying to get rid of old Cocoa data types when possible.

My problem is here: migrating NSDecimalNumber to Decimal.

I used to bridge NSDecimalNumber to Double both ways in Swift 2.2:

let double = 3.14
let decimalNumber = NSDecimalNumber(value: double)

let doubleFromDecimal = decimalNumber.doubleValue

Now, switching to Swift 3:

let double = 3.14
let decimal = Decimal(double)

let doubleFromDecimal = ???

decimal.doubleValue does not exist, nor Double(decimal), not even decimal as Double... The only hack I come up with is:

let doubleFromDecimal = (decimal as NSDecimalNumber).doubleValue

But that would be completely stupid to try to get rid of NSDecimalNumber, and have to use it once in a while...

Well, either I missed something obvious, and I beg your pardon for wasting your time, or there's a loophole needed to be addressed, in my opinion...

Thanks in advance for your help.

Edit : Nothing more on the subject on Swift 4.

Edit : Nothing more on the subject on Swift 5.

Zaphod
  • 6,758
  • 3
  • 40
  • 60
  • yep, what a bummer still in 2018... the only reason I'm hesitating whether to move to Decimal is all the math operations (including literal conversions, like `myDecimal + 1`) and comparisons which are much more straightforward than with NSDecimalNumber – Leonid Usov Mar 21 '18 at 13:35

10 Answers10

37

NSDecimalNumber and Decimal are bridged

The Swift overlay to the Foundation framework provides the Decimal structure, which bridges to the NSDecimalNumber class. The Decimal value type offers the same functionality as the NSDecimalNumber reference type, and the two can be used interchangeably in Swift code that interacts with Objective-C APIs. This behavior is similar to how Swift bridges standard string, numeric, and collection types to their corresponding Foundation classes. Apple Docs

but as with some other bridged types certain elements are missing.

To regain the functionality you could write an extension:

extension Decimal {
    var doubleValue:Double {
        return NSDecimalNumber(decimal:self).doubleValue
    }
}

// implementation
let d = Decimal(floatLiteral: 10.65)
d.doubleValue
sketchyTech
  • 5,746
  • 1
  • 33
  • 56
  • 1
    The question is still the same, even if it is hidden by an extension. You still have to use a third type instead of having only 'Double` and `Decimal`... – Zaphod Feb 12 '18 at 10:20
  • 2
    Voting for this to be the accepted answer in Swift 4. Clean, though not ideal solution, ideally the Decimal type should provide a double cast cleanly in the Swift language. – Peter Suwara Feb 18 '19 at 08:14
28

Another solution that works in Swift 3 is to cast the Decimal to NSNumber and create the Double from that.

let someDouble = Double(someDecimal as NSNumber)

As of Swift 4.2 you need:

let someDouble = Double(truncating: someDecimal as NSNumber)
rmaddy
  • 314,917
  • 42
  • 532
  • 579
14

Solution that works in Swift 4

let double = 3.14
let decimal = Decimal(double)
let doubleFromDecimal = NSDecimalNumber(decimal: decimal).doubleValue
print(doubleFromDecimal)
Pranavan SP
  • 1,785
  • 1
  • 17
  • 33
  • Yes but you still have to cast to a third type... You can't stay with only `Double` and `Decimal`, you need either to use `NSDecimalNumber` or `NSNumber`... That's the point of this post... – Zaphod Jul 20 '18 at 09:21
  • @Zaphod Have you checked – Pranavan SP Jul 22 '18 at 17:26
  • 1
    Indeed it works... But you still use a third type, here `String` using the `description` property. The whole idea behind this post is the question of efficiency and clarity of the code. The point is that the type `Decimal` designed to replace good old `NSDecimalNumber` can't be used transparently without using a hack (via `NSNumber`, `NSDecimalNumber`, `String`, pointers, or something else...) The question is more rhetorical than about development... And the missing method, is still missing in Swift 4. – Zaphod Jul 23 '18 at 13:20
  • 1
    Don't ever rely on the .description property of any type. That's bound to go wrong at some point. – alexkent Jun 24 '20 at 19:05
9

Swift 5

let doubleValue = Double(truncating: decimalValue as NSNumber)
Rafał Sroka
  • 39,540
  • 23
  • 113
  • 143
  • 2
    Indeed it works... But you still use a third type, here `NSNumber`. The whole idea behind this post is the question of efficiency and clarity of the code. The point is that the type `Decimal` designed to replace good old `NSDecimalNumber` can't be used transparently without using a hack (via `NSNumber`, `NSDecimalNumber`, `String`, pointers, or something else...) The question is more rhetorical than about development... And the missing method, is still missing in Swift 5. – Zaphod Jun 18 '19 at 07:05
3

Decimal in Swift 3 is not NSDecimalNumber. It's NSDecimal, completely different type.

You should just keep using NSDecimalNumber as you did before.

user28434'mstep
  • 6,290
  • 2
  • 20
  • 35
  • 4
    Actually, `Decimal` corresponds to `NSDecimalNumber`: `The Swift overlay to the Foundation framework provides the Decimal structure, which bridges to the NSDecimalNumber class. The Decimal value type offers the same functionality as the NSDecimalNumber reference type, and the two can be used interchangeably in Swift code that interacts with Objective-C APIs` https://developer.apple.com/reference/foundation/nsdecimalnumber – Alexander Oct 06 '16 at 08:32
  • @AlexanderMomchliov, except it has no methods of `NSDecimalNumber` available, so in order to use them you still gonna need to cast `Decimal` to `NSDecimalNumber`. Source: just checked it in playground. – user28434'mstep Oct 06 '16 at 08:45
  • 2
    Yeah, I didn't claim it did. I just said that your assumption that `Decimal` corresponds to `NSDecimal`, is wrong. – Alexander Oct 06 '16 at 08:46
  • 1
    @AlexanderMomchliov, but *it is* `NSDecimal`, just compare `Decimal` in generated headers of Swift 3 with `NSDecimal` in Foundation C code. Just because Swift **bridges** `Decimal` variable of old Foundation code to `NSDecimalNumber` when imported in doesn't mean `Decimal` becomes `NSDecimalNumber`. – user28434'mstep Oct 06 '16 at 08:50
  • 2
    Even though the documentation says that Decimal bridges to NSDecimalNumber, In my experience, it's safest to stick with NSDecimalNumber for now, and wait for the libraries to stabilize (tried converting over to Decimal, and it was messier than I expected, plus the resulting code crashed). – Jeff Lewis Jan 02 '17 at 19:32
3

You are supposed to use as operator to cast a Swift type to its bridged underlying Objective-C type. So just use as like this.

let p = Decimal(1)
let q = (p as NSDecimalNumber).doubleValue

In Swift 4, Decimal is NSDecimalNumber. Here's citation from Apple's official documentation in Xcode 10.

Important

The Swift overlay to the Foundation framework provides the Decimal structure, which bridges to the NSDecimalNumber class. For more information about value types, see Working with Cocoa Frameworks in Using Swift with Cocoa and Objective-C (Swift 4.1).

There's no NSDecimal anymore. There was confusing NSDecimal type in Swift 3, but it seems to be a bug. No more confusion.

Note

I see the OP is not interested in Swift 4, but I added this answer because mentioning only about (outdated) Swift 3 made me confused.

eonil
  • 83,476
  • 81
  • 317
  • 516
  • What was meant here was not the underlying structure but a question of syntax. I know you have ti use the `as` operator to cast to a bridged type. My problem is just "what is the point to use `Decimal` instead of `NSDecimalNumber` if you still need to reference it once in a while?", for instance to get the double value... There is no way to change your code from `NSDecimalNumber` to `Decimal` transparently... That was just what I meant... And it is still the case in Swift 4 (as noted in the edit) – Zaphod Oct 08 '18 at 14:50
  • @Zaphod The point would be "using proper value type". You should know that many of `NS-` classes are actually designed to be value-types, but implemented in classes due to lack of language construct in OBJC. IMO, Apple want to provide more "Swift-y" library to Swift users, but their transition to Swift is not complete yet. There're many "incomplete" parts like this in Apple libraries, and they're likely to be updated later. – eonil Oct 09 '18 at 05:37
  • IMO, once after Apple successfully moved to Swift, they gonna remove access to OBJC part one by one. – eonil Oct 09 '18 at 05:40
  • That was exactly my point... The transition is not completely over with Decimal... – Zaphod Oct 09 '18 at 11:58
  • Bridging to a type isn't the same as being that type. `String` bridges to `NSString` and `Dictionary` to `NSMutableDictionary`, but they're entirely separate types. Further reading: https://developer.apple.com/documentation/swift/imported_c_and_objective-c_apis/working_with_foundation_types – Ky - Oct 13 '19 at 03:09
3

For swift 5, the function is

let doubleValue = Double(truncating: decimalValue as NSNumber)

the example in the below, show the number of float.

let decimalValue: Decimal = 3.14159
let doubleValue = Double(truncating: decimalValue as NSNumber)
print(String(format: "%.3f", doubleValue))  // 3.142
print(String(format: "%.4f", doubleValue)) // 3.1416
print(String(format: "%.5f", doubleValue)) // 3.14159
print(String(format: "%.6f", doubleValue)) // 3.141590
print(String(format: "%.7f", doubleValue)) // 3.1415900
Zgpeace
  • 3,927
  • 33
  • 31
  • 1
    Indeed it works... But you still use a third type, here `NSNumber`. The whole idea behind this post is the question of efficiency and clarity of the code. The point is that the type `Decimal` designed to replace good old `NSDecimalNumber` can't be used transparently without using a hack (via `NSNumber`, `NSDecimalNumber`, `String`, pointers, or something else...) The question is more rhetorical than about development... And the missing method, is still missing in Swift 5. – Zaphod May 04 '21 at 13:25
2

In Swift open source, the implementation is actually done in Decimal.swift, but it is private. You can re-use the code from there.

extension Double {
    @inlinable init(_ other: Decimal) {
        if other._length == 0 {
            self.init(other._isNegative == 1 ? Double.nan : 0)
            return
        }

        var d: Double = 0.0
        for idx in (0..<min(other._length, 8)).reversed() {
            var m: Double
            switch idx {
            case 0: m = Double(other._mantissa.0)
                break
            case 1: m = Double(other._mantissa.1)
                break
            case 2: m = Double(other._mantissa.2)
                break
            case 3: m = Double(other._mantissa.3)
                break
            case 4: m = Double(other._mantissa.4)
                break
            case 5: m = Double(other._mantissa.5)
                break
            case 6: m = Double(other._mantissa.6)
                break
            case 7: m = Double(other._mantissa.7)
                break
            default: break
            }
            d = d * 65536 + m
        }

        if other._exponent < 0 {
            for _ in other._exponent..<0 {
                d /= 10.0
            }
        } else {
            for _ in 0..<other._exponent {
                d *= 10.0
            }
        }
        self.init(other._isNegative != 0 ? -d : d)
    }
}
Eugene Dudnyk
  • 5,553
  • 1
  • 23
  • 48
  • Yes indeed, but the real question remains, why `doubleValue` is declared as `internal` instead of `public` as it clearly should be... Maybe a pull request has to be done... But I don't know how to propose this... ‍♂️ – Zaphod May 25 '20 at 09:12
  • You can create an evolution proposal on swift forums, and you'll get the feedback and reasoning behind the current implementation, and considerations regarding the proposal. – Eugene Dudnyk May 26 '20 at 05:34
  • Yep, I'll add that to my todo list... – Zaphod May 27 '20 at 07:43
0

If you import Swift Charts, you can use .primitivePlottable

import Charts

let x: Decimal = 3.14159
let y: Double = x.primitivePlottable // 3.14159

But that's definitely not the intended way of doing it.

Eric
  • 1
-1

let decimalNumber = Decimal(10.5) let doubleNumber = Double(decimalNumber)

print(doubleNumber) // Output: 10.5

Samir Rana
  • 24
  • 1
  • This code doesn't compile. And please make some effort to format your answer correctly. Either indent the code 4 space or wrap the lines with three backticks (```) – HangarRash Jun 25 '23 at 17:06