We found out that you cannot distinguish two Decimals by their hashValue if one is the negative of the other. We use Decimals as a field in a struct and that struct implements Hashable to be able to be put in a set. Our business logic then requires all fields to be unique, so all fields and combined for the hashValue. Meaning that two structs where our decimal field is the negative of the other and the rest of the fields are in fact equal, then the whole struct is considered equal. Which is not what we want.
Playground code:
for i in 0..<10 {
let randomNumber: Int = Int.random(in: 0..<10000000)
let lhs = Decimal(integerLiteral: randomNumber)
let rhs = Decimal(integerLiteral: -randomNumber)
print("Are \(lhs) and \(rhs)'s hashValues equal? \(lhs.hashValue == rhs.hashValue)")
print("Are \(randomNumber) and \(-randomNumber)'s hashValues equal? \(randomNumber.hashValue == (-randomNumber).hashValue)\n")
}
The same happens when testing with doubleLiteral
instead of integerLiteral
.
The work around is to compare the Decimals directly, and optionally include it in the hashValue if required by other parts.
Is this behaviour intended? The mantissa is the same, so I guess the reason they're not considered equal is because the sign is not included in the Decimal's hashValue?