I have a NSDecimalNumber and an NSInteger. I want to multiply the former by the latter, so I'll have to convert the integer. This is what I came up with, after some trial and error.
NSDecimalNumber *factor = (NSDecimalNumber*) [NSDecimalNumber numberWithInteger:myInteger];
It feels like I'm really driving the point home:
- Hi there, I want an NSDecimalNumber, let's call it factor.
- By the way, just so you know, I want that to be an NSDecimalNumber, if that'd be possible.
- So could you give me an NSDecimalNumber? Here, I'll give you an integer to use as a source.
Is it supposed to be this verbose? Or am I messing up somewhere?