0

I've got some code for a game I'm working on that converts double values to Int64. The problem areas is as follows:

double xVal = 1.6;    

Int64 xValInt64 = (Int64)xVal;
Int32 xValInt32 = (Int32)xVal;

Testing on my PC works great.

However, testing on my mobile device (iPhone) results in:

xValInt64 = 4609884575999459329
xValInt32 = 1

I can't wrap my mind around why this is happening. Anybody dealt with this before? Why is my cast not working?

Tyson
  • 1,226
  • 1
  • 10
  • 33
  • 3
    Can you show a *complete* (self-contained) example, with the double input, the Int64 output and the expected output? – Martin R Nov 15 '13 at 19:45
  • 2
    Your question might benefit from specifying what language/SDK you're working with. – ilya n. Nov 15 '13 at 19:53
  • Okay, I've amended the title and example code to better explain the problem. – Tyson Nov 15 '13 at 19:58
  • How are you using C# on an iPhone? – David H Nov 15 '13 at 20:01
  • I don't know the exactly compilation process, but the C# code is given to Unity Engine, which builds an Xcode project which is then executed on the device. – Tyson Nov 15 '13 at 20:03
  • 1
    How/where is `Int64` defined? – Martin R Nov 15 '13 at 21:06
  • The xValInt64 value, treated as a bit pattern and interpreted as double, is 1.5999994277954103782946049250313080847263336181640625. This is close to 1.6, but not close enough to be the value of a double 1.6 literal. – Patricia Shanahan Nov 15 '13 at 21:58

0 Answers0