0

Heres what I've tried

 NSUserDefaults *prefs = [NSUserDefaults standardUserDefaults];
        [prefs setInteger:HighScore forKey:@"integerkey"];
        [prefs synchronize];

NSUserDefaults *prefs = [NSUserDefaults standardUserDefaults];
HighScore = [prefs integerForKey:@"integerKey"];

Implicit conversion loses integer precision: NSInteger (aka long) to int.

Lorenzo B
  • 33,216
  • 24
  • 116
  • 190
  • 1
    I suspect HighScore is not an actual NSInteger, at least when fetching from the prefs...? – Joachim Isaksson Dec 28 '13 at 09:59
  • wool since I'm trying to run it on a 64 bit device I suspected it would be appropriate to put it as an NSinteger – user3023743 Dec 28 '13 at 10:03
  • `NSInteger` is probably a good choice here, however, somewhere you're trying to store the value from an `NSInteger` into an `int` which is not the same type. Which line is giving the warning/error message? – Joachim Isaksson Dec 28 '13 at 10:06

2 Answers2

1

On 64-bit platforms, NSInteger is a 64-bit quantity, but int is only 32-bit.

I assume that you have declared

int HighScore;

so you can change that to

NSInteger HighScore;

or add an explicit cast

HighScore = (int)[prefs integerForKey:@"integerKey"];

to solve the problem.

Martin R
  • 529,903
  • 94
  • 1,240
  • 1,382
0

Convert it to NSNumber because NSInteger isn't an object. And save that object in defaults.

HereTrix
  • 1,315
  • 7
  • 13
  • 2
    However, [setInteger](https://developer.apple.com/library/mac/documentation/Cocoa/Reference/Foundation/Classes/NSUserDefaults_Class/Reference/Reference.html#//apple_ref/occ/instm/NSUserDefaults/setInteger:forKey:) would seem to take an NSInteger as a parameter, not an NSNumber. – Joachim Isaksson Dec 28 '13 at 09:58
  • oh okay let me try that :] – user3023743 Dec 28 '13 at 09:58