0

I am trying to divide integers but get 0 as result. I just do not understand what i am doing wrong. I am using only int's in this example but get the same result testing with float or double.

The code i use is:

int wrongAnswers = askedQuestions - playerResult;
int percentCorrect = (playerResult / askedQuestions) * 100;
int percentWrong = (wrongAnswers / askedQuestions) * 100;


NSLog(@"askedQuestions: %i", askedQuestions);
NSLog(@"playerResult: %i", playerResult);
NSLog(@"wrongAnswers: %i", wrongAnswers);
NSLog(@"percentCorrect: %i", percentCorrect);
NSLog(@"percentWrong: %i", percentWrong);
NSLog(@"calc: %i", (wrongAnswers + playerResult));
NSLog(@"wrong answers %: %i %%", ((wrongAnswers / askedQuestions) * 100));

The result i get is:

2011-01-09 16:45:53.411 XX[8296:207] askedQuestions: 5
2011-01-09 16:45:53.412 XX[8296:207] playerResult: 2
2011-01-09 16:45:53.412 XX[8296:207] wrongAnswers: 3
2011-01-09 16:45:53.413 XX[8296:207] percentCorrect: 0 %
2011-01-09 16:45:53.414 XX[8296:207] percentWrong: 0 %
2011-01-09 16:45:53.414 XX[8296:207] calc: 5
2011-01-09 16:45:53.415 XX[8296:207] wrong answers : 0 %

I would very much appreciate help :-)

PeterK
  • 4,243
  • 4
  • 44
  • 74

4 Answers4

7

You first get 0 (losing the value completely) then multiply it. You need to first multiply, then divide.

Eugene Mayevski 'Callback
  • 45,135
  • 8
  • 71
  • 121
2

Try using float data type, int are truncated .

Vishwanath
  • 6,284
  • 4
  • 38
  • 57
2

Cast the division to decimal. Int divided by int will be rounded up to 0.

percentCorrect = ((decimal) playerResult / (decimal) askedQuestions) * 100;
1

well you are using int to store the result which means if the division yields values like 0.8 the value is truncated and you will get 0 as the answer

programmer
  • 878
  • 5
  • 8