-5

Im fairly new to programming

So i was actually trying to figure something out

Why does rand()%10+1 give us a number between 1-10 whereas 32767%10 is actually 7?

Keith Thompson
  • 254,901
  • 44
  • 429
  • 631

2 Answers2

3

I think I see the source of your confusion.

The fact that you referred to rand()%10 in your title and rand()%10+1 in the body of your question made that difficult.

You asked:

Why does rand()%10+1 give us a number between 1-10 whereas 32767%10 is actually 7?

It's because the function N%10 is not monotonically increasing. As the value of N increases, the value of N%10 goes up and down.

I think you're assuming:

  • that 32767 is the maximum value returned by rand() (which it can be, but on my system it's 2147483647, but that doesn't affect the point); and
  • that if 32767 is the maximum value returned by rand(), then 32767%10, which is 7, must be the maximum value of rand()%10.

Your second assumption is wrong. For any value N (we'll ignore negative values), N%10 is the last digit of its decimal representation. If rand() returns 9, then rand()%10 will be 9 and rand()%10+1 will be 10 -- which is larger than the value of 32767%10+1.

Keith Thompson
  • 254,901
  • 44
  • 429
  • 631
-3
#include <stdio.h>
#include <stdlib.h>

int main()

{
    int randomnumber;
    randomnumber = rand() % 10;
    printf("%d\n", randomnumber);
    return 0;
}

When you run this code it will generate a number from 0 to 10 because of number 10 behind it (Because you want to random to 10 so that is why is a 10) but it will generate from 0 to 9 only. That's why you need a +1 at the end to generate random from 1 to 10. That's all.