1

I need help with this

I was asked that for an unsigned integer range 1 to 1 billion, ,how many bits are needed!

How do we calculate this?

Thank you

UPDATE!!!!

This what I wanted to know because the interviwer said 17

hippietrail
  • 15,848
  • 18
  • 99
  • 158
daydreamer
  • 87,243
  • 191
  • 450
  • 722

3 Answers3

7

Take the log base 2 of 1 billion and round up.

Alternatively, you should know that integers (with over 4 billion values) require 32-bits, therefore for 2 billion you'd need 31-bits and for 1 billion, 30-bits.

Another handy thing to know is that every 10 bits increase the number of values you can represent by a factor just over 1000 (1024), so for 1000, you need 10 bits, 1 million needs 20 bits, and 1 billion needs 30 bits.

Peter Alexander
  • 53,344
  • 14
  • 119
  • 168
3

Calculate log2(1000000000) and round it up. It works out to 30 bits.

For example in Python you can calculate it like this:

>>> import math
>>> math.ceil(math.log(1000000000, 2))
30.0
Mark Byers
  • 811,555
  • 193
  • 1,581
  • 1,452
2
2^10 = 1024
2^10 * 2^10 = 2^20 = 1024*1024 = 1048576
2^10 * 2^10 * 2^10 = 2^30 = 3 * 1024 ~= 1,000,000

=> 30 Bits

KingCrunch
  • 128,817
  • 21
  • 151
  • 173