I am trying to find the number of multiplications required when executing an algorithm which uses Exponentiation by Squaring I was reading about on Wikipedia. The section, Computational Complexity, mentions that the algorithm requires at most floor(log n)
. How could I go about proving this?
I have this pseudocode:
expo(a, n)
if n == 0
return 1
if n == 1
return a
if n is even
b = expo(a, n/2)
return b*b
return a * expo(a, n-1)
With this, I also have the following relation
Number of multiplications = T(n)
T(n) = 0 if n<2; a*a^(n-1) if n is odd; (a^(n/2))^2 is n is even
I've attempted using bit-strings representing the base, a, and noting binary operations which need to be completed. i.e. 5 = 101_2. All 1's require inverting and then bit-shifting to the right. All 0's simply require bit-shifting to the right. These operations then can represent multiplication, as described by this chart I produced:
exponent n 0 1 2 3 4 5 6 7 8
bits in n 1 1 2 2 3 3 3 3 4
0-bits in n 1 0 1 0 2 1 1 0 3
1-bits in n 0 1 1 2 1 2 2 1 1
binary operations for a^n 0 0 1 2 2 3 3 4 3
multiplications for a^n 0 0 1 2 2 3 3 4 3
Edit
As pointed out by Henry in the comments below, the number of multiplications can be found using # of bits in binary representation + # of 1 bits in binary representation - 1
. To prevent getting lost in the math, I will assume the amount of 1-bits is given by some function b(n)
. Then, T(n) = floor(log_2 n) + b(n) - 1
Proving for n = 2
:
2_10 = 10_2 -> b(2) = 1
-> T(2) = floor(log_2 2) + b(2) - 1 = 1 + 1 - 1 = 1
This agrees with the observation table above.
Assume true for k.
Prove for k+1:
T(k+1) = floor(log_2 (k+1)) + b(k+1) - 1
After this formula, in terms of k+1, I am not so sure what to do. I would appreciate any insight.