I'm having some trouble with an asymptotic analysis question. The problem asks for both the asymptotic worst case running time and the asymptotic expected running time of a function. Random(n) generates a random number between 1 and n with uniform distribution (every integer between 1 and n is equally likely.)
Func2(A, n)
/* A is an array of integers */
1 s ← A[1];
2 k ← Random(n);
3 if (k < log2(n)) then
4 for i ← 1 to n do
5 j ← 1;
6 while (j < n) do
7 s ← s + A[i] ∗ A[j];
8 j ← 2 ∗ j;
9 end
10 end
11 end
12 return (s);
I was wondering how line 3 (if (k < log2(n)) then) effects the expected running time of the function. I believe lines 4 - 10 run at worst case cn^2 time, but I am unsure how to derive the expected running time due to the if statement. Thanks for any help!
-Matt