1

Hey guys I'm working out some big-o problems from the Algorithms book by Dasgupta and am stuck on a few.

1) f(n) = n^0.1 g(n) = (log n)^10

According to the top answer on Asymptotic Complexity of Logarithms and Powers , "log(n)^a is always O(n^b), for any positive constants a, b." So for 1), f = omega(g)

2) f(n) = n^1.01 g(n) = n log^2 n My guess is f = omega(g). Is this example correct or a different case because log is squared and multiplied by n?

Please provide any explanation about the steps you take to solve these kind of problems

Community
  • 1
  • 1
ryank
  • 395
  • 2
  • 6
  • 19

1 Answers1

1

Your answer to the first question is correct, as is your application of that rule. Here's a proof that log(n) = O(n^a) for any a > 0 (which is clearly equivalent to said rule):

The derivative of n^a is a*(n^(a-1))
The derivative of log(n) = 1/n
Therefore, for large enough n, the derivative of n^a is more than the derivative of log(n)
Therefore, for large enough n, n^a > log(n)
Therefore log(n) = O(n^a)

Your answer to the second question is correct. Here's a proof:

g(n) = O(f(n)) if and only if log(log(n)) = O(n^0.01)
log(log(n)) = O(log(n)) so log(log(n)) = O(O(n^0.01)) = O(n^0.01)
laurie
  • 708
  • 6
  • 16