16

In a previous problem, I showed (hopefully correctly) that f(n) = O(g(n)) implies lg(f(n)) = O(lg(g(n))) with sufficient conditions (e.g., lg(g(n)) >= 1, f(n) >= 1, and sufficiently large n).

Now, I need to prove OR disprove that f(n) = O(g(n)) implies 2^(f(n)) = O(2^g(n))). Intuitively, this makes sense, so I figured I could prove it with help from the previous theorem. I noticed that f(n) can be rewritten as lg(2^f(n)) and that g(n) is just lg(2^g(n)), which got me excited...this is taking the log base 2 of both sides of what I want to prove, and it simplifies things a lot!

But I'm pretty sure this won't work. Just because lg(2^f(n)) = O(lg(2^g(n))) does not necessarily mean that 2^f(n) = O(2^g(n))...that's backwards from the previous theorem (which said "implies", not "if and only if").

Do I need to try this proof another way, or can I actually go off of what I have (at least as a starter)?

**Speaking of other ways, maybe I could just argue about how raising 2 to some g(n) that is "above" an f(n) will still keep it higher? It almost feels like a common sense argument, but maybe I'm missing something important..

**Oh, oops! I forgot to add that f(n) and g(n) are asymptotically positive. By our textbook definition, this means that they are "positive for all sufficiently large n."

Clock Slave
  • 7,627
  • 15
  • 68
  • 109
norman
  • 5,128
  • 13
  • 44
  • 75

3 Answers3

18

Well, it's not even true to begin with.

Let's say algorithm A takes 2n steps, and algorithm B takes n steps. Then their ratio is a constant.

But the ratio of 22n and 2n is not a constant, so what you said doesn't hold.

user541686
  • 205,094
  • 128
  • 528
  • 886
  • I'm not sure I understand what you're saying here. How does the ratio of two algorithms' runtimes being a constant connect with whether/how one bounds the other? – norman Sep 11 '12 at 01:58
  • 2
    @nicole: When you say `f(n) = O(g(n))` [what you mean (by definition) is](http://en.wikipedia.org/wiki/Big_O_notation#Formal_definition) that the limit of f(n)/g(n) as n approaches infinity is some finite constant *c*. If the constant is infinite then (by definition) the statement isn't true. For your case, if you have f(n) = 2n and g(n) = n, then f(n) = O(g(n)), but 2^(2n) is not O(2^n) because the ratio is infinite. – user541686 Sep 11 '12 at 02:00
  • 2
    Ah! We actually did go over that theorem in lecture, but I forgot about it. So, I guess we could say that 2^2n/2^n = 2^n, which does not give us a nice, finite limit. Thanks! – norman Sep 11 '12 at 02:09
  • @Mehrdad - but if algorithm A takes 2n steps and B takes n steps, is it true that A = O(B), which is an assumption given in the problem? My interpretation is A = O(B) implies that B grows faster than A, in which case it is intuitively obvious that this statement is true (since 2^x is monotonically increasing on x). But is my assumption an over-simplification? – drew moore Jan 20 '15 at 21:29
  • 1
    @drewmoore: There is no reason that B grows faster than A or that A grows faster than B, it's just as correct to say n = O(2n) as it is to say 2n = O(n). However, by convention the argument to O is stripped of constants, since 2n = O(n) is easier to understand. – user541686 Jan 20 '15 at 21:37
  • @Mehrdad: And this is simply because we ignore coefficients in asymptotic notation, correct? If, hypothetically, we did not, then it would NOT be true that 2n = 0(n), because 2n grows faster than n. But, since we ignore the coefficient and simply look at 2n and n as 1-degree functions, the example you give is possible. Correct? – drew moore Jan 20 '15 at 21:53
  • 1
    @drewmoore: I'm not sure which example you're referring to, but yes, it's because we ignore constants. – user541686 Jan 20 '15 at 22:13
  • @Mehrdad - Thanks! Let me ask that a better way: If the assumption was that f(n) = o(g(n)) (little o rather than big O), then it WOULD be true that 2^f(n) = O(2^g(n)), because in that case we'd know the runtime of g(n) is strictly greater than that of f(n), not just >= - correct? – drew moore Jan 21 '15 at 01:16
  • @drewmoore: Well yes, if f(n)/g(n) -> 0 as n -> infinity then that implies f(n) < g(n) for large n which means 2^f(n) < 2^g(n) for large n so it is certainly also true that 2^f(n) <= 2^g(n), which implies 2^f(n) = O(2^g(n)). – user541686 Jan 21 '15 at 01:43
  • @Mehrdad Could I ask you something? How can we find the functions f(n) such that f(n)=O(f(n)^2) ? – Mary Star Feb 27 '15 at 18:19
  • 1
    @evinda: You want f(n)/f(n)^2 = c (some constant), that means 1/f(n) = c or f(n) = 1/c, so that means f(n) must be a constant. – user541686 Feb 27 '15 at 19:03
  • @Mehrdad I am looking at the exercise: Prove or disapprove if f(n)=O(f(n)^2). Could I answer as follows? Let f(n)=1/n, for all n in N. Suppose that f(n)=O(f(n)^2). That means that there are c>0 and n_0 in N such that 1/n=f(n)<= c f(n)^2=c(1/n^2), for all n>= n_0. 1/n <=c (1/n^2) => n^2/n <=c=> n <= c, contradiction. – Mary Star Feb 27 '15 at 19:41
  • @Mehrdad We notice that f(n) in O(f(n)^2) iff there are c>0, n_0 in N such that for all n>=n_0: f(n) <= c f(n)^2 =>(f(n) !=0) 1 <=c f(n) => f(n) \geq 1/c:=C =>f(n) in Ω(1). – Mary Star Feb 27 '15 at 19:41
  • @evinda: Sorry, I'm not going to correct your homework for you. I just gave you a proof that the only f's that satisfy this condition are those that must be constant, so that should be enough. – user541686 Feb 27 '15 at 19:52
17

If f(n) = O(g(n)),
2^(f(n)) not equal to O(2^g(n)))

Let, f(n) = 2log n and g(n) = log n
(Assume log is to the base 2)

We know, 2log n <= c(log n) therefore f(n) = O(g(n))

2^(f(n)) = 2^log n^2 = n^2
2^(g(n)) = 2^log n = n

We know that
n^2 is not O(n)

Therefore, 2^(f(n)) not equal to O(2^g(n)))

3

For any f,g: N->R*, if f(n) = O(g(n)) then 2^(f(n) = O(2^g(n)) (1)

We can disprove (1) by finding a counter-example.

Suppose (1) is true -> by Big-O definition, there exists c>0 and integer m >= 0 such that:

2^f(n) <= c2^g(n) , for all n >= m (2)

Select f(n) = 2n, g(n) = n, we also have f(n) = O(g(n)), apply them to (2).

-> 2^(2n) <= c2^n -> 2^n <= c (3)

This means: there exists c>0 and integer m >= 0 such that: 2^n <= c , for all n >= m.

There is no such c, because if there is, we always find n > lg(c) that makes (3) not true: 2^n >= c, for all n >= lg(c).

Therefore, (1) cannot be true.

Tuan Le PN
  • 364
  • 1
  • 12