-2

I'm a bit confused on (log n). Given this code

public static boolean IsPalindrome(String s) {
    char[] chars = s.toCharArray();
    for (int i = 0; i < (chars.length / 2); i++) {
        if (chars[i] != chars[(chars.length - i - 1)])
            return false;
        }
        return true;
    }
}

I am looping n/2 times. So, as length of n increase, my time is increasing half the time of n. In my opinion, I thought that's exactly what log n was? But the person who wrote this code said this is still O(N).

In what case of a loop, can something be (log n)? For example this code:

1. for (int i = 0; i < (n * .8); i++)

Is this log n? I'm looping 80% of n length.

What about this one?

2. for (int i = 1; i < n; i += (i * 1.2))

Is that log n? If so, why.

deHaar
  • 17,687
  • 10
  • 38
  • 51
JohnC1
  • 841
  • 1
  • 12
  • 27
  • The last thing is an *O(log n)*. (1) is just *O(n)*. – Willem Van Onsem Jan 14 '19 at 22:48
  • The time complexity is typically talking about the size of input, here for a string of size n it takes O(n) operations so the complexity is O(n), what you're talking about is like the growth which is different, The last one is log(n) since it takes log_1.2(n) operations or so to go through the loop, which (provided the body of the loop has not too many operations) imples the complexity is O(log(n)) – Countingstuff Jan 14 '19 at 22:50
  • Are you familiar with logarithms? – ruakh Jan 14 '19 at 22:54
  • 1
    O(n/2) is the same as O(n) but a log is a different thing altogether. For example, a base 10 log tells you how many digits are in the number: log(1) = 0, log(10) = 1, log(100) = 2, log(10000000000) = 10 So you see that with an O(log(n)) algorithm if you loop 100 times you can process a google items (10^100) which is very different from dividing. – Jerry Jeremiah Jan 14 '19 at 22:59
  • 1
    The complexity is simply defining the number of operations variation compared to the number of entry elements. The first example is a linear relation (a line on a graph) between the number of characters and the how many loops. In the last example the step interval on the `for` loop is increasing by 20% for each step, which means that over time is it getting bigger and therefore decreasing the number of steps in a exponential manner hence a logarithmic complexity. So you could have two algorithms with O(n) but one looping n/2 times and the second looping n/4 times. – Toady Jan 14 '19 at 23:01
  • A small correction though.... your last loop example is an infinite loop since the starting value of `i`is 0 and `0 + 0 * 1.2 = 0` so there is no increment! So my previous comment is only valid for a non-zero starting value and an increment like `i = i * 1.2` – Toady Jan 14 '19 at 23:19
  • @user10472446 great catch, fixed. – JohnC1 Jan 14 '19 at 23:22
  • Possible duplicate of [How to find time complexity of an algorithm](https://stackoverflow.com/questions/11032015/how-to-find-time-complexity-of-an-algorithm) – Prune Jan 15 '19 at 00:10

2 Answers2

2

1. for (int i = 0; i < (n * .8); i++) In the first case basically you can replace 0.8n with another variable, let's call it m.

for (int i = 0; i < m; i++) You're looping m number of times. You're increasing value of i one unit in each iteration. Since m and n are just variable names, the Big-O complexity of the above loop is O(n).

2. for (int i = 0; i < n; i += (i * 1.2)) In the second scenario, you're not incrementing the value of i, the value of i is always going to be 0. And it is a classic case of an for-infinite loop.

What you're looking for is 2. for (int i = 1; i <= n; i += (i * 1.2)) Here, you're incrementing the value of i logarithmically(but not to the base 2).

Consider for (int i = 1; i <= n; i += i) The value of i doubles after every iteration. value of i is going to be 1, 2, 4, 8, 16, 32, 64.. Let's say n value is 64, your loop is going to terminate in 7 iterations, which is (log(64) to the base 2) + 1(+1 because we are starting the loop from 1) number of operations. Hence it becomes a logarithmic operation.

2. for (int i = 1; i <= n; i += (i * 1.2)) In your case as well the solution is a logarithmic solution, but it is not to the base 2. The base of your logarithmic operation is 2.2, But in big-O notation it boils down to a O(log(n))

Vineeth Chitteti
  • 1,454
  • 2
  • 14
  • 30
0

I think you miss what is time complexity and how the big O notation work.

The Big O notation is used to describe the asymptotic behavior of the algorithm as the size of the problem growth (to infinity). Particular coefficients do not matter.

As a simple intuition, if when you increase n by a factor of 2, the number of steps you need to perform also increases by about 2 times, it is a linear time complexity or what is called O(n).

So let's get back to your examples #1 and #2:

  1. yes, you do only chars.length/2 loop iterations but if the length of the s is doubled, you also double the number of iterations. This is exactly the linear time complexity

  2. similarly to the previous case you do 0.8*n iterations but if n is doubled, you do twice as many iterations. Again this is linear

The last example is different. The coefficient 1.2 doesn't really matter. What matters is that you add i to itself. Let's re-write that statement a bit

i += (i * 1.2)

is the same as

i = i + (i * 1.2)

which is the same as

i = 2.2 * i

Now you clearly see that each iteration you more than double i. So if you double n you'll only need one more iteration (or even the same). This is a sign of a fundamentally sub-linear time complexity. And yes this is an example of O(log(n)) because for a big n you need only about log(n, base=2.2) iterations and it is true that

 log(n, base=a) = log(n, base=b) / log(n, base=b) = constant * log(x, base=b) 

where constant is 1/log(a, base=b)

SergGr
  • 23,570
  • 2
  • 30
  • 51