19

I am currently studying basic algorithms for Big Oh. I was wondering if anyone can show me what the code for (n log n) in Java using Big Oh would be like or direct me to any SO page where one exists.

Since I am just a beginner, I can only imagine the code before I write it. So, theoretically (at least), it should contain one for loop where we have something of n times. Then for the log n, we can use the while loop. So then the loop is executed n times and the while loop is executed log base 2 times. At least that is how I am imagining it in my head but seeing the code would clear things up.

hherklj kljkljklj
  • 367
  • 2
  • 3
  • 9
  • I'm not sure if I understand you correctly. Are you asking for an example of an algorithm with a time complexity in O(n log n)? – Carsten Sep 26 '13 at 06:46
  • Try to study any good sorting algorithm like merge sort. Following link may help you http://stackoverflow.com/questions/1592649/examples-of-algorithms-which-has-o1-on-log-n-and-olog-n-complexities – Abhishek Bansal Sep 26 '13 at 06:46
  • Yes. I just want to see what the code would look like in a Java program. – hherklj kljkljklj Sep 26 '13 at 06:47
  • @hherkljkljkljklj There are many such algorithms, and they don't necessarily look/work the same. Here are examples you find: Quick Sort and Merge Sort. – Kevin A. Naudé Sep 26 '13 at 06:49
  • @AbhishekBansal We'll I do understand the algorithm theoretically (more or less) but I just want to see how the algorithm would be written in a Java program. – hherklj kljkljklj Sep 26 '13 at 06:49
  • If you just want a sample code, try to google it out. I'm sure you'll find it. – Abhishek Bansal Sep 26 '13 at 06:51
  • @AbhishekBansal I have. I found example of many other Big Oh algorithms such as O(n^2) but not one for O(n log n). – hherklj kljkljklj Sep 26 '13 at 06:53
  • http://www.code2learn.com/2011/07/merge-sort-using-java.html – Abhishek Bansal Sep 26 '13 at 06:55
  • 1
    @hherkljkljkljklj You will probably need to learn about recursion before you can appreciate most O(n log n) algorithms. They don't typically comprise two nested explicit loops. – Kevin A. Naudé Sep 26 '13 at 06:56
  • 2
    It sounds like you may be a bit confused. O-notation isn't an algorithm, it's a metric that _describes_ algorithms. So, you don't implement O(n log n) -- you implement an algorithm, which may have guaranteed performance of O(n log ). To use a housing analogy, you don't build 20 feet, you build a house and then measure how many feet tall it is. – yshavit Sep 26 '13 at 07:04
  • +1 for studying "basic algorithms for Big Oh" – necromancer Sep 26 '13 at 07:35

4 Answers4

59
int n = 100
for(int i = 0; i < n; i++) //this loop is executed n times, so O(n)
{
    for(int j = n; j > 0; j/=2) //this loop is executed O(log n) times
    {

    }
}

Explanation: The outer for loop should be clear; it is executed n times. Now to the inner loop. In the inner loop, you take n and always divide it by 2. So, you ask yourself: How many times can I divide n by 2?

It turns out that this is O (log n). In fact, the base of log is 2, but in Big-O notation, we remove the base since it only adds factors to our log that we are not interested in.

So, you are executing a loop n times, and within that loop, you are executing another loop log(n) times. So, you have O(n) * O(log n) = O(n log n).

John R Perry
  • 3,916
  • 2
  • 38
  • 62
productioncoder
  • 4,225
  • 2
  • 39
  • 65
  • 4
    This would be O(n log n), but it isn't realistic. Almost *all* O(n log n) algorithms are recursive. Perhaps you could offer a more typical form. – Kevin A. Naudé Sep 26 '13 at 06:55
  • 13
    This example was intended to provide a very simple example on what O(n log n) is. You won't see that algorithm in reality. Yes, those algorithms are recursive but explaining O(n log n) iteratively is easier to understand. The key is to see that you always divide n by two. This is a key feature in most algorithms like Mergesort where you call the same algorithm for each half. I thought this would be appropriate since he talks about nested loops in his question. – productioncoder Sep 26 '13 at 07:02
  • 3
    @slashburn thanks, this is the simplest explanation..! – TheFlash Nov 14 '17 at 07:56
  • 1
    @KevinA.Naudé anything that can be written recursively can be written iteratively so saying it's unrealistic makes no sense. – John R Perry Dec 26 '18 at 23:57
  • Isn't that O(n*n)? – Ari Feb 15 '20 at 02:52
  • 1
    @Ari If we were to do j-- instead of j/=2 then it would be O(n^2), but since we divide by 2 for every iteration it is O (n * log(n)) – productioncoder Feb 15 '20 at 16:16
5

A very popular O(n log n) algorithm is merge sort. http://en.wikipedia.org/wiki/Merge_sort for example of the algorithm and pseudocode. The log n part of the algorithm is achieved through breaking down the problem into smaller subproblems, in which the height of the recursion tree is log n.

A lot of sorting algortihms has the running time of O(n log n). Refer to http://en.wikipedia.org/wiki/Sorting_algorithm for more examples.

rcs
  • 6,713
  • 12
  • 53
  • 75
4

Algorithms with a O(.) time complexity involving log n's typically involve some form of divide and conquer.

For example, in MergeSort the list is halved, each part is individually merge-sorted and then the two halves are merged together. Each list is halved.

Whenever you have work being halved or reduced in size by some fixed factor, you'll usually end up with a log n component of the O(.).

In terms of code, take a look at the algorithm for MergeSort. The important feature, of typical implementations, is that it is recursive (note that TopDownSplitMerge calls itself twice in the code given on Wikipedia).

All good standard sorting algorithms have O(n log n) time complexity and it's not possible to do better in the worst case, see Comparison Sort.

To see what this looks like in Java code, just search! Here's one example.

Daniel Renshaw
  • 33,729
  • 8
  • 75
  • 94
2

http://en.wikipedia.org/wiki/Heapsort

Simple example is just like you described - execute n times some operation that takes log(n) time. Balanced binary trees have log(n) height, so some tree algorithms will have such complexity.

mabn
  • 2,473
  • 1
  • 26
  • 47