-1

I was solving Majority Element problem of GFG. I had done same problem with 2 approaches

Method 1

static int majorityElement1(int a[], int size) {
        HashMap<Integer, Integer> mp = new HashMap<>();
        int count = 0;
        int maxNum = 0;
        for (int i = 0; i < a.length; i++) {
            if (mp.containsKey(a[i])) {
                if (count < mp.get(a[i]) + 1) {
                    count = mp.get(a[i]) + 1;
                    maxNum = a[i];
                }
                mp.replace(a[i], mp.get(a[i]) + 1);
            } else {
                if (count < 1) {
                    count = 1;
                    maxNum = a[i];
                }
                mp.put(a[i], 1);
            }
        }
        return (mp.get(maxNum) > size / 2) ? maxNum : -1;
    }

Method 2

static int majorityElement(int a[], int size) {
        int count = 0;
        int maxNum = 0;
        for (int i = 0; i < a.length; i++) {
            if (count == 0) {
                maxNum = a[i];
                count++;
            } else {
                if (maxNum == a[i]) {
                    count++;
                } else {
                    count--;
                }
            }
        }
        count = 0;
        for (int i = 0; i < a.length; i++) {
            if (a[i] == maxNum) {
                count++;
            }
        }
        return (count > size / 2) ? maxNum : -1;
    }

Even thought method 1 is solving the problem in O(n) time complexity time compiler of GFG shows that time limit exceeded. But it is showing execution successful when I am using the code of method 2 whose time complexity is O(2n) . Can some please help me understand way this is happening.

Yennefer
  • 5,704
  • 7
  • 31
  • 44
  • 2
    time complexity doesn't mean actual execution speed. It's just a way to compare algorithms as `n` grows larger and larger – Lino Mar 30 '21 at 08:00
  • 3
    `O(2n) == O(n)`! Constant factors are neglected. – Seelenvirtuose Mar 30 '21 at 08:00
  • and that is one problem of that notation: `100000n` is worse than `1n`, despite being same complexity (using a `HashMap` and searching in it is surely requiring some additional time) –  Mar 30 '21 at 08:45
  • How does the second algorithm work? – k314159 Mar 30 '21 at 10:59

1 Answers1

1

Big O notation gives you an indication about the growth, with "some 'n'". This is necessary to understand correctly the algorithms that you are using.

Given that, you are not seeing the big picture here. Once you know that an algorithm is in O(n) versus another one that s in O(2n) you are basically saying that both algorithm grow at a linear pace, in fact the constant factors are absorbed, so both them are in O(n). You can't know exactly at which rate though, but you have to dig more.

Digging, means starting to consider additional details like:

  • how many times are we reading a value from the memory?
  • how many exchanges are we performing?

For example, just for example, consider this scenario: you are performing an algorithm which is supposed to swap entries an array to sort it. This could be inefficient as you are performing a lot of read and writes. You come up with a comparable algorithm which allocates another array to help minimizing such swaps. The latter version is more efficient, but it also uses more memory. Memory, wasn't part of the initial analysis.

Therefore Big-O notation gives you an insight for rough comparisons, then you must analyze your inputs and outputs in order to understand in reality which is the growth: you can use an inefficient algorithm to sort 10 items, but you might not want do the same with 100000.

Wikipedia has a very detailed and comprehensive article here. You might want to have a look at it, it could help giving you a better insight on the subject.

Yennefer
  • 5,704
  • 7
  • 31
  • 44