2

The question is available here. My Python code is

def solution(A, B):
    if len(A) == 1:
        return [1]

    ways = [0] * (len(A) + 1)

    ways[1], ways[2] = 1, 2
    for i in xrange(3, len(ways)):
        ways[i] = ways[i-1] + ways[i-2]

    result = [1] * len(A)
    for i in xrange(len(A)):
        result[i] = ways[A[i]] & ((1<<B[i]) - 1)

    return result

The detected time complexity by the system is O(L^2) and I can't see why. Thank you in advance.

Tengerye
  • 1,796
  • 1
  • 23
  • 46

3 Answers3

3

First, let's show that the runtime genuinely is O(L^2). I copied a section of your code, and ran it with increasing values of L:

import time
import matplotlib.pyplot as plt

def solution(L):
    if L == 0:
        return
    ways = [0] * (L+5)
    ways[1], ways[2] = 1, 2
    for i in xrange(3, len(ways)):
        ways[i] = ways[i-1] + ways[i-2]

points = []
for L in xrange(0, 100001, 10000):
    start = time.time()
    solution(L)
    points.append(time.time() - start)

plt.plot(points)
plt.show()

The result graph is this: plot showing O(L^2) growth

To understand why this O(L^2) when the obvious "time complexity" calculation suggests O(L), note that "time complexity" is not a well-defined concept on its own since it depends on which basic operations you're counting. Normally the basic operations are taken for granted, but in some cases you need to be more careful. Here, if you count additions as a basic operation, then the code is O(N). However, if you count bit (or byte) operations then the code is O(N^2). Here's the reason:

You're building an array of the first L Fibonacci numbers. The length (in digits) of the i'th Fibonacci number is Theta(i). So ways[i] = ways[i-1] + ways[i-2] adds two numbers with approximately i digits, which takes O(i) time if you count bit or byte operations.

This observation gives you an O(L^2) bit operation count for this loop:

for i in xrange(3, len(ways)):
    ways[i] = ways[i-1] + ways[i-2]

In the case of this program, it's quite reasonable to count bit operations: your numbers are unboundedly huge as L increases and addition of huge numbers is linear in clock time rather than O(1).

You can fix the complexity of your code by computing the Fibonacci numbers mod 2^32 -- since 2^32 is a multiple of 2^B[i]. That will keep a finite bound on the numbers you're dealing with:

for i in xrange(3, len(ways)):
    ways[i] = (ways[i-1] + ways[i-2]) & ((1<<32) - 1)

There are some other issues with the code, but this will fix the slowness.

Paul Hankin
  • 54,811
  • 11
  • 92
  • 118
  • The analysis is great and the solution is perfect. Thank you so much. Would you please point out other issues? I want to be strict with my code.@Paul – Tengerye Feb 04 '17 at 17:33
  • Gave you my +1 b/c your comment reminded me of the distinction between uniform cost, which I used, and logarithmic cost, which you used, methods of analyses. – code_dredd Feb 04 '17 at 19:33
1

I've taken the relevant parts of the function:

def solution(A, B):
    for i in xrange(3, len(A) + 1):  # replaced ways for clarity
        # ...

    for i in xrange(len(A)):
        # ...

    return result

Observations:

  1. A is an iterable object (e.g. a list)
  2. You're iterating over the elements of A in sequence
  3. The behavior of your function depends on the number of elements in A, making it O(A)
  4. You're iterating over A twice, meaning 2 O(A) -> O(A)

On point 4, since 2 is a constant factor, 2 O(A) is still in O(A).

I think the page is not correct in its measurement. Had the loops been nested, then it would've been O(A²), but the loops are not nested.

This short sample is O(N²):

def process_list(my_list):
    for i in range(0, len(my_list)):
        for j in range(0, len(my_list)):
            # do something with my_list[i] and my_list[j]

I've not seen the code the page is using to 'detect' the time complexity of the code, but my guess is that the page is counting the number of loops you're using without understanding much of the actual structure of the code.

EDIT1:

Note that, based on this answer, the time complexity of the len function is actually O(1), not O(N), so the page is not incorrectly trying to count its use for the time-complexity. If it were doing that, it would've incorrectly claimed a larger order of growth because it's used 4 separate times.

EDIT2:

As @PaulHankin notes, asymptotic analysis also depends on what's considered a "basic operation". In my analysis, I've counted additions and assignments as "basic operations" by using the uniform cost method, not the logarithmic cost method, which I did not mention at first.

Most of the time simple arithmetic operations are always treated as basic operations. This is what I see most commonly being done, unless the algorithm being analysed is for a basic operation itself (e.g. time complexity of a multiplication function), which is not the case here.

The only reason why we have different results appears to be this distinction. I think we're both correct.

EDIT3:

While an algorithm in O(N) is also in O(N²), I think it's reasonable to state that the code is still in O(N) b/c, at the level of abstraction we're using, the computational steps that seem more relevant (i.e. are more influential) are in the loop as a function of the size of the input iterable A, not the number of bits being used to represent each value.

Consider the following algorithm to compute an:

def function(a, n):
    r = 1
    for i in range(0, n):
        r *= a
    return r

Under the uniform cost method, this is in O(N), because the loop is executed n times, but under logarithmic cost method, the algorithm above turns out to be in O(N²) instead due to the time complexity of the multiplication at line r *= a being in O(N), since the number of bits to represent each number is dependent on the size of the number itself.

Community
  • 1
  • 1
code_dredd
  • 5,915
  • 1
  • 25
  • 53
  • I think the test code uses time to 'detect' complexity. I have replaced the `len(A)` to `L`, but still the same [detection result](https://codility.com/demo/results/trainingSFM9P4-RZF/). Thank you very much for your kind help.@ray – Tengerye Feb 04 '17 at 11:20
  • @user3293726 I don't think I said the page was trying to count `len(A)` in my edit; it was actually the opposite. – code_dredd Feb 04 '17 at 11:23
  • Sorry for the confusion. I though you were right. I just replaced `len(A)` to prove your correctness. Do you think my code complexity is `O(L)` and it is the detection code that is wrong? Thank you again.@ray – Tengerye Feb 04 '17 at 11:28
  • @user3293726 Yes, that's what I said. I think the page is incorrect in its result. The function's order of growth is linear, not quadratic. – code_dredd Feb 04 '17 at 11:29
  • 1
    I've timed the code myself and it's genuinely O(L^2). In my answer I show the working, and why you need to be careful about what basic operations are. – Paul Hankin Feb 04 '17 at 14:45
  • @PaulHankin Yes, I did count additions as basic operations. I used the uniform cost methos, not the logarithmic cost method, to perform my analysis. I'll clarify this in my post, and thanks for the note. – code_dredd Feb 04 '17 at 19:16
0

Codility Ladder competition is best solved in here:

It is super tricky.

We first compute the Fibonacci sequence for the first L+2 numbers. The first two numbers are used only as fillers, so we have to index the sequence as A[idx]+1 instead of A[idx]-1. The second step is to replace the modulo operation by removing all but the n lowest bits

555
  • 147
  • 8