Time complexity of a recursive function with two calls

China☆狼群 提交于 2019-12-24 00:42:13

问题


Consider this code:

def count_7(lst):
    if len(lst) == 1:
        if lst[0] == 7:
            return 1
        else:
            return 0

    return count_7(lst[:len(lst)//2]) + count_7(lst[len(lst)//2:])

note: the slicing operations will be considered as O(1).

So, my inutation is telling me it's O(n*logn), but I'm struggling proving it scientifically.
Be glad for help!


回答1:


Ok, mathematically (sort of ;) I get something like this:

T(n) = 2T(n/2) + c
T(1) = 1

Generalizing the equation:

T(n) = 2^k * T(n/2^k) + (2^k - 1) * c
T(1) = 1

n/2^k == 1 when k == logN so:

T(n) = 2^logN * T(1) + (2^logN - 1) * c

since T(1) = 1 and applying logarithmic properties

T(n) = n * 1 + (n-1) * c
T(n) = n + n * c
T(n) = n * (1+c)
T(n) = O(n)

A clue that this is not O(n*logn) is that you don't have to combine the two subproblems. Unlike mergesort, where you have to combine the two sub arrays, this algorithm doesn't have to do anything with the recursive result, so its time can be expressed as constant c.

UPDATE: Intuition behind

This algorithm should be O(n) because you visit each element in the array only once. It may not seem trivial because recursion never is.

For example, you divide the problem in two subproblems half the size, each subproblems is then divided in half the size too and will keep going on until each subproblem is of size 1. When you finish, you'll have n subproblems of size 1, which is n*O(1) = O(n).

The path from the beginning of first problem until N problems of size 1 is logarithmic, because in each step you subdivide in two. But in each step you do nothing with the result so this doesn't add any time complexity to the solution.

Hope this helps




回答2:


The easiest way is to assume n is a multiple of 2 for simplicity: n = 2m

The time complexity of your algorithm is (c is a constant):

t(n) = 2 t(n/2) + c

And using recursion you get:

t(n) = 22 t(n/22) + 2c + c

     ...

     = 2log(n) t(n/2log(n)) + c(2log(n)-1 + ... + 22 + 21 + 20)

Which can be simplified by noticing that log(n) = m, and thus 2log(n) = 2m = n.

     = n + c(2log(n)-1 + ... + 22 + 21 + 20)

Finally, the sum above can be reduced to 2log(n) (which equals n)

 t(n) = (1 + c) n

So your solution is O(n)




回答3:


You scan all the element of the list once, that's O(n). The only difference with simple recursive scan is the order in which you scan them. You do 1, n/2, 2, 3/4n etc... instead of 1,2,3 .... but the complexity is the same.



来源:https://stackoverflow.com/questions/20447402/time-complexity-of-a-recursive-function-with-two-calls

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!