Big O of Recursive Methods

后端 未结 2 1824
礼貌的吻别
礼貌的吻别 2020-12-15 06:02

I\'m having difficulty determining the big O of simple recursive methods. I can\'t wrap my head around what happens when a method is called multiple times. I would be more

相关标签:
2条回答
  • 2020-12-15 06:38

    You might want to refer to the master theorem for finding the big O of recursive methods. Here is the wikipedia article: http://en.wikipedia.org/wiki/Master_theorem

    You want to think of a recursive problem like a tree. Then, consider each level of the tree and the amount of work required. Problems will generally fall into 3 categories, root heavy (first iteration >> rest of tree), balanced (each level has equal amounts of work), leaf heavy (last iteration >> rest of tree).

    Taking merge sort as an example:

    define mergeSort(list toSort):
        if(length of toSort <= 1):
            return toSort
        list left = toSort from [0, length of toSort/2)
        list right = toSort from [length of toSort/2, length of toSort)
        merge(mergeSort(left), mergeSort(right))
    

    You can see that each call of mergeSort in turn calls 2 more mergeSorts of 1/2 the original length. We know that the merge procedure will take time proportional to the number of values being merged.

    The recurrence relationship is then T(n) = 2*T(n/2)+O(n). The two comes from the 2 calls and the n/2 is from each call having only half the number of elements. However, at each level there are the same number of elements n which need to be merged, so the constant work at each level is O(n).

    We know the work is evenly distributed (O(n) each depth) and the tree is log_2(n) deep, so the big O of the recursive function is O(n*log(n)).

    0 讨论(0)
  • 2020-12-15 06:42

    You can define the order recursively as well. For instance, let's say you have a function f. To calculate f(n) takes k steps. Now you want to calculate f(n+1). Lets say f(n+1) calls f(n) once, then f(n+1) takes k + some constant steps. Each invocation will take some constant steps extra, so this method is O(n).

    Now look at another example. Lets say you implement fibonacci naively by adding the two previous results:

    fib(n) = { return fib(n-1) + fib(n-2) }
    

    Now lets say you can calculate fib(n-2) and fib(n-1) both in about k steps. To calculate fib(n) you need k+k = 2*k steps. Now lets say you want to calculate fib(n+1). So you need twice as much steps as for fib(n-1). So this seems to be O(2^N)

    Admittedly, this is not very formal, but hopefully this way you can get a bit of a feel.

    0 讨论(0)
提交回复
热议问题