问题
I'm looking for the mathematical proof, not just the answer.
回答1:
The recurrence relation of binary search is (in the worst case)
T(n) = T(n/2) + O(1)
Using Master's theorem

- n is the size of the problem.
- a is the number of subproblems in the recursion.
- n/b is the size of each subproblem. (Here it is assumed that all subproblems are essentially the same size.)
- f (n) is the cost of the work done outside the recursive calls, which includes the cost of dividing the problem and the cost of merging the solutions to the subproblems.
Here a = 1, b = 2 and f(n) = O(1) [Constant]
We have f(n) = O(1) = O(nlogba)
=> T(n) = O(nlogba log2 n)) = O(log2 n)
回答2:
The proof is quite simple: With each recursion you halve the number of remaining items if you’ve not already found the item you were looking for. And as you can only divide a number n recursively into halves at most log2(n) times, this is also the boundary for the recursion:
2·2·…·2·2 = 2x ≤ n ⇒ log2(2x) = x ≤ log2(n)
Here x is also the number of recursions. And with a local cost of O(1) it’s O(log n) in total.
来源:https://stackoverflow.com/questions/6094864/how-do-you-calculate-the-big-oh-of-the-binary-search-algorithm