Are there are any cases where you would prefer O(log n)
time complexity to O(1)
time complexity? Or O(n)
to O(log n)
?
A more general question is if there are situations where one would prefer an O(f(n))
algorithm to an O(g(n))
algorithm even though g(n) << f(n)
as n
tends to infinity. As others have already mentioned, the answer is clearly "yes" in the case where f(n) = log(n)
and g(n) = 1
. It is sometimes yes even in the case that f(n)
is polynomial but g(n)
is exponential. A famous and important example is that of the Simplex Algorithm for solving linear programming problems. In the 1970s it was shown to be O(2^n)
. Thus, its worse-case behavior is infeasible. But -- its average case behavior is extremely good, even for practical problems with tens of thousands of variables and constraints. In the 1980s, polynomial time algorithms (such a Karmarkar's interior-point algorithm) for linear programming were discovered, but 30 years later the simplex algorithm still seems to be the algorithm of choice (except for certain very large problems). This is for the obvious reason that average-case behavior is often more important than worse-case behavior, but also for a more subtle reason that the simplex algorithm is in some sense more informative (e.g. sensitivity information is easier to extract).