Constants in the formal definition of Big O

前端 未结 2 1457
走了就别回头了
走了就别回头了 2021-01-05 02:10

I\'m revising the formal definitions of Big O and the other associated bounds and something is tripping me up. In the book I\'m reading (Skiena) Big O is defined as:

2条回答
  •  感情败类
    2021-01-05 02:45

    When choosing to classify an algorithm into a complexity class, is the general rule of thumb to just choose the lowest growth class that still holds according to the definition of Big O?

    In terms of notations, just like we have big-O for upper bounds we have big-Omega for lower bounds and big-Theta for when you are able to show that the upper bound and the lower bounds match.

    https://en.wikipedia.org/wiki/Big_O_notation#The_Knuth_definition

    Assuming that Knuth quote is correct, then we can say that you are not alone in assuming that results involving tight asymptotic bounds are more useful :) Sometimes people say big-O when they actually meant to say big-Theta but some other times they just don't care or haven't managed to find the lower bound.

    It seem like I could choose a very large value for c, and make the whole thing arbitrary by blowing out the size of smaller g(n) values.

    For functions with different asymptotic growth rates, the c doesn't matter. No matter how big or how small you choose c to be, there will be an n when the faster growing function catches up. The constant factor is there to allow you to ignore constant multipliers when things have the same growth rate. For example, when it comes to big-O, f(x) = 2x and g(x) = 3x both have the same growth rate.

提交回复
热议问题