问题
I don´t really know how to express in big O-notation. I´ve seen several sources talking about this but it only made me more uncertain. When I write in big-O should I just ignore the constants?
examples:
1. 0.02N³
2. 4N*log(2^N)
3. 24Nlog(N)
4. N²
5. N*sqrt(N)
this is what I mean with "ignore the constants":
1. O(N³)
2. O( N*log(2^N) )
3. O( Nlog(N) )
4. O( N² )
5. O( N*sqrt(N) )
and how fast are O( N*log(2^N) )
and O( N*sqrt(N) )
growing compared to the other examples?
I really appreciate the help so thanks in advance
回答1:
Big O notation is characterizes asymptotic behavior of function. Mathematically f(x) = O(g(x))
when lim (x->inf) (f(x)/g(x)) = const
Let's get some clarity. There are 5 common notations (Bachmann–Landau notations):
ω (small omega)
Ω (big omega)
Θ (theta)
Ο (big o)
ο (small o)
They works like mathematical comparison operators:
< (strictly less)
<= (less or equals)
= (equals)
>= (greater or equals)
> (strictly greater)
Strictly saying, big o is just an upper bound so you can't say which function grows faster based just on big-o notation.
For example, quick sort has worst case complexity = O(n2) but it's also right to say that quick sort has worst case complexity = O(n889). It's just like we can say x < 899 based on knowledge that x < 2.
Because of the limiting behavior you can ignore constants and less-ordered summands (they are "dominated" by the highest order summand) of your functions.
For example, if f(x) = 33*n³ + n² + n + 3544
, it's right to say that f(x) = O(n³)
(Moreover, it's right to say f(x) = Θ(n³)
which is much more informative (Θ
is called a tight bound
)
回答2:
Yes, you ignore the constants. Also if you have a sum ex. 5n^2 + 2n
you only take the biggest argument with highest exponent (also without the constant) -> here: O(n^2)
You did those examples good.
To compare the growing, I suggest you use wolframalpha or any tool drawing plots, and you will see how they change
来源:https://stackoverflow.com/questions/14295113/how-do-i-write-big-o-notations