I am learning about Big O Notation running times and amortized times. I understand the notion of O(n) linear time, meaning that the size of the input affects the g
In information technology it means that:
f(n)=O(g(n)) If there is suitable constant C and N0 independent on N,
such that
for all N>N0 "C*g(n) > f(n) > 0" is true.
Ant it seems that this notation was mostly have taken from mathematics.
In this article there is a quote: D.E. Knuth, "BIG OMICRON AND BIG OMEGA AND BIG THETA", 1976:
On the basis of the issues discussed here, I propose that members of SIGACT, and editors of computer science and mathematics journals, adopt notations as defined above, unless a better alternative can be found reasonably soon.
Today is 2016, but we use it still today.
In mathematical analysis it means that:
lim (f(n)/g(n))=Constant; where n goes to +infinity
But even in mathematical analysis sometimes this symbol was used in meaning "C*g(n) > f(n) > 0".
As I know from university the symbol was intoduced by German mathematician Landau (1877-1938)