Why are constants ignored in asymptotic analysis?
If you are talking about this
http://www.cs.cornell.edu/courses/cs312/2004fa/lectures/lecture16.htm
When you analyze the running time (or some other aspect) of an algorithm and find that it is something like
n ^ 2 + k
Then, when you are figuring out the big-O performance time -- it makes no sense to look at k, because you want to know the running time as n is getting large. n ^ 2
is so much larger than k -- that you can safely ignore it.