What is meant by \"Constant Amortized Time\" when talking about time complexity of an algorithm?
To develop an intuitive way of thinking about it, consider insertion of elements in dynamic array (for example std::vector
in C++). Let's plot a graph, that shows dependency of number of operations (Y) needed to insert N elements in array:
Vertical parts of black graph corresponds to reallocations of memory in order to expand an array. Here we can see that this dependency can be roughly represented as a line. And this line equation is Y=C*N + b
(C
is constant, b
= 0 in our case). Therefore we can say that we need to spend C*N
operations on average to add N elements to array, or C*1
operations to add one element (amortized constant time).