I work as a programmer, but have no computer science background, so recently I\'ve been following along with the excellent MIT OpenCourseWare intro to Computer Science and Progr
"Constant time" means that the operation will execute in an amount of time (or memory space - that's another thing often measured) independent of the input size. Usually you pick a variable (let's use n
) to indicate the input size.
O(1)
- constant time - running time does not depend on n
O(n)
- linear time - running time is linearly proportional to n
O(n^2)
- quadratic time - running time is proportional to the square of n
These are just a few examples; the possibilities are endless. See the wiki article on complexity
Here's a few specific ways that a program composed of only the operations you mention could take various amounts of time:
int n = // some value
doSomething
doSomething
doSomething
Note how it is three somethings in length, independent of what n
is. O(1)
int n = // some value
def f(n):
if n == 0 return
doSomething
f(n-1)
f(n)
Now we run a something for each value 0..n (linear time, O(n)
)
And we can have a bit of fun -
int n = //some value
def f(n):
if n == 0 return
doSomething
f(n-1)
f(n-1)
What's the running time here? (i.e. how many somethings do we execute?) :)