Time Complexity of a loop that integer divides the loop counter by a constant

↘锁芯ラ 提交于 2019-11-29 16:08:00

I find it easier to do it the other way around in cases like this. What is the opposite of what you're doing (even approximately)? Something like:

for (a = 1; a <= n; a = a * 8)
{
    ...
}

Now, we've changed the while to a for, which has a fixed number of steps, and decrementation to incrementation, which can be easier to work with.

we have:

1, 8, 8^2, ..., 8^k <= n

8^k <= n | apply logarithm

log (8^k) <= log n

k log 8 <= log n

=> k = O(log n)

So your while loop executes O(log n) times. If you have something that is O(n^3) inside, then your entire sequence will be O(n^3 log n).

The key to this is that it's integer division, not Zeno's paradox. Repeated division takes log(n) steps to reduce a to something that will become zero on the next step.

Another way to look at integer division by a power of two is as a bit shift. Shifting a to the right by 3 bits will produce a zero after a number of steps depending on the position of the highest set bit in a. i.e. (sizeof(a)*CHAR_BIT - leading_zero_count(a)) / 3. Bit position is the same thing as log_base2.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!