Computational complexity of base conversion

人盡茶涼 提交于 2019-12-17 14:13:10

问题


What is the complexity of converting a very large n-bit number to a decimal representation?

My thought is that the elementary algorithm of repeated integer division, taking the remainder to get each digit, would have O(M(n)log n) complexity, where M(n) is the complexity of the multiplication algorithm; however, the division is not between 2 n-bit numbers but rather 1 n-bit number and a small constant number, so it seems to me the complexity could be smaller.


回答1:


Naive base-conversion as you described takes quadratic time; you do about n bigint-by-smallint divisions, most of which take time linear in the size of the n-bit bigint.

You can do base conversion in O(M(n) log(n)) time, however, by picking a power of target-base that's roughly the square root of the to-be-converted number, doing divide-and-remainder by it (which can be done in O(M(n)) time via Newton's method), and recursing on the two halves.



来源:https://stackoverflow.com/questions/28418332/computational-complexity-of-base-conversion

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!