I know this isn\'t strictly a programming question, but it is a computer science question so I\'m hoping someone can help me.
I\'ve been working on my Algor
Probably, your CPU will multiply 32-bit integers in constant time. But big-Oh doesn't care about "less than four billion", so maybe you have to look at multiplication algorithms?
According to Wolfram, the "traditional" multiplication algorithm is O(n2). Although n in this case is the number of digits, and thus really log(n) in the actual number. So I should be able to square the integers 1..n in time O(n.log(n)). Summing is O(log(n2)), which is obviously O(log(n)), for an overall complexity of O(n.log(n)).
So I can quite understand your confusion.