This is probably pretty basic, but to save me an hour or so of grief can anyone tell me how you can work out the number of bits required to represent a given positive intege
My Java is a bit rusty, but the language-agnostic answer (if there is a "log2" function and a "floor" function available) would be:
numberOfBits = floor(log2(decimalNumber))+1
Assuming that "decimalNumber" is greater than 0. If it is 0, you just need 1 bit.