This is a homework question that I am stuck with.
Consider unsigned integer representation. How many bits will be required to store a decimal number
Ok to generalize the technique of how many bits you need to represent a number is done this way. You have R symbols for a representation and you want to know how many bits, solve this equation R=2^n or log2(R)=n. Where n is the numbers of bits and R is the number of symbols for the representation.
For the decimal number system R=9 so we solve 9=2^n, the answer is 3.17 bits per decimal digit. Thus a 3 digit number will need 9.51 bits or 10. A 1000 digit number needs 3170 bits