This is a homework question that I am stuck with.
Consider unsigned integer representation. How many bits will be required to store a decimal number
For a binary number of n digits the maximum decimal value it can hold will be
2^n - 1, and 2^n is the total permutations that can be generated using these many digits.
Taking a case where you only want three digits, ie your case 1. We see that the requirements is,
2^n - 1 >= 999
Applying log to both sides,
log(2^n - 1) >= log(999)
log(2^n) - log(1) >= log(999)
n = 9.964 (approx).
Taking the ceil value of n since 9.964 can't be a valid number of digits, we get n = 10.