I want to create a buffer for sprintfing a integer (in this case an unsigned int). A simple and misguided approach would be:
char b
If the array should work on all real-world computers, then int can either be 2 or 4 bytes. No other alternatives exist (*).
Meaning the maximum value it can hold is either 65535 or 4.29*10^9. Which in turn means that your array needs to hold either 5 or 10 digits.
Which in turn means that the array could be declared as:
char buf [sizeof(int)/2 * 5 + 1];
which will either expand to 5+1 or 10+1, which covers all known computers in the world.
A better and more professional solution is to use the fixed-width types from stdint.h. Then always you know in advance exactly how many digits that is needed, portably, and can therefore get rid of the above "magic numbers".
(*) In C language standard theory, an int could be anything 2 bytes or larger. But since no such systems will ever exist in the real world, there is no point in making your code portable to them. The C language has already introduced long and long long for a reason.
People who are concerned about portability to wildly exotic, completely fictional systems are misguided, they are mostly C language lawyers who like posing. You should not let such theoretical nonsense affect how you write professional programs for real-world computers.
EDIT
The "C language-lawyer poser" version would look like this:
#include
#include
#define STRINGIFY(s) #s
#define GET_SIZE(n) sizeof(STRINGIFY(n))
#define DIGITS(type) _Generic((type), unsigned int: GET_SIZE(INT_MAX) )
int main(void)
{
unsigned int x;
char buf [DIGITS(x)];
printf("%zu", sizeof(buf));
return 0;
}
Note that this assumes that INT_MAX expands to an integer constant and not to an expression. I got really strange results from GCC when using UINT_MAX, because that macro is defined as an expression internally, inside limits.h.