Is there any way to compute the width of an integer type at compile-time?
问题 The size of an integer type (or any type) in units of char /bytes is easily computed as sizeof(type) . A common idiom is to multiply by CHAR_BIT to find the number of bits occupied by the type, but on implementations with padding bits, this will not be equal to the width in value bits. Worse yet, code like: x>>CHAR_BIT*sizeof(type)-1 may actually have undefined behavior if CHAR_BIT*sizeof(type) is greater than the actual width of type . For simplicity, let\'s assume our types are unsigned.