Can unsigned integer incrementation lead to undefined defined behavior?
问题 After reading the 32 bit unsigned multiply on 64 bit causing undefined behavior? question here on StackOverflow, I began to ponder whether typical arithmetic operations on small unsigned types could lead to undefined behavior according to the C99 standard. For example, take the following code: #include <limits.h> ... unsigned char x = UCHAR_MAX; unsigned char y = x + 1; The x variable is initialized to the maximum magnitude for the unsigned char data type. The next line is the issue: the