C++ - Bit-wise not of uchar produces int
I am surprised by C++'s behavior when applying bit-wise not to an unsigned char. Take the binary value 01010101b , which is 0x55 , or 85 . Applying bit-wise not on an eight bit representation should yield 10101010b , which is 0xAA , or 170 . However, I cannot reproduce the above in C++. The following simple assertion fails. assert(static_cast<unsigned char>(0xAAu) == ~static_cast<unsigned char>(0x55u)); I printed the values of 0x55 , 0xAA , and ~0x55 (as uchar) with the following code. And it reveals that the bit-wise not does not do what I expect it to do. std::cout << "--> 0x55: " << 0x55u <