What\'s the point of negative ASCII values?
int a = \'«\'; //a = -85 but as in ASCII table \'<<\' should be 174
This is an artefact of your compiler's char
type being a signed integer type, and int
being a wider signed integer type, and thus the character constant is considered a negative number and is sign-extended to the wider integer type.
There is not much sense in it, it just happens. The C standard allows for compiler implementations to choose whether they consider char
to be signed or unsigned. Some compilers even have compile time switches to change the default. If you want to make sure about the signedness of the char
type, explicitly write signed char
or unsigned char
, respectively.
Use an unsigned char
to be extended to an int
to avoid the negative int
value, or open a whole new Pandora's box and enjoy wchar
.