In C++, sizeof(\'a\') == sizeof(char) == 1
. This makes intuitive sense, since \'a\'
is a character literal, and sizeof(char) == 1
as d
I don't know the specific reasons why a character literal in C is of type int. But in C++, there is a good reason not to go that way. Consider this:
void print(int);
void print(char);
print('a');
You would expect that the call to print selects the second version taking a char. Having a character literal being an int would make that impossible. Note that in C++ literals having more than one character still have type int, although their value is implementation defined. So, 'ab'
has type int
, while 'a'
has type char
.