In C, why is sizeof(char) 1, when 'a' is an int?

假如想象 提交于 2019-11-26 18:54:37
Richard Pennington

In C 'a' is an integer constant (!?!), so 4 is correct for your architecture. It is implicitly converted to char for the assignment. sizeof(char) is always 1 by definition. The standard doesn't say what units 1 is, but it is often bytes.

Th C standard says that a character literal like 'a' is of type int, not type char. It therefore has (on your platform) sizeof == 4. See this question for a fuller discussion.

It is the normal behavior of the sizeof operator (See Wikipedia):

  • For a datatype, sizeof returns the size of the datatype. For char, you get 1.
  • For an expression, sizeof returns the size of the type of the variable or expression. As a character literal is typed as int, you get 4.

This is covered in ISO C11 6.4.4.4 Character constants though it's largely unchanged from earlier standards. That states, in paragraph /10:

An integer character constant has type int. The value of an integer character constant containing a single character that maps to a single-byte execution character is the numerical value of the representation of the mapped character interpreted as an integer.

t0mm13b

According to the ANSI C standards, a char gets promoted to an int in the context where integers are used, you used a integer format specifier in the printf hence the different values. A char is usually 1 byte but that is implementation defined based on the runtime and compiler.

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!