问题
I have a hard time understanding sizeof
's behaviour when given a ternary expression.
#define STRING "a string"
int main(int argc, char** argv)
{
int a = sizeof(argc > 1 ? STRING : "");
int b = sizeof(STRING);
int c = sizeof("");
printf("%d\n" "%d\n" "%d\n", a, b, c);
return 0;
}
In this example (tested with gcc 4.4.3 and 4.7.2, compiled with -std=c99
), b is 9 (8 characters + implicit '\0'
), c is 1 (implicit '\0'
). a, for some reason, is 4.
I would expect a to be either 9 or 1, based on whether argc is greater than 1. I thought maybe the string literals get converted to pointers before being passed to sizeof
, causing sizeof(char*)
to be 4.
I tried replacing STRING
and ""
by char arrays...
char x[] = "";
char y[] = "a string";
int a = sizeof(argc > 1 ? x : y);
... but I got the same results (a=4, b=9, c=1).
Then I tried to dive into the C99 spec, but I did not find any obvious explanation in it. Out of curiosity I also tried changing changing x and y to other types:
char
andlong long int
: a becomes 8- both
short
or bothchar
: a becomes 4
So there's definitely some sort of conversion going on, but I struggle to find any official explanation. I can sort of imagine that this would happen with arithmetic types (I'm vaguely aware there's plenty of promotions going on when those are involved), but I don't see why a string literal returned by a ternary expression would be converted to something of size 4.
NB: on this machine sizeof(int) == sizeof(foo*) == 4
.
Follow-up
Thanks for the pointers guys. Understanding how sizeof
and ?:
work actually led me to try a few more type mashups and see how the compiler reacted. I'm editing them in for completeness' sake:
foo* x = NULL; /* or foo x[] = {} */
int y = 0; /* or any integer type */
int a = sizeof(argc > 1 ? x : y);
Yields warning: pointer/integer type mismatch in conditional expression [enabled by default]
, and a == sizeof(foo*)
.
With foo x[], bar y[]
, foo* x, bar* y
or foo* x, bar y[]
, the warning becomes pointer type mismatch
. No warning when using a void*
.
float x = 0; /* or any floating-point type */
int y = 0; /* or any integer type */
int a = sizeof(argc > 1 ? x : y);
Yields no warning, and a == sizeof(x)
(that is, the floating-point type).
float x = 0; /* or any floating-point type */
foo* y = NULL; /* or foo y[] = {} */
int a = sizeof(argc > 1 ? x : y);
Yields error: type mismatch in conditional expression
.
If I ever read the spec completely I'll make sure to edit this question to point to the relevant parts.
回答1:
You have to understand expressions, which are the core component of the language.
Every expression has a type. For an expression e
, sizeof e
is the size of the type of the value of the expression e
.
The expression a ? b : c
has a type. The type is the common type of the two operand expressions b
and c
.
In your example, the common type of char[9]
and char[1]
is char *
(both array-valued expressions decay to a pointer to the first element). (In C++, the rules for string literals are different and there is a const
everywhere.)
回答2:
You need to understand that sizeof
is entirely a compile-time operator.
With VLA it could return a variable expression, otherwise it is a compile-time constant.
What matters is the type of its argument.
So in sizeof(argc > 1 ? STRING : "")
the condition is not evaluated. The type of the argument is decayed to const char*
. And on your machine, it is 4.
You should code instead (argc > 1)?sizeof(STRING):1
Since STRING
is macro-expanded to the "a string"
literal, sizeof(STRING)
is 9, nearly as if you have declared
const char STRING[] = {'a',' ','s','t','r','i','n','g','\0'};
回答3:
Both STRING
and ""
are array objects of types char[9]
and char[1]
respectively. In C language, when array objects are used in expressions, they get implicitly converted (decay) to pointer types in almost all contexts, with few well-known specific exceptions.
One of such exceptions is sizeof
operator. When you use an array object as an immediate operand of sizeof
that array object does not decay to pointer type, and you get the size of the entire array in bytes as result. This is why sizeof(STRING)
is equivalent to sizeof(char[9])
and evaluates to 9
. And sizeof("")
is equivalent to sizeof(char[1])
and evaluates to 1
.
But when you use array objects as operands of ?:
operator, the context is no longer exceptional. In context of ?:
operator arrays immediately decay to pointers. This means that your sizeof(argc > 1 ? STRING : "")
is equivalent to sizeof(argc > 1 ? (char *) STRING : (char *) "")
, and in turn equivalent to sizeof(char *)
. This evaluates to pointer size on your platform, which just happens to be 4
.
来源:https://stackoverflow.com/questions/24932717/operator-sizeof-with-conditional-ternary-expression