I have a hard time understanding sizeof\'s behaviour when given a ternary expression.
#define STRING \"a s
Both STRING and "" are array objects of types char[9] and char[1] respectively. In C language, when array objects are used in expressions, they get implicitly converted (decay) to pointer types in almost all contexts, with few well-known specific exceptions.
One of such exceptions is sizeof operator. When you use an array object as an immediate operand of sizeof that array object does not decay to pointer type, and you get the size of the entire array in bytes as result. This is why sizeof(STRING) is equivalent to sizeof(char[9]) and evaluates to 9. And sizeof("") is equivalent to sizeof(char[1]) and evaluates to 1.
But when you use array objects as operands of ?: operator, the context is no longer exceptional. In context of ?: operator arrays immediately decay to pointers. This means that your sizeof(argc > 1 ? STRING : "") is equivalent to sizeof(argc > 1 ? (char *) STRING : (char *) ""), and in turn equivalent to sizeof(char *). This evaluates to pointer size on your platform, which just happens to be 4.