问题
This question was asked to me in a mock interview...Really got surprised to find awkward answers...
consider a macro:
#define SQR(x) (x*x)
Example 1:
SQR(2) //prints 4
Example 2:
If SQR(1+1) is given it doesn't sum (1+1)
to 2
but rather ...
SQR(1+1) //prints 3
Awkward right? What is the reason? How does this code work?
NOTE: I searched SO but couldn't find any relevant questions. If there are any kindly please share it!
回答1:
SQR(1+1)
expands to 1+1*1+1
which is 3, not 4, correct?
A correct definition of the macro would be
#define SQR(x) ((x)*(x))
which expands to (1+1)*(1+1)
and, more important, shows you one of the reasons you shouldn't use macros where they aren't needed. The following is better:
inline int SQR(int x)
{
return x*x;
}
Furthermore: SQR(i++)
would be undefined behavior if SQR
is a macro, and completely correct if SQR
is a function.
回答2:
The problem is that macros are doing textual substition before it is compiled, so the macro expands to 1+1*1+1
回答3:
That is why you always put arguments to macros into ():
#define SQR(x) ((x)*(x))
来源:https://stackoverflow.com/questions/17071504/confused-by-squaring-macro-sqr-in-c