when i am using #define
function,I observe something bizarre.
In the below code if I gave i
value as \'10\'
from input i
Macro's have subtleties.
What your macro does is:
Double(++i) -> ++i*++i
in your case 11*12 or 12*11
What you have is undefined behaviour.
You Double(++i)
is changed to ++i * ++i
, when you compile you code.
Double(++i)
will expand to ++i * ++i
. In this expression, i
is modified twice without an intervening sequence point, which is undefined behavior.
Read: So, what's wrong with using macros?