I\'ve seen definitions in C
#define TRUE (1==1)
#define FALSE (!TRUE)
Is this necessary? What\'s the benefit over simply defining TRUE as
#define TRUE (1==1)
#define FALSE (!TRUE)
is equivalent to
#define TRUE 1
#define FALSE 0
in C.
The result of the relational operators is 0 or 1. 1==1 is guaranteed to be evaluated to 1 and !(1==1) is guaranteed to be evaluated to 0.
There is absolutely no reason to use the first form. Note that the first form is however not less efficient as on nearly all compilers a constant expression is evaluated at compile time rather than at run-time. This is allowed according to this rule:
(C99, 6.6p2) "A constant expression can be evaluated during translation rather than runtime, and accordingly may be used in any place that a constant may be."
PC-Lint will even issue a message (506, constant value boolean) if you don't use a literal for TRUE and FALSE macros:
For C,
TRUEshould be defined to be1. However, other languages use quantities other than 1 so some programmers feel that!0is playing it safe.
Also in C99, the stdbool.h definitions for boolean macros true and false directly use literals:
#define true 1
#define false 0