See this code snippet
int main()
{
unsigned int a = 1000;
int b = -1;
if (a>b) printf(\"A is BIG! %d\\n\", a-b);
else printf(\"a is SMALL! %d\\n\", a
On a typical implementation where int
is 32-bit, -1 when converted to an unsigned int
is 4,294,967,295 which is indeed ≥ 1000.
Even if you treat the subtraction in an unsigned
world, 1000 - (4,294,967,295) = -4,294,966,295 = 1,001
which is what you get.
That's why gcc
will spit a warning when you compare unsigned
with signed
. (If you don't see a warning, pass the -Wsign-compare
flag.)