In C, there appear to be differences between various values of zero -- NULL
, NUL
and 0
.
I know that the ASCII character
All three define the meaning of zero in different context.
These three are always different when you look at the memory:
NULL - 0x00000000 or 0x00000000'00000000 (32 vs 64 bit)
NUL - 0x00 or 0x0000 (ascii vs 2byte unicode)
'0' - 0x20
I hope this clarifies it.