Why is it that the bitwise NOT operator (~ in most languages) converts the following values like so:
-2 -> 1
-1 -> 0
It helps if you look at it in binary.
First of all, as you know, negative numbers are expressed as (highest possible unsigned number plus 1 minus value). So -1 in a 16-bit integer, which has the highest unsigned value of 65535, would be 65536-1=65535, i.e. 0xffff in hex, or 1111 1111 1111 1111 in binary.
So:
1 in binary = 0000 0000 0000 0001
NOT on all bits would result in 1111 1111 1111 1110. That, in decimal, is 65534. And 65536 minus 65534 is 2, so this is -2.