For some reason, I am simply not understanding (or seeing) why this works:
UInt32 a = 0x000000FF;
a &= ~(UInt32)0x00000001;
but this does n
Your expression evaluated at compile time and as result overflow error is detected and blocking compilation.
Sample below show that run-time does not throw exception by default:
UInt16 a = 0x00FF;
UInt16 b = 0x0001;
a &= (UInt16)~b;
Note that your code also was converting int (as result of ~ operation on UInt16 converted to int) - so I moved ~ before cast.
Why: C# compiler generally tries to prevent unintentional code behavior if possible, which combined with all bit-wise operations defined on int/uint/long/ulong and compile time constant evaluation leads to this compile time error.