I have some unsafe C# code that does pointer arithmetic on large blocks of memory on type byte*, running on a 64-bit machine. It works correctly most of the tim
I'm answering my own question as I have solved the problem, but would still be interested in reading comments about why the behavior changes with checked vs unchecked.
This code demonstrates the problem as well as the solution (always casting the offset to long before adding):
public static unsafe void Main(string[] args)
{
// Dummy pointer, never dereferenced
byte* testPtr = (byte*)0x00000008000000L;
uint offset = uint.MaxValue;
unchecked
{
Console.WriteLine("{0:x}", (long)(testPtr + offset));
}
checked
{
Console.WriteLine("{0:x}", (long)(testPtr + offset));
}
unchecked
{
Console.WriteLine("{0:x}", (long)(testPtr + (long)offset));
}
checked
{
Console.WriteLine("{0:x}", (long)(testPtr + (long)offset));
}
}
This will return (when run on a 64-bit machine):
7ffffff
107ffffff
107ffffff
107ffffff
(BTW, in my project I first wrote all the code as managed code without all this unsafe pointer arithmetic nastiness but found out it was using too much memory. This is just a hobby project; the only one that gets hurt if it blows up is me.)