We noticed that lots of bugs in our software developed in C# (or Java) cause a NullReferenceException.
Is there a reason why \"null\" has even been included in the l
Nullity is a natural consequence of reference types. If you have a reference, it has to refer to some object - or be null. If you were to prohibit nullity, you would always have to make sure that every variable was initialized with some non-null expression - and even then you'd have issues if variables were read during the initialization phase.
How would you propose removing the concept of nullity?