I keep hearing people talk about how non-nullable reference types would solve so many bugs and make programming so much easier. Even the creator of null calls it his billion
I don't understand your example. If your "= new Class()" is just a placeholder in place of not having null, then it's (to my eyes) obviously a bug. If it's not, then the real bug is that the "..." didn't set its contents correctly, which is exactly the same in both cases.
An exception that shows you that you forgot to initialize c will tell you at what point it's not initialized, but not where it should have been initialized. Similarly, a missed loop will (implicitly) tell you where it needed to have a nonzero .count, but not what should have been done or where. I don't see either one as being any easier on the programmer.
I don't think the point of "no nulls" is to simply do a textual find-and-replace and make them all into empty instances. That's obviously useless. The point is to structure your code so your variables are never in a state where they point to useless/incorrect values, of which NULL is simply the most common.