See JPL Coding standards. Dynamic memory allocation leads to unpredictable execution. I have seen problems from dynamic memory allocations in perfectly coded systems - that over time, there is memory fragmentation just like a hard disk. Allocating blocks of memory from the heap will take longer and longer, until it becomes impossible to allocate the requested size. At which point in time, you start getting NULL pointers returned and the entire program crashes because few if anyone tests for out-of-memory conditions. It is important to note, that by the book, you may have enough memory available, however the fragmentation thereof is what prevents allocation. This is addressed in .NET CLI, with the use of "handles" instead of pointers, where the runtime can garbage collect, using a mark-and-sweep garbage collector, move the memory around. During the sweep it compacts the memory to prevent fragmentation and updates the handles. Whereas pointers (memory addresses) cannot be updated. This is a problem though, because garbage collection is no longer deterministic. Though, .NET has added mechanisms to make it more deterministic. However, if you follow JPL's advice (section 2.5), you do not need a fancy garbage collection. You dynamically allocate all you need at initialization, then reuse the allocated memory, never freeing it, then there is no fragmentation risk and you can still have deterministic garbage collection.