Denormals are known to underperform severely, 100x or so, compared to normals. This frequently causes unexpected software problems.
I\'m curious, from CPU Architect
Denormals are not handled by the FPU (H/W) in many architectures - so that leaves the implementation to s/w
There's a good basic intro here https://en.wikipedia.org/wiki/Denormal_number
Under Performance issues -