Once it's seen the first .
in 0.1
, then a subsequent .
cannot be part of the number.
It's all about ambiguity.
edit — section 7.8.3 of the spec explicitly insists on this:
The source character immediately following a NumericLiteral must not be an IdentifierStart or DecimalDigit.
I'm not sure exactly what that's trying to prevent, but the JavaScript lexer is pretty gnarly, mostly thanks to the regex literal grammar and the need for a weird parser-lexer hack to deal with that.