I just read on MDN that one of the quirks of JS\'s handling of numbers due to everything being \"double-precision 64-bit format IEEE 754 values\" is that when you d
(Math.floor(( 0.1+0.2 )*1000))/1000
This will reduce the precision of float numbers but solves the problem if you are not working with very small values. For example:
.1+.2 =
0.30000000000000004
after the proposed operation you will get 0.3 But any value between:
0.30000000000000000
0.30000000000000999
will be also considered 0.3