I wrote a function that behaves differently depending on the numeric type of it's parameters. Integer or float.
Using some code from this question How do I check that a number is float or integer? it was easy to detect if float or not but then I stumbled upon the case that javascript casts 1.0 to 1 without cause if you call a function using that number.
Example:
function dump(a, b) {
console.log(a, typeof b, b);
}
dump('1', 1);
dump('1.0', 1.0);
dump('1.1', 1.1);
Output chrome, firefox, ie, opera and safari all gave the same result:
1 number 1
1.0 number 1 "wrong"
1.1 number 1.1
I know that javascript only knows the type number but that forced cast seems to go way overboard. The only solution I came up with was to call the function using string values like '1.0', detect the dot and use parseFloat or parseInt.
Any suggestion on that?
You've acknowledged that JavaScript only has a single Number type. As such, 1 is identical to 1.0.
If you need this for display purposes, then you should use toFixed.
1..toFixed(1); // "1.0"
number%1===0
If that condition is true , it's integer, else it's float
来源:https://stackoverflow.com/questions/11798903/javascript-casts-floating-point-numbers-to-integers-without-cause