What can I use as a decimal type in JavaScript? It\'s not supported (0.1 + 0.2 !== 0.3
), and I need it for representing exact values in a banking/financial appl
Javascript does have floating point support. But anyway, for financial records, the simplest implementation would simply be storing your values in standard integers. You may either declare an integer to represent the amount in cents, or two integers one for dollars and one for cents.
So $18.57 would become 1857cents in the first technique or 18 dollars and 57 cents in the second.
This has the added advantage of being completely accurate, as integers are stored purely as unique binary representation, there would be no such thing as rounding errors.
It is often recommended1 to handle money as an integer representing the number of cents: 2572
cents instead of 25.72
dollars. This is to avoid the problems with floating-point arithmetic that you mention. Fortunately integer arithmetic in floating point is exact, so decimal representation errors can be avoided by scaling.
1Douglas Crockford: JavaScript: The Good Parts: Appendix A - Awful Parts (page 105).
It seems the following library implements decimal in js (node and browser): https://npmjs.org/package/jsdecimal
Take a look at BigNumber and that post too.