问题
I'm working on a program where you can choose up to 3 things you want to divvy points amongst.
Say for example that an action gains you 4 points, and those 4 points are divvied amongst the 3 things you selected.
In this case, those 3 things each get 1.33333... points.
In my database, they are stored as 1.33.
However when I bring them out, it tallies up to 3.99.
Understandable.
But how can I avoid this without giving one of the things 1.34 points?
回答1:
Store the full float/double in your database rather than truncating to 2 decimal places. The time to trunc is when displaying the value to the user -- but only trunc the displayed string, not the actual value.
Floating point values are the annoying drunk uncle of computing. Just let them be what they are, and then clean them up when presenting to the public eye.
回答2:
Floating point numbers will be lossy in this case. If you are dealing with integer numerators and denominators, why not store the numbers as fractions? You can make use of Pear's Math Fraction library or write something yourself.
回答3:
Use a third decimal place - not for display, but only for tracking precision. If someone divides 4 points among three, store it as 1.333. When you calculate back, you get 3.999 which you round up to 4. On the other hand, if someone divides 3.99 among three objects, store it as 1.33, so when you calculate back, you get 3.99 (and not 3.999) and thus you know not to round up.
来源:https://stackoverflow.com/questions/9425942/why-are-the-decimals-not-adding-up-correctly