问题
My function is converting a string to Decimal
func getDecimalFromString(_ strValue: String) -> NSDecimalNumber {
let formatter = NumberFormatter()
formatter.maximumFractionDigits = 1
formatter.generatesDecimalNumbers = true
return formatter.number(from: strValue) as? NSDecimalNumber ?? 0
}
But it is not working as per expectation. Sometimes it's returning like
Optional(8.300000000000001)
Optional(8.199999999999999)
instead of 8.3 or 8.2. In the string, I have value like "8.3" or "8.2" but the converted decimal is not as per my requirements. Any suggestion where I made mistake?
回答1:
That seems to be a bug, compare
- NSDecimalNumbers from NSNumberFormatter are affected by binary approximation error
Even with generatesDecimalNumbers
set to true
, the formatter
produces a binary floating point number internally, and that cannot
represent decimal fractions like 8.2
precisely.
Note also that (contrary to the documentation), the maximumFractionDigits
property has no effect when parsing a string
into a number.
There is a simple solution: Use
NSDecimalNumber(string: strValue) // or
NSDecimalNumber(string: strValue, locale: Locale.current)
instead, depending on whether the string is localized or not.
Or with the Swift 3 Decimal
type:
Decimal(string: strValue) // or
Decimal(string: strValue, locale: .current)
Example:
if let d = Decimal(string: "8.2") {
print(d) // 8.2
}
来源:https://stackoverflow.com/questions/43538661/generatesdecimalnumbers-for-numberformatter-does-not-work