Determine the decimal precision of an input number

♀尐吖头ヾ 提交于 2019-12-10 02:48:57

问题


We have an interesting problem were we need to determine the decimal precision of a users input (textbox). Essentially we need to know the number of decimal places entered and then return a precision number, this is best illustrated with examples:

4500 entered will yield a result 1
4500.1 entered will yield a result 0.1
4500.00 entered will yield a result 0.01
4500.450 entered will yield a result 0.001

We are thinking to work with the string, finding the decimal separator and then calculating the result. Just wondering if there is an easier solution to this.


回答1:


I think you should just do what you suggested - use the position of the decimal point. Obvious drawback might be that you have to think about internationalization yourself.

var decimalSeparator = NumberFormatInfo.CurrentInfo.CurrencyDecimalSeparator;

var position = input.IndexOf(decimalSeparator);

var precision = (position == -1) ? 0 : input.Length - position - 1;

// This may be quite unprecise.
var result = Math.Pow(0.1, precision);

There is another thing you could try - the Decimal type stores an internal precision value. Therefore you could use Decimal.TryParse() and inspect the returned value. Maybe the parsing algorithm maintains the precision of the input.

Finally I would suggest not to try something using floating point numbers. Just parsing the input will remove any information about trailing zeros. So you have to add an artifical non-zero digit to preserve them or do similar tricks. You might run into precision issues. Finally finding the precision based on a floating point number is not simple, too. I see some ugly math or a loop multiplying with ten every iteration until there is no longer any fractional part. And the loop comes with new precision issues...

UPDATE

Parsing into a decimal works. Se Decimal.GetBits() for details.

var input = "123.4560";

var number = Decimal.Parse(input);

// Will be 4.
var precision = (Decimal.GetBits(number)[3] >> 16) & 0x000000FF;

From here using Math.Pow(0.1, precision) is straight forward.




回答2:


Just wondering if there is an easier solution to this.

No.

Use string:

string[] res = inputstring.Split('.');
int precision = res[1].Length;



回答3:


Since your last examples indicate that trailing zeroes are significant, I would rule out any numerical solution and go for the string operations.




回答4:


No, there is no easier solution, you have to examine the string. If you convert "4500" and "4500.00" to numbers, they both become the value 4500 so you can't tell how many non-value digits there were behind the decimal separator.




回答5:


As an interesting aside, the Decimal tries to maintain the precision entered by the user. For example,

Console.WriteLine(5.0m);
Console.WriteLine(5.00m);
Console.WriteLine(Decimal.Parse("5.0"));
Console.WriteLine(Decimal.Parse("5.00"));

Has output of:

5.0
5.00
5.0
5.00

If your motivation in tracking the precision of the input is purely for input and output reasons, this may be sufficient to address your problem.




回答6:


Working with the string is easy enough.

If there is no "." in the string, return 1.

Else return "0.", followed by n-1 "0", followed by one "1", where n is the length of the string after the decimal point.




回答7:


Here's a possible solution using strings;

static double GetPrecision(string s)
{ 
    string[] splitNumber = s.Split('.');
    if (splitNumber.Length > 1)
    {
        return 1 / Math.Pow(10, splitNumber[1].Length);
    }
    else
    {
        return 1;
    }
}

There is a question here; Calculate System.Decimal Precision and Scale which looks like it might be of interest if you wish to delve into this some more.



来源:https://stackoverflow.com/questions/3281865/determine-the-decimal-precision-of-an-input-number

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!