Why does Decimal.Divide(int, int) work, but not (int / int)?

后端 未结 8 853
旧时难觅i
旧时难觅i 2020-12-08 03:18

How come dividing two 32 bit int numbers as ( int / int ) returns to me 0, but if I use Decimal.Divide() I get the correct answer? I\'m by no means

8条回答
  •  -上瘾入骨i
    2020-12-08 04:00

    The answer marked as such is very nearly there, but I think it is worth adding that there is a difference between using double and decimal.

    I would not do a better job explaining the concepts than Wikipedia, so I will just provide the pointers:

    floating-point arithmetic

    decimal data type

    In financial systems, it is often a requirement that we can guarantee a certain number of (base-10) decimal places accuracy. This is generally impossible if the input/source data is in base-10 but we perform the arithmetic in base-2 (because the number of decimal places required for the decimal expansion of a number depends on the base; one third takes infinitely many decimal places to express in base-10 as 0.333333..., but it takes only one decimal in base-3: 0.1).

    Floating-point numbers are faster to work with (in terms of CPU time; programming-wise they are equally simple) and preferred whenever you want to minimize rounding error (as in scientific applications).

提交回复
热议问题