Why does Decimal.Divide(int, int) work, but not (int / int)?

后端 未结 8 852
旧时难觅i
旧时难觅i 2020-12-08 03:18

How come dividing two 32 bit int numbers as ( int / int ) returns to me 0, but if I use Decimal.Divide() I get the correct answer? I\'m by no means

8条回答
  •  自闭症患者
    2020-12-08 04:17

    int is an integer type; dividing two ints performs an integer division, i.e. the fractional part is truncated since it can't be stored in the result type (also int!). Decimal, by contrast, has got a fractional part. By invoking Decimal.Divide, your int arguments get implicitly converted to Decimals.

    You can enforce non-integer division on int arguments by explicitly casting at least one of the arguments to a floating-point type, e.g.:

    int a = 42;
    int b = 23;
    double result = (double)a / b;
    

提交回复
热议问题