Why does printf output random values for double and 0.000000 for int?

前端 未结 2 1523
小蘑菇
小蘑菇 2020-12-12 01:18

I know this is a simple questions, but it came up when I was coding and I am wondering how it works now. So, my first question is that when printf is given an integer like b

相关标签:
2条回答
  • 2020-12-12 01:48

    You used the wrong format specifiers It should be

    int a = 2, b = 5, result = 0;
    result = b/a*a;
    
    printf("%d\n", result);
    
    ...
    
    double a = 2, b = 5, result = 0;
    result = b/a*a;
    
    printf("%f\n", result);
    
    0 讨论(0)
  • 2020-12-12 02:07

    Basically because if you interpret the bits that make up a small integer as if they were a double, it looks like the double value for 0. Whereas if you interpret the bits that represent a small double value as an integer, it looks like something more interesting. Here is a link to a page that describes how the bits are used to represent a double: http://en.m.wikipedia.org/wiki/IEEE_floating_point . With this link and a little patience, you can actually work out the integer value that a given double would be interpreted as.

    0 讨论(0)
提交回复
热议问题