How disastrous is integer overflow in C++?

巧了我就是萌 提交于 2019-11-26 20:34:41

问题


I was just wondering how disastrous integer overflow really is. Take the following example program:

#include <iostream>

int main()
{
    int a = 46341;
    int b = a * a;
    std::cout << "hello world\n";
}

Since a * a overflows on 32 bit platforms, and integer overflow triggers undefined behavior, do I have any guarantees at all that hello world will actually appear on my screen?


I removed the "signed" part from my question based on the following standard quotes:

(§5/5 C++03, §5/4 C++11) If during the evaluation of an expression, the result is not mathematically defined or not in the range of representable values for its type, the behavior is undefined.

(§3.9.1/4) Unsigned integers, declared unsigned, shall obey the laws of arithmetic modulo 2^n where n is the number of bits in the value representation of that particular size of integer. This implies that unsigned arithmetic does not overflow because a result that cannot be represented by the resulting unsigned integer type is reduced modulo the number that is one greater than the largest value that can be represented by the resulting unsigned integer type.


回答1:


As pointed out by @Xeo in the comments (I actually brought it up in the C++ chat first):
Undefined behavior really means it and it can hit you when you least expect it.

The best example of this is here: Why does integer overflow on x86 with GCC cause an infinite loop?

On x86, signed integer overflow is just a simple wrap-around. So normally, you'd expect the same thing to happen in C or C++. However, the compiler can intervene - and use undefined behavior as an opportunity to optimize.

In the example taken from that question:

#include <iostream>
using namespace std;

int main(){
    int i = 0x10000000;

    int c = 0;
    do{
        c++;
        i += i;
        cout << i << endl;
    }while (i > 0);

    cout << c << endl;
    return 0;
}

When compiled with GCC, GCC optimizes out the loop test and makes this an infinite loop.




回答2:


You may trigger some hardware safety feature. So no, you don't have any guarantee.

Edit: Note that gcc has the -ftrapv option (but it doesn't seem to work for me).




回答3:


There are two views about undefined behavior. There is the view it is there to gather for strange hardware and other special cases, but that usually it should behave sanely. And there is the view that anything can happen. And depending on the UB source, some hold different opinions.

While the UB about overflow has probably been introduced for taking into account hardware which trap or saturate on overflow and the difference of result between representation, and so one can argue for the first view in this case, people writing optimizers hold very dearly the view that if the standard doesn't guarantee something, really anything can happen and they try to use every piece of liberty to generate machine code which runs more rapidly, even if the result doesn't make sense anymore.

So when you see an undefined behavior, assume that anything can happen, however reasonable a given behavior may seem.



来源:https://stackoverflow.com/questions/9024826/how-disastrous-is-integer-overflow-in-c

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!