No segmentation fault when assignement increases beyond malloc size [duplicate]

坚强是说给别人听的谎言 提交于 2020-06-28 00:02:32

问题


int main()
{
    int *p=NULL;
    p=malloc(8);

    if (p==NULL)
        printf("Mem alloc failed\n");
    else
        printf("Passed\n");

    unsigned long int i=0;

    for (i=2;i<=10000;i++)
        p[i]=10;  // I thought as we are increasing beyond malloc size of 8 bytes there should be segmentation fault but I was wrong.

    printf("p[10000]= %d %d\n",p[10000]);

    free(p);
    return 0;
}

What can be the reason behind this as I tried to increase for loop count to

pow(2,32) ( for(i=2;i<=((pow(2,32)-1));i++))

in which case I get a segmentation fault?


回答1:


There is no guarantee a crash will actually happen. This kind of error is often silently ignored. It's even possible to write to the memory used to store internal structures of runtime memory management system, and thus corrupt the heap. It's an undefined behaviour, after all - nothing is guaranteed.

Anyway, preventing heap overflow in general is a subject of extensive research. Apparently, at this moment it's possible, but degrades performance significantly. Try googling for "heap overflow protection" if you find the topic interesting, you'll likely find a wide range of papers and technical descriptions of state-of-the-art techniques and current developements.




回答2:


what you described here is an undefined behavior, that might cause Exception, might be ignored, and might do just about anything, because the behavior is undefined:

behaviour, such as might arise upon use of an erroneous program construct or erroneous data, for which this International Standard imposes no requirements.

Undefined behaviour may also be expected when this International Standard omits the description of any explicit definition of behavior.



来源:https://stackoverflow.com/questions/18734898/no-segmentation-fault-when-assignement-increases-beyond-malloc-size

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!