Alternate way of computing size of a type using pointer arithmetic

前端 未结 9 525
广开言路
广开言路 2021-01-12 04:53

Is the following code 100% portable?

int a=10;
size_t size_of_int = (char *)(&a+1)-(char*)(&a); // No problem here?

std::cout<

        
9条回答
  •  南方客
    南方客 (楼主)
    2021-01-12 05:40

    From ANSI-ISO-IEC 14882-2003, p.87 (c++03):

    "75) Another way to approach pointer arithmetic is first to convert the pointer(s) to character pointer(s): In this scheme the integral value of the expression added to or subtracted from the converted pointer is first multiplied by the size of the object originally pointed to, and the resulting pointer is converted back to the original type. For pointer subtraction, the result of the difference between the character pointers is similarly divided by the size of the object originally pointed to."

    This seems to suggest that the pointer difference equals to the object size.

    If we remove the UB'ness from incrementing a pointer to a scalar a and turn a into an array:

    int a[1];
    size_t size_of_int = (char*)(a+1) - (char*)(a);
    
    std::cout<

    Then this looks OK. The clauses about alignment requirements are consistent with the footnote, if alignment requirements are always divisible by the size of the object.

    UPDATE: Interesting. As most of you probably know, GCC allows to specify an explicit alignment to types as an extension. But I can't break OP's "sizeof" method with it because GCC refuses to compile it:

    #include 
    
    typedef int a8_int __attribute__((aligned(8)));
    
    int main()
    {
     a8_int v[2];
    
     printf("=>%d\n",((char*)&v[1]-(char*)&v[0]));
    }
    

    The message is error: alignment of array elements is greater than element size.

提交回复
热议问题