When to use different integer types?
问题 Programming languages (e.g. c, c++, and java) usually have several types for integer arithmetic: signed and unsigned types types of different size: short , int , long , long long types of guaranteed and non guaranteed (i.e.implementation dependent) size: e.g. int32_t vs int (and I know that int32_t is not part of the language) How would you summarize when one should use each of them? 回答1: The default integral type ( int ) gets a "first among equals" preferential treatment in pretty much all