问题
I am using Ubuntu 10.10 (64 bit) with gcc and I wanted to use a 64 bit integer in my C++ program.
On my system the outputs of sizeof(long), sizeof(long long int) and sizeof(int64_t) are all 8 bytes (64 bits).
Which qualifier (long, long long, or int64_t) would you recommend for using 64 bit integers?
回答1:
int64_t -- This is because it is the most portable representation. The other two could be represented differently on other machines.
回答2:
int64_t. If you need 64 bits, declare it explicitly. The size of long and long long varies by machine.
回答3:
Do you need exactly 64 bits or at least 64 bits?
Use whichever of int64_t, int_least64_t, or int_fast64_t most clearly expresses your intent. (All three are almost certain to be the same type on current systems, but documenting your intent is valuable.)
All implementations must provide int_least64_t and int_fast64_t. It's at least theoretically possible that int64_t might not exist (say, if the compiler has a 128-bit type but no 64-bit type, or if signed integers aren't represented using 2's-complement).
(But in every C99-ish implementation I've ever seen, long long is exactly 64 bits, and int64_t exists.)
回答4:
Define custom type for 64-bit integer and use it in your code. Use the directive #ifdef to the compiler can choose the right one. The example for unification some integers:
#ifdef (_MSC_VER)
#include <basetsd.h>
#define int8_t INT8
#define uint8_t UINT8
#define int16_t INT16
#define uint16_t UINT16
#define int32_t INT32
#define uint32_t UINT32
#define int64_t INT64
#define uint64_t UINT64
#else
#include <inttypes.h>
#endif
typedef uint8_t u8_t;
typedef int8_t s8_t;
typedef uint16_t u16_t;
typedef int16_t s16_t;
typedef uint32_t u32_t;
typedef int32_t s32_t;
typedef uint64_t u64_t;
typedef int64_t s64_t;
来源:https://stackoverflow.com/questions/8373783/representing-a-64-bit-integer-in-gnu-linux