What's the difference between “int” and “int_fast16_t”?
As I understand it, the C specification says that type int is supposed to be the most efficient type on target platform that contains at least 16 bits. Isn't that exactly what the C99 definition of int_fast16_t is too? Maybe they put it in there just for consistency, since the other int_fastXX_t are needed? Update To summarize discussion below: My question was wrong in many ways. The C standard does not specify bitness for int . It gives a range [-32767,32767] that it must contain. I realize at first most people would say, "but that range implies at least 16-bits!" But C doesn't require two's