I\'m currently trying out some questions just to practice my programming skills. ( Not taking it in school or anything yet, self taught ) I came across this problem which re
From what I know, in C/C++ int is a 16bit type so you cannot fit 1 million in it (limit is 2^16=32k). Try and declare "a" as long
I think the C standard says that int is at least as large as short and at most as large as long.
In practice int is 4 bytes, so it can hold numbers between -2^31 and 2^31-1.