In which case should you use primitive types(int) or reference types (Integer)?
This question sparked my curiosity.
One case in which Integer might be prefered is when you are working with a database where numerical entries are allowed to be null, since you wouldn't be able to represent a null value with an int.
But of course if you're doing straight math, then int would be better as others have mentioned due to intuitiveness and less overhead.