This is one of those questions that really only matters in an academic sense. Ruby is optimized to treat ints, longs, etc. as primitives whenever possible. Java just made this explicit. If Java had primitives be objects, there would be IntPrimitive, LongPrimitive, etc (by whatever name) classes. which would most likely be final without special methods (e.g. no IntPrimitive.factorial). Which would mean for most purposes they would be primitives.