Sometimes I see Integer constants defined in hexadecimal, instead of decimal numbers. This is a small part I took from a GL10 class:
public static final int
Would you rather write 0xFFFFFFFF
or 4294967295
?
The first one much more clearly represents a 32 bit data type with all ones. Of course, many a seasoned programmer would recognize the latter pattern, and have a sneaking suspicion as to it's true meaning. However even in that case, it is much more prone to typing errors, etc.