Sometimes I see Integer constants defined in hexadecimal, instead of decimal numbers. This is a small part I took from a GL10 class:
public static final int
"It's obviously simpler to define 2914 instead of 0x0B62"
I don't know about that specific case, but quite often that is not true.
Out of the two questions:
B will be answered more correctly faster by a lot of developmers. (This goes for similar questions as well)
0x0B62 (it is 4 hex digits long so it reprensents a 16-bit number)
->
0000101101100010
(I dare you to do the same with 2914.)
That is one reason for using the hex value, another is that the source of the value might use hex (the standard of a specification for example).
Sometimes I just find it silly, as in:
public static final int NUMBER_OF_TIMES_TO_ASK_FOR_CONFIRMATION = ...;
Would almost always be silly to write in hex, I'm sure there are some cases where it wouldn't.