Sometimes I see Integer constants defined in hexadecimal, instead of decimal numbers. This is a small part I took from a GL10 class:
public static final int
There is no performance gain.
However, if these constants correspond to certain bits underneath, most programmers prefer Hex (or even binary) to make that clear and more readable.
For example, one can easily see that GL_EXP2 has 2 bits on, the 1 bit and the 0x0800 bit (which is 2048 decimal). A decimal value of 2049 would be less clear.