Sometimes I see Integer constants defined in hexadecimal, instead of decimal numbers. This is a small part I took from a GL10 class:
public static final int
When it comes to big numbers, representing them in hexadecimal makes them more readable, because they're more compact.
Also, sometimes it is significant for conversions to binary: a hexadecimal number can be very easy converted to binary. Some programmers like to do this, it helps when doing bit operations on the numbers.
As for the performance gain: no, there is none.