What is the \"correct\" way of comparing a code-point to a Java character? For example:
int codepoint = String.codePointAt(0);
char token = \'\\n\';
<
For a character which can be represented by a single char (16 bits, basic multilingual plane), you can get the codepoint simply by casting the char to an integer (as the question suggests), so there's no need for a special method to perform a conversion.
If you're comparing a char to a codepoint, you don't need any special casing. Just compare the char to the int directly (as the question suggests). If the int represents a codepoint outside of the basic multilingual plane, the result will always be false.