Typecasting from int to char and ASCII values

后端 未结 3 459
滥情空心
滥情空心 2020-12-22 14:04
int a1 = 65535;

char ch2 = (char) a1;

System.out.println(\"ASCII value corresponding to 65535 after being typecasted : \"+ch2);// prints?
char ch3 = 65535;
System.         


        
3条回答
  •  一生所求
    2020-12-22 14:30

    Okay, you have a couple of quite distinct questions there.

    The first question, I think, is:

    Why do you see ? when you output ch2 and ch3

    Because you're outputting an invalid character. Java characters represent UTF-16 code points, not actual characters. Some Unicode characters, in UTF-16, require two Java chars for storage. More about UTF-16 here in the Unicode FAQ. In UTF-16, the value 0xFFFF (which is what your ch2 and ch3 contain) is not valid as a standalone value; even if it were, there is no Unicode U+FFFF character.

    Re the output of ch22: The reason you're seeing a little box is that you're outputting character 0 (the result of (char)65536 is 0, see below), which is a "control character" (all the characters below 32 — the normal space character — are various control characters). Character 0 is the "null" character, for which there's no generally-accepted glyph that I'm aware of.

    Why no error when doing int a11 = 65536; char ch22 = (char) a11;?

    Because that's how Java's narrowing primitive conversions are defined. No error is thrown; instead, only the relevant bits are used:

    A narrowing conversion of a signed integer to an integral type T simply discards all but the n lowest order bits, where n is the number of bits used to represent type T. In addition to a possible loss of information about the magnitude of the numeric value, this may cause the sign of the resulting value to differ from the sign of the input value.

提交回复
热议问题