int a1 = 65535;
char ch2 = (char) a1;
System.out.println(\"ASCII value corresponding to 65535 after being typecasted : \"+ch2);// prints?
char ch3 = 65535;
System.
Okay, you have a couple of quite distinct questions there.
The first question, I think, is:
ch2
and ch3
Because you're outputting an invalid character. Java characters represent UTF-16 code points, not actual characters. Some Unicode characters, in UTF-16, require two Java char
s for storage. More about UTF-16 here in the Unicode FAQ. In UTF-16, the value 0xFFFF
(which is what your ch2
and ch3
contain) is not valid as a standalone value; even if it were, there is no Unicode U+FFFF character.
Re the output of ch22
: The reason you're seeing a little box is that you're outputting character 0
(the result of (char)65536
is 0
, see below), which is a "control character" (all the characters below 32 — the normal space character — are various control characters). Character 0
is the "null" character, for which there's no generally-accepted glyph that I'm aware of.
int a11 = 65536; char ch22 = (char) a11;
?Because that's how Java's narrowing primitive conversions are defined. No error is thrown; instead, only the relevant bits are used:
A narrowing conversion of a signed integer to an integral type T simply discards all but the n lowest order bits, where n is the number of bits used to represent type T. In addition to a possible loss of information about the magnitude of the numeric value, this may cause the sign of the resulting value to differ from the sign of the input value.