While practicing Java I randomly came up with this:
class test
{
public static void main(String arg[])
{
char x=\'A\';
x=x+1;
This is the equivalent program of your program:
public class Tester {
public static void main(String args[]){
char start='\u0041';
char next='\u0041'+1;
System.out.println(next);
}
}
But as you see, next=start+1, will not work. That's the way java handles.
The reason could be that we may accidentally use start, with integer 1 thinking that start is an int variables and use that expression. Since, java is so strict about minimizing logical errors. They designed it that way I think.
But, when you do, char next='\u0041'+1; it's clear that '\u0041' is a character and 1 will be implicitly converted into a 2 byte. This no mistake could be done by a programmer. May be that's the reason they have allowed it.
char is 2 bytes in java. char supports unicode characters. When you add or subtract a char var with an offset integer, the equivalent unicode character in the unicode table will result. Since, B is next to A, you got B.