While practicing Java I randomly came up with this:
class test
{
public static void main(String arg[])
{
char x=\'A\';
x=x+1;
This is the equivalent program of your program:
public class Tester {
public static void main(String args[]){
char start='\u0041';
char next='\u0041'+1;
System.out.println(next);
}
}
But as you see, next=start+1, will not work. That's the way java handles.
The reason could be that we may accidentally use start, with integer 1 thinking that start is an int variables and use that expression. Since, java is so strict about minimizing logical errors. They designed it that way I think.
But, when you do, char next='\u0041'+1; it's clear that '\u0041' is a character and 1 will be implicitly converted into a 2 byte. This no mistake could be done by a programmer. May be that's the reason they have allowed it.
char is 2 bytes in java. char supports unicode characters. When you add or subtract a char var with an offset integer, the equivalent unicode character in the unicode table will result. Since, B is next to A, you got B.
in case you want it to be just a-z.
private char getNextChar(char c) {
return (char)((c + 1 - 'a') % ('z' - 'a' + 1) + 'a');
}
this avoids it overflows.
Each char has a character code. The computer sees a character as an unsigned number, so you can increment it.
In Java, char is a numeric type. When you add 1 to a char, you get to the next unicode code point. In case of 'A', the next code point is 'B':
char x='A';
x+=1;
System.out.println(x);
Note that you cannot use x=x+1 because it causes an implicit narrowing conversion. You need to use either x++ or x+=1 instead.
A char in Java is just an integer number, so it's ok to increment/decrement it. Each char number has a corresponding value, an interpretation of it as a character, by virtue of a conversion table: be it the ASCII encoding, or the UTF-8 encoding, etc.
Come to think of it, every data in a computer - images, music, programs, etc. are just numbers. A simple character is no exception, it's encoded or codified as a number, too.
A char is in fact mapped to an int, look at the Ascii Table.
For example: a capital A corresponds to the decimal number 65. When you are adding 1 to that char, you basicly increment the decimal number by 1. So the number becomes 66, which corresponds to the capital B.