Incrementing Char Type In Java

前端 未结 10 1603
鱼传尺愫
鱼传尺愫 2020-12-03 05:08

While practicing Java I randomly came up with this:

class test
{
    public static void main(String arg[])
    {
        char x=\'A\';
        x=x+1;
                


        
10条回答
  •  醉话见心
    2020-12-03 05:11

    This is the equivalent program of your program:

    public class Tester {
        public static void main(String args[]){
                 char start='\u0041';
                 char next='\u0041'+1;
                 System.out.println(next);
        }
    }
    

    But as you see, next=start+1, will not work. That's the way java handles.

    The reason could be that we may accidentally use start, with integer 1 thinking that start is an int variables and use that expression. Since, java is so strict about minimizing logical errors. They designed it that way I think.

    But, when you do, char next='\u0041'+1; it's clear that '\u0041' is a character and 1 will be implicitly converted into a 2 byte. This no mistake could be done by a programmer. May be that's the reason they have allowed it.

    char is 2 bytes in java. char supports unicode characters. When you add or subtract a char var with an offset integer, the equivalent unicode character in the unicode table will result. Since, B is next to A, you got B.

提交回复
热议问题