Why does adding a '0' to an int digit allow conversion to a char?

后端 未结 6 1149
星月不相逢
星月不相逢 2020-12-31 11:28

I\'ve seen examples of this all over the place:

int i = 2;
char c = i + \'0\';
string s;
s += char(i + \'0\');

However, I have not yet seen

6条回答
  •  攒了一身酷
    2020-12-31 12:19

    When ASCII encoding is used, the integer value of '0' is 48.

    '0' + 1 = 49 = '1'
    '0' + 2 = 50 = '2'
    
    ...
    
    '0' + 9 = 57 = '9'
    

    So, if you wanted convert a digit to its corresponding character, just add '0' to it.

    Even if the platfrom uses non-ASCII encoding, the lanuage still guarantees that the characters '0' - '9' must be encoded such that:

    '1' - '0' = 1
    '2' - '0' = 2
    '3' - '0' = 3
    '4' - '0' = 4
    '5' - '0' = 5
    '6' - '0' = 6
    '7' - '0' = 7
    '8' - '0' = 8
    '9' - '0' = 9
    

    When ASCII encoding is used, that becomes:

    '1' - '0' = 49 - 48 = 1
    '2' - '0' = 50 - 48 = 2
    '3' - '0' = 51 - 48 = 3
    '4' - '0' = 52 - 48 = 4
    '5' - '0' = 53 - 48 = 5
    '6' - '0' = 54 - 48 = 6
    '7' - '0' = 55 - 48 = 7
    '8' - '0' = 56 - 48 = 8
    '9' - '0' = 57 - 48 = 9
    

    Hence, regardless of the character encoding used by a platform, the lines

    int i = 2;
    char c = i + '0';
    

    will always result in the value of c being equal to the character '2'.

提交回复
热议问题