What is the idea behind ^= 32, that converts lowercase letters to upper and vice versa?

前端 未结 10 1813
孤城傲影
孤城傲影 2020-11-28 03:02

I was solving some problem on codeforces. Normally I first check if the character is upper or lower English letter then subtract or add 32 to convert it to the

10条回答
  •  感动是毒
    2020-11-28 03:26

    See the second table at http://www.catb.org/esr/faqs/things-every-hacker-once-knew/#_ascii, and following notes, reproduced below:

    The Control modifier on your keyboard basically clears the top three bits of whatever character you type, leaving the bottom five and mapping it to the 0..31 range. So, for example, Ctrl-SPACE, Ctrl-@, and Ctrl-` all mean the same thing: NUL.

    Very old keyboards used to do Shift just by toggling the 32 or 16 bit, depending on the key; this is why the relationship between small and capital letters in ASCII is so regular, and the relationship between numbers and symbols, and some pairs of symbols, is sort of regular if you squint at it. The ASR-33, which was an all-uppercase terminal, even let you generate some punctuation characters it didn’t have keys for by shifting the 16 bit; thus, for example, Shift-K (0x4B) became a [ (0x5B)

    ASCII was designed such that the shift and ctrl keyboard keys could be implemented without much (or perhaps any for ctrl) logic - shift probably required only a few gates. It probably made at least as much sense to store the wire protocol as any other character encoding (no software conversion required).

    The linked article also explains many strange hacker conventions such as And control H does a single character and is an old^H^H^H^H^H classic joke. (found here).

提交回复
热议问题