Why does a character in Java take twice as much space to store as a character in C?
Java char is an UTF-16-encoded Unicode code point while C uses ASCII encoding in most of the cases.
char