Why does a character in Java take twice as much space to store as a character in C?
Because Java uses Unicode, C generally uses ASCII by default.
There are various flavours of Unicode encoding, but Java uses UTF-16, which uses either one or two 16-bit code units per character. ASCII always uses one byte per character.