From what I understand, a char
is safe to house ASCII characters whereas char16_t
and char32_t
are safe to house characters from unico
The type wchar_t
was put into the standard when Unicode promised to create a 16 bit representation. Most vendors choose to make wchar_t
32 bits but one large vendor has chosen to to make it 16 bits. Since Unicode uses more than 16 bits (e.g., 20 bits) it was felt that we should have better character types.
The intent for char16_t
is to represent UTF16 and char32_t
is meant to directly represent Unicode characters. However, on systems using wchar_t
as part of their fundamental interface, you'll be stuck with wchar_t
. If you are unconstrained I would personally use char
to represent Unicode using UTF8. The problem with char16_t
and char32_t
is that they are not fully supported, not even in the standard C++ library: for example, there are no streams supporting these types directly and it more work than just instantiating the stream for these types.