I want to understand the difference between char
and wchar_t
? I understand that wchar_t
uses more bytes but can I get a clear cut exa
Never use wchar_t
.
When possible, use (some kind of array of) char
, such as std::string
, and ensure that it is encoded in UTF-8.
When you must interface with APIs that don't speak UTF-8, use char16_t
or char32_t
. Never use them otherwise; they provide only illusory advantages and encourage faulty code.
Note that there are plenty of cases where more than one char32_t
is required to represent a single user-visible character. OTOH, using UTF-8 with char
forces you to handle variable width very early.