Struggling to convert vector<char> to wstring

别等时光非礼了梦想. 提交于 2019-12-06 07:44:32

You should not use wide types at all in your case.

Assuming you can get a char * from your vector<char>, you can stick to bytes by using the following code:

char * utf16_buffer = &my_vector_of_chars[0];
char * buffer_end = &my_vector_of_chars[vector.size()];
std::string utf8_str = boost::locale::conv::between(utf16_buffer, buffer_end, "UTF-8", "UTF-16");

between operates on 8-bit characters and allows you to avoid conversion to 16-bit characters altogether.

It is necessary to use the between overload that uses the pointer to the buffer's end, because by default, between will stop at the first '\0' character in the string, which will be almost immediately because the input is UTF-16.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!