Inserting Unicode Hyphen-minus into String Causes Error

送分小仙女□ 提交于 2019-12-02 19:14:48

问题


I am trying to insert a unicode hyphen-minus character into a text string. I am seeing an "Invalid universal character" error with the following:

u+002D (hyphen-minus)

[textViewContent insertString:@"\u002D" atIndex:cursorPosition.location];

However, these work fine:

u+2212 (minus)

[textViewContent insertString:@"\u2212" atIndex:cursorPosition.location];

u+2010 (hyphen)

[textViewContent insertString:@"\u2010" atIndex:cursorPosition.location];

I've poked at several of the existing Unicode discussions here, but I have not found one that explains what is different amongst my examples that causes the first one to error. Insight greatly appreciated.


回答1:


Unversal character names have some restrictions on their use. In C99 and C++98 you were not allowed to use one that referred to a character in the basic character set (which includes U+002D).

C++11 has updated this requirement so if you are inside a string or character literal then you are allowed to use a UCN that refers to basic characters. Depending on the compiler version you're using I would guess that you could use Objective-C++11 to make your code legal.

That said, since this character is part of ASCII and the basic character set, why don't you just write it literally?

@"-"


来源:https://stackoverflow.com/questions/11144885/inserting-unicode-hyphen-minus-into-string-causes-error

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!