UnicodeString w/ String Literals vs Hex Values
问题 Is there any conceivable reason why I would see different results using unicode string literals versus the actual hex value for the UChar. UnicodeString s1(0x0040); // @ sign UnicodeString s2("\u0040"); s1 isn't equivalent to s2. Why? 回答1: The \u escape sequence AFAIK is implementation defined, so it's hard to say why they are not equivalent without knowing details on your particular compiler. That said, it's simply not a safe way of doing things. UnicodeString has a constructor taking a