I\'ve got an international character stored in a unichar variable. This character does not come from a file or url. The variable itself only stores an unsigned short(0xce91)
The code above is the moral equivalent of unichar foo = 'abc';.
The problem is that 'Α' doesn't map to a single byte in the "execution character set" (I'm assuming UTF-8) which is "implementation-defined" in C99 §6.4.4.4 10:
The value of an integer character constant containing more than one character (e.g.,
'ab'), or containing a character or escape sequence that does not map to a single-byte execution character, is implementation-defined.
One way is to make 'ab' equal to 'a'<<8|b. Some Mac/iOS system headers rely on this for things like OSType/FourCharCode/FourCC; the only one in iOS that comes to mind is CoreVideo pixel formats. This is, however, unportable.
If you really want a unichar literal, you can try L'A' (technically it's a wchar_t literal, but on OS X and iOS, wchar_t is typically UTF-16 so it'll work for things inside the BMP). However, it's far simpler to just use @"Α" (which works as long as you set the source character encoding correctly) or @"\u0391" (which has worked since at least the iOS 3 SDK).