问题
With the release of iOS 9.1, we got a lot of cool new emoji like taco (🌮)! I'm working on an application where I'd like to show the new emoji characters on devices where they are supported but keep them hidden on devices where they are not. Is there a way to determine if a given emoji character (contained in an NSString) can be rendered on the device the app is running on?
After quite a bit of digging on SO and experimenting I've come up with a solution which I will post as an answer below, but please let me know if there's a better way.
回答1:
// Determine if the emoji character provided in the string can be rendered on this OS version
+ (BOOL)isEmojiSupported:(NSString *)emoji
{
NSData *data = [string dataUsingEncoding:NSUTF32LittleEndianStringEncoding];
UTF32Char emojiValue;
[data getBytes:&emojiValue length:sizeof(emojiValue)];
// Convert UTF32Char to UniChar surrogate pair.
// Found here: http://stackoverflow.com/questions/13005091/how-to-tell-if-a-particular-font-has-a-specific-glyph-64k#
UniChar characters[2] = { };
CFIndex length = (CFStringGetSurrogatePairForLongCharacter(emojiValue, characters) ? 2 : 1);
CGGlyph glyphs[2] = { };
CTFontRef ctFont = CTFontCreateWithName((CFStringRef)@"AppleColorEmoji", 0.0, NULL);
// If we don't get back any glyphs for the characters array, it's not supported
BOOL ret = CTFontGetGlyphsForCharacters(ctFont, characters, glyphs, length);
CFRelease(ctFont);
return ret;
}
回答2:
In response to the Swift question above, here's a Swift version based largely on an answer in this question: Get surrogate pairs from an emoji
func isEmojiSupported(emoji: String) -> Bool {
let uniChars = Array(emoji.utf16)
let font = CTFontCreateWithName("AppleColorEmoji", 0.0, nil)
var glyphs: [CGGlyph] = [0, 0]
return CTFontGetGlyphsForCharacters(font, uniChars, &glyphs, uniChars.count)
}
来源:https://stackoverflow.com/questions/33701590/how-can-i-determine-if-a-specific-emoji-character-can-be-rendered-by-an-ios-devi