Why do these two lines give me different results?
var str = "Hello 😘" // the square is an emoji
count(str) // returns 7
(str as NSString).length // returns 8
Original for reference:
This is because Swift uses Extended Grapheme Clusters. Swift sees the smiley as one character, but the NSString method sees it as two Unicode Characters, although they are "combined" and represent a single symbol.
I think the documentation says it best:
The character count returned by the count(_:) function is not always the same as the length property of an NSString that contains the same characters. The length of an NSString is based on the number of 16-bit code units within the string’s UTF-16 representation and not the number of Unicode extended grapheme clusters within the string. To reflect this fact, the length property from NSString is called utf16Count when it is accessed on a Swift String value.
来源:https://stackoverflow.com/questions/29832930/swift-string-count-vs-nsstring-length-not-equal