I need a way to convert a string that contains a literal string representing a hexadecimal value into a Character corresponding to that particular hexadecimal value.
With Swift 5, you will have to convert your string variable into an integer (using init(_:radix:) initializer), create Unicode scalar from this integer (using init(_:)) then create a character from this Unicode scalar (using init(_:)).
The Swift 5 Playground sample code below shows how to proceed:
let validHexString: String = "2C"
let validUnicodeScalarValue = Int(validHexString, radix: 16)!
let validUnicodeScalar = Unicode.Scalar(validUnicodeScalarValue)!
let character = Character(validUnicodeScalar)
print(character) // prints: ","
If you want to perform this operation for the elements inside an array, you can use the sample code below:
let hexArray = ["2F", "24", "40", "2A"]
let characterArray = hexArray.map({ (hexString) -> Character in
let unicodeScalarValue = Int(hexString, radix: 16)!
let validUnicodeScalar = Unicode.Scalar(unicodeScalarValue)!
return Character(validUnicodeScalar)
})
print(characterArray) // prints: ["/", "$", "@", "*"]
Alternative with no force unwraps:
let hexArray = ["2F", "24", "40", "2A"]
let characterArray = hexArray.compactMap({ (hexString) -> Character? in
guard let unicodeScalarValue = Int(hexString, radix: 16),
let unicodeScalar = Unicode.Scalar(unicodeScalarValue) else {
return nil
}
return Character(unicodeScalar)
})
print(characterArray) // prints: ["/", "$", "@", "*"]