How to encode 32-bit Unicode characters in a PowerShell string literal?

白昼怎懂夜的黑 提交于 2020-06-24 13:56:31

问题


This Stack Overflow question deals with 16-bit Unicode characters. I would like a similar solution that supports 32-bit characters. See this link for a listing of the various Unicode charts. For example, a range of characters that are 32-bit are the Musical Symbols.

The answer in the question linked above doesn't work because it casts the System.Int32 value as a System.Char, which is a 16-bit type.

Edit: Let me clarify that I don't particularly care about displaying the 32-bit Unicode character, I just want to store the character in a string variable.

Edit #2: I wrote a PowerShell snippet that uses the info in the marked answer and its comments. I would have wanted to put this in another comment, but comments can't be multi-line.

$inputValue = '1D11E'
$hexValue = [int]"0x$inputValue" - 0x10000
$highSurrogate = [int]($hexValue / 0x400) + 0xD800
$lowSurrogate = $hexValue % 0x400 + 0xDC00
$stringValue = [char]$highSurrogate + [char]$lowSurrogate

Dour High Arch still deserves credit for the answer for helping me finally understand surrogate pairs.


回答1:


Assuming PowerShell uses UTF-16, 32-bit code points are represented as surrogates. For example, U+10000 is represented as:

0xD100 0xDC00

That is, two 16-bit chars; hex D100 and DC00.

Good luck finding a font with surrogate chars.




回答2:


IMHO, the most elegant way to use Unicode literals in PowerShell is

[char]::ConvertFromUtf32(0x1D11E)

See my blogpost for more details



来源:https://stackoverflow.com/questions/4834291/how-to-encode-32-bit-unicode-characters-in-a-powershell-string-literal

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!