For a payment provider, I need to calculate a hash-based message authentication code, using HMAC-SHA256. That is causing me quite a bit of trouble.
The payment provi
A SHA hash is calculated on a sequence of bytes. Bytes are a profoundly different datatype to characters. You should not use character Strings to store binary data such as hashes.
sb.Append(Convert.ToString(Convert.ToChar(Int32.Parse(hex.Substring(i, 2)...
This creates a character string by reading each encoded byte and turning into a character of the same Unicode code point number. This is equivalent to decoding the bytes 0-255 using the ISO-8859-1 (Latin1) encoding, due to that encoding's property of matching the first 256 code points in Unicode.
var enc = Encoding.Default; [...] baSalt = enc.GetBytes(salt);
byte[] sha256Bytes = Encoding.Default.GetBytes(input);
These both convert the characters back to bytes using the system default encoding. This encoding varies between installs, but it will never be ISO-8859-1 - even the similar Western European code page 1252 has different characters in the range 0x80-0x9F.
Consequently the byte array you are using doesn't contain the bytes implied by the example hex sequences. A cheap fix would be to use Encoding.GetEncoding("ISO-8859-1")
instead of the default encoding, but really you should be using a bytes array to store data in the first place instead of a String, eg:
byte[] key= new byte[] { 0x57, 0x61, 0x7b, 0x5d, 0x23, 0x49, ... };
and pass that directly into ComputeHash
.
If you must initialise data from a hex string, parse it directly into a byte array, eg:
private static byte[] HexDecode(string hex) {
var bytes= new byte[hex.Length/2];
for (int i= 0; i