问题
I'm developing a game in Java that will be packaged as an applet, and I'm working on the networking aspect. I've designed a session flow that will work for the frequency of requests and the security needs, without requiring the use of SSL. The data transmission process is loosely based off of the way facebook signs their signed_token used with their OAuth process. Here's the simplified context:
- my php/java implementations use hash_hmac/javax.crypto.Mac to generate an obscured signature for signing a payload, based on a shared, secret, unique token and a varied JSON payload
- both outputs have to match exactly, because they're part of a larger encode/decode compression scheme
- this signature will be passed through URL with the payload, and is used to verify the payload for validity and integrity
As you can infer, if they don't match, then I have dropped packets of data and errors due to invalid data sent. My issue is that, while the hex encoding of the result matches perfectly, the raw binary doesn't seem to ever match. Below are the extracted php and Java test cases I set up:
Note: Because of the differences in how php and java generate the JSON structure for php associative arrays / java hashmaps, I'm using the value of the secret in place of the string payload so that both fields are consistent between platforms.
Php:
$secret = "922ec205d8e4d0ea06079d60a5336fffd9cf0aea";
$json = $secret; //json_encode($test_array);
$hmac_a = hash_hmac('sha256',$json,$secret);
$hmac_b = hash_hmac('sha256',$json,$secret,$raw=true);
echo(htmlentities($hmac_a)."<br/>\n");
echo(htmlentities($hmac_b)."<br/>\n");
In-browser output:
ff21a9e468ac49863e5e992324ac8bc92f239a08100b0f329b087be16f5ad382
ÿ!©äh¬I†>^™#$¬‹É/#š2›{áoZÓ‚
Java:
Mac hmac = Mac.getInstance("HmacSHA256");
SecretKeySpec secret_key = new SecretKeySpec(Charset.forName("UTF-8").encode(this.secret).array(), "HmacSHA256");
hmac.init(secret_key);
byte[] digest = hmac.doFinal(this.secret.getBytes("UTF-8"));
System.out.println(hexify(digest));
System.out.println(new String(digest,"UTF-8"));
Console output:
ff21a9e468ac49863e5e992324ac8bc92f239a08100b0f329b087be16f5ad382
�!��h�I�>^�#$���/#� 2� {�oZӂ
When copied to php and told to echo, that second string looks like this:
:�!��h�I�>^�#$���/#���2{�oZӂ
Note that while the hex is identical, the binary is different, but contains the same ending ( oZÓ‚ ) when displayed from the same source. Actually, it contains all of the more common characters (!hI>^#$/#2{oZÓ,) in order. I played around with copying the console output to php then displaying as a binary string, regular string, utf8_encode'd binary/regular string, and also utf8_encode'ing $hmac_b. Nothing seems to make the raw versions match up.
I've run mb_detect_encoding on php's hmac, and it told me UTF-8. I've also set everything in javax.crypto.Mac to UTF-8, and displayed as UTF-8, but no dice. I know Java's UTF-8 isn't different than php's UTF-8, because that defies the concept of having standard character sets. What's going on here?
Note: While I now prefer and am able to use the hex version for URL encoding, I'd still like to know what's going on with this character set nonsense, and possibly how to fix it.