When writing pixels to an HTML Canvas context using putImageData
I find that the pixel values are not exactly the same when I fetch them again. I have put up a
HTML5 specification encourages browser vendors to use something that is called Premultiplied Alpha
. In essence this means that pixels are stored in 32-bit integers where each channel contains a 8-bit color value. For performance reasons, the Premultiplied Alpha is used by browsers. What it means is that it premultiplies color values based on the alpha value.
Here's an example. You have a color such that the values for RGB are 128
, 64
, 67
. Now, for the sake of higher performance, the color values will be premultiplied by the alpha value. So, in case the alpha value is 16
, all the color values will get multiplied by 16/256
(= 0.0625
). In this case, the resulting values for RGB become 8
, 4
, 4.1875
(rounded to 4
because pixel color values are not floats).
The problem shows up when you do exactly what you are doing here; setting color data with a specific alpha value and then pulling back the actual color values. The previous Blue color of 4.1875
that got rounded to 4
will become 64
instead of 67
when you call getImageData()
.
That is why you are seeing all this and it will never change unless the underlying implementation in a browser engine changes to use a color system that does not suffer from this.
ImageData is defined in HTML5 as being unpremultiplied, but most canvas implementations use a premultiplied backing buffer to speed up compositing, etc. This means that when data is written and then read from the backing buffer it can change.
I would assume that Chrome v8 picked up a buggy version of the [un]premultiplying code from webkit.org (It has been broken before, although i don't recall any recent occurances, and that doesn't explain the windows only variance)
[edit: it could be worth checking a webkit nightly on windows? as the imagedata implementation doesn't have anything platform specific it's shared between all webkit browsers and could simply be broken in MSVC based builds]
Looks like a rounding issue to me...
64/255 = 0.2509... (rounded down to give 0.25)
0.25 * 255 = 63.75 (rounded down to give 63)
== OR ==
64/255 = 0.2509... (rounded up to give 0.26)
0.26 * 255 = 66.3 (rounded up to give 67)
Remember that 255 is the maximum value, not 256 ;)
EDIT: Of course, this wouldn't explain why the alpha channel is behaving...