opengl es2 premultiplied vs straight alpha + blending

空扰寡人 提交于 2019-12-03 08:47:44

Here's a great article on how to avoid dark fringes with straight alpha textures http://www.realtimerendering.com/blog/gpus-prefer-premultiplication/

If you're using mip-maps, that might be why your straight alpha textures have dark fringes -- filtering a straight alpha image can cause that to happen, and the best solution is to pre-multiply the image before the mip-maps are created. There's a common hack fix as well described in that article, but seriously consider pre-multiplying instead.

Using straight alpha to create the textures is often necessary and preferred, but it's still better to pre-multiply them as part of a build step, or during the texture load than to keep them as straight-alpha in memory. I'm not sure about OpenGL ES, but I know WebGL lets you pre-multiply textures on the fly during load by using gl.pixelStorei with a gl.UNPACK_PREMULTIPLY_ALPHA_WEBGL argument.

Another possibility is if you're compositing many elements, your straight alpha blending function is incorrect. In order to do a correct "over operator" with a straight alpha image, you need to use this blending function, maybe this is what you were looking for:

gl.blendFuncSeparate(gl.SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA, gl.ONE, gl.ONE_MINUS_SRC_ALPHA);

The common straight alpha blending function you referred to (gl.blendFunc(gl.SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA)) does not correctly handle the destination alpha channel, you have to use separate blend functions for color and alpha when blending a straight alpha image source, if you intend to composite many layers using an "over operator". (Think about how you probably don't want to interpolate the alpha channel, it should always end up more opaque than both the source & dest.) And take special care when you blend, because the result of the straight-alpha blend is a premultiplied image! So if you use the result later, you still have to be prepared to do premultiplied blending. For a longer explanation, I wrote about this here: https://limnu.com/webgl-blending-youre-probably-wrong/

The nice thing about using premultiplied images & blending is that you don't have to use separate blend funcs for color & alpha, and you automatically avoid a lot of these issues. You can & should create straight alpha textures, but then pre-multiply them before or during load and using premult blending (glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA)) throughout your code.

Eonil

AFAIK, glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA); is for premultiplied alpha, and it should work well if you use colors as is. glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA); is for straight alpha.

Many texture loading frameworks implicitly convert images into premultiplied-alpha format. This is because many of them are doing image re-drawing into a new image, and CGBitmapContext doesn't support straight-alpha (non-multiplied) image. Consequently, they will usually generate premultiplied-alpha image. So, please look in your texture loading code, and check whether it was converted into premultiplied format.

Also, Photoshop (of course Adobe's) implicitly erases color informations on fully transparent (alpha=0) pixels when exporting to PNG. If you use linear or other texture filtering, then GPU will sample over neighboring pixels, and colors in transparent pixels will affect pixels at edges. But Photoshop already erased the color information so they will have some random color values.

Theoretically, this color bleeding can be fixed by keeping correct color values on transparent pixels. Anyway, with Photoshop, we have no practical way to export a PNG file with keeping correct color value because Photoshop doesn't respect invisible stuffs. (it's required to write a dedicated PNG exporter Photoshop plug-in to export them correctly, I couldn't fine existing one which support this)

Premultiplied alpha is good enough just to display the image, but it won't work well if you do any shader color magics because colors are stored in integer form, so it usually doesn't have enough precision to restore original color value. If you need precise color magic stuffs, use straight alpha — avoid Photoshop.


Update

Here's my test result with @SlippD.Thompson's test code on iPhone simulator (64-bit/7.x)

 <Error>: CGBitmapContextCreate: unsupported parameter combination: 8 integer bits/component; 24 bits/pixel; 3-component color space; kCGImageAlphaNone; 2048 bytes/row.
    cgContext with CGImageAlphaInfo 0: (null)
    cgContext with CGImageAlphaInfo 1: <CGContext 0x1092301f0>
    cgContext with CGImageAlphaInfo 2: <CGContext 0x1092301f0>
 <Error>: CGBitmapContextCreate: unsupported parameter combination: 8 integer bits/component; 32 bits/pixel; 3-component color space; kCGImageAlphaLast; 2048 bytes/row.
    cgContext with CGImageAlphaInfo 3: (null)
 <Error>: CGBitmapContextCreate: unsupported parameter combination: 8 integer bits/component; 32 bits/pixel; 3-component color space; kCGImageAlphaFirst; 2048 bytes/row.
    cgContext with CGImageAlphaInfo 4: (null)
    cgContext with CGImageAlphaInfo 5: <CGContext 0x1092301f0>
    cgContext with CGImageAlphaInfo 6: <CGContext 0x1092301f0>
 <Error>: CGBitmapContextCreate: unsupported parameter combination: 8 integer bits/component; 24 bits/pixel; 0-component color space; kCGImageAlphaOnly; 2048 bytes/row.
    cgContext with CGImageAlphaInfo 7: (null)
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!