Creating blur filter with a shader - access adjacent pixels from fragment shader?

孤人 提交于 2019-12-03 03:43:13

Elaborating a bit more on what Matias said:

  1. Yes. You render the image into a texture (best done using FBOs) and in the second (blur) pass you bind this texture and read from it. You cannot perform the render and blur passes in one step, as you cannot access the framebuffer you're currently rendering into. This would introduce data dependencies, as your neighbours need not have their final color yet, or worse there color depends on you.

  2. You get the current pixel's coordinates in the special fragment shader variable gl_FragCoord and use these as texture coordinates into the texture containing the previously rendered image and likewise gl_FragCoord.x +/- 1 and gl_FragCoord.y +/- 1 for the neighbours. But like Matias said, you need to devide these values by width and height (of the image) respectively, as texture coordinates are in [0,1]. By using GL_CLAMP_TO_EDGE as wrapping mode for the texture, the edge cases are handled automatically by the texturing hardware. So at an edge you still get 9 values, but only 6 distinct ones (the other 3, the ones actually outside the image, are just duplicates of their inside neighbours).

1) Yes, using a FBO is the way to go.

2) With math, if you are at pixel (x, y), then the neighbors are (x+1, y), (x, y+1), (x+1, y+1), (x-1, y), etc. Edge cases are handled with the wrap modes of the texture. Notice that since GL_TEXTURE_2D uses normalized coordinates, the offsets aren't 1, but 1 / width and 1 / height of the texture.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!