reconstructed world position from depth is wrong

你离开我真会死。 提交于 2019-12-10 09:27:16

问题


I'm trying to implement deferred shading/lighting. In order to reduce the number/size of the buffers I use I wanted to use the depth texture to reconstruct world position later on.

I do this by multiplying the pixel's coordinates with the inverse of the projection matrix and the inverse of the camera matrix. This sort of works, but the position is a bit off. Here's the absolute difference with a sampled world position texture:

For reference, this is the code I use in the second pass fragment shader:

vec2 screenPosition_texture = vec2((gl_FragCoord.x)/WIDTH, (gl_FragCoord.y)/HEIGHT);
float pixelDepth = texture2D(depth, screenPosition_texture).x;

vec4 worldPosition = pMatInverse*vec4(VertexIn.position, pixelDepth, 1.0);
worldPosition = vec4(worldPosition.xyz/worldPosition.w, 1.0);
//worldPosition /= 1.85;
worldPosition = cMatInverse*worldPosition_byDepth;

If I uncomment worldPosition /= 1.85, the position is reconstructed a lot better (on my geometry/range of depth values). I just got this value by messing around after comparing my output with what it should be (stored in a third texture).

I'm using 0.1 near, 100.0 far and my geometries are up to about 15 away. I know there may be precision errors, but this seems a bit too big of an error too close to the camera. Did I miss anything here?


回答1:


As mentioned in a comment: I didn't convert the depth value from ndc space to clip space. I should have added this line:

pixelDepth=pixelDepth*2.0-1.0;


来源:https://stackoverflow.com/questions/16246250/reconstructed-world-position-from-depth-is-wrong

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!