depth-buffer

How to use depth texture with stencil, OpenGL ES 3.0

为君一笑 提交于 2019-12-12 02:38:00
问题 I have an application where I want to render depth to a texture using stencil mask. I try GL_DEPTH_STENCIL_OES: Creating texture: glGenFramebuffers(1, fbo); glBindFramebuffer(GL_FRAMEBUFFER, *fbo); glGenTextures(1, depthTexture); glBindTexture(GL_TEXTURE_2D, *depthTexture); // using combined format: depth + stencil glTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH_STENCIL_OES, w, h, 0, GL_DEPTH_STENCIL_OES, GL_UNSIGNED_INT_24_8_OES, NULL); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR

Non blocking glReadPixels of depth values with PBO

℡╲_俬逩灬. 提交于 2019-12-11 15:08:19
问题 I am reading a single pixel's depth from the framebuffer to implement picking. Originally my glReadPixels() was taking a very long time (5ms or so) and on nVidia it would even burn 100% CPU during that time. On Intel it was slow as well, but with idle CPU. Since then, I used the PixelBufferObject functionality, PBO , to make the glReadPixels asynchronous and also double buffered using this well known example. This approach works well, and let's me make a glReadPixels() call asynchronous but

How to select a 2D node in a 3D scene?

痞子三分冷 提交于 2019-12-11 07:32:58
问题 Here is my code. You can copy-paste and follow what I write bellow to see the problem yourself. public class MyApp extends Application { @Override public void start(Stage stage) throws Exception { Scene scene = new Scene(new MyView(), 100, 150); stage.setScene(scene); stage.show(); } private class MyView extends BorderPane { MyView() { GridPane board = new GridPane(); int size = 3; for (int i = 0; i < size*size; i++) { BorderPane pane = new BorderPane(); pane.setMinSize(30, 30); pane

JavafX 8 3D Z Order. Overlapping shape behaviour is wrong.

给你一囗甜甜゛ 提交于 2019-12-10 21:54:50
问题 I have a JavaFX 3D scene with a bunch of boxes and spheres added at random locations. It seems like the depth order is all wrong and I'm not sure why. I have tried to use myNode.setDepthTest(DepthTest.ENABLE) but that doesn't seem to help. I've attached an application which should demonstrate the problem. Any idea what I might be doing wrong here? Any help much appreciated. import javafx.application.Application; import javafx.application.ConditionalFeature; import javafx.application.Platform;

reconstructed world position from depth is wrong

你离开我真会死。 提交于 2019-12-10 09:27:16
问题 I'm trying to implement deferred shading/lighting. In order to reduce the number/size of the buffers I use I wanted to use the depth texture to reconstruct world position later on. I do this by multiplying the pixel's coordinates with the inverse of the projection matrix and the inverse of the camera matrix. This sort of works, but the position is a bit off. Here's the absolute difference with a sampled world position texture: For reference, this is the code I use in the second pass fragment

How to use depth testing when rendering to an offscreen buffer then onto texture

柔情痞子 提交于 2019-12-09 16:26:21
问题 I'm rendering my scene to a texture. This works fine except that depth testing does not work. How do I enable depth testing if rendering to an offscreen texture? I'm using the FrameBuffer class http://www.opengl.org/news/comments/framebuffer_object_fbo_c_class_available_with_example_application/ glGetIntegerv(GL_DRAW_BUFFER, &drawBuffer); frameBuffer->Bind(); glDrawBuffer(GL_COLOR_ATTACHMENT0_EXT); rAngle += 0.3f; glUseProgram(0); drawSpinningTeapot(); FramebufferObject::Disable();

Meaning and usage of the factor parameter in glPolygonOffset

不打扰是莪最后的温柔 提交于 2019-12-09 15:40:07
问题 I am having difficulty understanding the meaning of the first parameter in glPolygonOffset function. void glPolygonOffset(GLfloat factor, GLfloat units); The official documentation says that factor Specifies a scale factor that is used to create a variable depth offset for each polygon. and that each fragment's depth value will be offset after it is interpolated from the depth values of the appropriate vertices. The value of the offset is factor × DZ + r × units , where DZ is a measurement of

Depth offset in OpenGL

你离开我真会死。 提交于 2019-12-09 03:34:49
问题 What would be the best way of offsetting depth in OpenGL? I currently have index vertex attribute per polygon which I am passing to the Vertex Shader in OpenGL. My goal is to offset the polygons in depth where the highest index would be always in-front of the lower index. I currently have this simple approach modifying gl_Position.z . gl_Position.z += -index * 0.00001; 回答1: The usual way to set an automatic offset for the depth is glPolygonOffset(GLfloat factor,GLfloat units) When GL_POLYGON

Depth of Field shader for points/strokes in Processing

孤街浪徒 提交于 2019-12-07 23:53:21
问题 Recently I've been using the Depth of Field shader below (originally from the ofxPostProcessing library for OpenFrameworks) for my Processing sketches. depth.glsl uniform float maxDepth; void main() { float depth = gl_FragCoord.z / gl_FragCoord.w; gl_FragColor = vec4(vec3(1.0 - depth/maxDepth), 1.0); } dof.glsl uniform sampler2D texture; varying vec4 vertexture; varying vec4 vertTexCoord; uniform sampler2D tDepth; uniform float maxBlur; // max blur amount uniform float aperture; // aperture -

Rendering multiple depth information with FBOs

旧时模样 提交于 2019-12-07 15:05:20
问题 I am trying to implement a shader computing the light refraction through two surfaces: the back and front of the object. To do so, I need to render the refractive geometry with the normal depth test (GL_LESS), and the reversed depth test (GL_GREATER). It would allow me to compute the distance from the back face to the front face. Unfortunately, I only manage to render one of those at a time, and I can't figure out how to pass both depth information as textures to the shader. The shader itself