opengl-3

OpenGL & GLSL 3.3 on an HD Graphics 4000 under Ubuntu 12.04

三世轮回 提交于 2019-11-26 22:50:52
问题 I'm running that configuration : Ubuntu 12.04 Intel HD Graphics 4000 glxinfo give me that parameters: OpenGL renderer string: Mesa X11 OpenGL version string: 2.1 Mesa 8.0.4 OpenGL shading language version string: 1.20 OpenGL extensions: My goal was to run OpenGL 3.3 (and so GLSL 3.3). If I'm easy with development issues, I'm lost in hardware and drivers, so does someone knows a way to achieve that with my configuration? 回答1: Unfortunally at this moment it looks like this is not possible,

OpenGL VAO best practices

馋奶兔 提交于 2019-11-26 11:48:13
问题 Im facing an issue which I believe to be VAO-dependant, but Im not sure.. I am not sure about the correct usage of a VAO, what I used to do during GL initialization was a simple glGenVertexArrays(1,&vao) followed by a glBindVertexArray(vao) and later, in my drawing pipeline, I just called glBindBuffer(), glVertexAttribPointer(), glEnableVertexAttribArray() and so on.. without caring about the initally bound VAO is this a correct practice? 回答1: VAOs act similarly to VBOs and textures with

depth buffer got by glReadPixels is always 1

我只是一个虾纸丫 提交于 2019-11-26 10:02:28
问题 I\'m using glReadPixels to get depth value of select pixel, but i always get 1, how can i solve it? here is the code: glEnable(GL_DEPTH_TEST); .. glReadPixels(x, viewport[3] - y, 1, 1, GL_DEPTH_COMPONENT, GL_FLOAT, z); Do I miss anything? And my rendering part is shown below. I use different shaders to draw different part of scene, so how should i make it correct to read depth value from buffer? void onDisplay(void) { // Clear the window and the depth buffer glClear(GL_COLOR_BUFFER_BIT | GL