opengl

Why are glsl variables not working as expected?

好久不见. 提交于 2020-06-28 06:07:48
问题 I am working on a 3D renderer which was working as expected but now I am trying to batch every cube into a single draw call (my renderer can only draw cubes right now). Here I have my glsl program that runs for each batch #type vertex #version 330 core layout(location = 0) in vec3 a_Position; layout(location = 1) in vec4 a_Color; layout(location = 2) in vec3 a_TexCoord; layout(location = 3) in int a_TexIndex; uniform mat4 u_ProjectionView; out vec4 v_Color; out vec3 v_TexCoord; out flat int v

Why are glsl variables not working as expected?

纵然是瞬间 提交于 2020-06-28 06:07:05
问题 I am working on a 3D renderer which was working as expected but now I am trying to batch every cube into a single draw call (my renderer can only draw cubes right now). Here I have my glsl program that runs for each batch #type vertex #version 330 core layout(location = 0) in vec3 a_Position; layout(location = 1) in vec4 a_Color; layout(location = 2) in vec3 a_TexCoord; layout(location = 3) in int a_TexIndex; uniform mat4 u_ProjectionView; out vec4 v_Color; out vec3 v_TexCoord; out flat int v

2 QOpenGLWidget shared context causing crash

六月ゝ 毕业季﹏ 提交于 2020-06-28 03:32:50
问题 I would like to solve problem that I still deal with.. thats render 2 QOpenGLWidgets at same time in different top level windows with shared shader programs etc. Why do I post it here and not on Qt forums? I already did, but no one response :/ My first attempt was to use one context, wasnt working. The question: Is it even possible currently with QOpenGLWidget? Or I have to go to older QGLWidget? Or use something else? testAttribute for Qt::AA_ShareOpenGLContexts returns true so there is not

CPU to GPU normal mapping

大憨熊 提交于 2020-06-27 18:32:06
问题 I'm creating a terrain mesh, and following this SO answer I'm trying to migrate my CPU computed normals to a shader based version, in order to improve performances by reducing my mesh resolution and using a normal map computed in the fragment shader. I'm using MapBox height map for the terrain data. Tiles look like this: And elevation at each pixel is given by the following formula: const elevation = -10000.0 + ((red * 256.0 * 256.0 + green * 256.0 + blue) * 0.1); My original code first

OpenGL, how to set up GLSL version?

我是研究僧i 提交于 2020-06-27 08:42:11
问题 My system's default version for OpenGL and GLSL using freeglut is 4.1, also using glew there is no problem with its initialization, shader compilation and linking, and execution. This default version happens when I don't specify glutInitContextVersion, glutInitContextFlags or glutInitContextProfile, then my shaders work correct. Regardless I have support to this version, I would like to provide a 3.3 alternative. When I use the glut context specifying the 3.3 version, the application starts

OpenGL, how to set up GLSL version?

元气小坏坏 提交于 2020-06-27 08:42:08
问题 My system's default version for OpenGL and GLSL using freeglut is 4.1, also using glew there is no problem with its initialization, shader compilation and linking, and execution. This default version happens when I don't specify glutInitContextVersion, glutInitContextFlags or glutInitContextProfile, then my shaders work correct. Regardless I have support to this version, I would like to provide a 3.3 alternative. When I use the glut context specifying the 3.3 version, the application starts

OpenGL, how to set up GLSL version?

。_饼干妹妹 提交于 2020-06-27 08:42:02
问题 My system's default version for OpenGL and GLSL using freeglut is 4.1, also using glew there is no problem with its initialization, shader compilation and linking, and execution. This default version happens when I don't specify glutInitContextVersion, glutInitContextFlags or glutInitContextProfile, then my shaders work correct. Regardless I have support to this version, I would like to provide a 3.3 alternative. When I use the glut context specifying the 3.3 version, the application starts

Where do pixel gaps come from in OpenGL?

纵饮孤独 提交于 2020-06-27 06:58:07
问题 The problem that I have is, that there are some pixles in my rendered scene that seem to be missing/invisible and therefore have the same color as my clear color. Interestingly, this only happens if MSAA is turned off. My first thought was, that it could have something to do with the fact, that all the triangles are overlapping and somehow distorted by the projection matrix but these artifacts only seem to occur on lines rather than edges. I read about just applying a scale of 1.00001 to

Camera lens distortion in OpenGL

孤者浪人 提交于 2020-06-26 05:52:23
问题 I'm trying to simulate lens distortion effect for my SLAM project. A scanned color 3D point cloud is already given and loaded in OpenGL. What I'm trying to do is render 2D scene at a given pose and do some visual odometry between the real image from a fisheye camera and the rendered image. As the camera has severe lens distortion, it should be considered in the rendering stage too. The problem is that I have no idea where to put the lens distortion. Shaders? I've found some open codes that

How to run OpenCL + OpenGL inside a Docker container?

一笑奈何 提交于 2020-06-24 16:50:08
问题 The aim is to run an OpenCL/OpenGL (interop) app inside a docker container. But I have not been successful yet. Intro I have laptop with an NVidia graphics card so I thought leveraging on NVidia Dockerfiles [1,2] would be a good starting point. The following Dockerfile: # Dockerfile to run OpenGL app FROM nvidia/opengl:1.0-glvnd-runtime-ubuntu16.04 ENV NVIDIA_DRIVER_CAPABILITIES ${NVIDIA_DRIVER_CAPABILITIES},display RUN apt-get update && apt-get install -y --no-install-recommends \ mesa-utils