glsl

Deferred Shading and attenuation

拈花ヽ惹草 提交于 2019-12-09 21:01:57
问题 Recently I added deferred shading support in my engine; however I ran into some attenuation issues: As you can see, when I'm rendering the light volume (sphere), it doesn't blend nicely with the ambient part of the image ! Here is how I declare my point light: PointLight pointlight; pointlight.SetPosition(glm::vec3(0.0, 6.0, 0.0)); pointlight.SetIntensity(glm::vec3(1.0f, 1.0f, 1.0f)); Here is how I compute the light sphere radius: Attenuation attenuation = pointLights[i].GetAttenuation();

Rotate Normals in Shader

社会主义新天地 提交于 2019-12-09 18:24:00
问题 I have a scene with several models with individual positions and rotations. Given normals, the shaders apply simple bidirectional lighting to each pixel. That is my vertex shader. #version 150 in vec3 position; in vec3 normal; in vec2 texcoord; out vec3 f_normal; out vec2 f_texcoord; uniform mat4 model; uniform mat4 view; uniform mat4 proj; void main() { mat4 mvp = proj * view * model; f_normal = normal; f_texcoord = texcoord; gl_Position = mvp * vec4(position, 1.0); } And here is the

Textured points in OpenGL ES 2.0?

為{幸葍}努か 提交于 2019-12-09 15:46:21
问题 I'm trying to implement textured points (e.g. point sprites) in OpenGL ES 2.0 for a particle system. Problem I'm having is the points all render as solid black squares, rather than having the texture properly mapped. I have verified that gl_PointCoord is in fact returning x/y values from 0.0 to 1.0, which would map across the entire texture. The texture2D call always seems to return black though. My vertex shader : attribute vec4 aPosition; attribute float aAlpha; attribute float aSize;

OpenGL glLinkProgram returns false but info log is empty; checked everything

不想你离开。 提交于 2019-12-09 10:24:46
问题 I must admit this is my first time implementing shaders, previously I have only worked with fixed function pipeline; however, though I am certain that everything I did is correct - there must be an error. glLinkProgram(program) - returns GL_FALSE when queried for GL_LINK_STATUS . In addition, the info log is empty (when I query the log length - it is 1, which is the null terminator per the docs, it checks out). So linker errors, and no logs. In addition, I had just discovered that the linker

Dashed line in OpenGL3?

╄→尐↘猪︶ㄣ 提交于 2019-12-09 06:27:16
问题 I'm currently porting an old OpenGL 1.1 application which makes use of wireframe models to OpenGL 3.0. In 1.1 following code is used to create a dashed line: glPushAttrib(GL_ENABLE_BIT); glLineStipple(1, 0x0F0F); glEnable(GL_LINE_STIPPLE); Here as usual the parameters are pushed to the stack in order to influence all following drawing operations. My question: how is this done in OpenGL3 where this stack is no longer used? How can I set up my lines to be dashed (probably before handing the

What is, in simple terms, textureGrad()?

馋奶兔 提交于 2019-12-09 06:07:37
问题 I read the khronos wiki on this But I don;t really understand what it is saying. What exactly does textureGrad do? I think it samples multiple mipmap levels and computes some color mixing using the explicit derivative vectors given to it. But I am not sure. 回答1: When you sample a texture, you need the specific texture coordinates to sample the texture data at. For sake of simplicity, I'm going to assume a 2D texture, so the texture coordinates are a 2D vector (s,t) . (The explanation is

Use of undeclared identifier 'gl_InstanceID'

て烟熏妆下的殇ゞ 提交于 2019-12-09 04:03:29
Hi everyone, i have been trying Instanced drawing in OpenGLES2.0, in IOS platform. My rendering code glEnableVertexAttribArray(...); glVertexAttribPointer(...) glDrawElementsInstancedEXT(GL_TRIANGLES,IndicesCount, GL_UNSIGNED_SHORT, 0, 5); And my Vertex Shader attribute vec4 VertPosition; uniform mat4 mvpMatrix[600]; void main() { gl_Position = (mvpMatrix[gl_InstanceID]) * VertPosition; } I'm getting ERROR: Use of undeclared identifier 'gl_InstanceID' my glsl version is 1.0, if version is the issue then how can i upgrade ? Any other way to use "gl_InstanceID" in GLSL ? BDL gl_InstanceID is

Depth offset in OpenGL

你离开我真会死。 提交于 2019-12-09 03:34:49
问题 What would be the best way of offsetting depth in OpenGL? I currently have index vertex attribute per polygon which I am passing to the Vertex Shader in OpenGL. My goal is to offset the polygons in depth where the highest index would be always in-front of the lower index. I currently have this simple approach modifying gl_Position.z . gl_Position.z += -index * 0.00001; 回答1: The usual way to set an automatic offset for the depth is glPolygonOffset(GLfloat factor,GLfloat units) When GL_POLYGON

Precise control over texture bits in GLSL

自古美人都是妖i 提交于 2019-12-09 02:51:43
问题 I am trying to implement an octree traversal scheme using OpenGL and GLSL, and would like to keep the data in textures. While there is a big selection of formats to use for the texture data (floats and integers of different sizes) I have some trouble figuring out if there is a way to have more precise control over the bits and thus achieving greater efficiency and compact storage. This might be a general problem, not only applying to OpenGL and GLSL. As a simple toy example, let's say that I

hazy artefact on OS X WebGL on sides of volume rendering

谁都会走 提交于 2019-12-08 20:00:32
Does anyone know how to sort this weird effect? The sides of the volume we're trying to render seem artificially hazy. I'm running it on 2014 MacBook Pro, Intel Iris 1536 MB GPU, Yosemite v 10.10.2 (14C1514). I've heard that this is only a problem on machines running OS X, and it doesn't appear on Windows machines. I've also noticed it in some other places e.g. Leberba khronos.org/bugzilla/show_bug.cgi?id=1337 Bug reported so closing 来源: https://stackoverflow.com/questions/29493673/hazy-artefact-on-os-x-webgl-on-sides-of-volume-rendering