glsl

Get old style OpenGL code work in GLSL

℡╲_俬逩灬. 提交于 2020-01-05 16:48:23
问题 I am trying to draw this pattern in OpenGL : To get this, I created the pattern like : vector< vector<DataPoint> > datas; float Intensitytemp=0; float xPos=0, yPos=0, angleInRadians=0; for (float theta = 0.0f; theta < 4096; theta += 1.f) { vector<DataPoint> temp; angleInRadians = 2 * M_PI*theta / 4096; for (float r = 0; r < 4096; r += 1.f) { xPos = cos(angleInRadians)*r / 4096; yPos = sin(angleInRadians)*r / 4096; Intensitytemp = ((float)((int)r % 256)) / 255; DataPoint dt; dt.x = xPos; dt.y

How to optimize a color gradient shader?

ε祈祈猫儿з 提交于 2020-01-05 14:03:11
问题 I have created this simple fragment shader for achieving a vertical color gradient effect. But I find this to be taxing for my mobile device in full screen. is there any way to optimize this? here is the link to the code http://glsl.heroku.com/e#13541.0 回答1: You could do something like this instead. vec2 position = (gl_FragCoord.xy / resolution.xy); vec4 top = vec4(1.0, 0.0, 1.0, 1.0); vec4 bottom = vec4(1.0, 1.0, 0.0, 1.0); gl_FragColor = vec4(mix(bottom, top, position.y)); Example You can

GLSL Shader - Shadow between 2 textures on a plane

我是研究僧i 提交于 2020-01-05 10:36:17
问题 I'm writting a game with AGK (App Game Kit) and I wanted to make some shadows with shaders. (AGK only support GLSL 1.20 at the moment) On my game, I have a plane object, where I have 2 textures. The first texture is the background texture, and the second, is the foreground texture, with a transparent path where we see the background texture (they are like the walls) and I have a pointer light. Here is an example (left is what I have, right is what I want) : And here is the code : attribute

Draw a line segment in a fragment shader

牧云@^-^@ 提交于 2020-01-05 07:47:12
问题 I'm struggling to understand the following code, the idea is to draw a simple segment in a fragment shader. I tried to decompose it but I still don't get the ??? line. It would be awesome to have a nice explanation. I couldn't find anything on SO or Google. float lineSegment(vec2 p, vec2 a, vec2 b) { float thickness = 1.0/100.0; vec2 pa = p - a; vec2 ba = b - a; float h = clamp( dot(pa,ba)/dot(ba,ba), 0.0, 1.0 ); // ???????? float idk = length(pa - ba*h); return smoothstep(0.0, thickness, idk

Linking and compiling shaders for OpenGL 3.3 and 2.1 using LWJGL

徘徊边缘 提交于 2020-01-05 05:50:06
问题 I am using LWJGL to create an OpenGL context. I can get it running on my machine (OpenGL 4.2 compatible) and with changes to the simple shaders also on OpenGL 2.1. I have to write code (the shaders, or rather the linking and compilation of them, seem to be the problem here) that is compatible with OpenGL 2.1. I assumed it would be easy to just write: The Frag shader: #version 120 in vec4 pass_Color; out vec4 out_Color; void main(void) { out_Color = pass_Color; } The Vert shader: #version 120

GLSL vertex shader crashes computer

让人想犯罪 __ 提交于 2020-01-05 04:22:08
问题 I've been trying to pinpoint the exact cause of a GLSL shader that crashes my computer. I'm running Mac OS X 10.8.2 with a NVIDIA GeForce 9400M. The shader renders correctly but will occasionally crash my computer, drawing regions of black over the display (including outside of the rendering window) until the computer becomes unresponsive. I receive no errors from glGetError and no errors during shader compilation. It appears that the crash no longer occurs when I remove a uniform mat4 from

GLSL vertex shader crashes computer

纵饮孤独 提交于 2020-01-05 04:22:06
问题 I've been trying to pinpoint the exact cause of a GLSL shader that crashes my computer. I'm running Mac OS X 10.8.2 with a NVIDIA GeForce 9400M. The shader renders correctly but will occasionally crash my computer, drawing regions of black over the display (including outside of the rendering window) until the computer becomes unresponsive. I receive no errors from glGetError and no errors during shader compilation. It appears that the crash no longer occurs when I remove a uniform mat4 from

OpenGL: Blinn-Phong model implemented in shaders give wrong result

£可爱£侵袭症+ 提交于 2020-01-05 04:18:30
问题 First of all I'm new with computer graphics, openGL and have a basic knowledge of c++ coding. I have struggled with the openGL project like a month and have come to a point where I have to implement shading with Blinn-Phong model. I have implemented calculations in vertex and fragment shaders. There is/are propably some minor error(s) in code because without shading everything works perfectly but after shading parts are added to shaders anything does not happen. I calculate surface normals in

Error in Geometry Shader from large array

点点圈 提交于 2020-01-05 03:37:10
问题 When I try to link my Geometry Shader it throws the following error: 0(76) : error C5041: cannot locate suitable resource to bind variable "triTable". Possibly large array. In reference to this array declared in the shader: const int triTable[256][16] = { { -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, { 0, 8, 3, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, { 0, 1, 9, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, ... ... ... { -1, -1, -1, -1, -1, -1, -1,

Shader for counting number of pixels

谁说我不能喝 提交于 2020-01-05 02:34:12
问题 I'm looking for a shader CG or HLSL, that can count number of red pixels or any other colors that I want. 回答1: You could do this with atomic counters in a fragment shader. Just test the output color to see if it's within a certain tolerance of red, and if so, increment the counter. After the draw call you should be able to read the counter's value on the CPU and do whatever you like with it. edit: added a very simple example fragment shader: // Atomic counters require 4.2 or higher according