shader

OpenGL: Blinn-Phong model implemented in shaders give wrong result

£可爱£侵袭症+ 提交于 2020-01-05 04:18:30
问题 First of all I'm new with computer graphics, openGL and have a basic knowledge of c++ coding. I have struggled with the openGL project like a month and have come to a point where I have to implement shading with Blinn-Phong model. I have implemented calculations in vertex and fragment shaders. There is/are propably some minor error(s) in code because without shading everything works perfectly but after shading parts are added to shaders anything does not happen. I calculate surface normals in

Error in Geometry Shader from large array

点点圈 提交于 2020-01-05 03:37:10
问题 When I try to link my Geometry Shader it throws the following error: 0(76) : error C5041: cannot locate suitable resource to bind variable "triTable". Possibly large array. In reference to this array declared in the shader: const int triTable[256][16] = { { -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, { 0, 8, 3, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, { 0, 1, 9, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, ... ... ... { -1, -1, -1, -1, -1, -1, -1,

ThreeJS: How can I use a shader to filter vertices by property?

南笙酒味 提交于 2020-01-04 17:31:28
问题 I'm using custom shaders to allow for slider filters on X, Y, and Z co-ordinates on a particle system by following this github issue. I'd like to expand this to allow filtering on non-positional properties such as a cost associated with each vertex. shader is a modified version of ThreeJS's particle_basic shader. Shader // snip // filter by co-ordinate "if(myWorldPosition.x < xMin) discard;", "if(myWorldPosition.x > xMax) discard;", "if(myWorldPosition.y < yMin) discard;", "if(myWorldPosition

Chroma key Fragment Shader fails to find the color

↘锁芯ラ 提交于 2020-01-04 11:42:55
问题 I'm trying to write a fragment-shader that functions as a chroma-key filter for a specific color (for example make all pixels with a specific green transparent). The shader I'm writing is for use in WebGL trough PIXI.js. JSFiddle: https://jsfiddle.net/IbeVanmeenen/hexec6eg/14/ So far, I wrote this code for the shader, based on the shader I've found here. varying vec2 vTextureCoord; uniform float thresholdSensitivity; uniform float smoothing; uniform vec3 colorToReplace; uniform sampler2D

Chroma key Fragment Shader fails to find the color

ぃ、小莉子 提交于 2020-01-04 11:42:26
问题 I'm trying to write a fragment-shader that functions as a chroma-key filter for a specific color (for example make all pixels with a specific green transparent). The shader I'm writing is for use in WebGL trough PIXI.js. JSFiddle: https://jsfiddle.net/IbeVanmeenen/hexec6eg/14/ So far, I wrote this code for the shader, based on the shader I've found here. varying vec2 vTextureCoord; uniform float thresholdSensitivity; uniform float smoothing; uniform vec3 colorToReplace; uniform sampler2D

Offset gl_Position or gl_Vertex by pixels value

拜拜、爱过 提交于 2020-01-04 06:45:47
问题 I have attribute contains pixels values. And i want to offset my gl_vertex with this attribute value. The problem is that my gl_vertex is in world units and the offset\attribute is in pixels. I can do it if i'm sending the screen size as uniforms and then convert the pixels to -1 to 1 values, and add it to the final gl_Position. But i don't want now to manage screen size events and sending it anyway every draw, every shader that i have. It there any way to do it with some matrix play, or

Offset gl_Position or gl_Vertex by pixels value

孤者浪人 提交于 2020-01-04 06:45:26
问题 I have attribute contains pixels values. And i want to offset my gl_vertex with this attribute value. The problem is that my gl_vertex is in world units and the offset\attribute is in pixels. I can do it if i'm sending the screen size as uniforms and then convert the pixels to -1 to 1 values, and add it to the final gl_Position. But i don't want now to manage screen size events and sending it anyway every draw, every shader that i have. It there any way to do it with some matrix play, or

How to get BitmapImage bytes after applying <Image.Effect>

∥☆過路亽.° 提交于 2020-01-04 01:52:08
问题 This one: BitmapSource originalImage; byte[] _originalPixels; _originalPixels = new byte[(int) originalImage.Width*(int) originalImage.Height*4]; originalImage.CopyPixels(_originalPixels, 4*(int) originalImage.Width, 0); copies the image bytes before applying a filter and there is no surprising. How to get bytes with an effect already applied? How to apply shader effect programmatically to byte[] or some kind low level pixel structure array? 回答1: How about rendering your image to a

How to get BitmapImage bytes after applying <Image.Effect>

左心房为你撑大大i 提交于 2020-01-04 01:52:07
问题 This one: BitmapSource originalImage; byte[] _originalPixels; _originalPixels = new byte[(int) originalImage.Width*(int) originalImage.Height*4]; originalImage.CopyPixels(_originalPixels, 4*(int) originalImage.Width, 0); copies the image bytes before applying a filter and there is no surprising. How to get bytes with an effect already applied? How to apply shader effect programmatically to byte[] or some kind low level pixel structure array? 回答1: How about rendering your image to a

C++ OpenGL shading version error - GLSL x is not supported [Ubuntu 16.04]

旧街凉风 提交于 2020-01-03 13:36:17
问题 I am currently working on a project using OpenGL on Ubuntu 16.04 and have run into a major issue. At this point I have no idea what to do as it feels like I have tried everything in order to fix this. For some reason my shader just won't compile and returns the following error: Failed to compile vertex shader! 0:1(10): error: GLSL 4.50 is not supported. Supported versions are: 1.10, 1.20, 1.30, 1.00 ES, 3.00 ES, 3.10 ES, and 3.20 ES` I have adjusted the version in the shader file without any