shader

GLSL 三种变量类型(uniform,attribute和varying)

我只是一个虾纸丫 提交于 2019-12-25 01:23:35
1.uniform变量 uniform变量是外部程序传递给(vertex和fragment)shader的变量。因此它是application通过函数glUniform**()函数赋值的。在(vertex和fragment)shader程序内部,uniform变量就像是C语言里面的常量(const ),它不能被shader程序修改。(shader只能用,不能改) 如果uniform变量在vertex和fragment两者之间声明方式完全一样,则它可以在vertex和fragment共享使用。(相当于一个被vertex和fragment shader共享的全局变量) uniform变量一般用来表示:变换矩阵,材质,光照参数和颜色等信息。 以下是例子: uniform mat4 viewProjMatrix; //投影+视图矩阵 uniform mat4 viewMatrix; //视图矩阵 uniform vec3 lightPosition; //光源位置 uniform float lumaThreshold; uniform float chromaThreshold; uniform sampler2D SamplerY; uniform sampler2D SamplerUV; uniform mat3 colorConversionMatrix; 2

Drawing a cube for each vertex

ぃ、小莉子 提交于 2019-12-25 00:48:29
问题 I have a list of 3D vertices which I can easily render as a pointcloud by passing the whole list to my vertex shader, setting gl_Position = pos , then setting FragColor = vec4(1.0, 1.0, 1.0, 1.0) and use GL_POINTS in the drawing function. I would now like to render an actual cube at that vertex position, with the vertex being the center of the cube and some given width. How can I achieve this in the most easy and performant way? Looping through all vertices, loading a cube into a buffer and

detectMultiScale(…) internal principle? [closed]

我们两清 提交于 2019-12-25 00:05:24
问题 Closed . This question needs details or clarity. It is not currently accepting answers. Want to improve this question? Add details and clarify the problem by editing this post. Closed 5 days ago . This is an algorithm question on the detectMultiScale(...) function from the opencv library. I need help to understand what opencv's detectMulitScale() function exactly does. I have understood from reading the C++ code that the source image is scaled with several scales based on scaleFactor and size

Why don't my shaders work for trapezoids polygons?

时光毁灭记忆、已成空白 提交于 2019-12-24 20:30:27
问题 I need to draw parts of a texture on trapezoids polygons. (old school fake-3D race game, the road is made of these trapezoids and I want to apply a texture on them.) But the texture appears wrong, like if each of the two triangles forming a trapezoid are half of a parrallelogram instead, they have each a different horizontal skewing, not a global perspective transformation. Searching for a solution I see that this problem is common and the reason why is that the two triangles are not equal

How do I discard pixels based on vertex color and turn that on or off in a MonoBehaviour?

白昼怎懂夜的黑 提交于 2019-12-24 19:05:32
问题 I've written a shader that uses a mesh's vertex colors and also has a function that clips all vertices that have an interpolated vertex color in the blue channel greater than 0.5 (discards all blue vertices). I'm trying to create a voice command that allows the user to call the function when they are ready. However, in Microsoft's Mixed Reality Toolkit Speech Input Handler, it only allows me to call functions from components (Mesh Renderer, Mesh Filter, Mesh Collider, etc.) of the GameObject

GLSL shader on a .png logo with swipe effect from left to right. I

寵の児 提交于 2019-12-24 18:33:18
问题 I am junior developer attempting to create a swipe effect from left to right. Where the logo disappears/gets erased from left to right and then appears from right to left. I've new to GLSL and have never programmed in C++. All my attempts have created varying results. The current result is that the logo it self gets compressed from left to right. So I would be very thankful if someone could give some advice as to code examples that have a similar effect or explain what i'm doing wrong here.

OpenGL Stencil - Exclude transparent pixels

左心房为你撑大大i 提交于 2019-12-24 16:46:02
问题 I have this texture that I want to use as as mask using the Stencil buffer : Then I want to draw an image IMG on the screen which should appear only where the above image is visible (where there's color), thus excluding transparent pixels above the top edge and below the gradient. The problem is whenever I draw my image IMG on the screen it appears everywhere where the image has been drawn, whether the pixels are transparent or not. So I thought about using the ALPHA_TEST but it's gone in the

Passing data into different shaders

痞子三分冷 提交于 2019-12-24 15:58:55
问题 Lets say my Shaderprogram contains a Vertex- and a Fragmentshader. How can i pass information straight to the fragmentshader? When i Use gl_uniform... and specifiy the varaible I want to adress inside the fragmet shader it throws me an error like this: Using ShaderProgram: The fragment shader uses varying myVarHere, but previous shader doesnot write to it. Since I attach the vertexshader first, I found out that I need to "pass the information through" the vertex shader using in and out. Im

Failed to get socket connection from UnityShaderCompiler.exe shader compiler

扶醉桌前 提交于 2019-12-24 12:43:50
问题 In Unity each time I try to change/set a shader to a material (in code or in the editor) Unity hangs for a while and logs "Failed to get socket connection from UnityShaderCompiler.exe shader compiler!" in the console. I killed Unity, rebooted the PC, deactivated the firewall, nothing seems to change anything. I googled the error message but no luck. Any idea? 回答1: This specific issue was resolved by Jerry's comment to update to Unity 5.3.1p3. The issue was a Unity bug in an older version.

Force a sprite object to always be in front of skydome object in THREE.js

偶尔善良 提交于 2019-12-24 12:33:44
问题 I currently have a custom shader draw a gradient on a skydome and would like to have a sun/moon in front of the skydome (from users perspective). The easiest way to do this is to have sprites for the sun and moon but the problem that arises is that the sprites get lodged within the skydome (sprite is partway in front and partway in back of the skydome). I have attempted to solve this with polygonoffset, but that doesn't seem to work on sprite objects. So my question is, how can I setup a sun