glsl

LWJGL texture rendering/indexing

三世轮回 提交于 2021-02-10 06:46:27
问题 I am currently having issues with trying to render two textures onto two totally separate objects through a single vertex, and fragment shader. The issue seems to lie in trying to index, and bind the two textures onto their own objects. In trying to index and bind the textures, the smaller index will always appear onto both objects. Can someone help me, or at least push me into the right direction? here is my code for the main class, the renderer, and the fragment shader. (feel free to

Texture does not fit in the square - OpenGL

老子叫甜甜 提交于 2021-02-08 08:32:51
问题 I am trying to create a square and put a red circle middle of it. As you can see in the simple vert I map (-1, 1) range to (0,1) range: coord = position.xy*0.5 + vec2(0.5,0.5); If you look at the simple.frag, there is this line: if(abs(length(coord - vec2(0.5,0.5))) < 0.3) Thus I am expecting red circle at the middle of square. However this happens: The center of circle is beyond of (0.5, 0.5). If I change that line to this: if(abs(length(coord - vec2(0.0,0.0))) < 0.3) Output is: The center

Texture does not fit in the square - OpenGL

不羁的心 提交于 2021-02-08 08:32:25
问题 I am trying to create a square and put a red circle middle of it. As you can see in the simple vert I map (-1, 1) range to (0,1) range: coord = position.xy*0.5 + vec2(0.5,0.5); If you look at the simple.frag, there is this line: if(abs(length(coord - vec2(0.5,0.5))) < 0.3) Thus I am expecting red circle at the middle of square. However this happens: The center of circle is beyond of (0.5, 0.5). If I change that line to this: if(abs(length(coord - vec2(0.0,0.0))) < 0.3) Output is: The center

How to create a numpy array to describe the vertices of a triangle?

若如初见. 提交于 2021-02-08 06:22:23
问题 I like to use Numpy to create an array of vertices that is to be passed into glsl . Vertices will be a numpy array that comprises the info of 3 vertex. Each vertex consist of: pos = (x, y) a 64-bit signed floating-point format that has a 32-bit R component in bytes 0..3, and a 32-bit G component in bytes 4..7, and color = (r, g, b) a 96-bit signed floating-point format that has a 32-bit R component in bytes 0..3, a 32-bit G component in bytes 4..7, and a 32-bit B component in bytes 8..11 i.e.

Syntaxerror 'varying' in GLSL

馋奶兔 提交于 2021-02-07 19:02:55
问题 I'm using GLFW 3 and OpenGL 4 + GLSL 4 on a MacBook Pro. I get the following syntax error, when starting my program: ERROR: 0:5: 'varying' : syntax error syntax error The shader code: #version 410 varying vec3 vertex; void main() { } Why am I not allowed to use varying variables? 回答1: Why am I not allowed to use varying variables? Because they have been replaced by the more generic in / out variable concept since GLSL 1.30. That became necessary because with GL3, the geometry shader was

How to achieve vector swizzling in C++?

[亡魂溺海] 提交于 2021-02-07 11:38:31
问题 struct vec2 { union { struct { float x, y; }; struct { float r, g; }; struct { float s, t; }; }; vec2() {} vec2(float a, float b) : x(a), y(b) {} }; struct vec3 { union { struct { float x, y, z; }; struct { float r, g, b; }; struct { float s, t, p; }; // Here is the problem with g++. struct { vec2 xy; float z; }; struct { float x; vec2 yz; }; }; vec3() {} vec3(float a, float b, float c) : x(a), y(b), z(c) {} }; The code above compiles and works as expected in Visual Studio and so I can use it

Pure WebGL Dashed Line

三世轮回 提交于 2021-02-07 09:46:20
问题 I'm trying to create a dashed line using pure webgl. I know there is already a question on this, and maybe I'm dumb, but I cannot figure out how to make it work. I understand the concept, but I do not know how to get the distance along the path in the shader. A previous answer had the following line: varying float LengthSoFar; // <-- passed in from the vertex shader So how would I get LengthSoFar ? How can I calculate it in the vertex shader? Am I totally missing something? Can someone give

Pure WebGL Dashed Line

╄→гoц情女王★ 提交于 2021-02-07 09:44:13
问题 I'm trying to create a dashed line using pure webgl. I know there is already a question on this, and maybe I'm dumb, but I cannot figure out how to make it work. I understand the concept, but I do not know how to get the distance along the path in the shader. A previous answer had the following line: varying float LengthSoFar; // <-- passed in from the vertex shader So how would I get LengthSoFar ? How can I calculate it in the vertex shader? Am I totally missing something? Can someone give

WebGL2 render Uint16Array to canvas as an image

。_饼干妹妹 提交于 2021-02-07 08:40:24
问题 I am attempting to render a Uint16Array to an image in the browser using webgl2 textures. I have a working example fiddle of a Uint8Array and am struggling with the upgrade to 16bit as webgl has a steep learning curve. Working 8-bit fiddle (identical to snippet below): Uint8Array http://jsfiddle.net/njxvftc9/2/ Non-working 16-bit attempt: Uint16Array http://jsfiddle.net/njxvftc9/3/ // image data var w = 128; var h = 128; var size = w * h * 4; var img = new Uint8Array(size); // need

WebGL2 render Uint16Array to canvas as an image

风格不统一 提交于 2021-02-07 08:38:10
问题 I am attempting to render a Uint16Array to an image in the browser using webgl2 textures. I have a working example fiddle of a Uint8Array and am struggling with the upgrade to 16bit as webgl has a steep learning curve. Working 8-bit fiddle (identical to snippet below): Uint8Array http://jsfiddle.net/njxvftc9/2/ Non-working 16-bit attempt: Uint16Array http://jsfiddle.net/njxvftc9/3/ // image data var w = 128; var h = 128; var size = w * h * 4; var img = new Uint8Array(size); // need