shader

Running normal maps / bump maps on Pixi.js

删除回忆录丶 提交于 2020-07-16 09:41:23
问题 I would really like to use normal maps in pixi.js but I don't know how. I've came across a good example with THREE.js and using webgl shaders - clickOnDemo (sometimes it's necessary to click once on the image to work). So I really wanted to replicate it but using pixi, I gave it a shot aand black screen - myDemo . I don't get errors and I'm not sure what I'm doing wrong. Could the problem be in the way I'm creating the mesh or the shader? var material = new PIXI.Shader.from(null, shaderCode,

Running normal maps / bump maps on Pixi.js

孤者浪人 提交于 2020-07-16 09:40:34
问题 I would really like to use normal maps in pixi.js but I don't know how. I've came across a good example with THREE.js and using webgl shaders - clickOnDemo (sometimes it's necessary to click once on the image to work). So I really wanted to replicate it but using pixi, I gave it a shot aand black screen - myDemo . I don't get errors and I'm not sure what I'm doing wrong. Could the problem be in the way I'm creating the mesh or the shader? var material = new PIXI.Shader.from(null, shaderCode,

OpenGL directional light shader

北战南征 提交于 2020-07-08 13:33:11
问题 I want to add directional light to my scene using OpenGL and GLSL. The problem is that the theoretically correct way to do so has the wrong results. In the vertex shader I do the following: The direction of the light is given in world-coordinates and transformed using the viewMatrix to camera-coordinates. The normal of the vertex is transformed using the normal-matrix to camera-coordinates. void main () { vary_textureCoord = attribute_textureCoord; vary_normal = mat3(normalMatrix) * attribute

OpenGL directional light shader

坚强是说给别人听的谎言 提交于 2020-07-08 13:32:39
问题 I want to add directional light to my scene using OpenGL and GLSL. The problem is that the theoretically correct way to do so has the wrong results. In the vertex shader I do the following: The direction of the light is given in world-coordinates and transformed using the viewMatrix to camera-coordinates. The normal of the vertex is transformed using the normal-matrix to camera-coordinates. void main () { vary_textureCoord = attribute_textureCoord; vary_normal = mat3(normalMatrix) * attribute

OpenGL directional light shader

ぃ、小莉子 提交于 2020-07-08 13:32:35
问题 I want to add directional light to my scene using OpenGL and GLSL. The problem is that the theoretically correct way to do so has the wrong results. In the vertex shader I do the following: The direction of the light is given in world-coordinates and transformed using the viewMatrix to camera-coordinates. The normal of the vertex is transformed using the normal-matrix to camera-coordinates. void main () { vary_textureCoord = attribute_textureCoord; vary_normal = mat3(normalMatrix) * attribute

what does instancing do in webgl

你离开我真会死。 提交于 2020-07-07 05:07:32
问题 I want to know is any way to understand how many times vertex shader will be called in a draw call in webgl? because I want to know what does instancing realy do, is it call every shared vertices for each instance? so it will call too many time vertex shader 回答1: Instancing calls your vertex shader one per vertex per instance. The difference is you can choose 1 or more attributes to only advance once per instance instead of once per vertex. Normally each attribute advances stride bytes for

Why are glsl variables not working as expected?

好久不见. 提交于 2020-06-28 06:07:48
问题 I am working on a 3D renderer which was working as expected but now I am trying to batch every cube into a single draw call (my renderer can only draw cubes right now). Here I have my glsl program that runs for each batch #type vertex #version 330 core layout(location = 0) in vec3 a_Position; layout(location = 1) in vec4 a_Color; layout(location = 2) in vec3 a_TexCoord; layout(location = 3) in int a_TexIndex; uniform mat4 u_ProjectionView; out vec4 v_Color; out vec3 v_TexCoord; out flat int v

Why are glsl variables not working as expected?

纵然是瞬间 提交于 2020-06-28 06:07:05
问题 I am working on a 3D renderer which was working as expected but now I am trying to batch every cube into a single draw call (my renderer can only draw cubes right now). Here I have my glsl program that runs for each batch #type vertex #version 330 core layout(location = 0) in vec3 a_Position; layout(location = 1) in vec4 a_Color; layout(location = 2) in vec3 a_TexCoord; layout(location = 3) in int a_TexIndex; uniform mat4 u_ProjectionView; out vec4 v_Color; out vec3 v_TexCoord; out flat int v

CPU to GPU normal mapping

大憨熊 提交于 2020-06-27 18:32:06
问题 I'm creating a terrain mesh, and following this SO answer I'm trying to migrate my CPU computed normals to a shader based version, in order to improve performances by reducing my mesh resolution and using a normal map computed in the fragment shader. I'm using MapBox height map for the terrain data. Tiles look like this: And elevation at each pixel is given by the following formula: const elevation = -10000.0 + ((red * 256.0 * 256.0 + green * 256.0 + blue) * 0.1); My original code first

Camera lens distortion in OpenGL

孤者浪人 提交于 2020-06-26 05:52:23
问题 I'm trying to simulate lens distortion effect for my SLAM project. A scanned color 3D point cloud is already given and loaded in OpenGL. What I'm trying to do is render 2D scene at a given pose and do some visual odometry between the real image from a fisheye camera and the rendered image. As the camera has severe lens distortion, it should be considered in the rendering stage too. The problem is that I have no idea where to put the lens distortion. Shaders? I've found some open codes that