glsles

GLSL ES fragment shader produces very different results on different devices

不羁岁月 提交于 2020-01-14 18:44:31
问题 I am developing a game for Android using OpenGL ES 2.0 and have a problem with a fragment shader for drawing stars in the background. I've got the following code: precision mediump float; varying vec2 transformed_position; float rand(vec2 co) { return fract(sin(dot(co.xy ,vec2(12.9898,78.233))) * 43758.5453); } void main(void) { float distance = 10.0; float quantization_strength = 4.0; vec3 background_color = vec3(0.09, 0.0, 0.288); vec2 zero = vec2(0.0, 0.0); vec2 distance_vec = vec2

Manual selection lod of mipmaps in a fragment shader using three.js

岁酱吖の 提交于 2020-01-02 05:12:42
问题 I'm writing a physically based shader using glsl es in three.js. For the addition of specular global illumination I use a cubemap dds texture with mipmap chain inside (precalculate with CubeMapGen as it's explained here). I need to access this texture in fragment shader and I would like to select manually the index of mipmap. The correct function for doing this is vec4 textureCubeLod(samplerCube sampler, vec3 coord, float lod) but it's available only in vertex shader. In my fragment shader I

How do you pack one 32bit int Into 4, 8bit ints in glsl / webgl?

眉间皱痕 提交于 2019-12-28 01:56:14
问题 I'm looking to parallelize some complex math, and webgl looks like the perfect way to do it. The problem is, you can only read 8 bit integers from textures. I would ideally like to get 32 bit numbers out of the texture. I had the idea of using the 4 color channels to get 32 bits per pixel, instead of 4 times 8 bits. My problem is, glsl doesn't have a "%" operator or any bitwise operator! TLDR: How do I convert a 32bit number to 4 8bit numbers by using the operators in glsl. Some extra info on

GLSL shader for texture cubic projection

拜拜、爱过 提交于 2019-12-24 10:17:59
问题 I am trying to implement a texture cubic projection inside my WebGL shader, like in the picture below: What I tried so far: I am passing the bounding box of my object (the box in the middle of the picture) as follows: uniform vec3 u_bbmin; uniform vec3 u_bbmax; ... so the eight vertexes of my projection box are: vec3 v1 = vec3(u_bbmin.x, u_bbmin.y, u_bbmin.z); vec3 v2 = vec3(u_bbmax.x, u_bbmin.y, u_bbmin.z); vec3 v3 = vec3(u_bbmin.x, u_bbmax.y, u_bbmin.z); ...other combinations vec3 v8 = vec3

Can I avoid texture gradient calculations in webgl?

孤街醉人 提交于 2019-12-23 14:53:38
问题 We have a webgl/three.js application that makes extensive use of texture buffers for passing data between passes and for storing arrays of data. None of these has any use for mipmaps. We are easily able to prevent mipmap generation: at the three.js level we set min and mag filters to NearestFilter, and set generateMipmaps false. However, the shaders do not know at compile time that there is no mipmapping. When compiled using ANGLE we get a lot of warning messages: warning X4121: gradient

Three.js/GLSL - Convert Pixel Coordinate to World Coordinate

流过昼夜 提交于 2019-12-23 12:46:59
问题 I have a simple shader in my Three.js application that colors the screen red. However, I want to color all pixels to the right of a given world position to to a different color. I have seen some answers that suggest using varying vec4 worldCoord = gl_ModelViewMatrix * gl_Vertex; , but since WebGL using GLSLES, variables like gl_Vertex are not available to me. Vertex Shader <script type="x-shader/x-vertex" id="vertexshader"> #ifdef GL_ES precision highp float; #endif void main() { gl_Position

OpenGL ES3 Shadow map problems

柔情痞子 提交于 2019-12-22 01:36:10
问题 I work on C++ project for Android with OpenGL ES3, so I try to implement the shadow map with directional light, I understand the theory well but I never get it successfully rendered. first I create the framebuffer which contains the depth map: glGenFramebuffers(1, &depthMapFBO); glBindFramebuffer(GL_FRAMEBUFFER, depthMapFBO); glGenTextures(1, &depthMap); glBindTexture(GL_TEXTURE_2D, depthMap); glTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT, SHADOW_WIDTH, SHADOW_HEIGHT, 0, GL_DEPTH

When switching to GLSL 300, met the following error

隐身守侯 提交于 2019-12-21 04:42:06
问题 when I switch to use OpenGL ES 3 with GLSL 300, I met the following error in my frag shader undeclared identifier gl_FragColor when using GLSL 100, everything is fine. 回答1: Modern versions of GLSL do fragment shader outputs simply by declaring them as out values, and gl_FragColor is no longer supported, hence your error. Try this: out vec4 fragColor; void main() { fragColor = vec4(1.0, 0.0, 0.0, 1.0); } Note that gl_FragDepth hasn't changed and is still available. For more information see

Three.js - Using multiple textures in a single PointCloud

醉酒当歌 提交于 2019-12-18 17:26:31
问题 I'm trying to use multiple textures in a single PointCloud using a ShaderMaterial. I'm passing a texture array to the shader along with texture index attributes and selecting the appropriate texture to use in the fragment shader. Relevant Setup Code: var particleCount = 100; var uniforms = { textures: { type: 'tv', value: this.getTextures() } }; var attributes = { texIndex: { type: 'f', value: [] }, color: { type: 'c', value: [] }, }; var material = new THREE.ShaderMaterial({ uniforms:

Three.js - Using multiple textures in a single PointCloud

狂风中的少年 提交于 2019-12-18 17:26:22
问题 I'm trying to use multiple textures in a single PointCloud using a ShaderMaterial. I'm passing a texture array to the shader along with texture index attributes and selecting the appropriate texture to use in the fragment shader. Relevant Setup Code: var particleCount = 100; var uniforms = { textures: { type: 'tv', value: this.getTextures() } }; var attributes = { texIndex: { type: 'f', value: [] }, color: { type: 'c', value: [] }, }; var material = new THREE.ShaderMaterial({ uniforms: