glsl

Three.js - shader code for halo effect, normals need transformation

安稳与你 提交于 2019-12-05 00:02:47
问题 I am attempting to create a shader to produce a glowing halo effect in Three.js. My current attempt is live here: http://stemkoski.github.io/Three.js/Shader-Halo.html The shader code is currently: <script id="vertexShader" type="x-shader/x-vertex"> varying vec3 vNormal; void main() { vNormal = normalize( normalMatrix * normal ); gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 ); } </script> <script id="fragmentShader" type="x-shader/x-vertex"> varying vec3 vNormal; void

WebGL and HTML shader-type

一笑奈何 提交于 2019-12-04 23:06:11
I declare my GLSL ES shader program within a HTML file, using this code: <script id="shader-fs" type="x-shader/x-fragment">..shader-code..</script> as seen in the learning WebGL examples. Everything works fine, but I dont understand why I should use the type attribute of the script tag. I want to know where the "x-shader/x-fragment" value is specified. Who does that.. the W3C, the Khronos Group or the browser developers? Can anybody help me? Tahnk you. There is no official organization who specified that GLSL code should be put within a <script> tag of type "x-shader/x-fragment" . The only

openGL: lines with shaders

霸气de小男生 提交于 2019-12-04 22:49:39
How would I create a line (possibly colored) with shaders? I'm using programmable pipeline and I'm a beginner with openGL. I can't find an example on how to draw lines with shaders.. I suppose I have to load a VAO (vertices array object) into the shader, but then what? What functions should I use and how? First, set use the shaderprogram. Then draw lines using glDrawArrays (or Elements if your data is indexed) with mode=GL_LINES or one of the other line drawing modes. Here's a code example for 2D lines with different color in each end. If shading mode is set to smooth, OpenGL will interpolate

GLSL Shader to convert six textures to Equirectangular projection

倖福魔咒の 提交于 2019-12-04 20:42:46
I want to create an equirectangular projection from six quadratic textures, similar to converting a cubic projection image to an equirectangular image, but with the separate faces as textures instead of one texture in cubic projection. I'd like to do this on the graphics card for performance reasons, and therefore want to use a GLSL Shader. I've found a Shader that converts a cubic texture to an equirectangular one: link Step 1: Copy your six textures into a cube map texture. You can do this by binding the textures to FBOs and using glBlitFramebuffer() . Step 2: Run the following fragment

GLSL <> operators on a vec4

北城余情 提交于 2019-12-04 19:55:23
问题 I'm looking at some newer GLSL code that doesn't compile to my current version of OpenGL and I'm wondering what the short form of the following means: vec4 base; if (base < 0.5) { result = (2.0 * base * blend); } Is this equivalent to: if (base.r < 0.5 && base.g < 0.5 && base.b < 0.5 && base.a < 0.5) { result.r = 2.0 * base.r * blend.r; result.g = 2.0 * base.g * blend.g; result.b = 2.0 * base.b * blend.b; result.a = 2.0 * base.a * blend.a; } Edit: Error: Fragment shader failed to compile with

Why am I getting a blank screen when using shader?

浪子不回头ぞ 提交于 2019-12-04 19:28:40
I'm using this tutorial to create and draw an outline for my game Sprite s. However, all I get is a blank red screen. I'm very new to Shader s so I'm not sure if I'm missing something very trivial. My vertex and fragment shaders were copy-pasted from the above tutorial. (Commenting on the tutorial doesn't seem to work so I was unable to seek help there.) My code: float x = 0, y = 0, height = 256, width = 256, angle = 0, outlineSize = 1f; @Override public void create() { batch = new SpriteBatch(); img =new Texture("badlogic.jpg"); sprite = new Sprite(img); loadShader(); float w = Gdx.graphics

gl_NormalMatrix [duplicate]

旧时模样 提交于 2019-12-04 19:27:34
问题 This question already has answers here : Why transforming normals with the transpose of the inverse of the modelview matrix? (5 answers) Closed 4 years ago . I have found that "gl_NormalMatrix - 3x3 Matrix representing the inverse transpose model-view matrix". Why does the matrix for normals have to be the inverse transpose model-view matrix? Why can't I just use the model-view matrix for this purpose? 回答1: See here: This section was inspired by the excellent book by Eric Lengyel “Mathematics

Using GL_INT_2_10_10_10_REV in glVertexAttribPointer()

你。 提交于 2019-12-04 19:15:24
Can anybody tell me how exactly do we use GL_INT_2_10_10_10_REV as type parameter in glVertexAttribPointer() ? I am trying to pass color values using this type. Also what is the significance of "REV" suffix in this type ? Does it require any special treatment in the shaders ? My code is as follows : GLuint red=1023,green=1023,blue=1023,alpha=3; GLuint val = 0; val = val | (alpha << 30); val = val | (blue << 20); val = val | (green << 10); val = val | (red << 0); GLuint test_data[]={val,val,val,val}; loadshaders(); glBindAttribLocation(ps,0,"tk_position"); glBindAttribLocation(ps,1,"color");

GLSL reusable/shared functions, shared constants (OpenGL ES 2.0)?

笑着哭i 提交于 2019-12-04 18:49:27
问题 Short: Can I define a function that every shader can use? Or I have to define it per shader? The whole story: I want to create numerous shaders intended to colorize the input fragments with predefined gradient ramps (something like this - http://www.thinkboxsoftware.com/storage/krakatoa-support-images/krakatoa15_kcm_densitybyage_gradientrampmap.png). I want to define a gradient ramp constant for each shader (an array of vec4 color samples, where the alpha value holds the gradient position,

GLSL shader for texture 'smoke' effect

妖精的绣舞 提交于 2019-12-04 18:48:43
I've looked around and haven't found anything relevant. I'm tyring to create a shader to give a texture smoke effect animation like here: Not asking for a complete/full solution (although that would be awesome) but any pointers towards where I can get started to achieve this effect. Would i need to have the vertices for the drawing or is this possible if I have the texture only? In the example pictured it appears as if they have the vertices. Possibly the "drawing" of the flower shape was recorded and then played back continuously. Then the effect hits the vertices based on a time offset from