glsl

Webgl: alternative to writing to gl_FragDepth

℡╲_俬逩灬. 提交于 2019-12-07 02:21:54
问题 In WebGL, is it possible to write to the fragment's depth value or control the fragment's depth value in some other way? As far as I could find, gl_FragDepth is not present in webgl 1.x, but I am wondering if there is any other way (extensions, browser specific support, etc) to do it. What I want to archive is to have a ray traced object play along with other elements drawn using the usual model, view, projection. 回答1: There is the extension EXT_frag_depth Because it's an extension it might

Shader's function parameters performance

我怕爱的太早我们不能终老 提交于 2019-12-07 02:05:28
问题 I'm trying to understand how passing parameters is implemented in shader languages. I've read several articles and documentation, but still I have some doubts. In particular I'm trying to understand the differences with a C++ function call, with a particular emphasis on performances. There are slightly differences between HSLS,Cg and GLSL but I guess the underline implementation is quite similar. What I've understood so far: Unless otherwise specified a function parameter is always passed by

Full setup of Transform Feedback(openGL)

 ̄綄美尐妖づ 提交于 2019-12-07 02:01:07
问题 GLSL 1.50, openGL 3.3. I've been lately trying to get my tranform feedback working but without success. I still receive error after glBeginTranformFeedback() and as I haven't found any full working code I have stacked up my knowledge with some code that I found and documentation, it should be working well by now but I am missing something. So if anybody got full code (initializing of buffers, setting up, updating, rendering, reading back) it would definitelly help and if you don't but know

Behavior of uniforms after glUseProgram() and speed

蓝咒 提交于 2019-12-06 23:53:59
问题 How fast is glUseProgram()? Is there anything better (faster)?: Here are my thoughts: Use 1 universal shader program, but with many input settings and attributes (settings for each graphics class) Use more than 1 shader for each graphics class What state are uniforms in after changing the shader program? Do they save values (for example, values of matrices)? Here are what I consider the benefits of #1 to be: Doesn't use glUseProgram() And the benefits of #2: No matrix changes (for example, if

Incorrect order of matrix values in glm?

六月ゝ 毕业季﹏ 提交于 2019-12-06 22:57:00
问题 I started using GLM library to do mathematics operations over OpenGL 3 and GLSL. I need an orthographic projection to draw 2D graphics, so I writed this simple code: glm::mat4 projection(1.0); projection = glm::ortho( 0.0f, 640.0f, 480.0f, 0.0f, 0.0f, 500.0f); Printing on screen the values that glm::ortho has created I get: 0.00313 0.00000 0.00000 0.00000 0.00000 -0.00417 0.00000 0.00000 0.00000 0.00000 -0.00200 0.00000 -1.00000 1.00000 -1.00000 1.00000 As I know this is not the correct order

Why don't people use tetrahedrons for skyboxes?

有些话、适合烂在心里 提交于 2019-12-06 22:27:47
问题 When rendering a sky with a fixed texture in 3D games, people often create 6 textures in a cube map first, and then render a cube around the camera. In GLSL, you can access the pixels in the textures with a normal instead of a texture coordinate, and you can easily get this normal by normalizing the fragment position relative to the camera. However, this process can be done with any shape that surrounds the camera, because when you normalize each position it will always result in a sphere.

OpenGL GLSL 3.30 in Ubuntu 14.10 mesa 10.1.3

浪尽此生 提交于 2019-12-06 21:05:46
when I try to compile a glsl shader with OpenGL in Ubuntu I get the following error: - 0:1(10): error: GLSL 3.30 is not supported. Supported versions are: 1.10, 1.20, 1.30, and 1.00 ES But when I do a "glxinfo | grep OpenGL" it says: OpenGL vendor string: X.Org OpenGL renderer string: Gallium 0.4 on AMD JUNIPER OpenGL core profile version string: 3.3 (Core Profile) Mesa 10.1.3 OpenGL core profile shading language version string: 3.30 OpenGL core profile context flags: (none) OpenGL core profile profile mask: core profile OpenGL core profile extensions: OpenGL version string: 3.0 Mesa 10.1.3

GLSL : accessing framebuffer to get RGB and change it

随声附和 提交于 2019-12-06 19:34:33
I'd like to access framebuffer to get RGB and change their values for each pixel. It is because the glReadPixels, and glDrawPixels are too slow to use, so that i should use shaders instead of using them. Now, I write code, and success to display three-dimensional model using GLSL shaders. I drew two cubes as follows. .... glDrawArrays(GL_TRIANGLES, 0, 12*6); .... and fragment shader : varying vec3 fragmentColor; void main() { gl_FragColor = vec4(fragmentColor, 1); } Then, how can I access to RGB values and change it? For example, If the pixel values at (u1, v1) on window and (u2, v2) are (0,0

Why do I need to define a precision value in webgl shaders?

我与影子孤独终老i 提交于 2019-12-06 17:39:09
问题 I'm trying to get this tutorial to work but I ran into two issues, one of which is the following. When I run the code as is I get an error in the fragment shader saying: THREE.WebGLShader: gl.getShaderInfoLog() ERROR: 0:2: '' : No precision specified for (float) . So what I did was specifying a precision for every float/vector I define like so varying highp vec3 vNormal . This eliminates the error but I don't get why? I can't find any other example where precision values are added to variable

openGL: lines with shaders

拜拜、爱过 提交于 2019-12-06 17:10:46
问题 How would I create a line (possibly colored) with shaders? I'm using programmable pipeline and I'm a beginner with openGL. I can't find an example on how to draw lines with shaders.. I suppose I have to load a VAO (vertices array object) into the shader, but then what? What functions should I use and how? 回答1: First, set use the shaderprogram. Then draw lines using glDrawArrays (or Elements if your data is indexed) with mode=GL_LINES or one of the other line drawing modes. Here's a code