shader

Retrieve Vertices Data in THREE.js

試著忘記壹切 提交于 2019-11-28 07:48:16
I'm creating a mesh with a custom shader. Within the vertex shader I'm modifying the original position of the geometry vertices. Then I need to access to this new vertices position from outside the shader, how can I accomplish this? In lieu of transform feedback (which WebGL 1.0 does not support), you will have to use a passthrough fragment shader and floating-point texture (this requires loading the extension OES_texture_float ). That is the only approach to generate a vertex buffer on the GPU in WebGL. WebGL does not support pixel buffer objects either, so reading the output data back is

Depth as distance to camera plane in GLSL

丶灬走出姿态 提交于 2019-11-28 07:04:09
I have a pair of GLSL shaders that give me the depth map of the objects in my scene. What I get now is the distance from each pixel to the camera. What I need is to get the distance from the pixel to the camera plane. Let me illustrate with a little drawing * |--* / | / | C-----* C-----* \ | \ | * |--* The 3 asterisks are pixels and the C is the camera. The lines from the asterisks are the "depth". In the first case, I get the distance from the pixel to the camera. In the second, I wish to get the distance from each pixel to the plane. There must be a way to do this by using some projection

Omnidirectional shadow mapping with depth cubemap

时光总嘲笑我的痴心妄想 提交于 2019-11-28 06:38:20
I'm working with omnidirectional point lights. I already implemented shadow mapping using a cubemap texture as color attachement of 6 framebuffers, and encoding the light-to-fragment distance in each pixel of it. Now I would like, if this is possible, to change my implementation this way: 1) attach a depth cubemap texture to the depth buffer of my framebuffers, instead of colors. 2) render depth only, do not write color in this pass 3) in the main pass, read the depth from the cubemap texture, convert it to a distance, and check whether the current fragment is occluded by the light or not. My

OpenGL - How to access depth buffer values? - Or: gl_FragCoord.z vs. Rendering depth to texture

馋奶兔 提交于 2019-11-28 06:04:47
I want to access the depth buffer value at the currently processed pixel in a pixel shader. How can we achieve this goal? Basically, there seems to be two options: Render depth to texture. How can we do this and what is the tradeoff? Use the value provided by gl_FragCoord.z - But: Is this the correct value? On question 1: You can't directly read from the depth buffer in the fragment shader (unless there are recent extensions I'm not familiar with). You need to render to a Frame Buffer Object (FBO). Typical steps: Create and bind an FBO. Look up calls like glGenFramebuffers and

opengl版本

大城市里の小女人 提交于 2019-11-28 05:42:13
OpenGL vendor string: Intel OpenGL renderer string: Intel(R) HD Graphics 630 OpenGL version string: 4.4.0 - Build 21.20.16.4664 OpenGL extensions (GL_): GL_3DFX_texture_compression_FXT1, GL_AMD_depth_clamp_separate, GL_AMD_vertex_shader_layer, GL_AMD_vertex_shader_viewport_index, GL_ARB_ES2_compatibility, GL_ARB_ES3_compatibility, GL_ARB_arrays_of_arrays, GL_ARB_base_instance, GL_ARB_bindless_texture, GL_ARB_blend_func_extended, GL_ARB_buffer_storage, GL_ARB_cl_event, GL_ARB_clear_buffer_object, GL_ARB_clear_texture, GL_ARB_clip_control, GL_ARB_color_buffer_float, GL_ARB_compatibility, GL_ARB

GLSL gl_FragCoord.z Calculation and Setting gl_FragDepth

旧时模样 提交于 2019-11-28 04:24:47
So, I've got an imposter (the real geometry is a cube, possibly clipped, and the imposter geometry is a Menger sponge) and I need to calculate its depth. I can calculate the amount to offset in world space fairly easily. Unfortunately, I've spent hours failing to perturb the depth with it. The only correct results I can get are when I go: gl_FragDepth = gl_FragCoord.z Basically, I need to know how gl_FragCoord.z is calculated so that I can: Take the inverse transformation from gl_FragCoord.z to eye space Add the depth perturbation Transform this perturbed depth back into the same space as the

Why does my OpenGL Phong shader behave like a flat shader?

荒凉一梦 提交于 2019-11-28 03:06:16
I've been learning OpenGL for the past couple of weeks and I've run into some trouble implementing a Phong shader. It appears to do no interpolation between vertexes despite my use of the smooth qualifier. Am I missing something here? To give credit where credit is due, the code for the vertex and fragment shaders cribs heavily from the OpenGL SuperBible Fifth Edition. I would highly recommend this book! Vertex Shader: #version 330 in vec4 vVertex; in vec3 vNormal; uniform mat4 mvpMatrix; // mvp = ModelViewProjection uniform mat4 mvMatrix; // mv = ModelView uniform mat3 normalMatrix; uniform

Finder what point is to the left of a line/point after spinning it

徘徊边缘 提交于 2019-11-28 02:20:11
I am currently trying to write a shader in unity that draws a triangular pattern around countries in a risk-styled game if both countries are not owned by the same player (visual aid to see your borders). Right now, I'm having an issue with making the shader set the countries properly. It always sets country 0 to the left, and country 1 to the right - country 0 and 1 are set programically. The line, a border, can be between 0 and 359 degrees. How I find the countries 0 and 1 is I draw 3 points to the left and right of the midpoint of the line, one .01f, one .1f and one 1f away from the

unity shader 剔除指定的颜色

社会主义新天地 提交于 2019-11-28 00:50:35
Shader "MyShader/PaintingBGTransparency" { Properties{ _MainTex("Base (RGB)", 2D) = "white" {} _FilterfColor("Ridof (RGB)",Color) = (1,1,1,1) } SubShader{ Tags { "RenderType" = "Opaque" } Blend SrcAlpha OneMinusSrcAlpha pass { CGPROGRAM #pragma vertex vertext_convert #pragma fragment fragment_convert #include "UnityCG.cginc" sampler2D _MainTex; sampler2D _MainTex1; float4 _FilterfColor; struct Inputvrite { float4 vertex : POSITION; float4 texcoord : TEXCOORD0; }; struct Inputfragment { float4 pos : SV_POSITION; float4 uv : TEXCOORD0; }; float ColorLerp(float3 tmp_nowcolor,float3 tmp

Unity Shader内置矩阵之美

≡放荡痞女 提交于 2019-11-28 00:47:59
mul函数 mul函数,Z = mul(M, V)是表示矩阵M和向量V进行点乘,得到一个向量Z,这个向量Z就是对向量V进行矩阵变换后得到的值。 特别需要注意的是,例如normal是float3类型的,点乘的矩阵也要转换成float3x3。 float3 normal=mul((float3x3)UNITY_MATRIX_IT_MV,v.normal); 矩阵 内置的矩阵(float4x4): 1、这里要特别说明一下UnityObjectToClipPos(v.vertex)) 方法,官方网站上说明,编写着色器脚本时,请始终使用UnityObjectToClipPos(v.vertex)而不是mul(UNITY_MATRIX_MVP,v.vertex),因为所有内建的矩阵名字在Instanced Shader中都是被重定义过的,如果直接使用UNITY_MATRIX_MVP,会引入一个额外的矩阵乘法运算,所以推荐使用UnityObjectToClipPos / UnityObjectToViewPos函数,它们会把这一次额外的矩阵乘法优化为向量-矩阵乘法。 2、UNITY_MATRIX_IT_MV的使用场景,专门针对法线进行变换。但是为什么法线的变换和顶点不一样呢? 之所以法线不能直接使用UNITY_MATRIX_MV进行变换,是因为法线是向量,具有方向,在进行空间变换的时候