shader

Metal Shader with SceneKit SCNProgram

时光怂恿深爱的人放手 提交于 2019-11-30 05:36:38
I'm looking for just a working Metal shader that works in SceneKit with SCNProgram. Can someone show me the correct method declarations/how to hook this up? let program = SCNProgram() program.vertexFunctionName = "myVertex" program.fragmentFunctionName = "myFragment" material.program = program and then the shader //MyShader.metal vertex something myVertex(something) { return something; } fragment float4 myFragment(something) { return something } I'm just looking for the most basic example please. I clipped out all the 'unnecessary' stuff, this is about as basic as it gets and pretty much what

GLSL Problem: Multiple shaders in one program

本秂侑毒 提交于 2019-11-30 04:59:39
I must have misunderstood something with shaders: I thought that as you can attach multiple shaders to one program, you'd be able to simply attach more than one fragment shader, as an example: A crate texture rendered with a color modulation and refraction. But apparently this is not the case, as you can have only one main function per program. How can I work around the main function limit and allow for any dynamic combination of multiple fragment shaders which are in the same program and called after each other? You can have a set of entry points pre-defined. Suppose you have a limited number

How do I calculate pixel shader depth to render a circle drawn on a point sprite as a sphere that will intersect with other objects?

情到浓时终转凉″ 提交于 2019-11-30 03:24:14
I am writing a shader to render spheres on point sprites, by drawing shaded circles, and need to write a depth component as well as colour in order that spheres near each other will intersect correctly. I am using code similar to that written by Johna Holwerda : void PS_ShowDepth(VS_OUTPUT input, out float4 color: COLOR0,out float depth : DEPTH) { float dist = length (input.uv - float2 (0.5f, 0.5f)); //get the distance form the center of the point-sprite float alpha = saturate(sign (0.5f - dist)); sphereDepth = cos (dist * 3.14159) * sphereThickness * particleSize; //calculate how thick the

CocosCreator 2.1.2 Shader组件

假装没事ソ 提交于 2019-11-30 01:46:25
本篇文章相关导读: 新版ShaderHelper,支持 Creator 2.1.2 ! 社区大佬揭开 Creator 2.1.2 材质系统的神秘面纱! 为什么要选择使用TypeScript,看了就知道原因! Creator 2.0.x ShaderHelper组件 预告 Creator 2.0.x CustomMaterial.js源码分析 01 基本用法 ​在中秋节的三天假期里,Shawn 终于将 ShaderHelper2 的组件 接口完全还原到旧版本 ShaderHelper 一样了,看下图: 同时新版 ShaderHelper 支持在组件上传递参数到片元着色器中,如下效果: 需要注意 ShaderHelper 的 props 参数只是用来设置片元代码中变量的初值。当你在编辑器中实时调节 props 中的参数值时,可以在场景编辑器中实时看到参数变化后的效果。 02 动态效果 要让Shader运行起来具有动态效果,我这里参考了前面一篇文章《 一起来看,社区大佬揭开 Creator 2.1.2 材质系统的神秘面纱! 》中「lxb229」大佬的作法,增加了一个 ShaderTime 的组件专用于更新 Shader 代码中的 time 参数,看下图: ShaderTime 组件是在 update 回调中不停地更新 Shader 中的 time 参数值,从 0 累加到 65535

unity shader logo流光线条

徘徊边缘 提交于 2019-11-30 00:51:23
Shader "myPractices/LogoShader" { Properties { _MainTex ("Texture", 2D) = "white" {} } SubShader { Tags{"Queue"="Transparent" "IgnoreProjector"="True" "RenderType"="Transparent"} Blend SrcAlpha OneMinusSrcAlpha AlphaTest Greater 0.1 Pass { CGPROGRAM #pragma vertex vert #pragma fragment frag #include "UnityCG.cginc" sampler2D _MainTex; float4 _MainTex_ST; struct v2f { float4 pos : SV_POSITION; float2 uv : TEXCOORD0; }; //顶点函数 v2f vert (appdata_base v) { v2f o; o.pos = mul(UNITY_MATRIX_MVP, v.vertex); o.uv = TRANSFORM_TEX(v.texcoord, _MainTex); return o; } //必须放在使用其的 frag 函数之前,否则无法识别. //核心: 计算函数

How to implement a ground fog GLSL shader

本小妞迷上赌 提交于 2019-11-29 21:20:03
问题 I'm trying to implement a ground fog shader for my terrain rendering engine. The technique is described in this article: http://www.iquilezles.org/www/articles/fog/fog.htm The idea is to consider the ray going from the camera to the fragment and integrate the fog density function along this ray. Here's my shader code: #version 330 core in vec2 UV; in vec3 posw; out vec3 color; uniform sampler2D tex; uniform vec3 ambientLightColor; uniform vec3 camPos; const vec3 FogBaseColor = vec3(1., 1., 1.

What are Vertex and Pixel shaders?

て烟熏妆下的殇ゞ 提交于 2019-11-29 19:56:09
What are Vertex and Pixel shaders? What is the difference between them? Which one is the best? A Pixel Shader is a GPU (Graphic Processing Unit) component that can be programmed to operate on a per pixel basis and take care of stuff like lighting and bump mapping. A Vertex Shader is also GPU component and is also programmed using a specific assembly-like language, like pixel shaders, but are oriented to the scene geometry and can do things like adding cartoony silhouette edges to objects, etc. Neither is better than the other, they each have their specific uses. Most modern graphic cards

GLSL multiple shaderprogram VS uniforms switches

人走茶凉 提交于 2019-11-29 19:52:06
I'm working on a shader manager architecture and I have several questions for more advanced people. My current choice oppose two designs which are: 1. Per material shader program => Create one shader program per material used in the program. Potential cons: Considering every object might have its own material, it involves a lot of glUseProgram calls. Implies the creation of a lot of shaderprogram objects. More complex architecture that #2. Pros: Shader code can be generated specifically for each "options" used in the material. If i'm not wrong, uniforms have to be set only one time (when the

OpenGL ES Shaders and 64-bit iPhone 5S

帅比萌擦擦* 提交于 2019-11-29 19:50:18
问题 I just started testing with the iPhone 5S and the 64bit architecture on an OpenGL ES app. The problem I'm seeing is that (CGFloat) values are way wrong when they get to the shaders. I pass in 0.8 and it changes to -1.58819e-23 when I debug the shader. I am using glUniform4fv() to pass in the value. Do I need to use a different data type or? or a different method to pass in the values? The value goes through fine when I test on 32bit CGFloat brushColor[4]; brushColor[0] = 0.8; brushColor[1] =

WebGL/GLSL - How does a ShaderToy work?

江枫思渺然 提交于 2019-11-29 19:40:35
I've been knocking around Shadertoy - https://www.shadertoy.com/ - recently, in an effort to learn more about OpenGL and GLSL in particular. From what I understand so far, the OpenGL user first has to prepare all the geometry to be used and configure the OpenGL server (number of lights allowed, texture storage, etc). Once that's done, the user then has to provide at least one vertex shader program, and one fragment shader program before an OpenGL program compiles. However, when I look at the code samples on Shadertoy, I only ever see one shader program, and most of the geometry used appears to