shader

Per-Vertex Normals from perlin noise?

假如想象 提交于 2019-12-04 03:12:48
I'm generating terrain in Opengl geometry shader and am having trouble calculating normals for lighting. I'm generating the terrain dynamically each frame with a perlin noise function implemented in the geometry shader. Because of this, I need an efficient way to calculate normals per-vertex based on the noise function (no texture or anything). I am able to take cross product of 2 side to get face normals, but they are generated dynamically with the geometry so I cannot then go back and smooth the face normals for vertex normals. How can I get vertex normals on the fly just using the noise

Render an SCNGeometry as a wireframe

醉酒当歌 提交于 2019-12-04 03:06:46
I'm using SceneKit on iOS and I have a geometry I want to render as a wireframe. So basically I want to draw only the lines, so no textures. I figured out that I could use the shaderModifiers property of the used SCNMaterial to accomplish this. Example of a shader modifier: material.shaderModifiers = [ SCNShaderModifierEntryPointFragment: "_output.color.rgb = vec3(1.0) - _output.color.rgb;" ] This example apparently simply inverts the output colors. I know nothing about this 'GLSL' language I have to use for the shader fragment. Can anybody tell me what code I should use as the shader fragment

Referencing texels in a data texture using indexes in the shader

こ雲淡風輕ζ 提交于 2019-12-04 02:38:32
问题 I have values in the texels of a DataTexture that I am trying to access using indexes in my shader. The indexes [0, 1, 2, 3, 4, 5, 6... 62, 63] are continuous, while the data texture has a height and width ( uTextureDimension ) of 8. After some research I wrote this function to take a particular index value, and reference the corresponding texel: vec2 customUV = vec2( mod(aIndex, uTextureDimension) / uTextureDimension, floor(aIndex / uTextureDimension) / uTextureDimension ); vec4 texelValues

Do I have to use shared vertices in mesh in Unity?

若如初见. 提交于 2019-12-04 02:19:48
问题 I want to create procedurally generated landscape meshes with a flat shaded look in Unity3D. I thought it would be the best to create three unique vertices per triangle and use one calculated normal for the vertices. Building this mesh would lead to redundant vertex position information. (Would it have an impact on render time?) Anyway... the problem is that I would like to use shading techniques e.g. ambient occlusion on this mesh. I don't want to mess up the mesh topology that Unity3D

Passing own struct into opengl es 2.0 shader

不羁的心 提交于 2019-12-04 01:58:56
I want to try a lighting example from the book OpenGL ES 2.0 Programming Guide. In the shader they have made two structures. struct directional_light { vec3 direction; // normalized light direction in eye space vec3 halfplane; // normalized half-plane vector vec4 ambient_color; vec4 diffuse_color; vec4 specular_color; }; struct material_properties { vec4 ambient_color; vec4 diffuse_color; vec4 specular_color; float specular_exponent; }; They have also made two uniforms, based on these structures. uniform material_properties u_material_properties; uniform directional_light u_directional_light;

Enabling an extension on a Three.js shader

可紊 提交于 2019-12-04 01:39:21
How can I enable an extension on a Three.js shader? My code so far: getting extension: var domElement = document.createElement( 'canvas' ); var gl = domElement.getContext('webgl') || domElement.getContext('experimental-webgl'); gl.getExtension('OES_standard_derivatives'); on my shader: fragmentShader: [ "#extension GL_OES_standard_derivatives : enable", "code..." ]... The console output: WARNING: 0:26: extension 'GL_OES_standard_derivatives' is not supported ERROR: 0:32: 'dFdx' : no matching overloaded function found ERROR: 0:32: '=' : cannot convert from 'const mediump float' to '2-component

Render “hard” edges using custom shader

我怕爱的太早我们不能终老 提交于 2019-12-03 22:40:56
问题 I'd like to reproduce the effect created by using THREE.EdgesHelper (drawing a boundary on "hard" object edges), but using a custom shader rather than adding a separate THREE.Line object. Essentially I'd like to do what's done in this demo, but only for the "hard" boundaries; e.g. boundaries that are not between two coplanar faces Approach: apply similar routine to EdgesHelper , but mark vertices that are in hard edges with a custom attribute (e.g. isEdge ); probably need to use

Camera frame yuv to rgb conversion using GL shader language

a 夏天 提交于 2019-12-03 20:48:38
I am getting the camera frame from the android camera Preview Callback in Byte array and pass it to jni code. As we can't use byte in c++ so i am converting it to the integer array as follows: JNIEXPORT void JNICALL Java_com_omobio_armadillo_Armadillo_onAndroidCameraFrameNative( JNIEnv* env, jobject, jbyteArray data, jint dataLen, jint width, jint height, jint bitsPerComponent) { Armadillo *armadillo = Armadillo::singleton(); jbyte *jArr = env->GetByteArrayElements(data, NULL); int dataChar[dataLen]; for (int i = 0; i < dataLen; i++) { dataChar[i] = (int) jArr[i]; } Then I am paasing it to the

GPU Instancing

巧了我就是萌 提交于 2019-12-03 20:10:43
作用: 批渲染Mesh相同的那些物体,以降低DrawCall数 这些物体可以有不同的参数,比如颜色与缩放 GPU Instancing与静态批处理,动态批处理的区别 使用静态,动态批处理物体的材质的所有参数是相同的,因为使用 Renderer.sharedMaterial 修改参数,则所有物体都会受影响。而使用 Renderer.material 会生成新实例,没法进行批处理 使用GPU Instancing的同一类物体的材质对象相同,但可以在代码中通过接口设置不同的参数,但仍会被批渲染。 使用条件: 兼容的平台及API 相同的Mesh与Material 不支持SkinnedMeshRenderer Shader支持GPU Instancing GPU Instancing支持的平台: DirectX 11 and DirectX 12 on Windows OpenGL Core 4.1+/ES3.0+ on Windows, macOS, Linux, iOS and Android Metal on macOS and iOS Vulkan on Windows and Android PlayStation 4 and Xbox One WebGL (requires WebGL 2.0 API) Shader Target Levels 注意: 使用multiple

可编程渲染管线与着色器语言

有些话、适合烂在心里 提交于 2019-12-03 20:10:01
Programming pipeline & shading language 大家好,今天想给大家介绍一下可编程渲染管线和着色器语言的相关基础知识,使想上手SHADER编程的童鞋们可以快速揭开SHADER语言的神秘面纱 由于时间有限,我决定只讲三个主要方面的内容,其过程中肯定会有不详细之处,还请见谅,就算是抛砖引玉,给大家一个简单的入门引路。 本章内容总共分为三个部分 一、3D渲染管线工作流程 二、可编程管线 三、着色器语言 3D渲染管线作为整个工作流程的基础,是不可或缺的基本知识。因此,作一定的讲解是有必要的。 但作为一个回顾内容,就不会对具体的内容进行讲解,比如如何进行坐标系变换,如何进行光栅化等等。 我们仅关注的是整个工作的过程。 甚至,我们更关心的不是整个工作过程中的细节,而是我们所必须要关注的几大流程。 如下图 数据填充 当我们想实现一次渲染效果的时候,数据的提交(填充)是不可缺少的。 因此,工作流程的第一步就是要处理输入的数据。 而我们最直接的接触3D渲染流程的时机,也就是数据填充时,更确切的说,就是那一堆set数据的API。 数据填充允许我们提交我们想要的数据,比如顶点数据(如位置,法线,颜色,纹理坐标等)。常量(如世界矩阵,观察矩阵,投影矩阵,纹理因子等等)。 变换&顶点光照 在这个阶段,顶点会经过世界变换,观察变换,投影变换。 通常情况下,在顶点经过观察变换后