shader

Vertex shader error C5145: must write to gl_Position using QShaderProgram

本小妞迷上赌 提交于 2019-12-08 04:24:28
问题 I am translating a Visual Studio C++ OpenGL project to a Qt project to implement a UI . I have all the code translated and I am using Qt classes to implement OpenGL part. The problem I am having now is that when I link the shaderProgram it throws me an error that says: Vertex info (0) : error C5145: must write to gl_Position I am implementing QOpenGLFunctions_4_1_Core and I debug the compile function to see if the code was well read and yes All the code is read and the compile function

How to add fog to texture in shader (THREE.JS R76)

故事扮演 提交于 2019-12-08 03:27:28
So firstly, I am aware of this post: ShaderMaterial fog parameter does not work My question is a bit different... I am trying to apply the fog in my three.js scene to a shader thats using a TEXTURE and I can't figure it out. My best guess as to what is supposed to go into the frag was: resultingColor = mix(texture2D(glowTexture, vUv), fogColor, fogFactor); This works when the texture2D part is just a normal color but as a texture it doesn't render. THREE.glowShader = { vertexShader: [ ` varying vec2 vUv; void main() { vUv = uv; gl_Position = projectionMatrix * modelViewMatrix * vec4(position,1

Understanding Shader Programming

三世轮回 提交于 2019-12-08 02:50:08
问题 I am trying to understand shader programming, but at this point, documentation wont help me further. 1] Does the data type & size of the buffers have to match? In the DX tutorial 4 from the DX SDK, they have a struct: struct SimpleVertex{ XMFLOAT3 Pos; XMFLOAT4 Color; }; While in their shader file, they define: struct VS_OUTPUT{ float4 Pos : SV_POSITION; float4 Color : COLOR0; }; They define Pos as a vector of 3 in one file, while it is 4 in another. How is this correct? I thought the size of

Shaders: How to draw 3D point verts without generating geometry?

最后都变了- 提交于 2019-12-08 02:25:26
问题 I have a 3D Webgl scene. I am using Regl http://regl.party/ . Which is WebGL. So I am essentially writing straight GLSL. This is a game project. I have an array of 3D positions [[x,y,z] ...] which are bullets, or projectiles. I want to draw these bullets as a simple cube, sphere, or particle. No requirement on the appearance. How can I make shaders and a draw call for this without having to create a repeated duplicate set of geometry for the bullets? Preferring an answer with a vert and frag

Opengl Simple Fragment Shader to overlay semi-transparent Triangle Strip over texture

自作多情 提交于 2019-12-08 00:13:39
问题 I have a textured triangle strip that forms a quad. when you click on it i want the surrounding areas to get marked with semi-transparent quads so you can still see the textures underneath. i have the quads getting displayed correctly, but they are not transparent at all and completely cover whatever is underneath. i have a very simple fragment shader that i thought would work with glEnable(GL_BLEND) and glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA) : #version 130 out vec4 flatColor; void

sphere texture mapping error

人盡茶涼 提交于 2019-12-07 23:58:29
问题 i use D3DXCreateSphere method to create sphere mesh. and i want to apply an earth texture on it. i calculate the texture coordinate in pixel shader with following code: sampler ShadeSampler = sampler_state { Texture = (ShadeTex); AddressU = Mirror; AddressV = Mirror; MinFilter = LINEAR; MagFilter = LINEAR; MipFilter = LINEAR; }; PS_OUTPUT PSMain(VS_OUTPUT input){ PS_OUTPUT output = (PS_OUTPUT)0; vector uv; uv.x = 0.5 + atan2(input.normal.z, input.normal.x) / piMul2; uv.y = 0.5f - asin(input

Depth of Field shader for points/strokes in Processing

孤街浪徒 提交于 2019-12-07 23:53:21
问题 Recently I've been using the Depth of Field shader below (originally from the ofxPostProcessing library for OpenFrameworks) for my Processing sketches. depth.glsl uniform float maxDepth; void main() { float depth = gl_FragCoord.z / gl_FragCoord.w; gl_FragColor = vec4(vec3(1.0 - depth/maxDepth), 1.0); } dof.glsl uniform sampler2D texture; varying vec4 vertexture; varying vec4 vertTexCoord; uniform sampler2D tDepth; uniform float maxBlur; // max blur amount uniform float aperture; // aperture -

SceneKit painting on texture with texture coordinates

跟風遠走 提交于 2019-12-07 21:24:25
问题 I have a Collada model that I load into SceneKit. When I perform a hittest on the model I am able to retrieve the texture coordinates of the model that was hit. With these texture coordinates I should be able to replace texture coordinates with a color. So this way I should be able to draw on the model Correct me if I am wrong so far. I read a lot of articles till now but I just don't get my shaders right. ( Though I did get some funky effects ;-) My vertex shader : precision highp float;

How to use shaders in MonoGame?

删除回忆录丶 提交于 2019-12-07 19:27:18
问题 Can someone tell me, how to use shaders in monogame? I have this error: https://gamedev.stackexchange.com/questions/46994/has-anyone-got-the-krypton-lighting-engine-working-in-monogame-for-windows-8 I tried to use 2MGFX, but the tool reports: The effect must contain at least one technique and pass. From what I can see from myshader.fx the file, it does. Here is my shader code: sampler TextureSampler : register(s0); float _valueAlpha = 1; float _valueRGB = 1; float4 main(float4 color : COLOR0,

Unity 3D double sided shader with alpha and depth masking

匆匆过客 提交于 2019-12-07 19:23:41
问题 I want to create a shader in Unity 3D that you can add a material to that has both depth and transparency. I have so far created a shader that appears in 'scene' mode with transparency, but when I try to preview this in my VR setup, I cannot see the backface of the image. This is a set up for Google Cardboard, so I need to have the camera inside the mesh. This is my code so far: Shader "Custom/transparentDepth" { Properties { _Color ("Main Color", Color) = (1,1,1,1) _MainTex ("Base (RGB)