glsl

GLSL performance - function return value/type

白昼怎懂夜的黑 提交于 2019-12-11 16:24:49
问题 I'm using bicubic filtering to smoothen my heightmap, I implemented it in GLSL: Bicubic interpolation: (see interpolate() function bellow) float interpolateBicubic(sampler2D tex, vec2 t) { vec2 offBot = vec2(0,-1); vec2 offTop = vec2(0,1); vec2 offRight = vec2(1,0); vec2 offLeft = vec2(-1,0); vec2 f = fract(t.xy * 1025); vec2 bot0 = (floor(t.xy * 1025)+offBot+offLeft)/1025; vec2 bot1 = (floor(t.xy * 1025)+offBot)/1025; vec2 bot2 = (floor(t.xy * 1025)+offBot+offRight)/1025; vec2 bot3 = (floor

Per Vertex Diffuse and Specular Shader

家住魔仙堡 提交于 2019-12-11 15:46:28
问题 I am trying to implement Specular lighting in the Vertex shader. I'm generating a sphere with vertices and normals. I'm setting up the vertices etc before calling the vertex shader as, Generate Model Matrix: glm::mat4 ModelMatrix = glm::translate(glm::mat4(), glm::vec3(0.0f, 0.0f, -2.0f)); Generate View Matrix: glm::vec3 cameraPosition = glm::vec3(0.0f, 0.0f, 0.0f); glm::vec3 cameraTarget = glm::vec3(0.0f, 0.0f, -2.0f); glm::vec3 upVector = glm::vec3(0.0f, 1.0f, 0.0f); glm::mat4 ViewMatrix =

Strange error in GLSL while assigning attribute to a varying in vertex shader

泪湿孤枕 提交于 2019-12-11 13:53:21
问题 I am writing a GLSL program for texture mapping. I am having a weird problem. From my vertex shader I am trying to pass the texture coordinate vec2 as a varying to frag shader. In a diff shader in the same prog I did the same and it worked. But for this texture its not working. If I comment that line, then everything works. I have no clue why this is happening. This is the vertex shader: attribute vec4 position; attribute vec4 color1; attribute vec4 normal; attribute vec2 texCoord; uniform

OpenGL GL_POLYGON Not Functioning Properly

假如想象 提交于 2019-12-11 13:38:11
问题 I have an OpenGL-related issue. Whenever I attempt to draw a simple polygon using four vertices from a vertex buffer... nothing happens. However, it will draw the shape in GL_TRIANGLES or GL_TRIANGLE_STRIP mode, albeit distorted. Am I doing something wrong? Relevent code: Vertex array: http://i.imgur.com/nEcbw.png GL_POLYGON: http://i.imgur.com/idfFT.png GL_TRIANGLES: http://imgur.com/84ey3,idfFT,nEcbw#0 GL_TRIANGLE_STRIP: http://i.imgur.com/JU3Zl.png 回答1: I'm using a forward-compatible 3.2

About converting YUV(YV12) to RGB with GLSL for iOS

落爺英雄遲暮 提交于 2019-12-11 13:32:00
问题 I'm trying to convert YUV(YV12) to RGB with GLSL shader. As below step. read a raw YUV(YV12) data from image file filtering Y, Cb and Cr from the raw YUV(YV12) data mapping texture send Fragment Shader. but result image is not same as raw data. below image is raw data. screenshot of raw image link(Available for download) and below image is convert data. screenshot of convert image link(Available for download) and below is my source code. - (void) readYUVFile { ... NSData* fileData = [NSData

Increase the intensity of texture in shader code - OpenGL

亡梦爱人 提交于 2019-12-11 13:16:36
问题 How can I increase the intensity of an image using opengl shadercode? The purpose is that the resulted image should look more brighter than actual image. Found a related link here. But it is for android. private void CreateShaders() { /***********Vert Shader********************/ vertShader = GL.CreateShader(ShaderType.VertexShader); GL.ShaderSource(vertShader, @" attribute vec3 a_position; varying vec2 vTexCoord; void main() { vTexCoord = (a_position.xy + 1) / 2; gl_Position = vec4(a_position

How to render images in WebGL from ArrayBuffer

和自甴很熟 提交于 2019-12-11 11:24:04
问题 I am having a image that I am reading in server side and pushing to web browser via AJAX call. I have a requirement where I have to render them line by line using WebGL. For Example : Image is 640X480 where 640 is width and 480 is height. Now the total number of pixels will be 640*480 = 307200 pixels. So, I want to render the whole image in 640(total width) intervals in a loop using WebGL. Now I have texture2D(as per my knowledge) in webgl to do so, but not getting any idea of where to start

glGetUniformLocation() returning -1 even though used in vertex shader

纵然是瞬间 提交于 2019-12-11 11:07:39
问题 I'm trying to render a simple cube with normals. I'm using the following code to initialize the shaders. void initShader(const char* vertexShaderPath, const char* fragmentShaderPath){ cout<<"Initializing Shaders"<<endl<<"==="<<endl; cout<<"Vertex shader "; vs = loadShader(vertexShaderPath, GL_VERTEX_SHADER); cout<<"Fragment shader "; fs = loadShader(fragmentShaderPath, GL_FRAGMENT_SHADER); cout<<"Creating program: "; program = glCreateProgram(); if(0==program) cout<< "Failed"<<endl; else cout

Problems compiling shader source within osg application

岁酱吖の 提交于 2019-12-11 10:55:09
问题 I have an OSG application that I want to texture map a full screen quad in the finalDrawCallback because I need everything in my scene to be rendered before the texturing is done. This is why I have to use the openGL calls instead of the osg calls for the program and shaders to execute. Specifically I seem to have an issue with compiling both the vert and frag shaders. When I call glGetShaderiv(shader, GL_COMPILE_STATUS, &param ), my param value doesn't change or is undefined. Which,

Uniform block index in different shaders

泄露秘密 提交于 2019-12-11 10:30:10
问题 Looking at uniform_buffer_object specs, there is no guarantee that a certain uniform block that is defined the same way in multiple shader programs will have the same index returned by glGetUniformBlockIndex() . That means I have to call glBindBufferBase() to assign the UBO the relevant index every time I switch the shader program. However, from some testing, it seems like a uniform block does have the same index in different shader programs, even when the uniform blocks are declared in