glsl

Mac OS 10.8 supports GLSL 3.30?

元气小坏坏 提交于 2019-12-03 11:43:10
问题 I'm following the arcsynthesis tutorials on OpenGL 3.3 using 10.8 Mountain Lion and when building the project it compiles and runs the shaders using GLSL version 3.30, however even in the core profile on Mac OS 10.8 I shouldn't have GLSL 3.30 support - only 1.50 (as highlighted in the picture) Is anybody able to explain how I have managed to achieve this black magic? 回答1: OS X 10.8 still only support OpenGL 3.2, but with some 3.3 features such as specifying attribute location (#extension GL

What extractly mat3(a mat4 matrix) statement in glsl do?

て烟熏妆下的殇ゞ 提交于 2019-12-03 11:07:49
I'm doing a per fragment lighting and when correcting normal vecter, i got this code: vec3 f_normal = mat3(MVI) * normal; Where MVI is: mat4 MVI = transpose(inverse(ModelViewMatrix)); . So what is return after mat3(MVI) statement? mat3(MVI) * normal Returns the upper 3x3 matrix from the 4x4 matrix and multiplies the normal by that. This matrix is called the 'normal matrix'. You use this to bring your normals from world space to eye space. The upper 3x3 portion of the matrix is important for scale and rotation, while the rest is only for translation (and normals are never translated) To take a

OpenGL GLSL 3.30 in Ubuntu 14.10 mesa 10.1.3

匿名 (未验证) 提交于 2019-12-03 10:10:24
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: when I try to compile a glsl shader with OpenGL in Ubuntu I get the following error: - 0:1(10): error: GLSL 3.30 is not supported. Supported versions are: 1.10, 1.20, 1.30, and 1.00 ES But when I do a "glxinfo | grep OpenGL" it says: OpenGL vendor string: X.Org OpenGL renderer string: Gallium 0.4 on AMD JUNIPER OpenGL core profile version string: 3.3 (Core Profile) Mesa 10.1.3 OpenGL core profile shading language version string: 3.30 OpenGL core profile context flags: (none) OpenGL core profile profile mask: core profile OpenGL core profile

How to implement this tunnel like animation in WebGL? [closed]

最后都变了- 提交于 2019-12-03 09:50:14
问题 It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center. Closed 8 years ago . How to implement this tunnel like animation in WebGL? Source: http://dvdp.tumblr.com/ See also: How to implement this rotating spiral in WebGL? 回答1: Well, this was fun. :) A WebGL demo is available here: http:/

How to write const array in GLSL ES

匿名 (未验证) 提交于 2019-12-03 08:59:04
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I am trying to write a simple vertex shader for an OpenGL ES app on the iPhone, but my array constructor is causing me trouble. attribute vec4 normal; attribute vec4 position; void main(void){ const vec4 vertices[3] = vec4[](vec4(0.25, -0.25, 0.5, 1.0), vec4(-0.25, -0.25, 0.5, 1.0), vec4(0.25, 0.25, 0.5, 1.0)); gl_Position = vertices[gl_VertexID]; } When using this code the shader is unable to compile, and gives me the eror message: ERROR: 0:13: '(' : syntax error: Array size must appear after variable name 回答1: The GLSL version used with ES

Why are dFdx/ddx and dFdy/ddy 2 dimension variables when quering a 2d texture?

谁说胖子不能爱 提交于 2019-12-03 08:41:37
I cannot seem to understand this, shouldn't the derivative/change along the U or V coordinate in a 2d texture/array be single dimension variable as we are checking it only along ddx (U coordinate) or ddy (V coordinate)? There are 4 distinct partial derivatives here: du/dx, dv/dx, du/dy, and dv/dy. None of those four values need be zero, unless the texture image coordinates happen to be perfectly aligned to the display screen axes. In general the texture coordinate axes need not be aligned to the screen display axes. X and Y (display viewport axes) are not the same directions as U and V

Only glsl shader version 120 works on mac OS X

匿名 (未验证) 提交于 2019-12-03 08:28:06
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I have a problem with the glsl's version on my mac os X 10.9.2. I'm making a program in c++ with OpenGL and SDL2 I can't upgrade from my version 120 to any version higher. How I can upgrade please ? I compile like this : g++ and my flag is : -framework SDL2 -lSDLmain -framework OpenGL -framework SDL2_image -framework cocoa ERROR: 0:3: '' : version '330' is not supported 回答1: On OS/X 10.9 to create an OpenGL 3.3/4.1 context you need to add the following snippet before SDL_CreateWindow . A full example is available here: https://gist.github

How to create billboard matrix in glm

耗尽温柔 提交于 2019-12-03 08:25:47
How to create a billboard translation matrix from a point in space using glm? Just set the upper left 3×3 submatrix of the transformation to identity. Update: Fixed function OpenGL variant: void makebillboard_mat4x4(double *BM, double const * const MV) { for(size_t i = 0; i < 3; i++) { for(size_t j = 0; j < 3; j++) { BM[4*i + j] = i==j ? 1 : 0; } BM[4*i + 3] = MV[4*i + 3]; } for(size_t i = 0; i < 4; i++) { BM[12 + i] = MV[12 + i]; } } void mygltoolMakeMVBillboard(void) { GLenum active_matrix; double MV[16]; glGetIntegerv(GL_MATRIX_MODE, &active_matrix); glMatrixMode(GL_MODELVIEW); glGetDoublev

Dashed line in OpenGL3?

一笑奈何 提交于 2019-12-03 08:01:41
I'm currently porting an old OpenGL 1.1 application which makes use of wireframe models to OpenGL 3.0. In 1.1 following code is used to create a dashed line: glPushAttrib(GL_ENABLE_BIT); glLineStipple(1, 0x0F0F); glEnable(GL_LINE_STIPPLE); Here as usual the parameters are pushed to the stack in order to influence all following drawing operations. My question: how is this done in OpenGL3 where this stack is no longer used? How can I set up my lines to be dashed (probably before handing the coordinates over to glBufferData()? For separate line segments, this is not very complicated at all. For

THREE.js blur the frame buffer

痞子三分冷 提交于 2019-12-03 08:00:21
问题 I need to blur the frame buffer and I don't know how to get the frame buffer using THREE.js. I want to blur the whole frame buffer rather than blur each textures in the scene. So I guess I should read the frame buffer and then blur, rather than doing this in shaders. Here's what I have tried: Call when init: var renderTarget = new THREE.WebGLRenderTarget(512, 512, { wrapS: THREE.RepeatWrapping, wrapT: THREE.RepeatWrapping, minFilter: THREE.NearestFilter, magFilter: THREE.NearestFilter, format