shader

How can i convert uv coordinates to world space?

偶尔善良 提交于 2019-12-24 09:24:09
问题 I am trying to implement a shader. It will use with Unity LineRenderer. Shader will have noise that scrolling overtime raltive to texture coordinates. For example in parallel to x axis of uv space of texture. I have an implementation, but i dont'know how to get direction relative to texture uv (consider the texture rotation) in a vert function. I am only have a world space-relativew scrolling. Main problem - how to convert uv coordinates (for example (0, 0) or (1, 0)) to world space? Here is

How can i convert uv coordinates to world space?

青春壹個敷衍的年華 提交于 2019-12-24 09:23:36
问题 I am trying to implement a shader. It will use with Unity LineRenderer. Shader will have noise that scrolling overtime raltive to texture coordinates. For example in parallel to x axis of uv space of texture. I have an implementation, but i dont'know how to get direction relative to texture uv (consider the texture rotation) in a vert function. I am only have a world space-relativew scrolling. Main problem - how to convert uv coordinates (for example (0, 0) or (1, 0)) to world space? Here is

vertexshader / vertexprogram write vertex attributes to vbo?

前提是你 提交于 2019-12-24 09:01:56
问题 Is there a way to alter vertex attributes in a vertexshader/vertexprogram and save the changes back into the VBO? 回答1: Yes, that is called Transform Feedback in OpenGL (or Stream-Out in DirectX): http://www.opengl.org/registry/specs/EXT/transform_feedback.txt http://www.opengl.org/registry/doc/glspec42.core.20120119.pdf (page 158) http://msdn.microsoft.com/en-us/library/windows/desktop/bb205121.aspx 来源: https://stackoverflow.com/questions/9530387/vertexshader-vertexprogram-write-vertex

Runtime error with GLSL shaders: Inconsistency detected by ld.so

喜夏-厌秋 提交于 2019-12-24 08:36:55
问题 I am writing some OpenGL code to draw a small dot in a window, but when I try to use my own shaders, I get an error message which I don't understand. So, here's my main function: int main(int argc, char** argv) { // Initialize some things glutInit(&argc, argv); glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGBA); glutInitWindowSize(100, 100); glutCreateWindow("OpenGL Test"); glutDisplayFunc(RenderScene); glewInit(); glClearColor(0.0f, 0.0f, 0.0f, 0.0f); // Make the vertex buffer Vector3f vertices[1]

Custom shader material taking forever to initialize?

[亡魂溺海] 提交于 2019-12-24 07:15:10
问题 I've been working on a raymarched project in three.js for a little over a year now and as the complexity has increased so has the initialization time. It can now take over 40 seconds to load the project in browser however once loaded runs at +60fps. I've tracked down the culprit function through performance tests and it seems to get hung up on the InitMaterial function within three's library. Does anyone have any idea as to what could be causing this hangup? Personally I believe it could be

Transitioning vertices between 3D models with three.js

喜夏-厌秋 提交于 2019-12-24 07:01:13
问题 I am trying to achieve polygon blowing and reassembling effect similar to: http://na.leagueoflegends.com/en/featured/skins/project-2016 https://tkmh.me/ In both of these examples, you can see how they are morphing / transitioning the vertices from one 3d model to another, resulting in a pretty cool effect. I have something similar working, but I can't wrap my head around how they are transitioning the vertices with velocity offsets (please refer to the first link and see how the particles don

Pass array to shader

六月ゝ 毕业季﹏ 提交于 2019-12-24 06:31:04
问题 I create my array this.kernel: it hast 48 elements and I want to pass it to my fragment shader. When i call gl.uniform3fv(gl.getUniformLocation(this.program, "kernel"), 16, this.kernel); kernel is defined in my shader: uniform vec3 kernel[16]; I get an error for not enough arguments. I already looked up the specification etc, but don't find my problem -.- void glUniform3fv( GLint location, GLsizei count, const GLfloat * value); Thanks for help €: I converted this.kernel to a float32array but

Shader Materials and GL Framebuffers in THREE.js

荒凉一梦 提交于 2019-12-24 05:49:50
问题 I'm trying to use an FBO in a material in THREE.js. I have a GPU-based fluid simulation which outputs its final visualisation to a framebuffer object, which I would like to use to texture a mesh. Here's my simple fragment shader: varying vec2 vUv; uniform sampler2D tDiffuse; void main() { gl_FragColor = texture2D( tDiffuse, vUv ); } I am then trying to use a simple THREE.ShaderMaterial: var material = new THREE.ShaderMaterial( { uniforms: { tDiffuse: { type: "t", value: outputFBO } }, //other

iPhone glShaderBinary

我与影子孤独终老i 提交于 2019-12-23 23:25:08
问题 Does anyone have an example of how to compile a shader, save the shader binary, and use glShaderBinary to later load the shader with iPhone/iOS (OpenGL ES 2.0)? 回答1: It's not possible to do this (at least with iOS 4 and below). iOS doesn't support any precompiled binary shaders. If you query OpenGL for the number of binary shader formats it supports, you get back zero. You're forced to compile the shaders every time you start the app. (Answered my own question.) 回答2: The best place to start

How to use mouse to change OpenGL camera

情到浓时终转凉″ 提交于 2019-12-23 20:01:02
问题 I'm trying to set up a camera in OpenGL to view some points in 3 dimensions. To achieve this, I don't want to use the old, fixed functionality style (glMatrixMode(), glTranslate, etc.) but rather set up the Model View Projection matrix myself and use it in my vertex shader. An orthographic projection is sufficient. A lot of tutorials on this seem to use the glm library for this, but since I'm completely new to OpenGL, I'd like to learn it the right way and afterwards use some third party