shader

c++ OpenGL glGetUniformLocation for Sampler2D returns -1 on Raspberry PI but works on Windows

青春壹個敷衍的年華 提交于 2019-12-23 19:17:14
问题 I'm making a crossplatform OpenGL program. However, I've encountered a problem, where glGetUniformLocation, which should return the location of a uniform variable in my shader program, returns -1, and it only occurs on Linux (Raspbian distro, ran on Raspberry PI), and on Windows the same code works perfectly! Here's my code: Load Shader program function: int shader, status; programID = glCreateProgram(); // Load vertex shader shader = LoadShaderFromString(GL_VERTEX_SHADER, Tools:

How do I query the alignment/stride for an SSBO struct?

 ̄綄美尐妖づ 提交于 2019-12-23 18:48:04
问题 I'm not sure which structure layout is most suited for my application: shared , packed , std140 , std430 . I'm not asking for an explanation of each, that information is easy to find, it's just hard to figure out the impact each will have on vendor compatibility/performance. If shared is the default, I'm suspecting that's a good starting point. From what I can gather, I have to query the alignment/offsets when using shared or packed , because it's implementation specific. What's the API for

Shader之小白入门学习七

本秂侑毒 提交于 2019-12-23 17:57:20
Shader之小白入门学习七 前言 上一篇中学习了变量的来龙去脉,在本篇中继续深入了解顶点片段着色器,但是在此之前还是需要大概了解下渲染管线的相关知识点。 渲染管线 图形渲染管线之所以被叫做 管线 ,就是因为它和一根管子的概念很像,我们可以理解为这根管子的末端连接的是我们最终的显示屏幕,管子的起始端连接的是我们的原始素材。 把我们想展示在屏幕上的素材从起始端放入,然后素材通过管子经过一系列操作最终战士在屏幕上。 而我们如果想加工一下素材最后的显示效果就可以在这根管子中最一些特殊的操作处理,以使素材经过我们所定义的操作使其最终达到我们想要的效果。 这根管子从头到尾大致流程可以分为以下三个大阶段(概念上): 应用程序阶段 (The Application Stage) 就是我们的原始素材准备阶段,包括我们的模型,贴图,相机和光源等,经过这个阶段会将所有的素材转换成渲染图元并提交到下一阶段中(几何阶段)。 几何阶段 (The Geometry Stage) 主要是对上一阶段中传过来的数据进行顶点上的加工处理,包括各种矩阵转换与顶点着色等,最后处理完成后会输出屏幕空间的二维顶点坐标,顶点着色等信息,并再提交到下一阶段(光栅化阶段)。 光栅化阶段 (The Rasterizer Stage) 经过几何阶段处理完后输送到光栅化阶段,从像素级别上对每个像素进行加工处理,最终显示在屏幕上。 结构体

Using glVertexAttribPointer and glDrawElements to draw from a packed vertex buffer

房东的猫 提交于 2019-12-23 16:50:23
问题 I have a packed vertex buffer containing postion coordinates aswell as color values for a vertex in the format {X, Y, Z, R, G, B, A}. I am able to display the rectangle properly with a hardcoded color when I alter the fragment shader by taking out the a_Color attribute and hard coding a vec4 value for gl_FragColor but I am not able to pass the color vec4 attribute into the fragment shader (the rectangle won't display in that scenario). What is the correct way to use glVertexAttribPointer(...)

Writing the correct value in the depth buffer when using ray-casting

守給你的承諾、 提交于 2019-12-23 15:27:53
问题 I am doing a ray-casting in a 3d texture until I hit a correct value. I am doing the ray-casting in a cube and the cube corners are already in world coordinates so I don't have to multiply the vertices with the modelviewmatrix to get the correct position. Vertex shader world_coordinate_ = gl_Vertex; Fragment shader vec3 direction = (world_coordinate_.xyz - cameraPosition_); direction = normalize(direction); for (float k = 0.0; k < steps; k += 1.0) { .... pos += direction*delta_step; float

Fill pattern in Image in Android

牧云@^-^@ 提交于 2019-12-23 10:18:16
问题 Two images are given blow i call first image as frame image and second image as frame image.Here fst is my Linear Layout and i set the frame-image as background image of it. Now i want to fill the pattern image in my frame image's white area. Outer area of the frame image is transparent and inner area is white. How can i fill pattern image in my frame-Image. I tryied this code. private void patternFill(Bitmap tempBitmapColor) { Bitmap bmp = BitmapFactory.decodeResource(getResources(), R

Is glCompileShader optional?

我与影子孤独终老i 提交于 2019-12-23 09:00:45
问题 While debugging my system, I found out that all the shaders I used were never compiled. All the GLSL Programs were happily linked and working like a charm. I have searched the entire code base for calls to glCompileShader , but none were found. My question then is: Is this a specific behaviour of the implementation I am working with? Is shader compilation carried out implicitly when linking a program? Is it optional? If that is the case, what advantages are there in doing it explicitly, apart

Implementing Fur with Shells technique in Unity

南笙酒味 提交于 2019-12-23 07:26:38
问题 I am trying to implement fur in Unity with the Shells technique. The Fins technique is purposely left out because I want this to run on low end mobiles (mostly Android devices) and that requires OpenGL ES 3.0 and above while Shells technique only requires OpenGL ES 2.0 . There is an example on the Shell technique based on XNA and I made an attempt to port that into Unity but it failed to work. Here is the article with the XNA project. The XNA shader: float4x4 World; float4x4 View; float4x4

Using different push-constants in different shader stages

一个人想着一个人 提交于 2019-12-23 07:03:48
问题 I have a vertex shader with a push-constant block containing one float: layout(push_constant) uniform pushConstants { float test1; } u_pushConstants; And a fragment shader with another push-constant block with a different float value: layout(push_constant) uniform pushConstants { float test2; } u_pushConstants; test1 and test2 are supposed to be different. The push-constant ranges for the pipeline layout are defined like this: std::array<vk::PushConstantRange,2> ranges = { vk:

Converting a Shadertoy into my own local Three.js sandbox

耗尽温柔 提交于 2019-12-23 05:05:57
问题 I'm making a local shader sandbox based on a Shadertoy by lennyjpg with the help of these two SO Q&A's (one, two). I'm converting the Shadertoy to use Three.js as a sandbox for a larger project that uses Three.js. However, while there are no errors displaying, I'm not seeing the expected result. Only the camera helper displays. What am I doing wrong here? (See the runnable snippet below.) Thanks in advance! var SCREEN_WIDTH = window.innerWidth; var SCREEN_HEIGHT = window.innerHeight; var