shader

Why am I getting a blank screen when using shader?

浪子不回头ぞ 提交于 2019-12-04 19:28:40
I'm using this tutorial to create and draw an outline for my game Sprite s. However, all I get is a blank red screen. I'm very new to Shader s so I'm not sure if I'm missing something very trivial. My vertex and fragment shaders were copy-pasted from the above tutorial. (Commenting on the tutorial doesn't seem to work so I was unable to seek help there.) My code: float x = 0, y = 0, height = 256, width = 256, angle = 0, outlineSize = 1f; @Override public void create() { batch = new SpriteBatch(); img =new Texture("badlogic.jpg"); sprite = new Sprite(img); loadShader(); float w = Gdx.graphics

Three js - Cloning a shader and changing uniform values

守給你的承諾、 提交于 2019-12-04 17:59:31
问题 I'm working on creating a shader to generate terrain with shadows. My starting point is to clone the lambert shader and use a ShaderMaterial to eventually customise it with my own script. The standard method works well: var material = new MeshLambertMaterial({map:THREE.ImageUtils.loadTexture('images/texture.jpg')}); var mesh = new THREE.Mesh(geometry, material); etc The result: However I'd like to use the lambert material as a base and work on top of it, so I tried this: var lambertShader =

Image2D in compute shader

馋奶兔 提交于 2019-12-04 17:54:38
I want to use image2D as 2D storage for vertices which will be modified by compute shader but things doesnt work. Create textures: glGenTextures(1, &HeightMap); glBindTexture(GL_TEXTURE_2D, HeightMap); glTexImage2D(GL_TEXTURE_2D, 0,GL_RGBA32F, 513, 513, 0,GL_RGBA32F, GL_UNSIGNED_BYTE, 0); Use and dispatch compute shader: glUseProgram(ComputeProgram); glActiveTexture(GL_TEXTURE0); glBindImageTexture(0, HeightMap, 0, GL_FALSE, 0, GL_READ_WRITE, GL_RGBA32F); glDispatchCompute(1, 1, 1 ); glMemoryBarrier( GL_ALL_BARRIER_BITS ); And compute shader: #version 430 core layout( std430, binding=1 )

Constant float values in GLSL shaders - any reason to use uniforms?

只谈情不闲聊 提交于 2019-12-04 17:02:18
问题 I'm looking at the source of an OpenGL application that uses shaders. One particular shader looks like this: uniform float someConstantValue; void main() { // Use someConstantValue } The uniform is set once from code and never changes throughout the application run-time. In what cases would I want to declare someConstantValue as a uniform and not as const float ? Edit: Just to clarify, the constant value is a physical constant. 回答1: First off, the performance difference between using a

Calculating screen texture coordinates in CG/HLSL

蓝咒 提交于 2019-12-04 17:00:38
In OpenGL , sometimes when doing multi-pass rendering and post-processing I need to apply texels to the primitive's assembly fragments which are part of full screen texture composition.That is usually the case when the current pass comes from FBO texture to which the screen quad had been rendered during previous pass.To achieve this I calculate objects UV coordinates in SCREEN SPACE .In GLSL I calculate it like this: vec2 texelSize = 1.0 / vec2(textureSize(TEXTURE, 0)); vec2 screenTexCoords = gl_FragCoord.xy * texelSize; Now I am experimenting with Unity3D which uses CG /HLSL.The docs for

How to use a shaderModifier to alter the color of specific triangles in a SCNGeometry

家住魔仙堡 提交于 2019-12-04 16:46:38
First, before I go on, I have read through: SceneKit painting on texture with texture coordinates which seems to suggest I'm on the right track. I have a complex SCNGeometry representing a hexasphere. It's rendering really well, and with a full 60fps on all of my test devices. At the moment, all of the hexagons are being rendered with a single material, because, as I understand it, every SCNMaterial I add to my geometry adds another draw call, which I can't afford. Ultimately, I want to be able to color each of the almost 10,000 hexagons individually, so adding another material for each one is

Possible Retina issue with OpenGL ES on iPhone?

社会主义新天地 提交于 2019-12-04 15:42:50
This is probably linked to another unsolved mystery of mine. I'm drawing Orthographic 2d on iPhone, using real device and simulator. I'm trying to color my pixels a given color depending on how far they are from arbitrary point in pixelspace of 'A', which I pass in (hard code). I'm doing everything in Retina 960x640 resolution. I calculate distance from A to gl_FragCoord , and I color based on leaping between 2 colors with the 'max' being 300px distance. When on simulator (with retina display) I need to give a center point of "460" pixels for screen midpoint X.. Y I give 160px, and I look for

OpenGL One (Triangle)

不羁的心 提交于 2019-12-04 14:42:07
前言: 虽然C++是个语法及其憨批的语言( 菜是原罪 ),但确实是入门OpenGL这门技术的不二之选。 Tips 顶点数组对象:Vertex Array Object,VAO 顶点缓冲对象:Vertex Buffer Object,VBO 索引缓冲对象:Element Buffer Object,EBO 或Index Buffer Object,IBO 一点点基本的概念 3D坐标转为2D坐标是通过OpenGL的 图形渲染管线 管理的。 图形渲染管线可以被划分为两个主要部分: 第一部分是把3D坐标转换为2D坐标 第二部分是把2D坐标转换为实际有颜色的像素 注意:2D坐标和像素不同,2D坐标精准表示一个点在2D空间种的位置,而2D像素是这个点的近似值,2D像素收到你的屏幕分辨率的限制。 管线接受特定的3D坐标,然后把他们转变成屏幕上的有色2D像素输出。图形渲染有很多阶段,每个阶段都是高度专门化的,有一个特定的函数处理。 当今显卡都有很多核心,在GPU上为每一个管线的极端运行各自的小程序,快速处理数据,这些小程序就叫 着色器(shader) 。OpenGL的着色器是使用着色器语言 GLSL 。 (蓝色的部分是可以自己写入着色器的部分)Vertex Shader && Fragment Shader 必须写 Vertex Shader:作为单独的定点作为输入。 Fragment

SKShader performance slowdown

放肆的年华 提交于 2019-12-04 14:41:58
问题 I need to implement a custom shader node using SpriteKit. In simulator everything is ok. On a device (iPad 3rd gen) shader animation is smooth just for first ~30 seconds, after that shader's fps are gradually falling down until it looks like a slideshow (1 fps or even less) It is worth noting, that SpriteKit shows 60 fps, so does Xcode. CPU is ~75% busy, but shader itself shows ~1fps. I own only 3rd generation iPad and I currently don't have an opportunity to test it on other devices Shader

Unity3D: Drawing particles after AA resolve for performance

試著忘記壹切 提交于 2019-12-04 13:00:00
I'm trying to gauge the effect of MSAA on scenes with lots of particles in Unity. To do that, I'd need to: Draw all non-particle objects in the scene with 8x MSAA Use the resolved depth buffer from the previous pass to render all the non-occluded particle systems onto a smaller render target. Alpha Blend the color buffer of (2) with (1) I'm looking at the post-processing effects to get a sense of what i might need to do, but none of them use the depth buffer from a previous pass to affect the depth test of other objects in the scene (in this case, the particles). They instead post-process the