render-to-texture

glFramebufferTexture2D fails on iPhone for certain texture sizes

时光总嘲笑我的痴心妄想 提交于 2019-12-12 19:16:27
问题 When I try to attach a texture to a framebuffer, glCheckFramebufferStatus reports GL_FRAMEBUFFER_UNSUPPORTED for certain texture sizes. I've tested on both a 2nd and 4th generation iPod Touch. The sizes of texture that fail are not identical between the two models. Here are some interesting results: 2nd generation - 8x8 failed, 16x8 failed, but 8x16 succeeded! 4th generation - 8x8 succeeded, 8x16 succeeded, but 16x8 failed! Here's some code I used to test attaching textures of different sizes

When does XNA discard Render Target contents?

倖福魔咒の 提交于 2019-12-12 14:53:36
问题 I understand that Render Targets in XNA are volatile, but how volatile are they? I can't find any documentation that tells when exactly their contents are discarded. Is it just when you start drawing to them, or could it be at any time? I would like to simply draw to a render target once and then use it as a Texture2D indefinitely. Is this possible? Would I need to enable RenderTargetUsage.PreserveContents for this to work properly? I have read that PreserveContents is very slow on Xbox and

OpenGL render-to-texture-via-FBO — incorrect display vs. normal Texture

孤街醉人 提交于 2019-12-12 08:44:42
问题 off-screen rendering to a texture-bound offscreen framebuffer object should be so trivial but I'm having a problem I cannot wrap my head around. My full sample program (2D only for now!) is here: http://pastebin.com/hSvXzhJT See below for some descriptions. I'm creating an rgba texture object 512x512, bind it to an FBO. No depth or other render buffers are needed at this point, strictly 2D. The following extremely simple shaders render to this texture: Vertex shader: varying vec2 vPos;

Multiple Render Targets not saving data

折月煮酒 提交于 2019-12-12 08:13:19
问题 I'm using SlimDX, targeting DirectX 11 with shader model 4. I have a pixel shader "preProc" which processes my vertices and saves three textures of data. One for per-pixel normals, one for per-pixel position data and one for color and depth (color takes up rgb and depth takes the alpha channel). I then later use these textures in a postprocessing shader in order to implement Screen Space Ambient Occlusion, however it seems none of the data is getting saved in the first shader. Here's my pixel

OpenGL ES 2.0 Shader on Texture not working

隐身守侯 提交于 2019-12-12 04:25:19
问题 I copied the example from this site click It is working well after fixing some minor things and extending the shader for my purpose. Now i want to move/translate the texture to the right side of the screen. For that case i have to add a ModelViewProjection Matrix to my code and as well to the shader, right? After doing this the Texture is not showing up anymore :-( What am I doing wrong? So the code is working until i change in my shader gl_position from : gl_Position= a_position to gl

OpenGL - FBO and alpha blending

拥有回忆 提交于 2019-12-12 01:53:26
问题 I was finding for answer, but I can't get answer for my problem. I have FBO and I can't get alpha blending and multisample to work. FBO draws scene to texture and then it's drown to default framebuffer with two textured triangles. Drawing directly to default framebuffer is fine. Here is difference between default framebuffer (top) and my FBO (bottom). I use FBO with 2x color attachments and 1x depth attachments. (Only GL_COLOR_ATTACHMENT0 is used, second is for other function) Depth test:

Opengl es: Render to Texture via Frame Buffer is rendering only one color

爱⌒轻易说出口 提交于 2019-12-11 18:37:14
问题 am trying to implement a motion blur effect in my android game. After a lot of research I found that the best way to do that is to save the previous frame as a texture using the Frame Buffer Object and render it on top of the current frame. So seeing some nice tutorials on how to do something like that I ended up with this code which basically render my scene on the texture and then draws the texture to the default framebuffer. But the texture has only one color ,like when i have a green

MTKView blend issues when re-using currentDrawable.texture in draw() loop

最后都变了- 提交于 2019-12-11 17:12:09
问题 I am working on a metal-backed painting application in which I divide the drawing of a stroke in two steps: the first step draws the leading edge of a stroke to screen and captures the entire to an MTLTexture via: metalTextureComposite = self.currentDrawable!.texture the second step draws an updated leading edge of the advancing stroke, and composites atop a polygon textured with the last saved metalTextureComposite. This method allows me to draw infinitely long strokes without sacrificing

MTKView vertex transparency is not getting picked up in “additive” blending mode

蓝咒 提交于 2019-12-11 16:22:16
问题 I am trying to implement a metal-backed drawing application where brushstrokes are drawn on an MTKView by stamping a textured square repeatedly along a path. I am varying the stamp's color/transparency at the vertex level as the brushstroke is drawn so I can simulate ink effects such as color/transparency fading over time, etc. This seems to work ok when I am using a classic "over" type blending (which does not accumulate value over time), but when I use "additive" blending, vertex

Load Uniform Matrix 1104 GL_Invalid_Operation Android OpenGLES 2.0

﹥>﹥吖頭↗ 提交于 2019-12-11 12:06:26
问题 I am working on implementing rendering to texture on Android 4.3+ (OpenGLES 2.0) I am getting the following error in my DrawFrame() method: 01-15 13:40:07.545: W/Adreno-ES20(23709): <__load_uniform_matrix:1104>: GL_INVALID_OPERATION 01-15 13:40:07.545: E/com.hpp.STextureRender(23709): glDrawArrays: glError 1282 01-15 13:40:07.545: D/io.hpp.CaptureManager(23709): Error encountered in drawFrame = glDrawArrays: glError 1282 01-15 13:40:07.545: W/System.err(23709): java.lang.RuntimeException: