framebuffer

How to get/set the width and height of the default framebuffer?

丶灬走出姿态 提交于 2019-12-06 04:04:33
问题 I want to know the dimension of my default frame buffer. I read setting view port to a particular value does not affect/set the dimensions of frame buffer. Are there any GL calls for this? 回答1: You can't set the size of the default framebuffer with OpenGL calls. It is the size of the window, which is controlled by the window system interface (e.g. EGL on Android). If you want to control it, this has to happen as part of the initial window/surface/context setup, where the details are platform

framebuffer and using shaders in opengl

我与影子孤独终老i 提交于 2019-12-06 03:41:36
问题 I'm quite a bit confused about framebuffers. What I want to do is using a framebuffer with multiple textures attached, fill every texture and then use a shader to combine (blend) all textures to create a new output. Sounds easy? yeah that's what I thought too, but I don't understand it. How can I pass the currently binded texture to a shader? 回答1: What you need is to put the texture in a specific slot, then use a sampler to read from it. In your app: GLuint frameBuffer; glGenFramebuffersEXT(1

How do you know what you've displayed is completely drawn on screen?

不问归期 提交于 2019-12-06 03:31:29
Displaying images on a computer monitor involves the usage of a graphic API, which dispatches a series of asynchronous calls... and at some given time, put the wanted stuff on the computer screen. But, what if you are interested in knowing the exact CPU time at the point where the required image is fully drawn (and visible to the user)? I really need to grab a CPU timestamp when everything is displayed to relate this point in time to other measurements I take. Without taking account of the asynchronous behavior of the graphic stack, many things can get the length of the graphic calls to jitter

WebGL display framebuffer?

天涯浪子 提交于 2019-12-06 03:18:46
I used the WEBKIT_WEBGL_depth_texture Extension. And init the buffers below. But how am I able to draw this framebuffer?? I'm totaly stuck right now. -.- function InitDepthtextures (){ var size = 256; // Create a color texture var colorTexture = gl.createTexture(); gl.bindTexture(gl.TEXTURE_2D, colorTexture); gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST); gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST); gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE); gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE); gl.texImage2D

CGImageRef width doesn't agree with bytes-per-row

∥☆過路亽.° 提交于 2019-12-06 02:26:11
问题 I'm trying to read pixels out of the screen buffer, I'm creating a CGImageRef with CGDisplayCreateImage , but the values for CGImageGetWidth and CGImageGetBytesPerRow Don't make sense together, dividing the bytes per row by the bytes per pixel gives me 1376 pixels per row, but the width of the image is 1366. What's going on here? Is there some kind of padding in the image? How do I read from the data I'm getting out of it safely, and with the correct results? Edit: The minimal code needed to

COLOR_ATTACHMENT's - How to render to multiple textures as color attachments inside a Framebuffer Object?

一世执手 提交于 2019-12-05 10:44:48
I am trying to render to multiple textures as COLOR_ATTACHMENT s without success. All I get from displaying them is a black screen (with a red clear fill) meaning my texture is read but is 'empty'. My pseudo code is : attach 3 textures to an FBO with texture indexes 1, 2 and 3 and color attachments 0, 1 and 2 respectively. As a test case, I tried to render my scene to the 3 color attachments so they are supposed to hold the same exact data. Then read either of those textures at shader pass 2 (with a 2Dsampler) and display them on a quad. My original intent for those 2 extra color attachments

using linux framebuffer for graphics but disabling console text

半城伤御伤魂 提交于 2019-12-05 02:46:28
问题 I have some c code that draws simple graphics on the linux framebuffer console. I'm also using the raspberry pi and it's composite video output. The OS is raspbian, and i'm doing a low level solution without using X. My graphics are working well, and i'm also able to read the usb keyboard and respond to key presses. Currently there is a tty terminal that my graphics are written over. The tty is still active and key presses are echoed to the screen. What I want to achieve is to disable all

Writing to then reading from an offscreen FBO on iPhone; works on simulator but not on device?

此生再无相见时 提交于 2019-12-05 01:31:38
I'm trying to do some image manipulation on the iPhone, basing things on the GLImageProcessing example from Apple. Ultimately what I'd like to do is to load an image into a texture, perform one or more of the operations in the example code (hue, saturation, brightness, etc.), then read the resulting image back out for later processing/saving. For the most part, this would never need to touch the screen, so I thought that FBOs might be the way to go. To start with, I've cobbled together a little example that creates an offscreen FBO, draws to it, then reads the data back out as an image. I was

OpenGL: Rendering to texture by using FBO and viewport offset problems

心已入冬 提交于 2019-12-04 21:57:03
I have noticed unexpected behavior of frame-buffer objects (FBO) when rendering to a texture. If we set viewport in following way: glViewport(0, 0, w, h); glMatrixMode(GL_PROJECTION); glLoadIdentity(); gluOrtho2D(0.0, 1.0, 1.0, 0.0); glMatrixMode(GL_MODELVIEW); glLoadIdentity() ; (w and h doesn't need to match window size) everything is rendered fine: So lets say we need to draw bounding rectangle of viewport: glBegin(GL_LINE_STRIP); glVertex2f(0.0, 0.0); glVertex2f(0.0, 1.0); glVertex2f(1.0, 1.0); glVertex2f(1.0, 0.0); glEnd(); If we do the same drawing on a texture then covering whole

Rendering Static and Dynamic Graphics OpenGL

坚强是说给别人听的谎言 提交于 2019-12-04 19:42:27
I am working on an IOS game using the OpenGL pipeline. I have been able to render the graphics I want to the screen, however, I am calling glDrawElements too many times and have some concerns about running into performance issues eventually. I have several static elements in my game that do not need to be render on every render cycle. Is there a way I can render static elements to one frame buffer and dynamic elements to another? Here's the code I have tried: static BOOL renderThisFrameBuffer = YES; if (renderThisFrameBuffer) { glBindFramebuffer(GL_FRAMEBUFFER, interFrameBuffer); glClearColor