framebuffer

iOS GLKit and back to default framebuffer

喜夏-厌秋 提交于 2020-01-13 05:38:10
问题 I am running the boiler plate OpenGL example code that XCode creates for an OpenGL project for iOS. This sets up a simple ViewController and uses GLKit to handle the rest of the work. All the update/draw functionality of the application is in C++. It is cross platform. There is a lot of framebuffer creation going on. The draw phase renders to a few frame buffers and then tries to set it back to the default framebuffer. glBindFramebuffer(GL_FRAMEBUFFER, 0); This generates an GL_INVALID_ENUM.

What does GL_COLOR_ATTACHMENT do?

青春壹個敷衍的年華 提交于 2020-01-13 03:14:49
问题 I'm learning about framebuffers right now and I just don't understand what the Color attachment does. I understand framebuffers. What is the point of the second parameter in: glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, textureColorBuffer, 0); Why doesn't anything draw to my frame buffer when I change it to COLOR_ATTACHMENT1? How could I draw to the frame buffer by setting the texture to Color attachment 1? Why would using multiple color attachments be useful?

Drawing into OpenGL ES framebuffer and getting UIImage from it on iPhone

老子叫甜甜 提交于 2020-01-12 03:53:11
问题 I'm trying to do offscreen rendering of some primitives in OpenGL ES on iOS. The code is as follows: // context and neccesary buffers @interface RendererGL { EAGLContext* myContext; GLuint framebuffer; GLuint colorRenderbuffer; GLuint depthRenderbuffer; } .m file: - (id) init { self = [super init]; if (self) { // initializing context myContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2]; [EAGLContext setCurrentContext:myContext]; [self setupOpenGL]; // creating buffers }

How to draw image in Linux at same time on LCD and HDMI on Raspberry Pi with QT?

为君一笑 提交于 2020-01-06 14:28:22
问题 Does anyone knows how to draw an image in Linux, Raspberry Pi, Qt with double framebuffer in same time. I mean i want to run my application on LCD display and draw image to HDMI in same time. 回答1: I wrote this code, but I saw a lot of questions on the Internet how to display the image on Linux fraimbuffer. I'll leave it here, maybe someone need help with it. That code was tested on Raspberry Pi 2 model B,B+ and Linux Kernel 4.4.y. with Qt 5.6 File: fbdi.pro QT += core QT += gui QT += widgets

How to draw image in Linux at same time on LCD and HDMI on Raspberry Pi with QT?

落爺英雄遲暮 提交于 2020-01-06 14:28:14
问题 Does anyone knows how to draw an image in Linux, Raspberry Pi, Qt with double framebuffer in same time. I mean i want to run my application on LCD display and draw image to HDMI in same time. 回答1: I wrote this code, but I saw a lot of questions on the Internet how to display the image on Linux fraimbuffer. I'll leave it here, maybe someone need help with it. That code was tested on Raspberry Pi 2 model B,B+ and Linux Kernel 4.4.y. with Qt 5.6 File: fbdi.pro QT += core QT += gui QT += widgets

How to get pixel colour from framebuffer on linux (Raspberry Pi)

杀马特。学长 韩版系。学妹 提交于 2020-01-05 08:22:19
问题 I am trying to make a small program to control the colour of an RGB LED according to the colour of certain pixels on the screen. Since this is on Raspberry Pi running Raspbmc, I cant use XLib because everything is drawn from the frame buffer(not sure if this is true, but from what I read on the FAQ this appears to be the case). I tried using XLib but couldn't get the display to be detected (makes sense why it doesn't work now). This is an example I found online. The problem is, it compiles

How to get pixel colour from framebuffer on linux (Raspberry Pi)

青春壹個敷衍的年華 提交于 2020-01-05 08:22:03
问题 I am trying to make a small program to control the colour of an RGB LED according to the colour of certain pixels on the screen. Since this is on Raspberry Pi running Raspbmc, I cant use XLib because everything is drawn from the frame buffer(not sure if this is true, but from what I read on the FAQ this appears to be the case). I tried using XLib but couldn't get the display to be detected (makes sense why it doesn't work now). This is an example I found online. The problem is, it compiles

Ambiguous results with Frame Buffers in libgdx

廉价感情. 提交于 2020-01-04 09:25:30
问题 I am getting the following weird results with the FrameBuffer class in libgdx. Here is the code that is producing this result: // This is the rendering code @Override public void render(float delta) { Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT | GL20.GL_DEPTH_BUFFER_BIT); stage.act(); stage.draw(); fbo.begin(); batch.begin(); batch.draw(heart, 0, 0); batch.end(); fbo.end(); test = new Image(fbo.getColorBufferTexture()); test.setPosition(256, 256); stage.addActor(test); } //This is the

Ambiguous results with Frame Buffers in libgdx

橙三吉。 提交于 2020-01-04 09:25:14
问题 I am getting the following weird results with the FrameBuffer class in libgdx. Here is the code that is producing this result: // This is the rendering code @Override public void render(float delta) { Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT | GL20.GL_DEPTH_BUFFER_BIT); stage.act(); stage.draw(); fbo.begin(); batch.begin(); batch.draw(heart, 0, 0); batch.end(); fbo.end(); test = new Image(fbo.getColorBufferTexture()); test.setPosition(256, 256); stage.addActor(test); } //This is the

Blurring the depth buffer in OpenGL - how to access mipmap levels in a fragment shader?

别说谁变了你拦得住时间么 提交于 2020-01-03 19:10:10
问题 I'm trying to blur a depth texture by blurring & blending mipmap levels in a fragment shader. I have two frambuffer objects: 1) A color frambuffer with a depth renderobject attached. 2) A z framebuffer with a depth texture attached. Once I render the scene to the color framebuffer object, I then blit to the depth buffer object, and can successfully render that (output is a GL_LUMINANCE depth texture). I can successfully access any given mipmap level by selecting it prior to drawing the depth