opengl-es

Can't run application on some Android devices

纵饮孤独 提交于 2019-12-11 19:00:53
问题 I made a QtQuick program and I found that Qt uses OpenGL ES 2.0 library. I amble to build and deploy my application to all devices, I have including AVD. But on some devices I receive error log: W/Qt ( 1246): eglconvenience/qeglconvenience.cpp:289 (void* QEglConfigChooser::chooseConfig()): Cant find EGLConfig, returning null config W/Qt ( 1246): eglconvenience/qeglconvenience.cpp:289 (void* QEglConfigChooser::chooseConfig()): Cant find EGLConfig, returning null config W/Qt ( 1246): scenegraph

How to determine UV texture coordinates for n-sided polygon

谁说我不能喝 提交于 2019-12-11 18:54:43
问题 I have generated an n-sided polygon using the code below: public class Vertex { public FloatBuffer floatBuffer; // buffer holding the vertices public ShortBuffer indexBuffer; public int numVertices; public int numIndeces; public Vertex (float[] vertex) { this.setVertices(vertex); } public Vertex (float[] vertex, short[] indices) { this.setVertices(vertex); this.setIndices(indices); } private void setVertices(float vertex[]) { // a float has 4 bytes so we allocate for each coordinate 4 bytes

OpenGL es glDrawTexfOES() 2D texture not rendering

蹲街弑〆低调 提交于 2019-12-11 18:53:09
问题 First of all I know this is a duplicated question and I check out another questions but didn't helped! I'm trying to convert my android game from canvas system to the GL10. After lots of googling I decided to do that for performans. Anyway I tryed to build a class that loads and draws textures. The problem is textures showing up whole black over my blue screen. I'm not sure If I can draw texture without creating any mesh or something like that but this is what I'm trying to do. I'm only

Opengl es: Render to Texture via Frame Buffer is rendering only one color

爱⌒轻易说出口 提交于 2019-12-11 18:37:14
问题 am trying to implement a motion blur effect in my android game. After a lot of research I found that the best way to do that is to save the previous frame as a texture using the Frame Buffer Object and render it on top of the current frame. So seeing some nice tutorials on how to do something like that I ended up with this code which basically render my scene on the texture and then draws the texture to the default framebuffer. But the texture has only one color ,like when i have a green

Why is it necessary to bind a buffer more than once in succession? [duplicate]

笑着哭i 提交于 2019-12-11 18:27:01
问题 This question already has answers here : Closed 6 years ago . Possible Duplicate: How does glBufferData know which VBO to work on? I've noticed in sample code (in an O'Reilly book) for both VBOs and render buffers that the binding is done more than once. What is the reasoning behind this? For example, you might have this at the top of an OpenGL routine: glGenBuffers(1, &m_vertexBuffer); glBindBuffer(GL_ARRAY_BUFFER, m_vertexBuffer); And then before doing the drawing, you do it again:

OpenGL iOS view does not paint

南楼画角 提交于 2019-12-11 18:13:17
问题 I am trying to use OpenGL to paint a view which is a subview of another view. I have created a view class for this purpose, and if I use this class in a simple test application it works fine. However, if I place an instance of this class on a particular page of my app, the OpenGL painting does not display anything. I am certain that the view is visible (I can set a background color, and that is displayed, and I can receive touch events). I can also trace through the OpenGL initialization and

GluProject not working… If the object haves Z=-1.0f and it is scaled to (0.01f,0.01f,0.0f) wich parameters i have to pass to GluProject?

痴心易碎 提交于 2019-12-11 18:02:06
问题 until now, I worked with gluProject , perspective projection, and a zoomable square centered on the screen with a lower left vertex (-1,-1,0). I zoom the square adjusting the Z axis. For example, I zoomed the square to Z=-5, and I call gluProject with the openGL object parameters (-1,-1,0) to know the window pixels X,Y position of that vertex of the square. It works fine. But now, I changued my architecture, and now I'm not using Z to zoom, I'm scaling to zoom. I have the square at Z=-1.0f,

Native WebGL particle system opacity issue

不想你离开。 提交于 2019-12-11 17:54:07
问题 I am trying to render textured particles and i have the problem. Transparent pixels of texture doing a weird thing with render. Looks like particles that are behing nearest (to camera) particles are not rendering at all. But not always, some of them are rendering and look expected. I was tried to play around with depth and blend options but without result. Perhaps that a solution can be found by modifying this part code. gl.enable(gl.DEPTH_TEST); gl.enable(gl.BLEND); gl.blendFunc(gl.SRC_ALPHA

Works on Bionic 4.1.2 but not on Samsung 4.4.2

◇◆丶佛笑我妖孽 提交于 2019-12-11 17:52:05
问题 I have an app that I'm working on that uses openGL ES 2.0 on an android device. The app contains some buttons that are displayed using programmatic layout controls. This button pad is superimposed on an OpenGL Surface View. When I run the app on the tablet the buttons appear AND the Surface View appears briefly, but then the Surface View is grey, the color of the background for the 3d model I've created. The app works on a Motorola Bionic, Android 4.1.2, but doesn't work on a Samsung Galaxy

Process every camera frame as Bitmap with OpenGL

偶尔善良 提交于 2019-12-11 17:45:03
问题 I have an app, where I want to process every given frame from the camera to do some ARCore stuff. So I have a class implementing GLSurfaceView.Renderer , and in this class I have the onDrawFrame(GL10 gl) method. In this method, I want to work with an Android bitmap, so I call this code to get a bitmap from the current frame: private Bitmap getTargetImageBitmapOpenGL(int cx, int cy, int w, int h) { try { if (currentTargetImageBitmap == null) { currentTargetImageBitmap = Bitmap.createBitmap(w,