opengl-es

android ffmpeg opengl es render movie

百般思念 提交于 2019-12-18 11:37:16
问题 I am trying to render video via the NDK, to add some features that just aren't supported in the sdk. I am using FFmpeg to decode the video and can compile that via the ndk, and used this as a starting point. I have modified that example and instead of using glDrawTexiOES to draw the texture I have setup some vertices and am rendering the texture on top of that (opengl es way of rendering quad). Below is what I am doing to render, but creating the glTexImage2D is slow. I want to know if there

Android OpenGL ES Framebuffer objects - rendering depth-buffer to texture

。_饼干妹妹 提交于 2019-12-18 10:44:53
问题 I am using an Android device running Froyo supporting OpenGL ES 1.1 and OpenGL ES 2.0 I want to render the depth buffer to a texture. Having seen a number of examples for OpenGL, OpenGL ES on other platforms (including iPhone) I have tried a number of FBO configurations. I seem to be able to get an FBO set-up with a colour texture but every time I attach a depth texture it fails. My current code is based on this example but creating a colour texture as well instead of setting draw and read

android game loop vs updating in the rendering thread

放肆的年华 提交于 2019-12-18 10:37:11
问题 I'm making an android game and am currently not getting the performance I'd like. I have a game loop in its own thread which updates an object's position. The rendering thread will traverse these objects and draw them. The current behavior is what seems like choppy/uneven movement. What I cannot explain is that before I put the update logic in its own thread, I had it in the onDrawFrame method, right before the gl calls. In that case, the animation was perfectly smooth, it only becomes choppy

In OpenGL ES, how do I load a texture that has transparent pixels?

谁说胖子不能爱 提交于 2019-12-18 10:35:07
问题 And then have display correctly? An example would be having a round ball in a rectangle while being able to see another texture in the background. edit: At the moment, when I load the texture the transparent pixels from the source image are displayed as black. 回答1: For iPhone and N95 this works: If you are loading texture from raw data, set internal and source format to GL_RGBA. glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, textureWidth, textureHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, pointerToPixels)

OpenGL ES versus OpenGL

最后都变了- 提交于 2019-12-18 10:16:06
问题 What are the differences between OpenGL ES and OpenGL ? 回答1: Two of the more significant differences between OpenGL ES and OpenGL are the removal of the glBegin ... glEnd calling semantics for primitive rendering (in favor of vertex arrays) and the introduction of fixed-point data types for vertex coordinates and attributes to better support the computational abilities of embedded processors, which often lack an FPU Have a look here: OpenGL_ES 回答2: OpenGL ES is the opengl api for embedded

Why FloatBuffer instead of float[]?

旧时模样 提交于 2019-12-18 10:05:46
问题 I've been using FloatBuffers in my Android code for a while (copied it from some opengles tutorial), but I cannot understand exactly what this construct is and why it is needed. For example this code (or similar) I see in many many peoples' code and android tutorials: float[] vertices = ...some array... ByteBuffer vbb = ByteBuffer.allocateDirect(vertices.length * 4); vbb.order(ByteOrder.nativeOrder()); // use the device hardware's native byte order FloatBuffer fb = vbb.asFloatBuffer(); //

Android Video Player Using NDK, OpenGL ES, and FFmpeg

不羁岁月 提交于 2019-12-18 09:54:51
问题 Ok so here is what I have so far. I have built FFmpeg on android and am able to use it fine. I have been able to load a video into FFmpeg after passing the chosen filename from the java side. To save on performance I am writing video player in the NDK rather than passing frames from FFmpeg to java through JNI. I want to send frames from the video to an OpenGL surface. I am having trouble figuring out how to get each frame of video and render it onto the OpenGL surface. I have been stuck

How to use Blend to make a polygon transparent?

↘锁芯ラ 提交于 2019-12-18 09:54:07
问题 I have a app that shows two polygons. I need to make progressively invisible one of the polygons, but the other must be visible. Im working with OpenGL ES 1.1. I'm developing for Android, but i think that other platforms will do the same code with some minor changes. How i can do that? I know that i must do it with these functions: glEnable (GL_BLEND); glBlendFunc (GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA); But i dont know where i have to put them and how to use them to make my polygon

WebGL/Javascript: Object transformations with multiple objects

故事扮演 提交于 2019-12-18 09:24:36
问题 I want to draw several objects and then transform them by selecting the specific one with a keyboard index. Let's say 1-5. I loaded the canvas. I initialized the webgl-context. I defined vertex/fragment shaders and bound them to a program, which I "used" ( gl.useProgram("program") ). And then I initialized a VertexBuffer (it's an own function). There I defined the vertices for a cube and bound that buffer. In the same function I defined my cone vertices and I bound it to a different buffer.

How to get frame by frame from MP4? (MediaCodec)

淺唱寂寞╮ 提交于 2019-12-18 09:09:08
问题 Actually I am working with OpenGL and I would like to put all my textures in MP4 in order to compress them. Then I need to get it from MP4 on my Android I need somehow decode MP4 and get frame by frame by request. I found this MediaCodec https://developer.android.com/reference/android/media/MediaCodec and this MediaMetadataRetriever https://developer.android.com/reference/android/media/MediaMetadataRetriever But I did not see approach how to request frame by frame... If there is someone who