MediaCodec simultaneous encoding and decoding

倖福魔咒の 提交于 2019-12-04 12:45:20

Sharing textures between OpenGL ES contexts requires some care. The way it's implemented in Grafika's "show + capture camera" Activity is broken; see this issue for details. The basic problem is that you essentially need to issue memory barriers when the texture is updated; in practical terms that means issuing glFinish() on the producer side and, and re-binding the texture on the consumer side, and doing all of this in synchronized blocks.

Your life will be simpler (and more efficient) if you can do all of the GLES work on a single thread. In my experience, having more than one GLES context active at a time is unwise, and you'll save yourself some pain by finding an alternative.

You probably want something more like this:

  • Thread #1 reads the file and feeds frames into a MediaCodec decoder. The decoder sends the output to a SurfaceTexture Surface.
  • Thread #2 has the GLES context. It created the SurfaceTexture that thread #1 is sending the output to. It processes the images and renders the output on the Surface of a MediaCodec encoder.
  • Thread #3, which created the MediaCodec encoder, sits waiting for the encoded output. As output is received it's written to disk. Note that the use of MediaMuxer can stall; see this blog post for more.

In all cases, the only communication between threads (and, under the hood, processes) is done through Surface. The SurfaceTexture and MediaCodec instances are created and used from a single thread; only the producer endpoint (the Surface) is passed around.

One potential trouble point is with flow control -- SurfaceTextures will drop frames if you feed them too quickly. Combining threads #1 and #2 might make sense depending on circumstances.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!