mediacodec

How to solve the Android MediaCodec error after setting “csd-0” and “csd-1”?

匿名 (未验证) 提交于 2019-12-03 01:42:02
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I use the MediaCodec to encode video stream and i need to set the "csd-0" and "csd-1" as following according to here : byte[] sps = { 0, 0, 0, 1, 103, 100, 0, 40, -84, 52, -59, 1, -32, 17, 31, 120, 11, 80, 16, 16, 31, 0, 0, 3, 3, -23, 0, 0, -22, 96, -108 }; byte[] pps = { 0, 0, 0, 1, 104, -18, 60, -128 }; MediaFormat format = MediaFormat.createVideoFormat("video/avc", width, height); format.setByteBuffer("csd-0", ByteBuffer.wrap(sps)); format.setByteBuffer("csd-1", ByteBuffer.wrap(pps)); ... But i got an error: format: {frame-rate=15, height

Android MediaCodec appears to buffer H264 frames

匿名 (未验证) 提交于 2019-12-03 01:36:02
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I'm manually reading a RTP/H264 stream and pass the H264 frames to the Android MediaCodec. I use the "markerBit" as a border for the frames. The MediaCodec is tied to a OpenGL Texture (SurfaceTexture). In general everything works fine. But the Decoder appears to buffer frames. If I put a frame in the decoder it is not rendered immediately to the texture. After I put 2-3 frames more in the decoder the first frame is rendered to the texture. I'm implementing against Android 4.4.4. private static final int INFINITE_TIMEOUT = -1; private static

android-ndk crash in android::MediaCodec?

匿名 (未验证) 提交于 2019-12-03 01:26:01
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: Can someone help me figure out what is the following crash about? Thanks. I/DEBUG ( 3007): Build fingerprint: 'samsung/zerofltetmo/zerofltetmo:5.1.1/LMY47X/G920TUVU2COF8:user/release-keys' I/DEBUG ( 3007): Revision: '11' I/DEBUG ( 3007): ABI: 'arm' I/DEBUG ( 3007): pid: 19656, tid: 21303, name: MediaCodec_loop >>> com.******.**** <<< I/DEBUG ( 3007): signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x14 I/DEBUG ( 3007): r0 f4a23e40 r1 00000003 r2 ca10ab30 r3 00000000 I/DEBUG ( 3007): r4 ca10ab30 r5 ca10ab60 r6 f4b14ea0 r7 000003f5 I

SurfaceTexture updateTexImage to shared 2 EGLContexts - Problems on Android 4.4

老子叫甜甜 提交于 2019-12-03 00:50:59
I am referring to this excellent example of how to encode the preview frames of the camera directly into an mp4 file: http://bigflake.com/mediacodec/CameraToMpegTest.java.txt I have adopted the code in the way that I also would like to render the preview image on the screen. Therefore I got something like a GLTextureView with its own EGLContext. This Context is then used as shared EGLContext when I create the EGLContext for the encoder rendering: mEGLContext = EGL14.eglCreateContext(mEGLDisplay, configs[0], sharedContext == null ? EGL14.EGL_NO_CONTEXT : sharedContext, attrib_list, 0); In my

In Android, how to pass a predefined Surface to MediaCodec for encoding?

早过忘川 提交于 2019-12-02 23:45:15
I have an app that manages its own GLSurfaceView and now I want to use Android 4.3's new MediaCodec feature that takes a Surface as input. In all the examples I've seen, the Surface is created using MediaCodec.createInputSurface(), then the GL context is created for this Surface. This feels monolithic and incredibly disruptive to retrofit into a code base that is already stable. Is it possible to use MediaCodec.configure(format, a_predefined_Surface, null, MediaCodec.CONFIGURE_FLAG_ENCODE) instead? This allows me to use MediaCodec in a plug-and-play and on-demand way. The fact that MediaCodec

Unable to mux both audio and video

删除回忆录丶 提交于 2019-12-02 23:41:43
I'm writing an app that records screen capture and audio using MediaCodec. I use MediaMuxer to mux video and audio to create mp4 file. I successfuly managed to write video and audio separately, however when I try muxing them together live, the result is unexpected. Either audio is played without video, or video is played right after audio. My guess is that I'm doing something wrong with timestamps, but I can't figure out what exactly. I already looked at those examples: https://github.com/OnlyInAmerica/HWEncoderExperiments/tree/audiotest/HWEncoderExperiments/src/main/java/net/openwatch

Android Video Circular Buffer with Sound

冷暖自知 提交于 2019-12-02 19:43:09
I am using Google's Open Source Example: Grafika . I am using it's ContinuousCaptureActivity.java The CircularBuffer's Implementation is demonstrated in this Activity, but there is no audio included in the resultant Video file. I want to add the Audio Recording functionality within this Activity and add the recorded Audio into the Video in the same CircularBuffered Fashion. For achieving this i have explored the MediaCodec Library, which was introduced in 4.3+ versions. I have also used MediaMuxer to capture Video and Audio Streams and Muxed them into a single Video. But, I am not sure about

Muxing AAC audio with Android's MediaCodec and MediaMuxer

时间秒杀一切 提交于 2019-12-02 17:41:48
I'm modifying an Android Framework example to package the elementary AAC streams produced by MediaCodec into a standalone .mp4 file. I'm using a single MediaMuxer instance containing one AAC track generated by a MediaCodec instance. However I always eventually get an error message on a call to mMediaMuxer.writeSampleData(trackIndex, encodedData, bufferInfo) : E/MPEG4Writer﹕timestampUs 0 < lastTimestampUs XXXXX for Audio track When I queue the raw input data in mCodec.queueInputBuffer(...) I provide 0 as the timestamp value per the Framework Example (I've also tried using monotonically

What governs playback speed when encoding with Android's MediaCodec + mp4parser?

自古美人都是妖i 提交于 2019-12-02 16:46:50
问题 I'm trying to record, encode and finally create a short movie on Android (using API 16) with a combination of MediaCodec and Mp4Parser (to encapsulate into .mp4). Everything is working just fine, except for the duration of the .mp4: its always 3 seconds long - and runs at about twice the 'right' speed. The input to encoder is 84 frames (taken 100ms apart). The last frame sets the 'end of stream' flag. I set the presentation time for each frame on queueInputBuffer I've tried to tweak every

Android 音视频深入 十九 使用ijkplayer做个视频播放器(更新:增加了小框,2018-3-12)

喜欢而已 提交于 2019-12-02 14:43:56
项目地址 https://github.com/979451341/Myijkplayer 前段时候我觉得FFmpeg做个视频播放器好难,虽然播放上没问题,但暂停还有通过拖动进度条来设置播放进度,这些都即便做得到,可以那个延缓。。。。。 现在学习一下目前移动端最知名的视频播放器的框架ijkplayer,这个框架他是基于FFmpeg、SDL、还有安卓原生API MediaCodec之类的。他是没有播放界面的,这个需要我们去做,所以这个里我就做个基于ijkplayer的视频播放器,随便浅显的说一下ijkplayer的源码,关于ijkplayer的源码以后会专门出一篇博客说一下。 1.首先了解一下ijkplayer咋用 我这里引入ijkplayer是通过添加依赖 implementation 'tv.danmaku.ijk.media:ijkplayer-java:0.8.8' implementation 'tv.danmaku.ijk.media:ijkplayer-armv7a:0.8.8' implementation 'tv.danmaku.ijk.media:ijkplayer-armv5:0.8.8' implementation 'tv.danmaku.ijk.media:ijkplayer-arm64:0.8.8' implementation 'tv.danmaku