mediacodec

Is there a way to get the total number of video frames on Android?

匆匆过客 提交于 2020-01-04 19:49:30
问题 I am currently extracting, decoding, editing and re-encoding a video on Android using MediaCodec and MediaExtractor on Android. In the course of this process I would like to give users some information on the progress. I am already counting how many frames were extracted, decoded and encoded. In order to compute a percentage and to show users how far the process is, I would need the total number of frames in the original stream. However, I am unable to find a method in the MediaExtractor.

Android MediaCodec API not working in Child Thread

試著忘記壹切 提交于 2020-01-02 19:32:10
问题 In Android, I tried using MediaCodec api (MediaCodecList.getCodecCount()) in Main Thread seems working fine with no issues. But the same api when i used inside its child thread , ends up with application crash. The crash log was stated below: A/libc(18571): Fatal signal 11 (SIGSEGV), code 1, fault addr 0xe0 in tid 18600 The above scenario was tested with Andriod 5.0.2 mobile device. Why such behaviour and any idea to resolve this ? 来源: https://stackoverflow.com/questions/32884208/android

MediaMuxer.nativeWriteSampleData always peroidically blocks for about one second during video recording

隐身守侯 提交于 2020-01-02 16:53:24
问题 I am doing android video recording using mediacodec + mediamuxer, and now I can record video and generate mp4 file which can be played. The problem is that I find the recorded video will seize for about one second some time. So I launched traceview and I find MediaMuxer.nativeWriteSampleData() cause the problem. Sometimes this function is very fast and returns within several micro-seconds, but sometimes this function is very slow and will consume about one second or so, and the video blocks

How to stream data from MediaCodec to AudioTrack with Xamarin for Android

可紊 提交于 2020-01-02 05:38:07
问题 I'm trying to decode a mp3 file and stream it to AudioTrack. It all works fine but causes a lot of GC on the Java side. I've made sure to not allocate memory in my play/stream loop and blame ByteBuffer.Get(byte[], int, int) binding of allocating a temp Java array. Anyone can confirm and/or show a better way of feeding data from MediaCodec to AudioTrack? (I know API 21 introduced AudioTrack.write(ByteBuffer, ...)) Thanks Here is what I do: byte[] audioBuffer = new byte[...]; ... ByteBuffer

MediaCodec createInputSurface

做~自己de王妃 提交于 2020-01-02 02:02:13
问题 I want to use MediaCodec to encode a Surface into H.264. With API 18, there is a way to encode content from a surface by calling createInputSurface() and then drawing on that surface. I get IllegalStateException on createInputSurface(). Here's additional logcat output: D/H264Encoder(17570): MediaFormat: {frame-rate=25, bitrate=1000000, height=600, mime=video/avc, color-format=19, i-frame-interval=75, width=800} D/NvOsDebugPrintf( 125): NvMMLiteOpen : Block : BlockType = 4 D/NvOsDebugPrintf(

Recoding one H.264 video to another using opengl surfaces is very slow on my android

喜你入骨 提交于 2020-01-01 19:58:53
问题 I'm developing function of translating one video into another with additional effects for each frame. I decided to use opengl-es for applying effects on each frame. My input and output videos are in MP4 using H.264 codec. I use MediaCodec API (android api 18+) for decoding H.264 into the opengl texture, then draw on the surface using this texture with my shader. I thought that using MediaCodec with H.264 will do hardware decoding on android and it will be fast. But appeared that it is not.

MediaCodec Video Streaming From Camera wrong orientation & color

早过忘川 提交于 2020-01-01 03:44:47
问题 I'm trying to stream video capturing directly from camera for android devices. So far I have been able to capture each frame from android camera's onPreviewFrame (byte[] data, Camera camera) function, encode the data & then successfully decode the data and show to the surface. I used android's MediaCodec for the encoding & decoding. But the color & the orientation of the video is not correct [ 90 degree rotated ]. After searching a while I have found this YV12toYUV420PackedSemiPlanar function

Android MediaCodec decode h264 raw frame

匆匆过客 提交于 2019-12-31 22:24:05
问题 I am using Android MediaCodec API to decode h264 frames. I could decode and render the frames on the view. My problem is the decoder miss lots of frames,especially the first some frames. DecodeMediaCodec.dequeueOutputBuffer() return -1. aAbout 150 h264 frames,just decoded 43 frames. I can not find where the problem is. Here is my codes. /** * init decoder */ private void initDecodeMediaCodec() { mDecodeMediaCodec = MediaCodec.createDecoderByType(MIME_TYPE); MediaFormat format = MediaFormat

MediaCodec with Surface Input: Recording in background

我是研究僧i 提交于 2019-12-31 22:19:55
问题 I'm working on a video encoding application which I want to prevent from stopping when the hosting Activity enters the background, or the screen cycles off/on. The architecture of my encoder is derived from the excellent CameraToMpegTest example, with the addition of displaying camera frames to a GLSurfaceView (see Github links below). I'm currently performing background recording with a two-state solution: When the hosting Activity is in the foreground, encode one video frame on each call to

Decoding Video and Encoding again by Mediacodec gets a corrupted file

三世轮回 提交于 2019-12-29 06:28:58
问题 I am trying to implement https://android.googlesource.com/platform/cts/+/jb-mr2-release/tests/tests/media/src/android/media/cts/DecodeEditEncodeTest.java but modifying the source by using a video file mp4. The mime tipe is video/avc, bitrate 288kbps, iframeinterval 100, width: 176, height: 144. The file size is 6MB. When I decode the video and put the frame in the outputsurface, I can save the frame to a bitmap and see the frame great. But at the end, after encoding (with the same parameters