mediacodec

Recoding one H.264 video to another using opengl surfaces is very slow on my android

回眸只為那壹抹淺笑 提交于 2019-12-04 20:44:02
I'm developing function of translating one video into another with additional effects for each frame. I decided to use opengl-es for applying effects on each frame. My input and output videos are in MP4 using H.264 codec. I use MediaCodec API (android api 18+) for decoding H.264 into the opengl texture, then draw on the surface using this texture with my shader. I thought that using MediaCodec with H.264 will do hardware decoding on android and it will be fast. But appeared that it is not. Recoding small 432x240 15 seconds video consumed 28 seconds of total time! Please, take a look at my code

Inconsistent video rotation when using MediaCodec

筅森魡賤 提交于 2019-12-04 19:22:54
I have two devices, a Nexus 7 (Android 5) and a Galaxy S3 (4.3). On both devices I recorded a video in portrait mode and saved it with rotation hint 90 degrees. This is the correct orientation hint cause when played using the default media player the orientation is fine on both devices. I can even copy the video from the Nexus to the Galaxy and the orientation is still fine when I play the video file. However, when I decode the video using the MediaCodec api I get some problems with the video rotation. When I display the video data I get from the MediaCodec the video on the Nexus is correct

Android MediaCodec for HEVC

爱⌒轻易说出口 提交于 2019-12-04 15:14:09
I'm looking into using the android MediaCodec class to decode HEVC. Are there any examples of projects that do this? At present I configure the decoder with the following: AMEDIAFORMAT_KEY_MIME: "video/hevc" AMEDIAFORMAT_KEY_MAX_HEIGHT: 4320 AMEDIAFORMAT_KEY_MAX_WIDTH: 8192 AMEDIAFORMAT_KEY_HEIGHT: 1600 AMEDIAFORMAT_KEY_WIDTH: 3840 AMEDIAFORMAT_KEY_FRAME_RATE: 24 AMEDIAFORMAT_KEY_PUSH_BLANK_BUFFERS_ON_STOP: 1 When using this setup, I get no video output. I've also tried setting csd-0 , csd-1 and csd-2 to the VPS, SPS and PPS respectively, but have had no luck. I get the following error in the

Converting specialized NV12 video frames to RGB

夙愿已清 提交于 2019-12-04 14:02:34
I have an H264 stream that's decoded using an Android MediaCodec. When I query the output MediaFormat, the color format is 2141391875. Apparently, that's a specialized NV12 variant known as HAL_PIXEL_FORMAT_NV12_ADRENO_TILED. This is on a Nexus 7 (2013). I want to take this data and convert it to RGB so I can create a Bitmap. I've found StackOverflow posts for converting other formats to RGB, not this format. I've tried code from those other posts, the result is just streaks of color. (To view the Bitmap, I draw on the Canvas associated with a Surface, as well as write it out as a JPEG -- it

Android MediaCodec slower in async-mode than in synchronous mode?

∥☆過路亽.° 提交于 2019-12-04 13:14:06
问题 again a question of mine regarding androids MediaCodec class. I have successfully managed to decode raw h264 content and display the result in two TextureViews. The h264 stream comes from a server that is running an openGL scene. The scene has a camera and is therefore responsive to users input. To further reduce the latency between an input on the server and the actual result on the smartphone I was thinking about using MediaCodecs Async mode. Here is how i set up both variants, synchronous

MediaCodec simultaneous encoding and decoding

倖福魔咒の 提交于 2019-12-04 12:45:20
I am trying to apply effects to the frames of a video using the GPU and then to re-encode those frames into a new result video. In the interest of performance I have implemented the following flow: There are 3 different threads, each with it's own OpenGL context. These contexts are set up in such a way that they share textures between them. Thread 1 extracts frames from the video and holds them in the GPU memory as textures, similar to this example. Thread 2 processes the textures using a modified version of GPUImage that also outputs textures in the GPU memory. Finally, thread 3 writes the

Raw H.264 stream output by MediaCodec not playble

杀马特。学长 韩版系。学妹 提交于 2019-12-04 11:47:00
I am creating raw H.264 stream output by MediaCodec. The problem is the output file is not playable in android default player (API 16). How can it be that Android can export file that is not playable in player, only in VLC on the PC. Maybe some thing wrong with my code? My video is 384x288. public class AvcEncoder { private MediaCodec mediaCodec; private BufferedOutputStream outputStream; private File f; public AvcEncoder(int w, int h, String file_name) { f = new File(file_name + ".mp4"); try { outputStream = new BufferedOutputStream(new FileOutputStream(f)); } catch (Exception e){ e

MediaCodec and 24 bit PCM

时间秒杀一切 提交于 2019-12-04 11:24:33
问题 I am successfully using MediaCodec to decode audio, however when I load a file with 24-bit samples, I have no way of knowing this has occurred. Since the application was assuming 16-bit samples, it fails. When I print the MediaFormat, I see {mime=audio/raw, durationUs=239000000, bits-format=6, channel-count=2, channel-mask=0, sample-rate=96000} I assume that the "bits-format" would be a hint, however this key is not declared in the API, and is not actually emitted when the output format

MediaExtractor.setDataSource throws IOException “failed to instantiate extractor”

主宰稳场 提交于 2019-12-04 07:46:17
I'm on Android 4.2 and calling MediaExtractor.setDataSource, and it sometimes throws an IOException of "failed to instantiate extractor". I've found where this is thrown from the C++ implementation, but it hasn't helped. Other people with the same problem and either no answer or an answer which doesn't help me are: android.media.MediaExtractor. Anyone got this beast to work? "Failed to instantiate extractor" exception media extractor show "failed to instantiate extractor" Failed to instantiate mediaextractor when using setDataSource() In a desperate attempt to figure this out I've written the

Audio file captured by MediaRecorder is broken after it is sent to server using Retrofit 2

只愿长相守 提交于 2019-12-04 07:39:24
My app records an audio clip and send the clip to the server using Retrofit2 after the recording is completed.The file is received in server,but the file is broken,what I mean by broken is that it cannot be played. I use the following URL(example url: mydomain.co/audio/myaudio.mp4 ) to play the audio clip,which I tried with another audio file using postman ,the audio file can be played successfully.Besides,even downloading the audio clip captured by android via Filezilla also has the same broken file. This is how I record the audio: private void startRecordingAudio() { Log.d("audiorecording",