android-mediacodec

Multiple videos on one Surface

╄→尐↘猪︶ㄣ 提交于 2020-08-03 09:17:06
问题 I have a single, fullscreen SurfaceView . And I have multiple network streams with h264 video which I can decode using MediaCodec . Is it possible to specify to which coordinates of the Surface will the video be rendered? So I can create kind of video mozaic? 回答1: No, that's not possible. You'll need to use multiple SurfaceTexture s instead, one per video decoder, and render all the textures into one view using Open GL. See https://source.android.com/devices/graphics/architecture.html for

Problems in making Video from Lottie JSON File and Overlay it with original video using FFMPEG

谁说我不能喝 提交于 2020-08-03 04:01:10
问题 there is some unique concept about processing Lottie Animations using FFMPEG video processing library and MediaCodec. In that, I wants to make video from Lottie animation and overlay that video on other original video. But the problem is i'm unable to make a video with transparent background from Lottie animation. So i made simple video from Lottie animation using MediaCodec and MediaMuxer, it takes one by one frame from Lottie Drawable and attach it to the video(Lottie Video). Here is the

Adding images to video with help of this MediaCodec Api

岁酱吖の 提交于 2020-05-17 08:33:25
问题 I am trying to add waterMark to the video by using this library but i couldn't the get the desired result as shown in the library https://github.com/MasayukiSuda/Mp4Composer-android . can anyone point out or help me how to solve it thanks what i tried mp4Composer = new Mp4Composer(videoItem.getPath(), videoPath) // .rotation(Rotation.ROTATION_270) .size(720, 720) .fillMode(FillMode.PRESERVE_ASPECT_FIT) .filter(new GlWatermarkFilter(BitmapFactory.decodeResource(context.getResources(),R

Not able to display live streamed data on android

[亡魂溺海] 提交于 2020-02-02 13:00:59
问题 I am trying to display raw H264 data from a camera device to my android app. I am able to receive data on a Textview but not able to display it on Textureview. I am a beginner in android and I am no expert in decoding raw data. It would be appreciated if someone could suggest a solution. Please find the below codes: Code for getting the data public class myVideoReceiver extends Thread { public boolean bKeepRunning2 = true; public String lastMessage2 = ""; public void run() { String message2;

How to trim video with MediaCodec

…衆ロ難τιáo~ 提交于 2019-12-30 05:28:27
问题 I'm trying to record the screen with MediaProjection API. I want to trim the video that was recorded by the media projection. Is there a way to do that without using any 3rd party dependency? 回答1: After lots of digging, I found this snippet /** * @param srcPath the path of source video file. * @param dstPath the path of destination video file. * @param startMs starting time in milliseconds for trimming. Set to * negative if starting from beginning. * @param endMs end time for trimming in

getting fatal exception when trying to play decoded audio using audioTrack

送分小仙女□ 提交于 2019-12-25 03:16:39
问题 I'm using MediaCodec for video decoding and playing it on surface texture. It all works fine, but when I am decoding audio, it's getting decoded successfully, but when I am trying to play it using AudioTrack I am getting following error: com.**** I/OMXClient: Using client-side OMX mux. com.**** I/OMXClient: Using client-side OMX mux. com.**** E/ACodec: [OMX.Intel.VideoDecoder.AVC] storeMetaDataInBuffers failed w/ err -2147483648 com.**** A/libc: Fatal signal 11 (SIGSEGV) at 0x0029004c (code=1

getting fatal exception when trying to play decoded audio using audioTrack

眉间皱痕 提交于 2019-12-25 03:16:16
问题 I'm using MediaCodec for video decoding and playing it on surface texture. It all works fine, but when I am decoding audio, it's getting decoded successfully, but when I am trying to play it using AudioTrack I am getting following error: com.**** I/OMXClient: Using client-side OMX mux. com.**** I/OMXClient: Using client-side OMX mux. com.**** E/ACodec: [OMX.Intel.VideoDecoder.AVC] storeMetaDataInBuffers failed w/ err -2147483648 com.**** A/libc: Fatal signal 11 (SIGSEGV) at 0x0029004c (code=1

How to get frame by frame from MP4? (MediaCodec)

淺唱寂寞╮ 提交于 2019-12-18 09:09:08
问题 Actually I am working with OpenGL and I would like to put all my textures in MP4 in order to compress them. Then I need to get it from MP4 on my Android I need somehow decode MP4 and get frame by frame by request. I found this MediaCodec https://developer.android.com/reference/android/media/MediaCodec and this MediaMetadataRetriever https://developer.android.com/reference/android/media/MediaMetadataRetriever But I did not see approach how to request frame by frame... If there is someone who

MediaCodec.createInputSurface() throws IllegalStateException in Android emulator (Error -38)

倖福魔咒の 提交于 2019-12-13 15:27:00
问题 I have MediaMuxer : MediaMuxer mMediaMuxer = new MediaMuxer(new File(Environment.getExternalStorageDirectory(), "video.mp4").getPath(), MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4); And code, which records video: MediaFormat mMediaFormat = MediaFormat.createVideoFormat("video/avc", width, height); mMediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, CodecCapabilities.COLOR_FormatSurface); mMediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 128000); mMediaFormat.setInteger(MediaFormat.KEY_FRAME

MediaCodec from another app causing issues

夙愿已清 提交于 2019-12-11 19:13:02
问题 I'm aware that I can check the max supported instances of MediaCodec using getMaxSupportedInstances The problem I have is - If another application is using MediaCodec , and I open my application then my application fails. In other words, if another developer is not handling the release of MediaCodec correctly then my application will "suffer" from this. I doubt that there is a way to release current instances of MediaCodec which my application did not create? What I currently do is, I use a