mediacodec

Not able to display live streamed data on android

[亡魂溺海] 提交于 2020-02-02 13:00:59
问题 I am trying to display raw H264 data from a camera device to my android app. I am able to receive data on a Textview but not able to display it on Textureview. I am a beginner in android and I am no expert in decoding raw data. It would be appreciated if someone could suggest a solution. Please find the below codes: Code for getting the data public class myVideoReceiver extends Thread { public boolean bKeepRunning2 = true; public String lastMessage2 = ""; public void run() { String message2;

Mediacodec, decode byte packet from server and render it on surface

房东的猫 提交于 2020-01-21 10:20:30
问题 I have some issues with MediaCode. I have 3 components; Decoder, Downloader and Render. And Simple FragmentStreamVideo that initialize the 'SurfaceView' and the 'Downloader'. The other components like the Render and Decoder are initialized in the SurfaceView. Then, a syncronize is done between the Decoder and the Dowloader, implemented by BlockingQueue<String> queue where String = Filename ( Each frame has its file ). Another syncronize between Decode and Render is done by the standard

Android video decoder not drawing to gles surface on lollipop only

不问归期 提交于 2020-01-15 12:17:11
问题 Briefly, I'm combining two open source apps into a new VR app, so this only runs on the Note 4 and S6 using the GearVR headset. My app works on kitkat, but the video is black on lollipop. The two source apps both work fine on lollipop. I have a surface created from a gl texture: glGenTextures( 1, &textureId ); glBindTexture(GL_TEXTURE_EXTERNAL_OES, textureId); glTexParameterf(GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_MIN_FILTER, GL_LINEAR); glTexParameterf(GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_MAG

Inconsistent video rotation when using MediaCodec

左心房为你撑大大i 提交于 2020-01-13 06:46:45
问题 I have two devices, a Nexus 7 (Android 5) and a Galaxy S3 (4.3). On both devices I recorded a video in portrait mode and saved it with rotation hint 90 degrees. This is the correct orientation hint cause when played using the default media player the orientation is fine on both devices. I can even copy the video from the Nexus to the Galaxy and the orientation is still fine when I play the video file. However, when I decode the video using the MediaCodec api I get some problems with the video

What android devices/decoder has supported adaptive video playback

可紊 提交于 2020-01-13 05:48:28
问题 I've tested on Nexus 5 that codecInfo.isFeatureSupported(MediaCodecInfo.CodecCapabilities.FEATURE_AdaptivePlayback) returns false. Does anyone know what chipset/software codec has supported the feature? https://developer.android.com/reference/android/media/MediaCodecInfo.CodecCapabilities.html#FEATURE_AdaptivePlayback Thanks 回答1: This is supported on most Nexus devices past KK MR1. Note, that it is HW video decoders only. Nexus 5 (KK MR1): // Qualcomm Snapdragon 800 OMX.qcom.video.decoder.avc

Use MediaCodec and MediaExtractor to decode and code video

妖精的绣舞 提交于 2020-01-12 03:57:06
问题 I need to decode a video into a sequence of bitmaps, such that I am able to modify them, and then compress them back to a video file in android. I plan to manage this by using getFrameAtTime and saving it to an image sequence. Then I can modify images in the sequence and code it back to a movie. But I have two problem with this: First, as I read it, the getFrameAtTime is for creating thumbnails and will not guarantee returning the correct frame. This makes the video laggy. Secondly, saving

Nexus 7 2013 mediacodec video encoder garbled output

不羁的心 提交于 2020-01-11 06:57:08
问题 I'm working on an app which encodes a movie using h.264 encoder to gallery and other targets. This app supports variable aspect ratios at output (1:1, 2:3, 3:2, 16:9, 3:4, 4:3). I'm using surface inputs for input/output from the 4.3 API. The app works ok on many devices (tested on S3, Motorola G, Nexus 7 2012, Motorola X), however, I've hit a wall when running it on Nexus 7 2013. Basically, some output resolutions work, some do not. 3:4 (720x960), 2:3 (720x1080) and 16:9 (1280x720) work as

Is it possible to feed MediaCodec Bytearray received by Server, and show them on SurfaceView

三世轮回 提交于 2020-01-07 04:51:08
问题 everyone. How i can do this if i take frame by frame stream video by server, I have PPS and SPS, and configure mediaCodec with this parametrs, also I have width and height. I'll glad any help.This example how id did it. Mediacodec, decode byte packet from server and render it on surface 来源: https://stackoverflow.com/questions/30229871/is-it-possible-to-feed-mediacodec-bytearray-received-by-server-and-show-them-on

How to drop frames while recording with MediaCodec and InputSurface?

…衆ロ難τιáo~ 提交于 2020-01-06 19:27:55
问题 In my Android app I want to record a video with Time-lapse. I have an InputSurface -> MediaCodec (encoder) -> MediaMuxer. But if I want to speed up the video (for example: x3), I get the resulted video with very high framerate. For example: with normal speed I get video 30fps. If I speed up (x3), I get the video 90fps. Since the framerate of video is high, the video player of my phone cannot play the video normally (The video player of computer plays the video well without any problem). So I

MediaCodec Async Callback is not occurring if other thread has blocking wait

北城余情 提交于 2020-01-06 08:57:29
问题 I am using MediaCodec OnAsyncInputAvailable and OnAsyncOutputAvailable callbacks introduced in API 28. void OnAsyncInputAvailable( AMediaCodec *codec, void *userdata, int32_t index) { CallbackData* callbackData = (CallbackData *) (userdata); callbackData->addInputBufferId(index); } void OnAsyncOutputAvailable( AMediaCodec *codec, void *userdata, int32_t index, AMediaCodecBufferInfo *bufferInfo) { CallbackData* callbackData = (CallbackData *) (userdata); callbackData->addOutputBuffer(index,