stagefright

Why AVCodecContext extradata is NULL?

拜拜、爱过 提交于 2021-02-11 18:20:02
问题 I am trying to decode h264 video using ffmpeg and stagefright library. I'm using this example. The example shows how to decode mp4 files, but i want to decode only h264 video. Here is piece of my code.. AVFormatSource::AVFormatSource(const char *videoPath) { av_register_all(); mDataSource = avformat_alloc_context(); avformat_open_input(&mDataSource, videoPath, NULL, NULL); for (int i = 0; i < mDataSource->nb_streams; i++) { if (mDataSource->streams[i]->codec->codec_type == AVMEDIA_TYPE_VIDEO)

MediaPlayer Framework on GingerBread and Apple's HTTP Live Streaming Support

只愿长相守 提交于 2020-01-11 20:22:33
问题 According to the release notes, StageFright replaces the OpenCore framework in the GingerBread release. It seems that there have been numerous discussion that says: Apple's HTTP Live streaming is supported by default since Android 2.3. Even the Wikipedia mentions this. However, when I try to run the test stream provided by Apple using the MediaPlayerDemo_Video.java bundled with API Demos I get the following exceptions: setDataSource('http://devimages.apple.com/iphone/samples/bipbop/gear1/

MediaPlayer Framework on GingerBread and Apple's HTTP Live Streaming Support

时光怂恿深爱的人放手 提交于 2020-01-11 20:21:32
问题 According to the release notes, StageFright replaces the OpenCore framework in the GingerBread release. It seems that there have been numerous discussion that says: Apple's HTTP Live streaming is supported by default since Android 2.3. Even the Wikipedia mentions this. However, when I try to run the test stream provided by Apple using the MediaPlayerDemo_Video.java bundled with API Demos I get the following exceptions: setDataSource('http://devimages.apple.com/iphone/samples/bipbop/gear1/

Android: How to integrate a decoder to multimedia framework

☆樱花仙子☆ 提交于 2020-01-08 19:41:14
问题 Recently i have ported a video decoder to android successfully. Also dumped the output on a surfaceview and checked the output using native API's. Now the next task is to implement play, pause, streaming etc. i.e. the other features of the media player. Doing this will be a rework as all these functionalities are already defined in the android multimedia framework. I heard that we can make our decoder as a plug-in and integrate it into Android's multimedia framework. Although i googled

Enabling Hardware Encoder in Jelly bean 4.1.1 rowboat DM3730

雨燕双飞 提交于 2019-12-24 23:39:06
问题 Kindly execuse me for the bit longer description about the problem. I have a custom board with DM3730 processor, and building android rowboat from http://code.google.com/p/rowboat/wiki/JellybeanOnBeagleboard_WithSGX OBJECTIVE: ENABLING HARDWARE DECODER. 2.1) For that, I need OMAX-IL Interface. Hence looked at the source code downloaded from TI. But i do not find omap3/ directory under hardware/ti/ which represents OMX implementation. 2.2) Hence downloaded from AOSP Jelly Bean Code By: git

Android: How to build and replace modified AOSP code

浪尽此生 提交于 2019-12-24 03:32:06
问题 I am starting to work with stage fright frame work to implement hardware decoder in android prior to Jelly bean in my video conferencing application. I have downloaded and built the android source code in Mac system. I am not clear with the whole idea of working with AOSP. And my questions are (with respect to stagefright framework) Where can I find the libstagefright.so after AOSP build ?. If I use the OMX codec in my class for decode, how should I link the libstagefright.so to native code

Play .ts video file on Android?

回眸只為那壹抹淺笑 提交于 2019-12-23 09:59:44
问题 I am pretty new at streaming video, so please bear with me. :) I am trying to port an m3u8 stream over from iPhone to Android. Looking in the m3u8 feed, I found some .ts files. From what I can tell, .ts files are, themselves, wrappers that contain the video stream (Elementary Stream). Is it possible to play a .ts file in Android? (The docs only list 3gp and mp4 as supported formats.) Is there a way to extract the Elementary Stream and just process the video feed? If that is in 3gp or mp4, I

How MediaCodec finds the codec inside the framework in Android?

别来无恙 提交于 2019-12-20 23:25:29
问题 I am trying to understanding how MediaCodec is used for hardware decoding. My knowledge in android internal is very limited. Here is my findings: There is a xml file which represents the codec details in the android system . device/ti/omap3evm/media_codecs.xml for an example. Which means, that If we create a codec from the Java Application with Media Codec MediaCodec codec = MediaCodec.createDecoderByType(type); It should be finding out respective coder with the help of xml file. What am I

Why am I getting “Unsupported format” errors, reading H.264 encoded rtsp streams with the Android MediaPlayer?

拈花ヽ惹草 提交于 2019-12-19 05:51:57
问题 I am trying to show H.264 encoded rtsp video on an Android device. The stream is coming from a Raspberry Pi, using vlc to encode /dev/video1 which is a "Pi NoIR Camera Board". vlc-wrapper -vvv v4l2:///dev/video1 --v4l2-width $WIDTH --v4l2-height $HEIGHT --v4l2-fps ${FPS}.0 --v4l2-chroma h264 --no-audio --no-osd --sout "#rtp{sdp=rtsp://:8000/pi.sdp}" :demux=h264 > /tmp/vlc-wrapper.log 2>&1 I am using very minimal Android code right now: final MediaPlayer mediaPlayer = new MediaPlayer();

Custom Wrapper Codec Integration into Android

爷,独闯天下 提交于 2019-12-18 13:37:11
问题 I need to develop a custom 'wrapper' video codec and integrate it into android (JB for now, ICS later). We want to use some custom decryption keys from the SIM (don't ask!). The best method (that would allow it to work alongside other non-encrypted media and to use the standard media player or other) seems to be to define our own mime-type, and link that to a custom wrapper codec that can do the custom decryption, and then pass the data on to a real codec. (Let's say the filetype is .mp4 for