h.264

H.264 encoded MP4 presented in HTML5 plays on Safari but not iOS devices

杀马特。学长 韩版系。学妹 提交于 2019-11-28 08:25:05
I'm using Adobe Media Encoder CS5 to encode a FLV file to H.264 to present on the web via HTML5 and the video file plays just fine in Safari in OS X (and in Firefox encoded to OGG) but on any iOS device (iPad, iPhone) I get the play icon with the slash running through it. Has anyone encountered this before and if so, any ideas as to why? Thanks. We had this problem and found that encoding the files in accordance with iPhone's webview's standards created files that played fine. Not all H.264 encoded Mp4 files are supported by iPhone (or Chrome for that matter) and slight differences in the

MediaCodec crash on high quality stream

≯℡__Kan透↙ 提交于 2019-11-28 06:40:16
问题 I am decoding a h264 video stream with the following code (original guide): public void configure(Surface surface, int width, int height, ByteBuffer csd0) { String VIDEO_FORMAT = "video/avc"; if (mConfigured) { throw new IllegalStateException("Decoder is already configured"); } MediaFormat format = MediaFormat.createVideoFormat(VIDEO_FORMAT, width, height); // little tricky here, csd-0 is required in order to configure the codec properly // it is basically the first sample from encoder with

Encoding FFMPEG to MPEG-DASH – or WebM with Keyframe Clusters – for MediaSource API

柔情痞子 提交于 2019-11-28 04:34:30
I'm currently sending a video stream to Chrome, to play via the MediaSource API. As I understand it, MediaSource only supports MP4 files encoded with MPEG-DASH, or WebM files that have clusters beginning with keyframes (otherwise it raises the error: Media segment did not begin with keyframe). Is there any way to encode in MPEG-DASH or keyframed WebM formats with FFMPEG in real-time? Edit: I just tried it with ffmpeg ... -f webm -vcodec vp8 -g 1 so that every frame is a keyframe. Not the ideal solution. It does work with MediaStream now though. Any way to sync up the segments with the

What does this H264 NAL Header Mean?

旧城冷巷雨未停 提交于 2019-11-28 03:24:14
0000 0109 1000 0001 6742 0020 e900 800c 3200 0001 68ce 3c80 0000 0001 6588 801a As far as I know, 0000 01 is the start prefix code to identify a NAL Unit. What does "09 .... " mean? Is it the header type byte? 0x000001 is the NAL start prefix code (it can also be 0x00000001, depends on the encoder implementation). 0x09 is 0b00001001, which means F=0, NRI = 0, and type is 0b01001. That particular type is an access unit delimiter. Notice that it is immediately followed by another NAL unit defined by 0x67, which is a NAL type of 7, which is the sequence parameter set. There's also the picture

Video rendering is broken MediaCodec H.264 stream

天涯浪子 提交于 2019-11-27 21:42:19
I am implementing a decoder using MediaCodec Java API for decoding live H.264 remote stream. I am receiving H.264 encoded data from native layer using a callback ( void OnRecvEncodedData(byte[] encodedData) ), decode and render on Surface of TextureView . My implementation is completed (retrieving encoded streams using callback, decode and rendering etc). Here is my decoder class: public class MediaCodecDecoder extends Thread implements MyFrameAvailableListener { private static final boolean VERBOSE = true; private static final String LOG_TAG = MediaCodecDecoder.class.getSimpleName(); private

Converting images to video

对着背影说爱祢 提交于 2019-11-27 19:12:15
how can I convert images to video without using FFmpeg or JCodec , only with android MediaCodec . The images for video is bitmap files that can be ARGB888 or YUV420 (my choice). The most important thing is that the video have to be playable in android devices and the maximum API is 16. I know all about API 18 MediaMuxer and I can not use it. Please help me, I am stuck on this for many days. (JCodec to slow, and FFmpeg very complicated to use). fadden There is no simple way to do this in API 16 that works across all devices. You will encounter problems with buffer alignment , color spaces , and

H264 with multiple PPS and SPS

巧了我就是萌 提交于 2019-11-27 18:52:25
I have a card that produces a H264 stream with a SPS (Sequence Parameter Set) and a PPS (Picture Parameter Set), in that order, directly before each I-Frame. I see that most H264 streams contain a PPS and SPS at the first I-Frame. Is this recommended? Do decoders/muxers typically support multiple PPS and SRS? H.264 comes in a variety of stream formats. One variation is called "Annex B". (AUD)(SPS)(PPS)(I-Slice)(PPS)(P-Slice)(PPS)(P-Slice) ... (AUD)(SPS)(PPS)(I-Slice). Typically you see SPS/PPS before each I frame and PPS before other slices. Most decoders/muxers are happy with "Annex B" and

FFmpeg can't decode H264 stream/frame data

六月ゝ 毕业季﹏ 提交于 2019-11-27 18:50:45
Recently I had chance to work with two devices that are streaming the H264 through RTSP. And I've ran into some problem trying to decompress this stream using FFmpeg library. Every time the " avcodec_decode_video2 " is called - FFmpeg just says something like: [h264 @ 00339220] no frame! My raw H264 stream I frame data starts like this: " 65 88 84 21 3F F8 F8 0D..." (as far as I understand this 0x65 indicates that it's a IDR frame?) Other frames for one device starts like: " 41 9A 22 07 F3 4E 48 CC...." and for other device - like this: " 61 9A 25 C1 1C 45 62 39...." Am I missing some frame

How to create an h264 video with an alpha channel for use with HTML5 Canvas?

拥有回忆 提交于 2019-11-27 18:03:08
I've been interested in this demo: http://jakearchibald.com/scratch/alphavid/ I also saw this question on here: Can I have a video with transparent background using HTML5 video tag? But I can't seem to figure out: How do you create h264, ogg and webm video files with alpha channels? Phrogz If you open this video from your demo with QuickTime you will see a video with separate RGB and A regions: If you then look at the code for the demo, you will see that it is using one offscreen HTML5 Canvas to draw each video frame to, reading the alpha pixels from the second half of that canvas to set

How to use AVSampleBufferDisplayLayer in iOS 8 for RTP H264 Streams with GStreamer?

大憨熊 提交于 2019-11-27 17:54:48
After getting notice of the HW-H264-Decoder being available to programmers in iOS 8, I want to use it now. There is a nice introduction to 'Direct Access to Video Encoding and Decoding' from WWDC 2014 out there. You can take a look here . Based on Case 1 there, I started to develop an Application, that should be able to get an H264-RTP-UDP-Stream from GStreamer, sink it into an 'appsink'-element to get direct access to the NAL Units and do the conversion to create CMSampleBuffers, which my AVSampleBufferDisplayLayer can display then. The interesting piece of code doing all that is the