h.264

Use MediaCodec for H264 streaming

倾然丶 夕夏残阳落幕 提交于 2019-12-03 03:53:42
I'm currently trying to use Android as a Skype endpoint. At this stage, I need to encode video into H.264 (since it's the only format supported by Skype) and encapsulate it with RTP in order to make the streaming work. Apparently the MediaRecorder is not very suited for this for various reasons. One is because it adds the MP4 or 3GP headers after it's finished. Another is because in order to reduce latency to a minimum, hardware accelaration may come in handy. That's why I would like to make use of the recent low-level additions to the framework, being MediaCodec , MediaExtractor , etc. At the

How can I perform hardware-accelerated H.264 encoding and decoding for streaming? [closed]

自作多情 提交于 2019-12-03 03:25:40
问题 It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center. Closed 7 years ago . I am able to get the RGBA frame data from the camera, and I want to encode it in H.264 format. I've used FFmpeg to encode and decode H.264 video, but at a frame size of 640x480 it's too slow for my needs. I'd

How can I play H.264 RTSP video in Windows 8 Metro C# XAML app?

心已入冬 提交于 2019-12-03 03:16:52
I have a device that provides an H.264 video stream from a URL like: rtsp://192.168.0.10:554/videoservice Since this is live video I don't need to be able to control it (pause, rewind, etc), just play. Is this supported by MediaElement or another standard class, do I need something like Smooth Streaming Client SDK or is this a lot more complicated than I thought? Update: I downloaded Microsoft's Player Framework but this doesn't play the stream either. I can't find anything in the examples about RTSP. Update: I used Wireshark to compare the packets that VLC Media Player (which works) sends

Planar YUV420 data layout

本秂侑毒 提交于 2019-12-03 03:10:00
In my project I use OpenH264 codec, which is said to output data in the YUV 4:2:0 planar format. After decoding I get one array with width * height * 1.5 elements, which, when displaying, looks like this image: http://o3d.googlecode.com/svn/trunk/samples_webgl/assets/shaving_cream.png Why there are four areas below the main one (which contains Y - responsible for grayscale - elements ), instead of two, like on my second picture? Is that mean that the format is different or am I wrong and my world just collapsed? I thought that the resoult should have looked like this: It is exactly the way you

Streaming RTP/RTSP: sync/timestamp problems

拈花ヽ惹草 提交于 2019-12-03 02:14:59
I'm having some trouble streaming H.264 video over RTSP. The goal is to live-stream a camera image to an RTSP client (ideally a browser plugin in the end). This has been working pretty well so far, except for one problem: the video will lag on startup, stutter every few seconds, and has a ~4-second delay. This is bad. Our setup is to encode with x264 (w/ zerolatency & ultrafast) and packed into RTSP/RTP with libavformat from ffmpeg 0.6.5. For testing, I'm receiving the stream with a GStreamer pipeline with gst-launch when connecting to an RTSP server. However , I've been able to reproduce the

H.264 conversion with FFmpeg (from a RTP stream)

二次信任 提交于 2019-12-03 02:02:53
问题 Environment: I have an IP Camera, which is capable of streaming it's data over RTP in a H.264 encoded format. This raw stream is recorded from the ethernet. With that data I have to work. Goal: In the end I want to have a *.mp4 file, which I can play with common Media Players (like VLC or Windows MP). What have I done so far: I take that raw stream data I have and parse it. Since the data has been transmitted via RTP I need to take care of the NAL Bytes, SPS and PPS. 1. Write a raw file First

errors when decode H.264 frames using ffmpeg

只谈情不闲聊 提交于 2019-12-03 01:56:14
问题 I am getting the following errors when decoding H.264 frames received from the remote end of a H.264 based SIP video call. Appreciate any help in understanding there errors. non-existing PPS 0 referenced decode_slice_header error non-existing PPS 0 referenced decode_slice_header error no frame! non-existing PPS 0 referenced decode_slice_header error non-existing PPS 0 referenced decode_slice_header error no frame! 回答1: That just means that ffmpeg has not seen a keyframe yet, which carries SPS

usage of start code for H264 video

你说的曾经没有我的故事 提交于 2019-12-03 00:48:07
I have general question about the usage of start code (0x00 0x00 0x00 0x01) for the H264 video. I am not clear about the usage of this start code as there is no reference in the RTP RFCs that are related to H264 video. But I do see lot of reference in the net and particularly in the stackoverflow. I am confused as I see one client doesn't have this start code and another client is using this start code. So, I am looking for a specific answer where this start code should be used and where I shouldn't. KMurali Markus Schumann There are two H.264 stream formats and they are sometimes called Annex

Recommendation on the best quality/performance H264 encoder for video encoding?

岁酱吖の 提交于 2019-12-03 00:39:41
I am looking for a video encoder that is fast, requires less CPU power and produces very good quality mp4 video. The input videos can be in any format and uploaded by users. Only thing I know is FFMPEG library. Is there anything else that is better? The program must have a batch utility (exe) that I am interested in. I would appreciate if you kindly share your knowledge. Thanks llogan Use x264 . It's fast and flexible enough for it to suit your needs. Other H.264 video encoders are junk compared to it and this isn't just my opinion. You can use it directly or via ffmpeg . You can get recent

Unable to mux both audio and video

删除回忆录丶 提交于 2019-12-02 23:41:43
I'm writing an app that records screen capture and audio using MediaCodec. I use MediaMuxer to mux video and audio to create mp4 file. I successfuly managed to write video and audio separately, however when I try muxing them together live, the result is unexpected. Either audio is played without video, or video is played right after audio. My guess is that I'm doing something wrong with timestamps, but I can't figure out what exactly. I already looked at those examples: https://github.com/OnlyInAmerica/HWEncoderExperiments/tree/audiotest/HWEncoderExperiments/src/main/java/net/openwatch