h.264

Get PTS from raw H264 mdat generated by iOS AVAssetWriter

一世执手 提交于 2019-12-03 13:24:56
问题 I'm trying to simultaneously read and write H.264 mov file written by AVAssetWriter. I managed to extract individual NAL units, pack them into ffmpeg's AVPackets and write them into another video format using ffmpeg. It works and the resulting file plays well except the playback speed is not right. How do I calculate the correct PTS/DTS values from raw H.264 data? Or maybe there exists some other way to get them? Here's what I've tried: Limit capture min/max frame rate to 30 and assume that

How to implement video DRM in iOS

允我心安 提交于 2019-12-03 13:17:25
问题 I'm looking to implement DRM in an iOS video player, but I'm not sure how to implement this. In order to implement video DRM (while still using Apple's hardware accelerated H264 decode), I need a way to feed the decrypted H264 stream into the standard iOS video playback APIs. According to this question, it was not possible to implement 3rd party DRM in September 2010. There's a thread in the Apple Developer Forums that goes nowhere. However, as of today a number of 3rd party DRM libraries

FFmpeg: Read profile level information from mp4

佐手、 提交于 2019-12-03 13:09:51
I have a mp4 file and need the profile level of it. FFmpeg says, it has baseline profile, which is what I need, but I need also the level . Here is what I get from FFmpeg: ffmpeg version 0.8, Copyright (c) 2000-2011 the FFmpeg developers built on Jul 20 2011 13:32:19 with gcc 4.4.3 configuration: --enable-gpl --enable-version3 --enable-nonfree --enable-postproc --enable-libfaac --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libtheora --enable-libvorbis --enable-libx264 libavutil 51. 9. 1 / 51. 9. 1 libavcodec 53. 7. 0 / 53. 7. 0 libavformat 53. 4. 0 / 53. 4. 0 libavdevice 53. 1

How can we get H.264 encoded video stream from iPhone Camera?

强颜欢笑 提交于 2019-12-03 13:06:04
问题 I am using following to get video sample buffer: - (void) writeSampleBufferStream:(CMSampleBufferRef)sampleBuffer ofType:(NSString *)mediaType Now my question is that how can I get h.264 encoded NSData from above sampleBuffer . Please suggest. 回答1: Update for 2017: You can do streaming Video and Audio now by using the VideoToolbox API. Read the documentation here: VTCompressionSession Original answer (from 2013): Short: You can't, the sample buffer you receive is uncompressed. Methods to get

Planar YUV420 data layout

不打扰是莪最后的温柔 提交于 2019-12-03 13:00:20
问题 In my project I use OpenH264 codec, which is said to output data in the YUV 4:2:0 planar format. After decoding I get one array with width * height * 1.5 elements, which, when displaying, looks like this image: http://o3d.googlecode.com/svn/trunk/samples_webgl/assets/shaving_cream.png Why there are four areas below the main one (which contains Y - responsible for grayscale - elements ), instead of two, like on my second picture? Is that mean that the format is different or am I wrong and my

How can I play H.264 RTSP video in Windows 8 Metro C# XAML app?

喜你入骨 提交于 2019-12-03 12:50:55
问题 I have a device that provides an H.264 video stream from a URL like: rtsp://192.168.0.10:554/videoservice Since this is live video I don't need to be able to control it (pause, rewind, etc), just play. Is this supported by MediaElement or another standard class, do I need something like Smooth Streaming Client SDK or is this a lot more complicated than I thought? Update: I downloaded Microsoft's Player Framework but this doesn't play the stream either. I can't find anything in the examples

MP4 container writer in Java

梦想与她 提交于 2019-12-03 12:49:46
问题 I would like to find a FREE MP4 (container) writer for Java. I do not need an encoder, only something which can write the correct atoms given their expected values. Bonus for such a library that also can write "valid" F4V. I would prefer a pure Java solution rather than something using JNI or external executables. 回答1: Even though my answer comes very late you could have a look into my MP4 Parser/Unparser at Github. You can parse MP4 files, modify them and write the result. You can even start

ffmpeg copyts to preserve timestamp

梦想与她 提交于 2019-12-03 12:25:49
I am trying to modify an HLS segment transport stream, and preserve its start time with ffmpeg. However the output does not preserve the input file's start_time value, even if -copyts is mentioned. Here's my command line: ffmpeg -i fileSequence1.ts -i x.png -filter_complex '[0:v][1:v]overlay[out]' -map '[out]' -map 0:1 -acodec copy -vsync 0 -vcodec libx264 -streamid 0:257 -streamid 1:258 -copyts -profile:v baseline -level 3 output.ts The start_time value is delayed about 2 seconds consistently. /Users/macadmin/>ffmpeg -y -v verbose -i fileSequence0.ts -map 0:0 -vcodec libx264 -copyts -vsync 0

Streaming RTP/RTSP: sync/timestamp problems

人走茶凉 提交于 2019-12-03 11:47:22
问题 I'm having some trouble streaming H.264 video over RTSP. The goal is to live-stream a camera image to an RTSP client (ideally a browser plugin in the end). This has been working pretty well so far, except for one problem: the video will lag on startup, stutter every few seconds, and has a ~4-second delay. This is bad. Our setup is to encode with x264 (w/ zerolatency & ultrafast) and packed into RTSP/RTP with libavformat from ffmpeg 0.6.5. For testing, I'm receiving the stream with a GStreamer

Decoding h264 frames from RTP stream

孤街醉人 提交于 2019-12-03 08:48:46
I am using live555 and ffmpeg libraries to get and decode RTP H264 stream from server; Video stream was encoded by ffmpeg, using Baseline profile and x264_param_default_preset(m_params, "veryfast", "zerolatency") I read this topic and add SPS and PPS data in the every frame, which I receive from network; void ClientSink::NewFrameHandler(unsigned frameSize, unsigned numTruncatedBytes, timeval presentationTime, unsigned durationInMicroseconds) { ... EncodedFrame tmp; tmp.m_frame = std::vector<unsigned char>(m_tempBuffer.data(), m_tempBuffer.data() + frameSize); tmp.m_duration =