video-toolbox

Decode h264 video stream to get image buffer

佐手、 提交于 2019-12-06 02:23:17
I followed this post to decode my h264 video stream frames. My data frames as bellow: My code: NSString * const naluTypesStrings[] = { @"0: Unspecified (non-VCL)", @"1: Coded slice of a non-IDR picture (VCL)", // P frame @"2: Coded slice data partition A (VCL)", @"3: Coded slice data partition B (VCL)", @"4: Coded slice data partition C (VCL)", @"5: Coded slice of an IDR picture (VCL)", // I frame @"6: Supplemental enhancement information (SEI) (non-VCL)", @"7: Sequence parameter set (non-VCL)", // SPS parameter @"8: Picture parameter set (non-VCL)", // PPS parameter @"9: Access unit delimiter

Why does AVSampleBufferDisplayLayer fail with Operation Interrupted (-11847)?

旧街凉风 提交于 2019-12-05 10:25:42
I'm using an AVSampleBufferDisplayLayer to decode and display H.264 video streamed from a server. When my app goes into the background and then returns to the foreground, the decoding process gets screwed up and the AVSampleBufferDisplayLayer fails. The error I'm seeing is: H.264 decoding layer has failed: Error Domain=AVFoundationErrorDomain Code=-11847 "Operation Interrupted" UserInfo=0x17426c500 {NSUnderlyingError=0x17805fe90 "The operation couldn’t be completed. (OSStatus error -12084.)", NSLocalizedRecoverySuggestion=Stop other operations and try again., NSLocalizedDescription=Operation

How to extract motion vectors from H.264 AVC CMBlockBufferRef after VTCompressionSessionEncodeFrame

隐身守侯 提交于 2019-12-04 19:20:31
I'm trying read or understand CMBlockBufferRef representation of H.264 AVC 1/30 frame. The buffer and the encapsulating CMSampleBufferRef is created by using VTCompressionSessionRef . https://gist.github.com/petershine/de5e3d8487f4cfca0a1d H.264 data is represented as AVC memory buffer, CMBlockBufferRef from the compressed sample. Without fully decompressing again , I'm trying to extract motion vectors or predictions from this CMBlockBufferRef . I believe that for the fastest performance, byte-by-byte reading from the data buffer using CMBlockBufferGetDataPointer() should be necessary. However

How to use VideoToolbox to decompress H.264 video stream

倾然丶 夕夏残阳落幕 提交于 2019-11-26 03:23:19
问题 I had a lot of trouble figuring out how to use Apple\'s Hardware accelerated video framework to decompress an H.264 video stream. After a few weeks I figured it out and wanted to share an extensive example since I couldn\'t find one. My goal is to give a thorough, instructive example of Video Toolbox introduced in WWDC \'14 session 513. My code will not compile or run since it needs to be integrated with an elementary H.264 stream (like a video read from a file or streamed from online etc)

How to use VideoToolbox to decompress H.264 video stream

陌路散爱 提交于 2019-11-25 20:16:04
I had a lot of trouble figuring out how to use Apple's Hardware accelerated video framework to decompress an H.264 video stream. After a few weeks I figured it out and wanted to share an extensive example since I couldn't find one. My goal is to give a thorough, instructive example of Video Toolbox introduced in WWDC '14 session 513 . My code will not compile or run since it needs to be integrated with an elementary H.264 stream (like a video read from a file or streamed from online etc) and needs to be tweaked depending on the specific case. I should mention that I have very little experience