cmsamplebufferref

How to convert CMSampleBufferRef to NSData

家住魔仙堡 提交于 2020-01-10 10:29:09
问题 How do you convert CMSampleBufferRef to NSData? I've managed to get the data for an MPMediaItem by following Erik Aigner's answer on this thread, however the data is of type CMSampleBufferRef . I know CMSampleBufferRef is a struct and is defined in the CMSampleBuffer Reference in the iOS Dev Library, but I don't think I fully understand what it is. None of the CMSampleBuffer functions seem to be an obvious solution. 回答1: Here you go this works for audio sample buffer which is what you are

Resizing CMSampleBufferRef provided by captureStillImageBracketAsynchronouslyFromConnection:withSettingsArray:completionHandler:

China☆狼群 提交于 2019-12-24 11:35:46
问题 In the app I'm working on, we're capturing photos which need to have 4:3 aspect ratio in order to maximize the field of view we capture. Up untill now we were using AVCaptureSessionPreset640x480 preset, but now we're in need of larger resolution. As far as I've figured, the only other two 4:3 formats are 2592x1936 and 3264x2448. Since these are too large for our use case, I need a way to downsize them. I looked into a bunch of options but did not find a way (prefereably without copying the

Capture still UIImage without compression (from CMSampleBufferRef)?

女生的网名这么多〃 提交于 2019-12-21 05:39:32
问题 I need to obtain the UIImage from uncompressed image data from CMSampleBufferRef. I'm using the code: captureStillImageOutput captureStillImageAsynchronouslyFromConnection:connection completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) { // that famous function from Apple docs found on a lot of websites // does NOT work for still images UIImage *capturedImage = [self imageFromSampleBuffer:imageSampleBuffer]; } http://developer.apple.com/library/ios/#qa/qa1702/_index.html

Why AVSampleBufferDisplayLayer stops showing CMSampleBuffers taken from AVCaptureVideoDataOutput's delegate?

血红的双手。 提交于 2019-12-20 14:45:07
问题 I want to display some CMSampleBuffer's with the AVSampleBufferDisplayLayer, but it freezes after showing the first sample. I get the samplebuffers from the AVCaptureVideoDataOutputSampleBuffer delegate: -(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { CFRetain(sampleBuffer); [self imageToBuffer:sampleBuffer]; CFRelease(sampleBuffer); } put them into a vector -(void) imageToBuffer:

Set rate at which AVSampleBufferDisplayLayer renders sample buffers

做~自己de王妃 提交于 2019-12-19 04:23:43
问题 I am using an AVSampleBufferDisplayLayer to display CMSampleBuffers which are coming over a network connection in the h.264 format. Video playback is smooth and working correctly, however I cannot seem to control the frame rate. Specifically, if I enqueue 60 frames per second in the AVSampleBufferDisplayLayer it displays those 60 frames, even though the video is being recorded at 30 FPS. When creating sample buffers, it is possible to set the presentation time stamp by passing a timing info

Deep Copy of Audio CMSampleBuffer

喜你入骨 提交于 2019-12-18 11:55:55
问题 I am trying to create a copy of a CMSampleBuffer as returned by captureOutput in a AVCaptureAudioDataOutputSampleBufferDelegate . The problem I am having is that my frames coming from delegate method captureOutput:didOutputSampleBuffer:fromConnection: being dropped after I retain them in CFArray for long time. Obviously, I need to create deep copies of incoming buffers for further processing. I also know that CMSampleBufferCreateCopy only creates shallow copies. There are few related

CMSampleBufferRef kCMSampleBufferAttachmentKey_TrimDurationAtStart crash

天大地大妈咪最大 提交于 2019-12-12 13:43:03
问题 This has bothering me for a while. i have video convert to convert video into “.mp4” format. But there is a crash that happens on some video but not all. here is the crash log *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[AVAssetWriterInput appendSampleBuffer:] Cannot append sample buffer: First input buffer must have an appropriate kCMSampleBufferAttachmentKey_TrimDurationAtStart since the codec has encoder delay' here is my codes: NSURL

How to get the timestamp of each video frame in iOS while decoding a video.mp4

孤者浪人 提交于 2019-12-11 06:02:01
问题 Scenario: I am writing an iOS app to try decode a videoFile.mp4 . I am using AVAssetReaderTrackOutput with AVAssetReader to decode frames from the video file. This works very well. I get each & every frame from videoFile.mp4 basically using the following logic at the core. Code: AVAssetReader * videoFileReader; AVAssetReaderTrackOutput * assetReaderOutput = [videoFileReader.outputs objectAtIndex:0]; CMSampleBufferRef sampleBuffer = [assetReaderOutput copyNextSampleBuffer]; sampleBuffer is the

Convert a CMSampleBuffer into a UIImage

痞子三分冷 提交于 2019-12-10 16:27:43
问题 Here's a function (code from Apple documentation) that converts a CMSampleBuffer into a UIImage func imageFromSampleBuffer(sampleBuffer: CMSampleBuffer) -> UIImage { // Get a CMSampleBuffer's Core Video image buffer for the media data var imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) // Lock the base address of the pixel buffer CVPixelBufferLockBaseAddress(imageBuffer, 0) // Get the number of bytes per row for the pixel buffer var baseAddress = CVPixelBufferGetBaseAddress

error converting AudioBufferList to CMBlockBufferRef

て烟熏妆下的殇ゞ 提交于 2019-12-08 15:48:55
问题 I am trying to take a video file read it in using AVAssetReader and pass the audio off to CoreAudio for processing (adding effects and stuff) before saving it back out to disk using AVAssetWriter. I would like to point out that if i set the componentSubType on AudioComponentDescription of my output node as RemoteIO, things play correctly though the speakers. This makes me confident that my AUGraph is properly setup as I can hear things working. I am setting the subType to GenericOutput though