cmsamplebufferref

Memory Leak in CMSampleBufferGetImageBuffer

我与影子孤独终老i 提交于 2019-12-01 08:35:53
问题 I'm getting a UIImage from a CMSampleBufferRef video buffer every N video frames like: - (void)imageFromVideoBuffer:(void(^)(UIImage* image))completion { CMSampleBufferRef sampleBuffer = _myLastSampleBuffer; if (sampleBuffer != nil) { CFRetain(sampleBuffer); CIImage *ciImage = [CIImage imageWithCVPixelBuffer:CMSampleBufferGetImageBuffer(sampleBuffer)]; _lastAppendedVideoBuffer.sampleBuffer = nil; if (_context == nil) { _context = [CIContext contextWithOptions:nil]; } CVPixelBufferRef buffer =

Split CMSampleBufferRef containing Audio

做~自己de王妃 提交于 2019-12-01 00:59:26
I'am splitting the recording into different files while recording... The problem is, captureOutput video and audio sample buffers doesn't correspond 1:1 (which is logical) - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection AUDIO START: 36796.833236847 | DURATION: 0.02321995464852608 | END: 36796.856456802 VIDEO START: 36796.842089239 | DURATION: nan | END: nan AUDIO START: 36796.856456805 | DURATION: 0.02321995464852608 | END: 36796.87967676 AUDIO START: 36796.879676764 | DURATION: 0

How to convert CMSampleBufferRef to NSData

这一生的挚爱 提交于 2019-11-30 07:01:48
How do you convert CMSampleBufferRef to NSData? I've managed to get the data for an MPMediaItem by following Erik Aigner's answer on this thread , however the data is of type CMSampleBufferRef . I know CMSampleBufferRef is a struct and is defined in the CMSampleBuffer Reference in the iOS Dev Library, but I don't think I fully understand what it is. None of the CMSampleBuffer functions seem to be an obvious solution. Daniel Here you go this works for audio sample buffer which is what you are looking at, and if you want to look at the whole process (getting all audio data from MPMediaItem into

iOS - Scale and crop CMSampleBufferRef/CVImageBufferRef

落爺英雄遲暮 提交于 2019-11-26 22:00:47
I am using AVFoundation and getting the sample buffer from AVCaptureVideoDataOutput , I can write it directly to videoWriter by using: - (void)writeBufferFrame:(CMSampleBufferRef)sampleBuffer { CMTime lastSampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer); if(self.videoWriter.status != AVAssetWriterStatusWriting) { [self.videoWriter startWriting]; [self.videoWriter startSessionAtSourceTime:lastSampleTime]; } [self.videoWriterInput appendSampleBuffer:sampleBuffer]; } What I want to do now is to crop and scale the image inside the CMSampleBufferRef without converting it into