core-media

Creating copy of CMSampleBuffer in Swift returns OSStatus -12743 (Invalid Media Format)

╄→гoц情女王★ 提交于 2020-01-24 05:46:05
问题 I am attempting to perform a deep clone of CMSampleBuffer to store the output of a AVCaptureSession . I am receiving the error kCMSampleBufferError_InvalidMediaFormat (OSStatus -12743) when I run the function CMSampleBufferCreateForImageBuffer . I don't see how I've mismatched the CVImageBuffer and the CMSampleBuffer format description. Anyone know where I've gone wrong? Her is my test code. func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer

CMBlockBufferCreate memory management

别说谁变了你拦得住时间么 提交于 2020-01-12 17:35:28
问题 I have some code that creates CMBlockBuffers and then creates a CMSampleBuffer and passes it to an AVAssetWriterInput. What's the deal on memory management here? According to the Apple documentation, anything you use with 'Create' in the name should be released with CFRelease. However, if I use CFRelease then my app aborts with 'malloc: * error for object 0xblahblah: pointer being freed was not allocated. CMBlockBufferRef tmp_bbuf = NULL; CMBlockBufferRef bbuf = NULL; CMSampleBufferRef sbuf =

How to convert CMSampleBuffer to Data in Swift?

只谈情不闲聊 提交于 2019-12-17 20:37:16
问题 I need to convert CMSampleBuffer to Data format. I am using one Third party framework for audio related task. That framework gives me the streaming (i.e Real Time audio) audio in CMSampleBuffer object. Like this: func didAudioStreaming(audioSample: CMSampleBuffer!) { //Here I need to conver this to Data format. //Because I am using GRPC framework for Audio Recognization, } Please provide me the steps to convert the CMSampleBuffer to Data . FYI let formatDesc:CMFormatDescription? =

How do I draw onto a CVPixelBufferRef that is planar/ycbcr/420f/yuv/NV12/not rgb?

空扰寡人 提交于 2019-12-06 04:15:10
问题 I have received a CMSampleBufferRef from a system API that contains CVPixelBufferRef s that are not RGBA (linear pixels). The buffer contains planar pixels (such as 420f aka kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange aka yCbCr aka YUV ). I would like to modify do some manipulation of this video data before sending it off to VideoToolkit to be encoded to h264 (drawing some text, overlaying a logo, rotating the image, etc), but I'd like for it to be efficient and real-time. Buuuut planar

Create a copy of CMSampleBuffer in Swift 2.0

☆樱花仙子☆ 提交于 2019-12-06 03:49:26
问题 This has been asked before, but something must have changed in Swift since it was asked. I am trying to store CMSampleBuffer objects returned from an AVCaptureSession to be processed later. After some experimentation I discovered that AVCaptureSession must be reusing its CMSampleBuffer references. When I try to keep more than 15 the session hangs. So I thought I would make copies of the sample buffers. But I can't seem to get it to work. Here is what I have written: var allocator: Unmanaged

Create a copy of CMSampleBuffer in Swift 2.0

微笑、不失礼 提交于 2019-12-04 08:42:59
This has been asked before, but something must have changed in Swift since it was asked. I am trying to store CMSampleBuffer objects returned from an AVCaptureSession to be processed later. After some experimentation I discovered that AVCaptureSession must be reusing its CMSampleBuffer references. When I try to keep more than 15 the session hangs. So I thought I would make copies of the sample buffers. But I can't seem to get it to work. Here is what I have written: var allocator: Unmanaged<CFAllocator>! = CFAllocatorGetDefault() var bufferCopy: UnsafeMutablePointer<CMSampleBuffer?> let err =

CMBlockBufferCreate memory management

[亡魂溺海] 提交于 2019-12-04 04:59:24
I have some code that creates CMBlockBuffers and then creates a CMSampleBuffer and passes it to an AVAssetWriterInput. What's the deal on memory management here? According to the Apple documentation, anything you use with 'Create' in the name should be released with CFRelease . However, if I use CFRelease then my app aborts with 'malloc: * error for object 0xblahblah: pointer being freed was not allocated. CMBlockBufferRef tmp_bbuf = NULL; CMBlockBufferRef bbuf = NULL; CMSampleBufferRef sbuf = NULL; status = CMBlockBufferCreateWithMemoryBlock( kCFAllocatorDefault, samples, buflen,

How to set timestamp of CMSampleBuffer for AVWriter writing

一笑奈何 提交于 2019-12-03 13:04:33
问题 I'm working with AVFoundation for capturing and recording audio. There are some issues I don't quite understand. Basically I want to capture audio from AVCaptureSession and write it using AVWriter, however I need some shifting in the timestamp of the CMSampleBuffer I get from AVCaptureSession. I read documentation of CMSampleBuffer I see two different term of timestamp: 'presentation timestamp' and 'output presentation timestamp'. What the different of the two ? Let say I get a CMSampleBuffer