core-media

Create a CMSampleBuffer from a CVPixelBuffer

丶灬走出姿态 提交于 2019-12-01 06:25:35
I get a CVPixelBuffer from ARSessionDelegate: func session(_ session: ARSession, didUpdate frame: ARFrame) { frame.capturedImage // CVPixelBufferRef } But another part of my app (that I can't change) uses a CMSampleBuffer. CMSampleBuffer is a container of CVPixelBuffer. In order to create a CMSampleBuffer I can use this function: func CMSampleBufferCreateReadyWithImageBuffer(_ allocator: CFAllocator?, _ imageBuffer: CVImageBuffer, _ formatDescription: CMVideoFormatDescription, _ sampleTiming: UnsafePointer<CMSampleTimingInfo>, _ sBufOut: UnsafeMutablePointer<CMSampleBuffer?>) -> OSStatus The

Create a CMSampleBuffer from a CVPixelBuffer

爱⌒轻易说出口 提交于 2019-12-01 04:53:33
问题 I get a CVPixelBuffer from ARSessionDelegate: func session(_ session: ARSession, didUpdate frame: ARFrame) { frame.capturedImage // CVPixelBufferRef } But another part of my app (that I can't change) uses a CMSampleBuffer. CMSampleBuffer is a container of CVPixelBuffer. In order to create a CMSampleBuffer I can use this function: func CMSampleBufferCreateReadyWithImageBuffer(_ allocator: CFAllocator?, _ imageBuffer: CVImageBuffer, _ formatDescription: CMVideoFormatDescription, _ sampleTiming:

How do I convert a CGImage to CMSampleBufferRef?

天大地大妈咪最大 提交于 2019-11-28 19:18:16
问题 I’d like to convert a CGImage to CMSampleBufferRef and append it to a AVAssetWriterInput using the appendSampleBuffer: method. I’ve managed to get the CMSampleBufferRef using the following code, but the appendSampleBuffer: simply returns NO when I supply the resulting CMSampleBufferRef . What am I doing wrong? - (void) appendCGImage: (CGImageRef) frame { const int width = CGImageGetWidth(frame); const int height = CGImageGetHeight(frame); // Create a dummy pixel buffer to try the encoding //

How to convert CMSampleBuffer to Data in Swift?

霸气de小男生 提交于 2019-11-28 12:42:16
I need to convert CMSampleBuffer to Data format. I am using one Third party framework for audio related task. That framework gives me the streaming (i.e Real Time audio) audio in CMSampleBuffer object. Like this: func didAudioStreaming(audioSample: CMSampleBuffer!) { //Here I need to conver this to Data format. //Because I am using GRPC framework for Audio Recognization, } Please provide me the steps to convert the CMSampleBuffer to Data . FYI let formatDesc:CMFormatDescription? = CMSampleBufferGetFormatDescription(audioSample) <CMAudioFormatDescription 0x17010d890 [0x1b453ebb8]> { mediaType: