I want to use Core Image for processing a bunch of CGImage objects and turning them into a QuickTime movie on macOS. The following code demonst
After speaking with Apple Developer Technical Support it appears that:
Core Image defers the rendering until the client requests the access to the frame buffer, i.e.
CVPixelBufferLockBaseAddress.
So, the solution is simply to do CVPixelBufferLockBaseAddress after calling CIContext.render as shown below:
for frameNumber in 0 ..< frameCount {
var pixelBuffer: CVPixelBuffer?
guard let pixelBufferPool: CVPixelBufferPool = pixelBufferAdaptor.pixelBufferPool else { preconditionFailure() }
precondition(CVPixelBufferPoolCreatePixelBuffer(nil, pixelBufferPool, &pixelBuffer) == kCVReturnSuccess)
let ciImage = CIImage(cgImage: frameImage)
context.render(ciImage, to: pixelBuffer!)
precondition(CVPixelBufferLockBaseAddress(pixelBuffer!, []) == kCVReturnSuccess)
defer { precondition(CVPixelBufferUnlockBaseAddress(pixelBuffer!, []) == kCVReturnSuccess) }
let bytes = UnsafeBufferPointer(start: CVPixelBufferGetBaseAddress(pixelBuffer!)!.assumingMemoryBound(to: UInt8.self), count: CVPixelBufferGetDataSize(pixelBuffer!))
precondition(bytes.contains(where: { $0 != 0 }))
while !input.isReadyForMoreMediaData { Thread.sleep(forTimeInterval: 10 / 1000) }
precondition(pixelBufferAdaptor.append(pixelBuffer!, withPresentationTime: CMTime(seconds: Double(frameNumber) * frameRate, preferredTimescale: 600)))
}