问题
I have to process frames which are captured by iPhone camera using my c++ functions. So I use startRunning()
function to start the flow of data, but in what way I can process each frame?
回答1:
Yes, it is pretty straight forward. You need to
- Create an AVCaptureVideoDataOutput object to produce video frames
- Implement a delegate for the AVCaptureVideoDataOutput object to process video frames
- In the delegate class, implement the method (captureOutput:didOutputSampleBuffer:fromConnection:) that is called when a sample buffer is written.
For more information you can read this part of the AVFoundation Programming Guide. The code samples are in Objective-C not swift but i think you get the idea.
回答2:
I created a class that may help you out: CameraCaptureHelper wraps up all the AVFoundation stuff and invokes a method on its delegate where it passes a CIImage
with each frame.
来源:https://stackoverflow.com/questions/35255580/is-there-any-way-to-get-frame-by-frame-using-avcapturesession-object-in-swift