问题
I have an AVCaptureSession
that displays live video in a UIView, and I want to save a frame of the video stream as a UIImage. I've been dissecting the code I keep seeing around the internet, but I'm having trouble with the first line:
if let stillOutput = self.stillImageOutput {
// Establish an AVCaptureConnection and capture a still image from it.
}
This gives me the error 'Camera' does not have a member named 'stillImageOutput'. The code depends on being able to get the video connection from the output.
I can post the full code block if that'd be helpful. Thanks!
回答1:
Once you have stillImageOutput
you can use following methods to capture image
stillImageOutput.outputSettings = [AVVideoCodecKey:AVVideoCodecJPEG]
if captureSession.canAddOutput(stillImageOutput) {
captureSession.addOutput(stillImageOutput)
}
// I had to add timer otherwise quality was messed up in iPad
var timer = NSTimer.scheduledTimerWithTimeInterval(0.4, target: self, selector: Selector("getImage"), userInfo: nil, repeats: false)
Than my function to get image
func getImage() {
if let videoConnection = stillImageOutput.connectionWithMediaType(AVMediaTypeVideo) {
stillImageOutput.captureStillImageAsynchronouslyFromConnection(videoConnection) {
(imageDataSampleBuffer, error) -> Void in
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)
// Use your image or store to Album
// UIImageWriteToSavedPhotosAlbum(UIImage(data: imageData), nil, nil, nil)
self.stopSession()
}
}
}
And to stop session and remove preview layer
func stopSession(){
self.captureSession.stopRunning()
self.previewLayer?.removeFromSuperlayer()
}
来源:https://stackoverflow.com/questions/26529787/capture-still-image-from-avcapturesession-in-swift