Taking photo with custom camera Swift 3

回眸只為那壹抹淺笑 提交于 2019-11-28 09:28:28

You can use AVCapturePhotoOutputlike this in Swift 3:

You need the AVCapturePhotoCaptureDelegate which returns the CMSampleBuffer.

You can get as well a preview image if you tell the AVCapturePhotoSettings the previewFormat

class CameraCaptureOutput: NSObject, AVCapturePhotoCaptureDelegate {

    let cameraOutput = AVCapturePhotoOutput()

    func capturePhoto() {

      let settings = AVCapturePhotoSettings()
            let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
            let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
                                 kCVPixelBufferWidthKey as String: 160,
                                 kCVPixelBufferHeightKey as String: 160,
                                 ]
            settings.previewPhotoFormat = previewFormat
            self.cameraOutput.capturePhoto(with: settings, delegate: self)

    }
    func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: NSError?) {

        if let error = error {
            print(error.localizedDescription)
        }

        if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer, let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {
          print(image: UIImage(data: dataImage).size)
        } else {

        }
    }
}

Thanks to Sharpkits I found my solution (This code works for me):

func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {

        if let error = error {
            print(error.localizedDescription)
        }

        if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer,
            let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {

            let imageData = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: nil)
            let dataProvider = CGDataProvider(data: imageData as! CFData)

            let cgImageRef = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: CGColorRenderingIntent.absoluteColorimetric)


            let image = UIImage(cgImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.right)

            let cropedImage = self.cropToSquare(image: image)

            let newImage = self.scaleImageWith(cropedImage, and: CGSize(width: 600, height: 600))

            print(UIScreen.main.bounds.width)


            self.tempImageView.image = newImage
            self.tempImageView.isHidden = false


        } else {

        }
    }

Great code. Thanks a ton for the help and examples.

Just to clarify for those of slower mental faculties like myself, the capture(_ ...etc) method is called behind the scenes when you call self.cameraOutput.capturePhoto(with: settings, delegate: self) inside of your takePhoto (or whatever you name it) method. You will never actually call the capture method yourself directly. It happens automatically.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!