AVCaptureStillImageOutput & UIImagePNGRepresentation

北城余情 提交于 2019-12-13 07:07:56

问题


I am having a hard time for something I think shouldn’t be so difficult, so I presume I must be looking at the problem from the wrong angle. In order to understand how AVCaptureStillImageOutput and the camera work I made a tiny app.

This app is able to take a picture and save it as a PNG file (I do not want JPEG). The next time the app is launched, it checks if a file is present and if it is, the image stored inside the file is used as the background view of the app. The idea is rather simple.

The problem is that it does not work. If someone can tell me what I am doing wrong that will be very helpful.

I would like the picture to appear as a background the same way it was on the display when it was taken, but it is rotated or has the wrong scale ..etc..

Here is the relevant code (I can provide more information if ever needed).

The viewDidLoad method:

override func viewDidLoad() {
    super.viewDidLoad()

    // For the photo capture:
    captureSession.sessionPreset = AVCaptureSessionPresetHigh

    // Select the appropriate capture devices:
    for device in AVCaptureDevice.devices() {
        // Make sure this particular device supports video.
        if (device.hasMediaType(AVMediaTypeVideo)) {
            // Finally check the position and confirm we've got the back camera.
            if(device.position == AVCaptureDevicePosition.Back) {
                captureDevice = device as? AVCaptureDevice
            }
        }
    }

    tapGesture = UITapGestureRecognizer(target: self, action: Selector("takePhoto"))
    self.view.addGestureRecognizer(tapGesture)

    let filePath = self.toolBox.getDocumentsDirectory().stringByAppendingPathComponent("BackGroundImage@2x.png")

    if !NSFileManager.defaultManager().fileExistsAtPath(filePath) {return}

    let bgImage = UIImage(contentsOfFile: filePath),
    bgView = UIImageView(image: bgImage)
    self.view.addSubview(bgView)
}

The method to handle the picture taking:

func takePhoto() {
    if !captureSession.running {
        beginPhotoCaptureSession()
        return
    }

    if let videoConnection = stillImageOutput.connectionWithMediaType(AVMediaTypeVideo) {
        stillImageOutput.captureStillImageAsynchronouslyFromConnection(videoConnection) {
            (imageDataSampleBuffer, error) -> Void in
            if error == nil {
                var localImage = UIImage(fromSampleBuffer: imageDataSampleBuffer)
                UIGraphicsBeginImageContext(localImage!.size)
                CGContextRotateCTM (UIGraphicsGetCurrentContext(), CGFloat(M_PI_2))
                //localImage!.drawAtPoint(CGPointZero)
                localImage!.drawAtPoint(CGPoint(x: -localImage!.size.height, y: -localImage!.size.width))
                //localImage!.drawAtPoint(CGPoint(x: -localImage!.size.width, y: -localImage!.size.height))
                localImage = UIGraphicsGetImageFromCurrentImageContext()
                UIGraphicsEndImageContext()
                localImage = resizeImage(localImage!, toSize: self.view.frame.size)

                if let data = UIImagePNGRepresentation(localImage!) {
                    let bitMapName = "BackGroundImage@2x"
                    let filename = self.toolBox.getDocumentsDirectory().stringByAppendingPathComponent("\(bitMapName).png")
                    data.writeToFile(filename, atomically: true)
                    print("Picture saved: \(bitMapName)\n\(filename)")
                }
            } else {print("Error on taking a picture:\n\(error)")}
        }
    }

    captureSession.stopRunning()
    previewLayer!.removeFromSuperlayer()
}

The method to start the AVCaptureSession:

func beginPhotoCaptureSession() {
    do {let input = try AVCaptureDeviceInput(device: captureDevice)
        captureSession.addInput(input)
    } catch let error as NSError {
        // Handle any errors:
        print(error)
    }

    previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
    previewLayer?.frame = self.view.layer.frame
    self.view.layer.addSublayer(previewLayer!)
    captureSession.startRunning()

    stillImageOutput.outputSettings = [kCVPixelBufferPixelFormatTypeKey:Int(kCVPixelFormatType_32BGRA)]
    if captureSession.canAddOutput(stillImageOutput) {
        captureSession.addOutput(stillImageOutput)
    }
}

As an example here is an image of a picture taken with the app:

Now here is what I get as the background of the app when it is relaunched:

If it was working correctly the 2 pictures would be similar.


回答1:


I cant see rotation in screenshot.. But the scale is a problem which is related to your code when you are drawing it in function takePhoto. You can try

func takePhoto() {
    if !captureSession.running {
        beginPhotoCaptureSession()
        return
    }
    if let videoConnection = stillImageOutput.connectionWithMediaType(AVMediaTypeVideo) {
        stillImageOutput.captureStillImageAsynchronouslyFromConnection(videoConnection) {
            (imageDataSampleBuffer, error) -> Void in
            if error == nil {
               if let data = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer) {
                    let bitMapName = "BackGroundImage@2x"
                    let filename = self.toolBox.getDocumentsDirectory().stringByAppendingPathComponent("\(bitMapName).png")
                    data.writeToFile(filename, atomically: true)
                    print("Picture saved: \(bitMapName)\n\(filename)")
                }
            } else {print("Error on taking a picture:\n\(error)")}
        }
    }

    captureSession.stopRunning()
    previewLayer!.removeFromSuperlayer()
}



回答2:


For those who might be hitting the same issue at one point, I post below what I fixed in the code to make it work. There may still be some improvement needed to support all possible orientation, but it's OK for a start.

    if let videoConnection = stillImageOutput.connectionWithMediaType(AVMediaTypeVideo) {
        stillImageOutput.captureStillImageAsynchronouslyFromConnection(videoConnection) {
            (imageDataSampleBuffer, error) -> Void in
            if error == nil {
                var localImage = UIImage(fromSampleBuffer: imageDataSampleBuffer)
                var imageSize = CGSize(width: UIScreen.mainScreen().bounds.height * UIScreen.mainScreen().scale,
                    height: UIScreen.mainScreen().bounds.width * UIScreen.mainScreen().scale)
                localImage = resizeImage(localImage!, toSize: imageSize)

                imageSize = CGSize(width: imageSize.height, height: imageSize.width)
                UIGraphicsBeginImageContext(imageSize)
                CGContextRotateCTM (UIGraphicsGetCurrentContext(), CGFloat(M_PI_2))
                localImage!.drawAtPoint(CGPoint(x: 0.0, y: -imageSize.width))
                localImage = UIGraphicsGetImageFromCurrentImageContext()
                UIGraphicsEndImageContext()

                if let data = UIImagePNGRepresentation(localImage!) {
                    data.writeToFile(self.bmpFilePath, atomically: true)
                }
            } else {print("Error on taking a picture:\n\(error)")}
        }
    }


来源:https://stackoverflow.com/questions/34646982/avcapturestillimageoutput-uiimagepngrepresentation

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!