How can I get Camera Calibration Data on iOS? aka AVCameraCalibrationData

a 夏天 提交于 2019-12-31 04:20:25

问题


As I understand it, AVCameraCalibrationData is only available over AVCaptureDepthDataOutput. Is that correct?

AVCaptureDepthDataOutput on the other hand is only accessible with iPhone X front cam or iPhone Plus back cam, or am I mistaken?

What I am trying to do is to get the FOV of an AVCaptureVideoDataOutput SampleBuffer. Especially, it should match the selected preset (full HD, Photo etc.).


回答1:


You can get AVCameraCalibrationData only from depth data output or photo output.

However, if all you need is FOV, you need only part of the info that class offers — the camera intrinsics matrix — and you can get that by itself from AVCaptureVideoDataOutput.

  1. Set cameraIntrinsicMatrixDeliveryEnabled on the AVCaptureConnection connecting your camera device to the capture session. (Note you should check cameraIntrinsicMatrixDeliverySupported first; not all capture formats support intrinsics.)

  2. When the video output vends sample buffers, check each sample buffer's attachments for the kCMSampleBufferAttachmentKey_CameraIntrinsicMatrix key. As noted in CMSampleBuffer.h (someone should file a radar about getting this info into the online documentation), the value for that attachment is a CFData encoding a matrix_float3x3, and the (0,0) and (1,1) elements of that matrix are the horizontal and vertical focal length in pixels.




回答2:


Background: A lot of these stack overflow responses are referencing intrinsic data when asked about camera calibration, including the accepted answer for this post, but calibration data typically includes intrinsic data, extrinsic data, lens distortion, etc. Its all listed out here in the iOS documentation. The author mentioned they were just looking for FOV, which is in the sample buffer, not in the camera calibration data. So ultimately, I think his question was answered. BUT if you found this question looking for actual camera calibration data, this will throw you off. And like the answer said, you can only get calibration data under specific conditions, which I outline more below.

Before I answer the rest, I would just say that the accepted answer here is great if you ARE looking for JUST the intrinsic matrix, that can be obtained much easier (i.e. not as stringent of an environment) than the rest of these values through the approach outlined above. If you are using this for computer vision, which is what I am using it for, that is sometimes all that is needed. But for really cool stuff, you'll want it all! So I will proceed to explain how to reach that:

I am going to assume you have the general camera app code in place. In that code, when a picture is taken, you are probably going to make a call to the photoOutput function that looks likes something like this:

func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {...

The output parameter is going to have a value you can reference to see if camera calibration is supported called isCameraCalibrationDataDeliverySupported, so for example, to print that out, use something like this:

print("isCameraCalibrationDataDeliverySupported: \(output.isCameraCalibrationDataDeliverySupported)")

Note in the documentation I linked to, it is only supported in specific scenarios:

"This property's value can be true only when the isDualCameraDualPhotoDeliveryEnabled property is true. To enable camera calibration delivery, set the isCameraCalibrationDataDeliveryEnabled property in a photo settings object."

So that's important, pay attention to that to avoid unnecessary stress. Use the actual value to debug and make sure you have the proper environment enabled.

With all that in place, you should get the actual camera calibration data from:

photo.cameraCalibrationData

Just pull out of that object to get specific values you are looking for, such as:

photo.cameraCalibrationData?.extrinsicMatrix
photo.cameraCalibrationData?.intrinsicMatrix
photo.cameraCalibrationData?.lensDistortionCenter
etc.

Basically everything that is listed in the documentation that I linked to above.




回答3:


Here is a more complete/updated code example in swift 5 that is put together from previous answers. This gets you the camera intrinsics matrix for an iphone.

based on:

  • https://stackoverflow.com/a/48159895/67166
  • https://stackoverflow.com/a/48565639/6716
// session setup
captureSession = AVCaptureSession()

let captureVideoDataOutput = AVCaptureVideoDataOutput()

captureSession?.addOutput(captureVideoDataOutput)

// enable the flag
if #available(iOS 11.0, *) {
    captureVideoDataOutput.connection(with: .video)?.isCameraIntrinsicMatrixDeliveryEnabled = true
} else {
    // ...
}

// `isCameraIntrinsicMatrixDeliveryEnabled` should be set before this
captureSession?.startRunning()

and now inside AVCaptureVideoDataOutputSampleBufferDelegate.captureOutput(...)

if #available(iOS 11.0, *) {
    if let camData = CMGetAttachment(sampleBuffer, key:kCMSampleBufferAttachmentKey_CameraIntrinsicMatrix, attachmentModeOut:nil) as? Data {
        let matrix: matrix_float3x3 = camData.withUnsafeBytes { $0.pointee }
        print(matrix)
        // > simd_float3x3(columns: (SIMD3<Float>(1599.8231, 0.0, 0.0), SIMD3<Float>(0.0, 1599.8231, 0.0), SIMD3<Float>(539.5, 959.5, 1.0)))
    }
} else {
    // ...
}


来源:https://stackoverflow.com/questions/48093509/how-can-i-get-camera-calibration-data-on-ios-aka-avcameracalibrationdata

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!