How to capture depth data from camera in iOS 11 and Swift 4?

后端 未结 4 783
一个人的身影
一个人的身影 2020-12-09 06:36

I\'m trying to get depth data from the camera in iOS 11 with AVDepthData, tho when I setup a photoOutput with the AVCapturePhotoCaptureDelegate the photo.depthData is nil.

4条回答
  •  清歌不尽
    2020-12-09 06:50

    To give more details to @klinger answer, here is what you need to do to get Depth Data for each pixel, I wrote some comments, hope it helps!

    func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
    
        //## Convert Disparity to Depth ##
    
        let depthData = (photo.depthData as AVDepthData!).converting(toDepthDataType: kCVPixelFormatType_DepthFloat32)
        let depthDataMap = depthData.depthDataMap //AVDepthData -> CVPixelBuffer
    
        //## Data Analysis ##
    
        // Useful data
        let width = CVPixelBufferGetWidth(depthDataMap) //768 on an iPhone 7+
        let height = CVPixelBufferGetHeight(depthDataMap) //576 on an iPhone 7+
        CVPixelBufferLockBaseAddress(depthDataMap, CVPixelBufferLockFlags(rawValue: 0))
    
        // Convert the base address to a safe pointer of the appropriate type
        let floatBuffer = unsafeBitCast(CVPixelBufferGetBaseAddress(depthDataMap), to: UnsafeMutablePointer.self)
    
        // Read the data (returns value of type Float)
        // Accessible values : (width-1) * (height-1) = 767 * 575
    
        let distanceAtXYPoint = floatBuffer[Int(x * y)]
    
    }
    

提交回复
热议问题