Most efficient/realtime way to get pixel values from iOS camera feed in Swift

我们两清 提交于 2019-12-04 19:26:33

You don't want an AVCaptureVideoPreviewLayer - that's what you want if you want to display the video. Instead, you want a different output: AVCaptureVideoDataOutput:

https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVCaptureVideoDataOutput_Class/index.html#//apple_ref/occ/cl/AVCaptureVideoDataOutput

This gives you direct access to the stream of sample buffers, which you can then get into pixel-space.

Just a note: I don't know what the throughput on current devices is, but I was unable to get a live stream at the highest quality from the iPhone 4S because the GPU<-->CPU pipeline was too slow.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!