What sensors does ARCore use?

时光毁灭记忆、已成空白 提交于 2019-12-21 02:35:45

问题


What sensors does ARCore use: single camera, dual-camera, IMU, etc. in a compatible phone?

Also, is ARCore dynamic enough to still work if a sensor is not available by switching to a less accurate version of itself?


回答1:


UPDATED November 24, 2019.

Google's ARCore, as well as Apple's ARKit, use the same types of sensors to track a real-world scene. ARCore uses a single RGB camera along with IMU device, what is a combination of an accelerometer, magnetometer and a gyroscope. Your phone can use ARCore's 3D tracking at 30fps (now) / 60fps (now) / 120fps (in near future). By the way, Inertial Measurement Unit can operate at up to 1000Hz.

Read what Google says about it about COM method, built on Camera + IMU:

Concurrent Odometry and Mapping – An electronic device tracks its motion in an environment while building a three-dimensional visual representation of the environment that is used to correct drift in the tracked motion.

Although in previous years Google tended towards Multicam + Depth-cam setup (in Tango system), now it tends to Single-cam + IMU system.

Here's Google US15595617 Patent: System and method for concurrent odometry and mapping.

We all know that the biggest problem for Android devices is a calibration. iOS devices don't have this issue ('cause Apple controls its own hardware and software). A low quality of calibration leads to errors in 3D tracking, hence all your virtual 3D objects might "float" in a poorly-tracked scene. So there's no miraculous button against bad tracking (and you can't switch to a less accurate version of tracking). The only solution in such a situation is to re-track your scene from scratch.

Three main rules for good tracking results are:

  1. Track your scene not too fast, not too slow

  2. Track appropriate surfaces and objects

  3. Use well lit environment when tracking

  4. Don't track reflected of refracted objects

About dual-camera setup for ARCore.

The one of the biggest problems of ARCore (that's ARKit problem too) is an Energy Impact. We understand that the higher frame rate is – the better tracking's results are. But the Energy Impact at 30 fps is HIGH and at 60 fps it's VERY HIGH. Such an energy impact will quickly drain your smartphone's battery (due to an enormous burden on CPU/GPU). So, just imagine that you use 2 cameras for ARCore – your phone must process 2 high-resolution image sequences at 60 fps in parallel as well as process and store feature points and AR anchors, and at the same time, a phone must simultaneously render animated 3D graphics with Hi-Res textures at 60 fps. That's too much for your CPU/GPU. In such a case, a battery will be dead in 15-20 minutes and will be as hot as a Sun)). It seems to me that users don't like it because this is a bad AR experience. So it seems that Google uses only one rear camera for running ARCore app.



来源:https://stackoverflow.com/questions/54356589/what-sensors-does-arcore-use

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!