ARKit set ARAnchor transform based on touch location

白昼怎懂夜的黑 提交于 2019-12-04 12:57:20

I assume you're referring to Apple's ARKitExample project "Placing Virtual Objects in Augmented Reality".

Have a look at the method VirtualObject.translateBasedOnScreenPos(_:instantly:infinitePlane:) that is being called when you're moving an (already placed) model in the screen—that basically needs to solve the same problems that you describe.

You'll find that this in turn calls ViewController.worldPositionFromScreenPosition(_:objectPos:infinitePlane:).

Extracted from this method, their approach is:

  1. Always do a hit test against exisiting plane anchors first. (If any such anchors exist & only within their extents.)

  2. Collect more information about the environment by hit testing against the feature point cloud, but do not return the result yet.

  3. If desired or necessary (no good feature hit test result): Hit test against an infinite, horizontal plane (ignoring the real world).

  4. If available, return the result of the hit test against high quality features if the hit tests against infinite planes were skipped or no infinite plane was hit.

  5. As a last resort, perform a second, unfiltered hit test against features. If there are no features in the scene, the result returned here will be nil.

As you can see, they consider various aspects that may-or-may-not apply to your use-case. Consider studying and re-using (parts of) their approach.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!