ARKit: How to tell if user's face is parallel to camera
问题 In my Swift / ARKit / SceneKit project, I need to tell if the user's face in front-facing camera is parallel to the camera. I was able to tell horizontal parallel by comparing the left and right eyes distance (using faceAnchor.leftEyeTransform and the worldPosition property) from the camera. But I am stuck on vertical parallel. Any ideas, how to achieve that? 回答1: Assuming you are using ARFaceTrackingConfiguration in your app, you can actually retrieve the transforms of both the ARFaceAnchor