问题
In my Swift
/ ARKit
/ SceneKit
project, I need to tell if the user's face in front-facing camera is parallel to the camera.
I was able to tell horizontal parallel by comparing the left and right eyes distance (using faceAnchor.leftEyeTransform
and the worldPosition
property) from the camera.
But I am stuck on vertical parallel. Any ideas, how to achieve that?
回答1:
Assuming you are using ARFaceTrackingConfiguration
in your app, you can actually retrieve the transforms of both the ARFaceAnchor
and the camera to determine their orientations. You can get a simd_float4x4
matrix of the head's orientation in world space by using ARFaceAnchor.transform
property. Similarly, you can get the transform of the SCNCamera
or ARCamera
of your scene.
To compare the camera's and face's orientations relative to each other in a SceneKit app (though there are similar functions on the ARKit side of things), I get the world transform for the node that is attached to each of them, let's call them faceNode
attached to the ARFaceAnchor
and cameraNode
representing the ARSCNView.pointOfView
. To find the angle between the camera and your face, for example, you could do something like this:
let faceOrientation: simd_quatf = faceNode.simdWorldTransform
let cameraOrientation: simd_quatf = cameraNode.simdWorldTransform
let deltaOrientation: simd_quatf = faceOrientation.inverse * cameraOrientation
By looking at deltaOrientation.angle
and deltaOrientation.axis
you can determine the relative angles on each axis between the face and the camera. If you do something like deltaOrientation.axis * deltaOrientation.angles
, you have a simd_float3
vector giving you a sense of the pitch, yaw and roll (in radians) of the head relative to the camera.
There are a number of ways you can do this using the face anchor and camera transforms, but this simd quaternion method works quite well for me. Hope this helps!
来源:https://stackoverflow.com/questions/53027049/arkit-how-to-tell-if-users-face-is-parallel-to-camera