I am using ARKit\'s ARFaceTrackingConfiguration with ARConfiguration.WorldAlignment.camera alignment, but I found that the documentation (seemingly) does not re
Although I still don't know why does not the face anchor behave as described in the documentation, I can at least answer how to correct its left-handed system into the Metal- and SceneKit-friendly right-handed system (X axis to the right, Y axis up, Z axis from the screen towards user):
func faceAnchorPoseToRHS(_ mat: float4x4) -> float4x4 {
let correctedPos = float4(x: mat.columns.3.x, y: mat.columns.3.y, z: -mat.columns.3.z, w: 1)
let quat = simd_quatf(mat)
let newQuat = simd_quatf(angle: -quat.angle, axis: float3(quat.axis.x, quat.axis.y, -quat.axis.z))
var newPose = float4x4(newQuat)
newPose.columns.3 = correctedPos
return newPose
}
It seems quite obvious:
When ARSession is running and ARCamera begins tracking environment, it places WorldOriginAxis in front of your face at (x: 0, y: 0, z: 0). Just check it using:
sceneView.debugOptions = [.showWorldOrigin]
So your face's position now at negative part of Z axis of World Coordinates.
Thus,
ARFaceAnchorwill be at negative Z axis as well.
And when you use ARFaceTrackingConfiguration vs ARWorldTrackingConfiguration there's two things to consider:
Rear Camera moves towards people along negative Z-axes (positive X-axis is on the right).
TrueDepth Camera moves towards faces along positive Z-axes (positive X-axis is on the left).
Hence, when you are "looking" through TrueDepth Camera, a
4x4 Matrixis mirrored.