问题
I'm trying to track facial expressions such as eyebrow raise, smile, wink, etc. In ARKit I could use blendShapes (https://developer.apple.com/documentation/arkit/arfaceanchor/2928251-blendshapes) to detect the movement of the different parts of the face but in ARCore it doesn't exist yet.
I've tried to access the mesh vertices which are relative to the center transform of the face but these change significantly with the rotation of the face.
Is there a way to normalize the face landmark/vertex from 0 to 1 where 0 is neutral and 1 is the maximum facial expression? It doesn't need to be as accurate as ARKit blendShapes.
来源:https://stackoverflow.com/questions/58226299/tracking-face-mesh-vertices-of-augmented-faces-arcore-regardless-of-rotation