问题
i've recently started playing with Android's camera api. And i'm using face detection and it's callback setFaceDetectionListener
. I'm having trouble trying to understand how to convert the faces[0].rect.centerX();
and faces[0].rect.centerY()
to something i can use on screen like move a image around the face centered on that point.
can anyone help me understand how to convert from the coord system given by the camera to something i can use to set elements on my activity.
回答1:
From docs
(-1000, -1000) represents the top-left of the camera field of view, and (1000, 1000) represents the bottom-right of the field of view.
So to put in to screen coords (pseudocode):
screenX = (coord.x +1000) * screen.width / 2000
screenY = (coord.y +1000) * screen.height / 2000
回答2:
Use this code (assuming you have a width and a height integer variable):
LayoutParams params = new LayoutParams(LayoutParams.WRAP_CONTENT, LayoutParams.WRAP_CONTENT);
params.setMargins(faces[0].rect.centerX()-width/2,faces[0].rect.centerY()-height/2, 0,0);
...and to get the width and height:
public void onWindowFocusChanged(boolean hasFocus){
width=image.getWidth();
height=image.getHeight();
}
来源:https://stackoverflow.com/questions/22102503/camera-face-detection-coords-to-screen-coords