Camera face detection coords to screen coords

一笑奈何 提交于 2019-12-11 08:17:03

问题


i've recently started playing with Android's camera api. And i'm using face detection and it's callback setFaceDetectionListener. I'm having trouble trying to understand how to convert the faces[0].rect.centerX(); and faces[0].rect.centerY() to something i can use on screen like move a image around the face centered on that point.

can anyone help me understand how to convert from the coord system given by the camera to something i can use to set elements on my activity.


回答1:


From docs

(-1000, -1000) represents the top-left of the camera field of view, and (1000, 1000) represents the bottom-right of the field of view.

So to put in to screen coords (pseudocode):

screenX = (coord.x +1000) * screen.width / 2000
screenY = (coord.y +1000) * screen.height / 2000



回答2:


Use this code (assuming you have a width and a height integer variable):

LayoutParams params = new LayoutParams(LayoutParams.WRAP_CONTENT, LayoutParams.WRAP_CONTENT);
params.setMargins(faces[0].rect.centerX()-width/2,faces[0].rect.centerY()-height/2, 0,0);

...and to get the width and height:

public void onWindowFocusChanged(boolean hasFocus){
    width=image.getWidth();
    height=image.getHeight();
}


来源:https://stackoverflow.com/questions/22102503/camera-face-detection-coords-to-screen-coords

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!