Finger/Hand Gesture Recognition using Kinect

后端 未结 8 974
刺人心
刺人心 2020-12-13 16:49

Let me explain my need before I explain the problem. I am looking forward for a hand controlled application. Navigation using palm and clicks using grab/fist.

Curren

相关标签:
8条回答
  • 2020-12-13 16:59

    I've had quite a bit of succes with the middleware library as provided by http://www.threegear.com/. They provide several gestures (including grabbing, pinching and pointing) and 6 DOF handtracking.

    0 讨论(0)
  • 2020-12-13 17:04

    You might be interested in this paper & open-source code:

    Robust Articulated-ICP for Real-Time Hand Tracking

    Code: https://github.com/OpenGP/htrack

    Screenshot: http://lgg.epfl.ch/img/codedata/htrack_icp.png

    YouTube Video: https://youtu.be/rm3YnClSmIQ

    Paper PDF: http://infoscience.epfl.ch/record/206951/files/htrack.pdf

    0 讨论(0)
  • 2020-12-13 17:05

    1) If there are a lot of false detections, you could try to extend the negative sample set of the classifier, and train it again. The extended negative image set should contain such images, where the fist was false detected. Maybe this will help to create a better classifier.

    0 讨论(0)
  • 2020-12-13 17:12

    You don't need to train your first algorithm since it will complicate things. Don't use color either since it's unreliable (mixes with background and changes unpredictably depending on lighting and viewpoint)

    1. Assuming that your hand is a closest object you can simply segment it out by depth threshold. You can set threshold manually, use a closest region of depth histogram, or perform connected component on a depth map to break it on meaningful parts first (and then select your object based not only on its depth but also using its dimensions, motion, user input, etc). Here is the output of a connected components method: depth image connected components hand mask improved with grab cut
    2. Apply convex defects from opencv library to find fingers;

    3. Track fingers rather than rediscover them in 3D.This will increase stability. I successfully implemented such finger detection about 3 years ago.

    0 讨论(0)
  • 2020-12-13 17:15

    Read my paper :) http://robau.files.wordpress.com/2010/06/final_report_00012.pdf

    I have done research on gesture recognition for hands, and evaluated several approaches that are robust to scale, rotation etc. You have depth information which is very valuable, as the hardest problem for me was to actually segment the hand out of the image.

    My most successful approach is to trail the contour of the hand and for each point on the contour, take the distance to the centroid of the hand. This gives a set of points that can be used as input for many training algorithms.

    I use the image moments of the segmented hand to determine its rotation, so there is a good starting point on the hands contour. It is very easy to determine a fist, stretched out hand and the number of extended fingers.

    Note that while it works fine, your arm tends to get tired from pointing into the air.

    0 讨论(0)
  • 2020-12-13 17:15

    If you only need the detection of a fist/grab state, you should give microsoft a chance. Microsoft.Kinect.Toolkit.Interaction contains methods and events that detects the grip / grip release state of a hand. Take a look at the HandEventType of InteractionHandPointer . That works quite good for the fist/grab detection, but does not detect or report the position of individual fingers.

    The next kinect (kinect one) detects 3 joint per hand (Wrist, Hand, Thumb) and has 3 hand based gestures: open, closed (grip/fist) and lasso (pointer). If that is enough for you, you should consider the microsoft libraries.

    0 讨论(0)
提交回复
热议问题