Finger/Hand Gesture Recognition using Kinect

后端 未结 8 986
刺人心
刺人心 2020-12-13 16:49

Let me explain my need before I explain the problem. I am looking forward for a hand controlled application. Navigation using palm and clicks using grab/fist.

Curren

8条回答
  •  悲哀的现实
    2020-12-13 17:17

    The fast answer is: Yes, you can train your own gesture detector using depth data. It is really easy, but it depends on the type of the gesture.

    Suppose you want to detect a hand movement:

    1. Detect the hand position (x,y,x). Using OpenNi is straighforward as you have one node for the hand
    2. Execute the gesture and collect ALL the positions of the hand during the gesture.
    3. With the list of positions train a HMM. For example you can use Matlab, C, or Python.
    4. For your own gestures, you can test the model and detect the gestures.

    Here you can find a nice tutorial and code (in Matlab). The code (test.m is pretty easy to follow). Here is an snipet:

    %Load collected data
    training = get_xyz_data('data/train',train_gesture);
    testing = get_xyz_data('data/test',test_gesture); 
    
    %Get clusters
    [centroids N] = get_point_centroids(training,N,D);
    ATrainBinned = get_point_clusters(training,centroids,D);
    ATestBinned = get_point_clusters(testing,centroids,D);
    
    % Set priors:
    pP = prior_transition_matrix(M,LR);
    
    % Train the model:
    cyc = 50;
    [E,P,Pi,LL] = dhmm_numeric(ATrainBinned,pP,[1:N]',M,cyc,.00001);
    

    Dealing with fingers is pretty much the same, but instead of detecting the hand you need to detect de fingers. As Kinect doesn't have finger points, you need to use a specific code to detect them (using segmentation or contour tracking). Some examples using OpenCV can be found here and here, but the most promising one is the ROS library that have a finger node (see example here).

提交回复
热议问题