Kinect 3D gesture recognition based on skeleton movements - What libraries exist?

蓝咒 提交于 2019-12-21 17:11:40

问题


What gesture recognition libraries (if any) exist for the Kinect? Right now I'm using OpenNI to record skeleton movements but am not sure how to go from that to triggering discrete actions.

My problem might be as simple as pose detection but it could also be as complicated as time based movements (ie. detect when they are moving their hand in a circle) depending on how difficult that is. The examples that I've seen for pose detection have been very ad-hoc - is this because a generic algorithm is difficult to do right?


回答1:


The NITE library (on top of OpenNI) has classes for detecting swipe and other gestures, but personally I've had trouble with using both the base OpenNI and NITE libraries together in C# (I keep running in to AccessViolationExceptions). If you're writing managed code, the XnVNITE.net.dll is what has the swipe detection. It's found under the PrimeSense/NITE folder after you install NITE.

If you can do without the skeleton and user recognition there is also the ManagedNite.dll library, which is a redundant library shipped with the PrimeSense NITE install. ManagedNite.dll also has hand/gesture recognition but no skeleton/user detection.

Otherwise, you can certainly detect your own time-based swipe gesture, as you suggested. You should be able to detect if a series of hand points travels in a straight line with a function like this:

static bool DetectSwipe(Point3D[] points)
{
    int LineSize = 10; // number of points in the array to look at
    int MinXDelta = 300; // required horizontal distance
    int MaxYDelta = 100; // max mount of vertical variation

    float x1 = points[0].X;
    float y1 = points[0].Y;
    float x2 = points[last].X;
    float y2 = points[last].Y;

    if (Math.Abs(x1 - x2) < MinXDelta)
        return false;

    if (y1 - y2 > MaxYDelta)
        return false;

    for (int i = 1; i < LineSize - 2; i++)
    {
        if (Math.Abs((points[i].Y - y1)) > MaxYDelta)
            return false;

        float result =
            (y1 - y1) * points[i].X +
            (x2 - x1) * points[i].Y +
            (x1 * y2 - x2 * y1);

        if (result > Math.Abs(result))
        {
            return false;
        }
    }
    return true;
}

You could enhance this code to detect for right vs. left swiping. I also did not include time computation in my example above - you would need to look at the time of the first and last point and determine if the swipe was completed within a certain amount of time.




回答2:


check this out: http://kinectrecognizer.codeplex.com/

supports 3D tracking and recognition fine-tuning.. should be easy to reuse as well




回答3:


Softkinetic looks promising, but the SDK is not freely available just yet.




回答4:


I am working on a standalone skeleton detection code for kinect. http://code42tiger.blogspot.com

I am planning to release it for free, however I still have a long way to go from perfection. I am wondering if your requirement is only hand position tracking, you can write it yourself without even using OpenNI or any other library. If you need a simple tip, read below.

1) Background removal (explained in my blog) 2) Blob detection (to choose which person to track, also explained in blog) 3) Hand tracking (Now when you have the user alone in the data, you can find easily find the hand by considering the farthest point from the body.) 4) Track the hand position to detect gestures. (some calculation that tracks the hand every few frames will given you the geometry of the movement)

This should work (if not perfect) 75% of the time. Unless the user tries to find fault with the algo, it should work for normal users.



来源:https://stackoverflow.com/questions/5234094/kinect-3d-gesture-recognition-based-on-skeleton-movements-what-libraries-exis

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!