none
Gesture detection? RRS feed

  • Question

  • One thing I'm not seeing so far with the SDK is any kind of gesture detection. I'm assuming it might not be that bad to do the math myself and try to figure out basic gestures. However this seems like a common thing so I thought I'd first see if I'm missing something and there is a "gesture helper" either built in or that someone has created?

     

    While the SDK samples are cool / fun, it is far more practical to me to have a sample of how you might recognize say the Kinect "wave", right and left hand swipes left, right, down, up etc.

    Saturday, June 18, 2011 9:08 PM

Answers

All replies

  • if it helps you, I kept with me the last openNI coordinates and comprobava the difference between the current and previous coordinates to verify whether there was a gesture or not. As I see it works the same way. I am developing a basic class to have these events, I hope to have it finished this weekend so people have it easier.
    Saturday, June 18, 2011 9:32 PM
  • Thanks, that'd be great. I'm working on the audio side of Kinect currently but hope to come back to this soon.
    Saturday, June 18, 2011 9:52 PM
  • if it helps you, I kept with me the last openNI coordinates and comprobava the difference between the current and previous coordinates to verify whether there was a gesture or not. As I see it works the same way. I am developing a basic class to have these events, I hope to have it finished this weekend so people have it easier.
    You have any idea where i can get those coordinates?
    Monday, June 20, 2011 1:52 PM
  • Tareq,

    The joint coordinates are available as part of the SkeletonData.Joints collection. You'll find SkeletonData elements as part of SkeletonFrame structure. Look at SkeletalViewer sample and corresponding walkthrough (http://research.microsoft.com/en-us/um/redmond/projects/kinectsdk/docs/SkeletalViewer_Walkthrough.pdf) for examples of getting this data.

    Hope this helps,
    Eddy


    I'm here to help
    Wednesday, June 22, 2011 2:37 AM
  • That would be awesome. Please put up some sample code once it's complete!

    Friday, June 24, 2011 11:30 PM
  • Friday, June 24, 2011 11:52 PM
  • right now i'm working on getting the gesture of closing your hand into a fist for clicking working. I will definitely post the code once i am satisfied with it. Maybe someone can expand on it and make it better.

    Saturday, June 25, 2011 3:52 AM
  • If you're any good at your chosen language, it's not too hard. I rolled my own gesture detection system in half a day.  Just build a List<List<Joint>> and use that as a buffer. In your Skeleton Frame Available event handler, append a new List<Joint> with each current joint in it.

    I went the other way. My inner List<Joint> is a list of positions for one specific joint, and my outer List<List<Joint>> contains those as its items.

    Then trim off the earlier indices from the list. I keep mine at 30 frames, which is about a second.  The handler then calls other methods which detect gestures based on analysis of the changes in that list.

    Here's a rough example from a NUI project I'm working on. While the Windows Magnifier is zoomed in, this will detect a difference in the position of the hip, and scroll the magnified desktop horizontally based on where the person is along the X (Horizontal) plane. Upon every single Skeletal frame, it runs a series of gesture detectors like this one. CPU stays fairly low, and it runs fairly fast.

     

     List<List<Joint>> JointHistory = new List<List<Joint>>();

     // (code to populate buffer from skeleton data)

     List<Joint> joints = JointHistory[(int)JointID.HipCenter];

     double delta = joints.Last().Position.X - joints.First().Position.X;

     

     if (delta > -.1 && delta < .1)

    return false;

    onMessage("Hip delta: " + delta.ToString());

     

    // SendKeys acts weird.  You have to send it strings. In this case, we want to hold down ctrl and alt while pressing arrowkeys a bunch of times.

    string keys = "%^("; // ctrl-alt-

    if (delta < 0)

    {

    onMessage("Scroll screen left");

    for (int i = 0; i < ScrollsPer10cmH * 100; i++)

    keys += "{LEFT}";

    }

    else

    {

    onMessage("Scroll screen right");

    for (int i = 0; i < ScrollsPer10cmH * 100; i++)

    keys += "{RIGHT}";

    }

    SendKeys.SendWait(keys + ")"); ;

     


    Saturday, June 25, 2011 8:40 AM
  • Just as a side-note, we'll probably start seeing lots of gesture libraries popping up on codeplex over the next few weeks.  Implementation isn't too hard.

    The really tricky part is what gestures do we care about?  Gestures for the kinect are a convention and rarely truly natural.   Any semi-official gesture library risks also setting up expectations of what our conventions for the kinect should be and -- if anyone has noticed with the XBOX -- some current conventions can be terribly annoying.

    For instance, I can live with the wave gesture as a way to bring up a menu.  The hover and hold for select common to most xbox games is actually a pita.  Hover and palm pump is just plain silly.  Hover and swipe (from dance central) is brilliant for menus -- but not for all select scenarios.  Hover and grasp is interesting -- but hand-pose recognition is tricky and requires that the hands be very close to the camera.  The new gesture used in the Paint program on coding4fun is questionable -- you raise a second arm to indicate a "down" gesture while the first arm draws -- as well as fatiguing.

    The hard part about a good gesture library isn't really the programming so much as the user design issues involved.


    James Ashley - Presentation Layer Architect at Razorfish Emerging Experiences
    jamesashley@imaginativeuniversal.com
    www.imaginativeuniversal.com www.emergingexperiences.com
    Sunday, June 26, 2011 6:59 PM
  • Here's some sample code on gesture and position recognition - 

    http://aswathkrishnan.tumblr.com/post/7175233975/geekalert-kinecting-2-position-and-gesture

    Sunday, July 3, 2011 1:48 AM
  • I also posted a DTW gesture recognizer here.

    http://social.msdn.microsoft.com/Forums/en-US/kinectsdknuiapi/thread/4a428391-82df-445a-a867-557f284bd4b1

    Sunday, July 3, 2011 8:51 AM
  • Thanks all. I've been using the Kinect Toolkit for a while and it has provided me with a good basis for doing gestures, though I have not had time to work on my Kinect project much anymore.

     

    http://blogs.msdn.com/b/eternalcoding/archive/2011/07/04/gestures-and-tools-for-kinect.aspx


    Geoff
    • Marked as answer by thnk2wn Tuesday, August 16, 2011 1:19 AM
    Tuesday, August 16, 2011 1:19 AM