Kinect Gesture NUI -- Missing RRS feed

  • General discussion

  • One thing that i think is missing from the the SDK release today is a detection of gestures. I am refering to those that have become standard in the xbox i.e. wave, swipe, run, and so on.

    Gestures are very natural and as part of the SDk it would be perfect transposition into the windows 7 touch gestures. eventough we can say that through controlling the mouse we can mimic these gestures. There is no good equivalent to click and hold in the kinect.

    Anyone have ideas on how to go about implementing gestures into program UIs

    Thursday, June 16, 2011 9:04 PM

All replies

  • Hey Martin,

    Gestures are something personal to a developer, so depending on your requirements you could either code them bespokely or look at using a third party middleware platform. Using the skeletal data provided by the API you can calculate basic gestures using math, or advanced gestures using machine learning. If you want to get started quickly then a middleware platform such as Prime Sense's NITE may be more suitable as you can feed this the skeletal data and if will run the algorythms for you and simply raise events for the 30 or so gestures it knows of.

    I would guess however that you will start to see open source and community contributions in this space coming very soon over the next few days/weeks so also keep your eyes out then!


    Follow Me on Twitter: @LewisBenge Or check out my blog:
    Friday, June 17, 2011 1:04 AM
  • Hi,

    Are there any Libraries available in .NET for implementing machine learning of skeletal data for recognizing complex gestures


    Martin Menezes

    Monday, November 7, 2011 3:02 AM