none
ControlsBasics sample issues RRS feed

  • Question

  • I've been having some issues with both the WPF and XAML samples for the control basics.  

    Each one builds fine and runs the application.  The issue I'm having is that it will never bring up the Kinect cursor for me to navigate using my hand.  I've held my hand still for about 30 seconds waiting.  

    Here's the setup:

    Kinect v2 sensor on a tripod about 5 feet off the ground.  

    x64 system

    16GB RAM

    Win 8.1 Enterprise

    Most of the samples I've tried have worked fine. 

    On a semi-related note, I've been unsuccessful with KinectRegions in my own code.  My application will never build because it can't find the KinectRegion in the kinect/2014 schema referenced.  

    The main reason my company decided to buy a Kinect was to play with the interactivity in WPF applications and so far, we haven't been able to see it work.

    Thanks for any help!


    Eat.Sleep.Code.

    Monday, July 28, 2014 3:43 PM

All replies

  • Thursday, July 31, 2014 6:52 PM
  • 1) we're improving the engagement detector to be a little less stringent.

    2) the engagement detector:

    • is looking for an open hand, held up towards the screen, that is still.
    • putting the sensor centered with the screen (ideally above at about 6', but ok below at ~2') is key. we pay attention to your body being oriented (shoulders wise) towards the sensor
    • if you don't successfully engage in the first few seconds of having your hand up...we'll stop considering you (this is to combat skeletons that body tracking picks up...that aren't real people - we don't want a phantom person to make a cursor show up and start pressing things).  in order to be considered again, you wave your hand and then hold it still again...or lower it to your side for 1-2 seconds, and lift it up again

    Hope this helps.

    Thanks, Rob Relyea

    PM, Kinect for Windows Interactions


    Rob Relyea [MSFT]

    Friday, August 1, 2014 4:44 PM
  • Hi Rob,

    Following this question:
    http://social.msdn.microsoft.com/Forums/en-US/31aaaa83-ec2c-4879-b45e-4ecb0397e912/best-way-to-handle-engagement-in-kinect-v2-?forum=k4wv2devpreview

    Where are API features to control, or get notified of engaged people ? If I iterate on bodies can I know currently engaged skeletons ? 

    How can I mashup this with speech and face ? I mean if a speech is performed can I know who's skeleton speaking ?

    Wednesday, August 6, 2014 8:41 AM