none
Unity Hand Control - Kinect SDK 2 RRS feed

  • Question

  • Hello,

    I want to implement hand control like the demo "Controls Basics" in the SDK sample projects Kinect SDK v2. However the sample WPF/XAML projects relies on functionality that does not appear to be ported over to the Unity Plugin provided by Microsoft for example KinectCoreWindow.SetKinectPersonManualEngagement BodyHandPairs etc... Not all namespaces appear available? all i an see is Windows.Kinect

    Watching the channel9 videos on programming kinect for windows v2 (http://channel9.msdn.com/Series/Programming-Kinect-for-Windows-v2) i was given the impression that this functionality was provided. Am i missing something or is this hand control functionality provided by SetKinectPersonManualEngagement not current provided and i would need to implement this functionality myself using lower level API functionality ?

    Thanks for your time

    Thursday, August 21, 2014 12:19 AM

All replies

  • None of the "Pay for Play" api's such as Interactions, Face/FaceHD, Fusion have been ported to the Unity, yet. There is some work to bring Fusion and Face API's to Unity, but I don't know the state of that yet.

    Interactions is a bit more difficult since Unity didn't have a standard UI layer. I know just recently they announced it as a beta for v4. The Kinect Interactions framework is designed to be extended so it can be enabled for other libraries.

    They would be built on top of the types in Microsoft.Kinect.Toolkit.Input namespace that ship in Microsoft.Kinect.WPF.Controls.dll. This would have to be wrapped for Unity itself since it can't support .Net libraries.

    The job for this controls layer is to listen to pointer events (see my answer above), to hit test in a ui frame workd window given that location ... and create a HitTestResult.

    It should create PressableModels (for pressable things) and ManipulatableModels (for scrollable things) and set the appropriate KinectGestureRecognizerSettings on the KinectGestureRecognizer that those Models hold.

    It would create an InputPointerManager instance, and call HandlePointerAsCursor with the appropriate parameters, including a HitTestResult, referenced above.

    That function will appropriate route the pointerPoint to the right gesture recognizers based on both hit testing and captured pressable/manipulatable controls.

    You can review this thread to get more insight, but understand, your framework that would use this would need to support Microsoft .Net.

    http://social.msdn.microsoft.com/Forums/en-US/6b8d6251-c59a-46c7-9da8-b912cb16dfab/kinectregionaddhandpointerhandler?forum=kinectv2sdk


    Carmine Sirignano - MSFT


    Monday, August 25, 2014 5:43 PM
  • Hi Carmine

    Is there a roadmap for the full Unity integration? I have begun the process of writing some components that mimic the Microsoft.Kinect.Toolkit.Input functionality but this will have been a waste of time once this is available natively.

    As a unity developer it would be quite simple to convert hit tests into a usable result.

    Would it be possible to use pInvoke to access Microsoft.Kinect.Toolkit.dll?

    Any additional advice the community can offer would be greatly appreciated.

    Tuesday, September 30, 2014 1:09 AM