Touch is not working on Kinect+ Surface RRS feed

  • Question

  • When am running the "ControlBasics-WPF" Kinect C# sample on the Surface (PixelSense), it's recognizing the click event ("KinectTileButtonClick" event) for hand/touch/mouse click. But when am trying to do same in a surface app it's only recognizing to hand and mouse click but not responding to the touch. By some more R&D I came to know that if you touch two times (like double click) then only it recognizing the click event. Also if am going for the separate TouchDown event then the “TouchDown” event  responds to the touch. Can I have any suggestion on this issue.The sample app is Here

    Thursday, July 25, 2013 12:05 PM

All replies

  • The Kinect controls are independent of the PixelSense SDK controls. If you are trying to mix the two, they may share the WPF framework control structure, but the event system may conflict with each other. You will have to develop your own controls that are able to use both SDK's in some way.

    Since you have the code for the controls themselves, you can certainly modify those to meet your needs.

    Carmine Sirignano - MSFT

    Thursday, July 25, 2013 6:04 PM