none
Help with gesture detection RRS feed

  • Question

  • Hi,

    1. Short version of my question:

    Is there any way to detect basic user gestures (press, pan, zoom) without Controls lib and if the app is not the focused window?

    2. Detailed version:

    My idea is "recovering" part of the old "InteractionStream" because I am interested in sending these events to a browser (in a port of webserver v1 app or in a new app) but it could be useful for other applications too.

    I have looked into Microsoft.Kinect.Input, but Pointer events there are based on a KinectCoreWindow instance and they are only triggered if the app is the focused window. There is a GestureRecognition class with the events I am interested in (pressStart, presComplete...) but I cannot figure out how to make it work nor when these events are raised.

    I think that gestures from VGB can be detected as a source reader (the same that bodies or color frames that doesn't need the windows to be focused or even visible). I would like to know if something similar is possible with predefined gestures.

    Any help would be appreciated :)

    Thursday, February 5, 2015 4:39 PM

Answers

  • There are no gestures natively part of the core SDK. The extensions come from the libraries you mentions. You can either use VGB and record your own gestures and bubble that up as a system level component. Keep in mind, you cannot run Kinect as a service.

    There are lower levels of the controls that you can access, see these threads for more detail:

    https://social.msdn.microsoft.com/Forums/en-US/6b8d6251-c59a-46c7-9da8-b912cb16dfab/kinectregionaddhandpointerhandler?forum=kinectv2sdk

     

    Carmine Sirignano - MSFT

    Thursday, February 5, 2015 7:24 PM
  • You have a look at the ControlsBasicsDX sample to give you an idea of how to use the GestureRecognizer.  At the application level, you can handle the engagement events that provides you the PointerPoint(ControlsBasicsDX::HandleEngagement) to determine if the interaction is occurring some an interactive region in your application. At the lower level since there is no understanding of the UI framework you are particularly using, you need to handle that yourself.

    Carmine Sirignano - MSFT

    • Marked as answer by jmmroldan Friday, February 6, 2015 8:15 PM
    Friday, February 6, 2015 6:56 PM

All replies

  • There are no gestures natively part of the core SDK. The extensions come from the libraries you mentions. You can either use VGB and record your own gestures and bubble that up as a system level component. Keep in mind, you cannot run Kinect as a service.

    There are lower levels of the controls that you can access, see these threads for more detail:

    https://social.msdn.microsoft.com/Forums/en-US/6b8d6251-c59a-46c7-9da8-b912cb16dfab/kinectregionaddhandpointerhandler?forum=kinectv2sdk

     

    Carmine Sirignano - MSFT

    Thursday, February 5, 2015 7:24 PM
  • Thanks Carmine.

    I do not intend to run Kinect as a service, just a "background application" (as old webserver was). For example I can count the people in front of the Kinect (from a BodyFrameReader) and send a message to a server each time anybody goes in or out. This can be done from a Kinect app that is not the focused windows (focused windows could be a game in a separate app not related to Kinect). I would like to do the same with gestures (it is needed for porting webserver which made the old InteractionStream available from Javascript).

    Then... I know there are some videos about visual gesture builder and gesture recognition (I'll have a look on them again) but apart from that what's the purpose of Microsoft.Kinect.Input.GestureRecognizer class and how can it be used? I have seen that there are "processEvents" methods and that you can pass them a List of Kinect PointerPoints, but I don't know how. Is there any code sample using this class? Are WPF/XAML Controls built on top of this? Could I use this to build my own controls for WPF?

    Thanks again for your help.


    • Edited by jmmroldan Friday, February 6, 2015 8:00 AM Clarification
    Thursday, February 5, 2015 11:10 PM
  • You have a look at the ControlsBasicsDX sample to give you an idea of how to use the GestureRecognizer.  At the application level, you can handle the engagement events that provides you the PointerPoint(ControlsBasicsDX::HandleEngagement) to determine if the interaction is occurring some an interactive region in your application. At the lower level since there is no understanding of the UI framework you are particularly using, you need to handle that yourself.

    Carmine Sirignano - MSFT

    • Marked as answer by jmmroldan Friday, February 6, 2015 8:15 PM
    Friday, February 6, 2015 6:56 PM
  • Ok. I'll have a look on it (I thought that sample was only using "high level" controls but I didn't properly check).

    Thanks again!

    Friday, February 6, 2015 8:16 PM