locked
Recognizing single finger vs multi-finger gestures - gesturerecognizer

    Question

  • Hi all,

    I am working on a DirectX/C++ windows store app and I've hit a problem.

    I want to be able to scale/zoom using the pinch-to-zoom gesture and so I have set up a OnPointerMoved() event that passes the intermediate points to a GestureRecognizer which has the events OnManipulationStarted and OnManipulationUpdated() - so far so good.

    Users can also interact with elements using a single-finger drag. To do this I just work with the raw position data coming from OnPointerMoved(). My problem is that OnPointerMoved() will always be hit before the GestureRecognizer recognizes the scale/zoom gesture and so if a user goes to zoom in on an element - my code triggers the OnPointerMoved() code before the ManipulationUpdated() code and therefore they are able to zoom but have also interacted with the elements on screen that can be manipulated by single-finger dragging.

    I thought a solution to this may be to put all the interaction into ManipulationUpdated() and in there write the code to decide whether it is a single finger drag or the start of a zoom but it seems ManipulationUpdated() doesn't get hit with a single finger (I'm assuming this is something to do with my GestureSettings only being set up for ManipulationScale).

    Does anyone know what the best way of solving this problem would be? Even being able to differentiate between single finger and multi-finger would be enough I guess (but I can't seem to find how to do this).

    Thank you for your time.

    Wednesday, July 24, 2013 10:22 AM

Answers

All replies

  • Hi,ponchoshh

    I don't think differentiate between single finger and multi-finger is practicable in ManipulationUpdated().

    Here is the material of ManipulationUpdated():

    http://msdn.microsoft.com/en-us/library/windows/apps/windows.ui.input.manipulationupdatedeventargs.aspx

    and in this,there is no funciton fit you if you want to put all the interaction into ManipulationUpdated() and differentiate them.Instead,I think it can be useful to make idea on business logic.like deciding it in OnPointerPress etc.

    Thursday, July 25, 2013 9:02 AM
    Moderator
  • You can detect movement with the GestureRecognizer. Listen for the translation events as well as the zoom and scale events.
    Thursday, July 25, 2013 2:21 PM
    Owner
  • Thanks for the reply.

    Yeah that was my worry, because OnPointerMoved() gets hit before any of the gesture recognizer stuff it means that the OnPointerMoved functionality (single finger) gets fired before gesture recognizer tells the app to stop firing that functionality and that this is actually a two-finger gesture.

    I guess there is no way around it - I really don't understand why the Drag gesture of GestureRecognizer only supports pens and mice, it seems perfect for what I want...

    Friday, July 26, 2013 8:53 AM
  • For what you describe the Translate gestures are what you want, not Drag. The Translate manipulations are designed for moving and rearranging objects by touch.

    --Rob

    Friday, July 26, 2013 2:07 PM
    Owner
  • Ah okay thank you, I'll try this.

    Monday, July 29, 2013 8:51 AM