Tuesday, February 21, 2012 2:53 AM
In the simplest terms, I want to write a WPF touch application that drives mouse input. Like the Touch Pointer provided in Control Panel > Pen and Touch > Touch > Touch Pointer.
I have tried using send_input, but it steals focus. This prevents my app from receving any further touch data, unless I lift a finger up and replace it on my app. I've also tried using PostMessage/SendMessage. I have only been able to send Mouse move events to a test wpf app. No errors are being reported, the call claims to have successfully completed. I have the uiAccess=true flag set in my app.manifest.
Input is greatly appreciated (pun intended)
Wednesday, March 07, 2012 7:05 AM
I'm now attempting to do the same with the Windows 8 Consumer Preview. It looks like there is someone at microsoft who is interested in doing the right thing.
My application now receives touch input correctly even though it's not in focus. This is a step forward!
However there are still a couple of issues.
1. The Mouse cursor can move while interacting with my application with touch. I'm using a low level mouse hook and filtering out touch input being converted to But there is a significant flicker for mouse movement.
2. The mouse manages to get confused between the first touch input and the actual mouse during mouse moves. I think this is what causes the flicker on the mouse move while touch input takes place as in (1)
Does anyone know if using the new WinRT api such as GetPointerInfo http://msdn.microsoft.com/en-us/library/windows/desktop/hh454885(v=vs.85).aspx can be used along with WPF input techniques?
Edit: I am able to get the behaviour I require under a very specific set of conditions. Place one finger down, windows sees this as the 'first touch' and injects a mouse event for that touch. My low level hook catches that and processes it, all is fine. Now I place a second finger down, and raise the first. Now there's no 'first touch' registered. As long as I have a finger down on my app, the mouse can be moved around on screen, without any issues. This is what I want! The moment all fingers are raised, I have to go through this process for the mouse to not flicker between the first touch and actual mouse position. I want this without having to always go through this set of conditions.
- Edited by S4shi Wednesday, March 07, 2012 7:18 PM typo
Tuesday, March 20, 2012 10:44 PM
Results from yet another attempt.
As a possible workaround, I used the InjectTouchInput api to place two inputs at (0,0) and raise the first input while holding the second.
This works great. I can interact with a window without having it be the 'first/primary' touch! ... except applications developed with the .NET framework (paint, the touch pack, my apps).
On a side note, any .NET application refuses to receive touch input if it's the second touch. Chrome, windows explorer etc can scroll around when the first finger is placed on another application and the second on them.
It looks like i'm missing some small ill-documented feature that will allow me to do this.
Friday, April 06, 2012 6:48 AMyeah,i'm chinese,i am doing the same thing as u.i have some touble in developing it.
Saturday, April 07, 2012 12:44 AMModeratorSashi - Can you elaborate on what you're trying to accomplish in your app? The previous posts are somewhat unclear.
Saturday, April 07, 2012 1:23 AM
I'm attempting to create new multitouch interaction techniques that enables the user to interact with a widget (my 'application') with his/her fingers, and indirectly perform actions within existing applications (browsers, paint, etc). My widget is a visual, that detects multiple fingers, and their analyses their movements. It presents a menu, and is also allows changing slider values, clicking on checkboxes or selecting from a listbox within an arbitrary application.
My current approach attempts to use the mouse as the bridge between touch input and actions in an application. For this, I require the following :
1. My widget, a visual overlay like the touch pointer, should continuously receive touch input.
2. The widget should be able to programmaticaly move the mouse cursor on the screen, while receiving touch input.
The previous posts try to describe the problems I'm having while doing this.
The issue is with how windows promotes the first touch to a mouse cursor.
This MSDN post suggests using the GetMessageExtraInfo() to disambiguate mouse messages injected by wisptis. I tried using a low level mouse hook to filter such events ( using this gist ), it seems to filter only some of the messages. With this hook in place, I am able to move the mouse around the screen while a finger is placed on the touch screen, however since the filter doesn't block all mouse messages generated by touch, the cursor flickers between the actual mouse position and the finger position.
I presumed this could be a touch driver issue, and have contacted the manufacturer of my touch screen (3M) and am still awaiting a response. However, testing with the InjectTouchInput api in windows 8 shows that the issue continues to exist even with simulated touch input. Which leads me to believe the problem is above the driver layer.
Note that I can get the behavior I desire if there is no first touch on the screen. I can do this by placing sequentially placing two fingers on the screen, and raising the first. From here, as long as there is always a finger on the screen at all times, I can move the mouse and receive touch input on my widget. None of the existing touch points are considered first touch and hence do not interfere with the mouse in any way.
I'd like to point out that Windows 8 is definitely an improvement over 7 in this regard. Windows 7 blocks all touch input once the mouse is moved, whereas windows 8 continues to pump the touch input for my application.
Thank you for asking me clarify this issue. I hope this description helps you understand my motivation and purpose of my application and the problems I'm facing.
- Edited by S4shi Saturday, April 07, 2012 2:48 AM
Monday, April 09, 2012 10:28 PMModerator
Hi Sashi -
It sounds like you're passing WM_POINTER messages to defwndproc. Handing a message to defwndproc informs the system that you'd like it to provide the default behavior for that message. For pointer, that includes things like kicking off support for legacy applications, which includes (legacy) mouse promotion of touch and other inputs.
Try not handing WM_POINTER messages to DWP. You should no longer see system-generated mouse messages in response to touch input.
Tuesday, April 10, 2012 12:50 AM
My application is written with WPF. I'm not dealing with WM_POINTER messages directly, or DefWndProc. All input is handled from the WPF framework. I'd like to continue utilizing the functionality provided by it.
From what I can see, there's no way to override the DefWndProc for a Window created with WPF.
Is there a workaround for this via WPF? Should I be cross posting this to a WPF-specific forum?
Tuesday, April 10, 2012 11:24 PMModerator
You should follow up with someone more familiar with WPF on how you can ensure WM_POINTER messages are not passed to defwndproc in WPF.
(I'm not a WPF expert, but a quick Bing search turns up some blog posts and other content describing how to hook the wndproc. That may lead to a workable approach.)
Wednesday, April 11, 2012 7:23 PM
Instead of trying to synthesize mouse or touch input to interact with the target application have you considered using UI Automation? UIA usually provides much better automation results than trying to synthesize input.
Also be aware that low level hooks cannot be used consistently from managed code. They must always return near immediately or the input manager will disconnect the hook to prevent it from causing system-wide responsiveness problems. Since there is no way to prevent the garbage collector from freezing the hook's thread there is no way to ensure that it won't time out.