Thursday, April 12, 2012 12:39 PM
The only way I know how the touch keyboard appears, is that there is an existing textbox control an the user taps into this control. In my project it is a bit different. If a user taps somewhere on the screen, a textbox must be created at that point where the user can enter some text. I can create the textbox control through code, but the touch keyboard won't appear, even if I set the focus on the textbox. If a user taps into the just created textbox the touch keyboard will appear. Off course I want to eliminate this extra tap for the user and show the touch keyboard immediately as soon as the textbox is created.
Is it possible to show the touch keyboard through code? Maybe through a method on the textbox or otherwise? Or is there another way of doing the above?
Thursday, April 12, 2012 4:15 PMModerator
There is no direct way to control the touch keyboard programmatically. Requiring the user to set the focus rather than the program is a deliberate design decision to prevent UI churn.
See Input Hosting Manager and the Touch Keyboard for more information on how and why this works. The User-driven invocation section of that document explains the specific behavior you are asking about.
Friday, April 13, 2012 3:09 AM
I have similar question with a little difference. We want to do the scenario like the exist Metro RDP app: The touch keyboard can
be brought up by a button in AppBar, then we can pass the touch keyboard event to another page.
The former part "The touch keyboard can be brought up by a button in AppBar" is our question. How does Metro RDP do it?
Friday, April 13, 2012 9:44 AM
I read the documentation you mentioned. I don't agree with the sentence 'Users indicate to the system that they want to input text by tapping on an input control instead of having an application make that decision on their behalf.'
I think there are valid scenarios where you want to show the keyboard triggered in a alternative way instead of a predefined textbox somewhere on a screen. It is up to the app instead of the metro environment to make that decision for the user. I hope Microsoft will still make this possible somehow.
But thanks for the answer anyway!
Friday, April 13, 2012 1:45 PM
I created a workaround for my problem. As in the sample 'Touch keyboard sample', I created a usercontrol (just containing a grid) that when it is tapped on, triggers the touch keyboard to appear. When my app is in the mode that a user wants to put text anywhere on the screen, this new control overlays the whole screen. When the user then taps somewhere, the keyboard appears and I know where I have to create the textbox which will display the text. The text, btw, is taken from the CoreWindow.GetForCurrentThread().CharacterReceived event.
As said, this is a bit of a hack to get what I want. I now realized that the desktop modus of Windows 8 has a button which shows the keyboard. So Microsoft realized themselves that the current situation is to limited for their purposes.
- Edited by Ronald Eekelder Friday, April 13, 2012 1:49 PM
Friday, June 15, 2012 6:41 PM
Another case for toggling the software keyboard is games that do not use XAML UIs. We do not have a TextBox control here... we are just looking for keyboard events. Without the ability to bring up the software keyboard pragmatically games have to resort to implementing their own software keyboards like an arcade machine.
Friday, June 15, 2012 7:04 PMModerator
You don't need to use Xaml UI to bring up the soft keyboard, you just need to hook up your text fields so they correctly report themselves as such through the accessibility system.
See Input Hosting Manager and the Touch Keyboard for more details.
Friday, June 15, 2012 11:24 PM
Is there any documentation or samples on how to do this? All I can find so far is how to hookup AutomationPeer objects for custom XAML/WPF controls. I see nothing on how to use this without some sort of XAML in your app.
Friday, June 15, 2012 11:35 PM
Could this have anything to do with CoreWindow.AutomationHostProvider?