locked
Is it possible to invoke a touch keyboard in Metro style app without XAML?

Answers

  • You misunderstood.

    The touch keyboard will invoke when an appropriate control receives focus. This is determined by the UI Automation attributes and can be written in any framework.

    This does not require using Windows.UI.Xaml. The Windows.UI.Xaml controls already implement this, so using them is easier than implementing a custom control. That said, the UI Automation interfaces are designed and documented specifically so developers can create accessible and automatable custom controls.

    See UI Automation Provider Programmer's Guide for more details on UI Automation in general and Implementing a Server-Side UI Automation Provider for information specifically on how to create the server side interfaces for your custom control.

    All that said, I would go with option 1 and leverage DirectX and Xaml interop to use the Windows.UI.Xaml.TextBox. In addition to keyboard support, it is already wired up with Text Services Framework (TSF) support so it can receive IME input, stylus input, speech input, etc.

    WPF is unrelated to this discussion. Like the Windows.UI.Xaml controls, WPF controls implement UI Automation; however, they aren't something you could use in your app.

    --Rob

    • Marked as answer by hellobody Wednesday, August 15, 2012 6:07 AM
    Tuesday, August 14, 2012 5:15 PM
    Owner

All replies

  • The keyboard will automatically be shown when the user sets focus to a text control. See the touch keyboard documentation for information on the logic used. If you are writing a custom text control (whether in Xaml or not) you will need to make sure it supports the UI Automation TextPattern and ValuePatterns and focus changed events.

    See the Input: Touch keyboard sample for an example of how to do this.

    --Rob

    Monday, August 13, 2012 4:21 PM
    Owner
  • Sorry, maybe my questions are little stupid. But I understood already very well that the touch keyboard invoke only when the user tap the text control by finger, to prevent unexpected touch keybord appearance. And there is no any way to invoke the keyboard programmatically. And I completely understood the logic of using the touch keyboard. I read all links you posted (except this one http://msdn.microsoft.com/en-us/library/windows/desktop/ee671596(v=vs.85).aspx (i'm reading it now)) and saw all samples.

    Now I see two ways  to implement touch keyboard in our game engine:

    1. Add WPF\XAML to our game engine application (http://msdn.microsoft.com/en-us/library/windows/apps/hh825871.aspx).

    or

    2. Create our own custom touch keyboard, as a part of our game engine gui.

    Just one more question: Did I right understand, there is no way to implement using the Metro touch keyboard without Windows Presentation Foundation?

    Tuesday, August 14, 2012 7:38 AM
  • You misunderstood.

    The touch keyboard will invoke when an appropriate control receives focus. This is determined by the UI Automation attributes and can be written in any framework.

    This does not require using Windows.UI.Xaml. The Windows.UI.Xaml controls already implement this, so using them is easier than implementing a custom control. That said, the UI Automation interfaces are designed and documented specifically so developers can create accessible and automatable custom controls.

    See UI Automation Provider Programmer's Guide for more details on UI Automation in general and Implementing a Server-Side UI Automation Provider for information specifically on how to create the server side interfaces for your custom control.

    All that said, I would go with option 1 and leverage DirectX and Xaml interop to use the Windows.UI.Xaml.TextBox. In addition to keyboard support, it is already wired up with Text Services Framework (TSF) support so it can receive IME input, stylus input, speech input, etc.

    WPF is unrelated to this discussion. Like the Windows.UI.Xaml controls, WPF controls implement UI Automation; however, they aren't something you could use in your app.

    --Rob

    • Marked as answer by hellobody Wednesday, August 15, 2012 6:07 AM
    Tuesday, August 14, 2012 5:15 PM
    Owner
  • Wrong! I am in Windows 8 right now trying to use FileZilla and I can't get a touch keyboard to come up. I plan to persevere just long enough to test my software and then I will be wiping my new tablet and reformatting to Ubuntu. It is comical how badly MS is at trying to ape Apple and Google. Sell your shares now.

    I realise that this might seem like trolling and I appreciate that this is a coding forum and whilst I am a coder, I am posting as a user. I was immediately at home with my first Android device and every update since has been intuitive. I code for Linux as well and use MintLinux rather than Ubuntu because I like to have a start menu. However, MS has made a huge mistake with Windows 8. Buying hardware is not the same as voluntarily joining your ecosystem. The fact that MS thinks that everyone loves MS just because they want to buy hardware is quite funny but they might get away with it if their new OS worked well and I could see all my running apps and the touch interface just worked and it was truly device agnostic but none of those things are true. MS is not a pioneer here. It is playing catchup and to do that you must at least match what is already old hat. Windows 8 is a huge fail and the reviewers that I read that said otherwise are shills.

    Disappointed does not even begin to cover it. Incredulous. How can you be this bad when you have thousands of employees? It is clearly not down to a lack of resources. It is a failure of leadership and/or culture.

    Friday, February 22, 2013 4:39 AM