Embedding XAML UI in a DirectX application/game RRS feed

  • Question

  • I'm still in the concept phase here, so I don't have any example code. Is there any means to use the existing XAML framework to host a user interface within a DirectX game or application. Because this is still at a conceptual level, I'm considering 3 different approaches with varying levels of complexity. My question is this - Which (if any) of these 3 approaches are possible? Where is a good starting point for initializing the interoperation between the 2 environments?

    1) A state-based solution where the application enters a UI mode - The XAML framework is given full control over the CoreWindow while in this state and operates as a full-screen application with a background image captured from the DirectX mode to provide the illusion that the game still exists in the background. I suspect this approach will be the simplest of the 3 and provide the highest level of isolation between the 2 environments. This solution would require some mechanism to dynamically enter and leave the XAML UI state.

    2) Embedding DirectX within a XAML application - In this approach, the XAML framework would retain ownership of the CoreWindow and allow the DirectX application to render to a single element within the XAML document. This solution would either require the DirectX application to operate in an event-driven environment or require a persistent thread to manage the update/render loop of the embedded DirectX game environment (and likely introduce thread concurrency hazards). This solution would require a means of obtaining a DirectX context that renders to the screen (or a portion there-of) in a way that the XAML framework will not conflict with it.

    3) Embedding a XAML UI within a DirectX environment - This approach is by far the most complicated, but is capable of producing the most impressive results. In this approach, the DirectX environment would be responsible for providing a wrapper implementation of ICoreWindow (I know that CoreWindow is sealed, but is it possible to provide an alternate implementation of ICoreWindow?) to translate the coordinates of UI interactions into the projected space of the UI surface. The XAML framework would render to either a surface or buffer that would then be presented in the DirectX application as a texture in 3-space, or the XAML framework would render directly into the 3-space world by means of a global transform mechanism. The DirectX environment would also be responsible for invoking a message pump / event dispatcher between update cycles. This solution would require an unimaginable level of environment simulation and wrapping to host a XAML document where it was likely never intended to be. If there is a chance that this approach could be made possible, please surprise me with it!

    NOTE - In all 3 of these cases, only the XAML framework would be black-box. Both the DirectX application and the XAML-based UI would be aware of this hybrid environment and any limitations or hazards that it may present.

    Tuesday, October 11, 2011 1:07 PM


  • Mixing XAML und DirectX is not possible in the Developer preview. It’s unsure if this will make it in the final version.


    See this thread:



    Tuesday, October 11, 2011 1:33 PM

All replies