Why are the maximum X and Y touch coordinates on the Surface Pro is different from native display resolution? RRS feed

  • Question

  • I have noticed that the Surface Pro and I believe the Sony Vaio Duo 11 are reporting maximum touch coordinates of 1366x768, which is surprising to me since their native display resolution is 1920x1080.

    Does anyone know of a way to find out at runtime what the maximum touch coordinates are? I'm running a DirectX app underneath the XAML, so I have to scale the touch coordinates into my own world coordinates and I cannot do this without knowing what the scale factor is.

    Here is the code that I'm running that looks at the touch coordinates:

    // From DirectXPage.xaml
    <Grid PointerPressed="OnPointerPressed"></Grid>

    // From DirectXPage.xaml.cpp
    void DirectXPage::OnPointerPressed(Platform::Object^ sender, Windows::UI::Xaml::Input::PointerRoutedEventArgs^ args)
       auto pointerPoint = args->GetCurrentPoint(nullptr);
       // the x value ranges between 0 and 1366
       auto x = pointerPoint->Position.X;
       // the y value ranges between 0 and 768
       auto y = pointerPoint->Position.Y;

    Also, here is a sample project setup that can demonstrate this issue if run on a Surface Pro:


    Wednesday, February 27, 2013 7:09 PM


  • It sounds like you have your Surface set up to magnify text and other items (see the Display control panel). This is typical for high DPI systems. Your app will get touch coordinates in the logical (magnified) coordinate system, but text and images will draw with the full resolution (assuming high-res image files are provided).

    You can get the bounds of the CoreWindow in the device independent pixels (DIPs) with CoreWindow.Bounds, or you can call DisplayProperties.ResolutionScale to get the scale factor for the screen.

    See Guidelines for scaling to pixel density for more information.


    Wednesday, February 27, 2013 10:38 PM