locked
Texture sampling with DXGI_FORMAT_R8G8B8A8_UNORM on ARM devices RRS feed

  • Question

  • I'm using DXGI_FORMAT_R8G8B8A8_UNORM textures in a 2D game to store in every pixel the distance from a light source. The precision is not great, but it's good enough when run on my PC or in the simulator. When I run it on my Surface RT (ARM) device though, I see much worse behavior. I wonder if I am really only getting 8 bit precision on x64. Any guess what the problem might be? 
    Friday, August 30, 2013 9:01 PM

Answers

  • All Feature Level 10.0 and later devices in Direct3D 11 API are required by the spec to support 32-bits of precision in pixel and vertex shaders. That requirement does not hold for Direct3D 9 era devices exposed using Feature Level 9.1, 9.2, or 9.3.

    In particular, the Microsoft Surface RT uses an NVIDIA Tegra 3 GPU. It only has 20-bits of precision in the pixel shader, and only 16-bits of precision in the depth buffer.

    Saturday, August 31, 2013 6:05 AM

All replies

  • Hello,

    Can you be more specific about what you mean by "the precision is not great"? It sounds like you might have an interpolation problem. Can you offer some greater detail? What feature level are you targeting?

    Thanks,

    James


    Windows SDK Technologies - Microsoft Developer Services - http://blogs.msdn.com/mediasdkstuff/

    Friday, August 30, 2013 11:58 PM
    Moderator
  • I am targeting feature level 9_1 and higher. With feature level >= 10_1 everything works fine, but when I run my game on a Surface RT with feature level 9_1 there are rendering artifacts which I believe are due to a the 8 bit precision I get by using DXGI_FORMAT_R8G8B8A8_UNORM. I'm confused though, because I use this same format for my textures on all feature levels and it works fine for those at or above 10_1. I just want to know why the behavior is changing which I run on my ARM device. Thanks,

    David

    Saturday, August 31, 2013 12:33 AM
  • By the way I am using a point sampler for all of these textures so this rules out an interpolation problem right?
    Saturday, August 31, 2013 12:36 AM
  • All Feature Level 10.0 and later devices in Direct3D 11 API are required by the spec to support 32-bits of precision in pixel and vertex shaders. That requirement does not hold for Direct3D 9 era devices exposed using Feature Level 9.1, 9.2, or 9.3.

    In particular, the Microsoft Surface RT uses an NVIDIA Tegra 3 GPU. It only has 20-bits of precision in the pixel shader, and only 16-bits of precision in the depth buffer.

    Saturday, August 31, 2013 6:05 AM
  • This is a helpful answer, but I'm still confused because if I use DXGI_FORMAT_R8G8B8A8_UNORM for all of my textures, shader resources views, and render target views, shouldn't each color channel be stored using only 8 bits regardless of the feature level? 

    What I am doing is storing distance values in 8 bit color textures using only the R and G channels and sampling them later. Are you saying my problem is not due to the low texture color precision, but probably due to lower precision calculations happening in my pixel shaders? Thanks!

    Saturday, August 31, 2013 2:53 PM
  • Correct. It's likely precision issues in your shaders, not in the texture itself.
    Tuesday, September 3, 2013 7:00 PM