8-bit vs 10-bit Bit Depth and GetDeviceGammaRamp() RRS feed

  • Question

  • Hope this is the right place to ask my question....

    When I excute a call to GetDeviceGammaRamp(), I get three 0 to 255 arrays, one for each RGB channel. That's indicative of an 8-bit "display", as you can see by the code fragment below : 

    How would the values change in the case of a 10-bit display pipeline?

    I updated to the latest NVIDIA Video driver, supposedly touted as the first generation to finally support 10-bit on GeForce video cards (I have a 1070) and have activated the 10-bit option in NIVIDIA Control Panel, as follows:

    A quick look at my system through Monitor Asset Manager shows my "video hardware" to have 10-bit capabilites, as shown below:

    So my question is why can't I see "Length=1024" in my call to GetGammaRamp()?

    Friday, December 27, 2019 10:07 PM