Topic-Free Mega Thread - v 1.11.2020

Okay…

 
Basically, the first HDR option uses a 10:10:10:2 framebuffer

    10-bit red / green / blue 2-bit alpha (not important)

The second HDR option (scRGB) uses a 16:16:16:16 framebuffer

    16-bit red / green / blue / alpha


 HDR10 scRGB  (16-bit)

  • is higher quality; 16 is larger than 10 :slight_smile:
  • is floating-point; it suffers less visible banding artifacts
  • is based on Rec709 / sRGB color primaries, so well suited for consumer displays

 HDR10 PQ  (10-bit)

  • uses half as much memory as scRGB
  • … can’t think of anything else good about it :stuck_out_tongue:

The PQ in this mode refers to Perceptual Quantization, it is a form of gamma used in HDR.

Both of these formats eventually have PQ applied to them.

  1.    “HDR10 PQ” pre-encodes the image using PQ gamma
  2.    “HDR10 scRGB” renders the image without gamma and the driver applies PQ

In the end, the 16-bit scRGB HDR mode should be your goto for higher quality. It might incur a couple % more GPU load than the inferior 10-bit mode.

Technically, scRGB gets converted to HDR10 before your display gets a signal, but results are better if this conversion is from 16-bit color to 10-bit.


Probably shouldn’t have asked me… I just spit even more technical jargon at you,

     :rofl:

D3D12 Version of Special K Tested For This Game:

   dxgi.7z (7.6 MB)

4 Likes