Bug in 0.11.1: Wrong HDR monitor used for reference

Tested the version provided. Special K now detects the display as “SNY Samsung”, and since it’s going to my Sony STR-DN1080 receiver than to a Samsung Q9FN television, that’s more correct? Seems odd that the Sony would be mentioned at all, as Windows and everything else only mention the Samsung display and the receiver is invisible, but alright.

So um, the yellowing is still there with the version provided. It’s extremely yellow still. Well… No effect, I guess.

And since I am playing an Unreal Engine game (Outer Worlds), I also tried to move the monitor in the Windows display properties so it was the left most display. It had no effect. I flipped it around just in case to the right most display and it was the same. Just as yellow as ever.

So yea, everything is still yellow.

You learned the funny way that passing HDMI through an A/V receiver modifies the EDID. It’s never recommended that you do that for anything graphically important. The A/V receiver might strip off bits of the EDID that are newer than it knows about. Hell it might even change your display’s reported gamut.

It’s best to just run a second HDMI cable to the receiver and run the receiver as a standalone display. NV’s drivers don’t have a notion of audio-only HDMI devices.

Turn down color saturation. Saturation increases equally in all directions, your TV just seems to have a bit of deficiency in producing blue.

The ACES tonemap will keep saturation in check so that you don’t notice your display’s blue problems as much. Passthrough doesn’t allow muting saturation.


Also, paper white defines a point where saturation begins to increase as luminance does. If you set it to 80 nits, that’ll mute the scene a little bit. Technically that is the correct behavior, but a lot of users were upset that saturation wasn’t as pronounced for red and blue, so setting paper white higher increases saturation.

I don’t understand this choice of theirs… It’s really annoying in these sorts of situations when one /only/ wants the HDMI cable for the audio alone.

First of all: If I don’t connect any monitor to the AV-Receiver and if I don’t activate that monitor on the Windows Desktop, I can’t use the HDMI audio channel. Which is a serious limitation!

And as I’ve explained, the secondary monitor, which I don’t use for gaming, is connected via the AV Receiver. The primary monitor, the one used for gaming and supporting HDR1000 is connected through DP.

The HDR monitor is actually on the far left. I.e. the left monitor, Display 1, is left and is the primary monitor connected using DP, the right monitor, Display 2, is connected via the AV Receiver.

I haven’t looked in this thread for a couple of days because in the end the coloring looked fine in 0.11.1 in HDR after I’ve deleted any old profile data. So I guess your assumption that the display parameters are read correctly, just the name might be incorrect seem to be true.

And Special K is amazing. Just wanted to say that :slight_smile:

How can you set a resolution for the receiver? It’s not even listed in Windows unless I connect a monitor to the receiver. Maybe our receivers work differently?

I imagine they mean in Windows’ Display settings.

Connect the monitor to the GPU and receiver, enable it in Windows, set it to a resolution of 1080p to allow HD audio formats to be enabled, and then reposition the virtual monitor in Windows’ display settings above another monitor?

Of course, I can add a dummy monitor to the receiver just to make the audio work, but why, if I can just attach an actual monitor I use to extend my desktop?

With the fix Kaldaien made, the name is now also shown correctly.

Ah, I sorta confused myself for a minute there, and seems to have lost the thread. Sorry 'bout that. I now realize what your previous comment was asking specifically.

You’re lucky.

My A/V receiver shows up as a 1920x1080 display whether anything’s attached to it or not :slight_smile:

I’ve been annoyed over that for years. Windows get lost on the non-display that’s only there for audio, lol. Thankfully, HDMI eARC lets me pass audio through the TV to the receiver, and the stupid computer never sees the receiver. It only took like 15 years :stuck_out_tongue:

Could be. My Denon AVR-X4400H shows up as a display in Windows. Inside the Nvidia Control Panel, I navigate to Display > Change resolution and select the receiver under “1. Select the display you would like to change”.

Both my TV (LG C9) and AVR support eARC, but there’s an incompatibility issue with that Denon’s specific model. I was using the workaround I mentioned until I got one of these. Now, I’m finally able to run only one HDMI cable from my GPU directly to the TV, and one HDMI cable from the TV to the AVR.