Bug in 0.11.1: Wrong HDR monitor used for reference

Hi @Kaldaien

Because of the DF video, I checked out SpecialK and I’m mad at myself for not having known of it before. I’m a real fan of HDR, and using SK to be able to use HDR with older games is absolutely fantastic.

However, I’ve played around and found a bug in 0.11.1 which is almost a show stopper (or I’m missing sth).

First, my config, which might be a bit special:

I have two monitors:

  • Acer Predator X27 (an HDR1000 monitor with FALD) connected over DP
  • LG 27UK650 (an HDR400 which I just use to extend the desktop, it’s never used for HDR)

The HDMI connection from the my Gfx Card (a 2080Ti) is actually made to an AV Receiver (a Pioneer VSX-534D) which then connects to the LG monitor. This way I can make use of the full digital HDMI 5.1 channel audio.

On the Windows desktop, HDR is enabled for the Acer only. The HDR image on the LG looks awful, which is why I’d never enable it there.

Now, when I use SK, I can see that SK always uses the values from the LG monitor as reference. In the HDR widget, it shows the name “ACR AVR Receiver” on the right. I.e. it sees the HDMI connection to the AV Receiver (therefore the name) and uses that, instead of my primary monitor, the Acer X27.

Consequently, since the actual image is shown on my Acer, the result is totally off.

The only way I was able to fix this, was to completely disconnect the HDMI cable to the AV Receiver from the gfx card (even turning off the LG and disabling it on the Windows desktop was not enough!). Then, after deleting the SpecialK config in My Mods, it started to work correctly: Now my Acer is shown in the HDR widget, and - oh wonderous miracle - the HDR image shown on the Acer is simply

perfect

So I hope this can be fixed, so I can use Special K and play Code Vein (which is the game I’m targeting right now) without having to unplug my AVReceiver (and losing 5.1 sound because of that).

If there’s any work around, I’d be glad to learn about this, too!

I don’t have a multimonitor setup, so i’m not sure, but my random guess is the LG display is 1 in your settings, and the Acer is 2. If so, can you swap the numbers and make the Acer display indicate it’s 1 and primary?

Also, i have the 27UK650 as well, how big is the difference compared to the X27?

No that’s not it, the Acer is listed as monitor 1 in the Windows display settings.

Anyway, I’ve come across a posting by Kaldaien in the other thread where he explains that the name shown in the HDR widget is of no consequence and can be faulty, However, the HDR display in Code Vein was totally off (WAY too dark) but now it’s correct.

About your question for the difference between the X27 and the LG: Well that would be quite off topic, if you’re interested in that, I’s be happy to go into more details in an extra thread.

Let’s say the X27 is a monitor that can be used for HDR, while the LG simply cannot. But the price difference is also staggering. I bought it almost two years ago, I haven’t regretted it, but justifying the cost is really hard for that monitor.

Hmm I noticed in my setup that MG248 is listed as monitor source so maybe that’s why my values are a bit off? Though it detects correct HDR monitor luminence and such since I read reviews for my monitor. Would it make difference if that name is correct? The MG248 isn’t a HDR monitor.

My HDR Monitor is Samsung C27HG70. It is set as primary and such. I also noticed in Windows 10 an interesting thing happening with monitors. In Display settings it shows my monitor is 1 but every so often if I go into Color Management it shows its Display 2 even though in Display settings of main settings of win10 that is 1. I have had to redo drivers to get these to match. Might wanna make sure that Display 1 in Both regular and Color Management tab is same monitor.

Yes, that seems to be a long standing bug in the display settings dialog.

I usually switch back and forth between the monitors before I change anything. That seems to fix this.

Here is the kicker. Horizon Zero Dawn uses Color Management display rather than regular win10 display. Which was why I got some weird ass dull colors in my game. That was something I found out by simply disabling my secondary monitor.

Rly? Didn’t notice that. But working with HDR can give you all kinds of weird effects. Even without any injection like SK.

Yup and this was without SK. Which was weird b/c it was semi easy to replicate it. Though I have AMD so HDR in that game is a bit off. Still not working right like Nvidia is for most part. Though Watch Dogs Legions uses NVAPI for HDR and that’s why HDR for me looks so bad and I can’t get it to work properly even with SK.

I wasn’t aware that there’s that much of a difference between AMD and NVidia in HDR. Never came up in one of the forums where they love to point out any kind of (dis)advantage… :slight_smile:

Look at chat between Kaldaien and me about Watch Dogs Legion HDR. He points out that’s why its doing bad. I get massive filmgrain effect and oddly enough HDR only works in WDL if you use Exclusive Fullscreen mode. Which is kinda good for me since if I want SK to work I can just simply bypass turning on WDL HDR ingame. But it still looks bad.

Though interestingly enough. WDL can turn on HDR ingame without having to turn on HDR mode in wind10.

You can use the topic-free megathread.

Why? A 2080 Ti should have at least 3 ports across DisplayPort and HDMI. I would never pass anything through an A/V receiver if it can be helped. The stupid NVIDIA driver is always going to consider the A/V receiver a display device on its own, but you can run it at 640x480 and put it off to the side… (top or right-side preferably)


As for the data about your display, the only thing that might be inaccurate is the name. I get the color gamut and luminance levels directly from DXGI for the “containing output” (display device that the game’s window is connected to). Unless you move the game’s window after starting it, that should all be accurate.

EDIT:

Come to think of it, Unreal Engine games actually do that… they start the game on your middle monitor and then move it to the far-left monitor. This could potentially be a problem for them and the only way around it (for now) would be to arrange your HDR display to be far-left :-\

To add to this, @Grestorn, I’d also connect the GPU directly to the display whenever possible, and run a second HDMI cable through the AVR for audio. However, you need to set a resolution of 1080p@60Hz for the receiver in order for HD audio formats to be enabled.

Do that and position the second “monitor” at the top right corner of your main display in the Nvidia Control Panel, as Kal suggested.

I actually have this very exact same setup lol. On my left Facing it and as Primary Monitor lol.

This has also happened to me. Since I have a Sony reciever inbetween my PC and my Tellie, Special K incorrectly thinks that display is a “SNY LG ULTRAFINE” (possibly related to the Ultra Fine displays that are also connected?). My tellie is a Samsung Q9FN, so um, definitely wrong.

Likely because fo this, the colours are waaaay off in Special K HDR tonemapping. Everything is extremely yellow!

I really don’t think that second part is correct.

I get the color gamut and luminance range of the display device directly from DXGI. DXGI does not supply device names, however, so I have to use a different method to get the name of the monitor. There’s not a reliable connection between DXGI and EDID display names, so the name of the display might be the wrong one but all the rest of the data is correct.

I may just stop reporting the display name altogether and do what most games do (\\DISPLAY1\…). That sucks as a name for a monitor, but it’s accurate. The EDID method is accurate only when there’s a single display :-\

That’s almost certainly a problem with your display. I use D65 white + Rec 709 color primaries, a properly calibrated HDR display should have no problem rendering what are SDR colors correctly.

No other SDR or HDR content behaves like it, either from the computer or other sources. Including other colour spaces. This has only ever occurred when using Special K’s HDR tonemapping.

It’s VERY obvious, plus we’ve had this display for a while, so I’m quite used to its quirks at this point!

EDIT: I am primarily testing an unreal engine game, so it might be that bug…

Well, I fixed the general display name bug, but I’m 100% positive that has nothing to do with whatever problem you are having because:

  1. DXGI reports the containing output’s colorspace (just not its name)
  2. SK uses Rec 709 colors and your display damn well better be able to render those accurately
  3. It was only the device name that was problematic
+ Changed HDR Display Name Query to use NvAPI on compatible systems to
    correctly map DXGI Display Handles to EDID Display Names

SpecialK64.7z (7.5 MB)
SpecialK32.7z (6.3 MB)

1 Like

Is that new version the best one to use with all the other HDR fixes? And game fixes you did?