Topic-Free Mega Thread - v 1.11.2020

Yeah i meant Path of Exile, curiously it worked earlier today before i did a DDU, and downloaded the latest nvidia driver.

Hmm I am getting another interesting glitch. HDR seems to switch whenever I change chars. As if it suddenly goes from dark proper HDR to bright as broad daylight.

I’d not even bother with SK in that engine. Until they stop using sRGB gamma in the SwapChain, it’s never going to benefit from the frame pacing and input latency work that SK does.

Engine’s poorly designed for windowed-mode rendering.


This all drives me mad. Windowed mode performance and latency don’t have to totally suck, we just need engines in latency-sensitive games such as that one to stop doing everything backwards :stuck_out_tongue:

I don’t know if they can even change from SRGB gamma, it’s when alt-tabbing from fullscreen-mode though that error occurs.

Avoid turning on Special K’s flip model settings and that should just go away.

They don’t benefit anything in a game that has to run in fullscreen exclusive anyway. I only support Flip + Fullscreen as a compromise so that graphics engines don’t leave you stranded in the configuration menu after changing resolution settings.

I am of half a mind to jump unto the below offer:

If you do, let me know if the Tobii features in SK still work :slight_smile:

I haven’t used them in a while now because, well, I don’t use computer monitors anymore. They can’t track my eyes when I’m playing my games on a 77 4K OLED from a few feet away, lol.

Third person eye tracking, hmm well why not.
Guessing it can also help with UI interaction and such depending on how it’s implemented.

EDIT: What does it do.

Hmm. :slight_smile:

EDIT:

If perhaps not without potential problem. Considering it’s a Ubisoft open world game the obligatory item pick up sound is going to be going constantly just from scanning the surroundings near the player.

Neat usage though.

Guessing it has similar utilization in Legion too.

EDIT:
And in that one what does it actually do?

Well hack the world certainly applies.

Neat with the fading UI and for both of these also the light adaption and some other ideas. :slight_smile:

The error occurs regardless of flip mode enabled :confused: it’s getting triggered just by having SK injected.

I dunno if this happened because i clean installed my nvidia driver, or because of a game update but a revert to the previous driver, or SK version did not fix it, so i think i can put the blame on the game instead.

EDIT: This is weird somehow changing from D3D11 to Vulkan, and back again fixed the error.

BTW is it possible to apply the framerate limiter only with vulkan without using the OSD?

That’s more or less how Tobii is used in all software, to be honest. Even Special K uses it for that purpose. I wanted to do some advanced HDR stuff with eye tracking, but then I got an HDR OLED (… and another one, and another one – @#$% these things are awesome) and now I don’t waste my time with computer monitors anymore :stuck_out_tongue:

Seriously, LG… just shrink these down to computer monitor scale already. 48" is a nice step towards OLED on the desktop, but it’s still too big. No idea why they haven’t gotten this memo yet? LCD sucks and we need OLED in-between mobile phones and giant TVs ASAP.

Do you also feel that glossy OLED with anti-reflective coating greatly contributes to the perceived clarity/sharpness than 27/32" 4K monitors that’s supposedly have higher PPI.

I’ve had plenty of glossy LCD displays. Matte vs. glossy finish doesn’t really impact perceived resolution, so much as it does black contrast.

OLED already invalidates a lot of traditional methods for defining contrast :wink:


All of my rooms are well light controlled, so I get no benefit from matte diffusion. I’ve been using plasma / OLED TVs exclusively for the past 15 years so I am allergic to external light sources, lol.

Hmm idk im my opinion when i compare my 4k 32 inch to my C9 55 inch i feel that i see detail, and sharpness way better on the OLED, i think matte coatings on display’s with 4k high ppi displays actually reduce perceived clarity due to the pixels being so small so it’s actually obscuring them.

Yes, if that’s what you were getting at, then I agree.

But this is not about the OLED having a glossy finish (I don’t think it’s even possible to mfg. an OLED or Plasma panel with a diffuse coating, they would be too damn dim to see anything :stuck_out_tongue:).

The reason that an LCD panel might choose glossy vs. matte is usually because it wants to boost contrast. LCD already works by blocking the backlight underneath, so further restricting the light with that diffuse layer tends not to dim the image unacceptably.

; It’s an option to go with with matte or glossy on LCD, it’s not one on OLED.

Sadly glossy is not reallly a option for most LCD displays especially high refresh rate ones, the only glossy ones i have been able to find is the Apple ones, or displays that use Apple panels but they are 5K, and don’t have a traditional DP/HDMI but use either Thunderbolt, or USB-C.

Honestly i can’t wait for micro-led in 27-32 inch sizes but i believe that’s around 5-10 years away.

1 Like

Yeah, that’s true…

I was surprised when I bought my PG27UQ ( https://www.amazon.com/Swift-PG27UQ-G-SYNC-Gaming-Monitor/dp/B07F1VGGLK ), that it was matte. It would reduce haloing on the local dimming if it were glossy instead. For what that monitor cost, people could afford to put curtains in their room to control glare :stuck_out_tongue:

Just realized that NVIDIA LDAT’s somewhat of a flawed measurement device for high-end display devices that are not sample-and-hold.

My OLEDs all run in BFI mode because motion looks so damn sexy with pixels that aren’t full-bright until the next image is scanned. The fact that I am not using a full persistence display means a 6% change in luminance is going to happen whether there’s a new image refreshed or not.

This can only be used in full persistence display modes (e.g. G-Sync). Attempting to use it with ‘OLED Motion’ enabled will give completely useless data. And using it to test latency on a CRT or Plasma is completely impossible.

PC gaming’s too closely tied to LCD display technology IMO. When OLED shrinks to fill computer monitor scale, there’s going to be a rude awakening for many people – brute-forcing your way to clear motion through higher refresh rates is something only LCD requires.

1 Like

Current high-end displays and upcoming models already use OLED do they not although under 30" it looks like variants of IPS LCD technology with some VA models are more the norm with various drawbacks and balance of features and total cost for the unit.

Mainstream though yeah probably at least another decade if not more.

EDIT: Nice, ReShade 4.8.0 is available for download, could still be updated a bit before the full release though. :slight_smile:

Those spectral images are spooky :slight_smile:


Technically, scRGB can hit – 15 million nits :sunglasses: It’s not a signal format though, so luckily that’s not a thing anyone has to worry about.

For Dolby Vision / HDR10, the currently defined standard SMPTE 2084 tonemap uses 10K nits as its peak.


I’m not sure why you’re surprised to see part of the signal hitting 10K nits?

Special K’s “Peak White” target does not mean “Don’t get brighter than Y nits,” it means “normalize the SDR image range to Y nits.”

If Special K merely rescaled the SDR image, then it would probably work more akin to a brightness limit. But if SK merely did that, there would be no HDR effect.

SK does 2 things to achieve SDR → HDR:

  1. It redefines the brightness value of (1.0, 1.0, 1.0)
  2. It allows a graphics engine to accumulate values beyond (1.0, 1.0, 1.0).

Multi-pass rendering that an engine doesn’t tonemap to SDR will slip past the (1.0, 1.0, 1.0) mark before SK applies any curves, simply because I changed the framebuffer to store HDR. Typically that includes things like a game’s HUD.

Now i need to know what the sun is rated at, when viewing from Earth :eyes:

It was surprising that HDR recording did indeed work, but the footage was clipped as if SDR - on every player i checked. So to me it feels like Nvidia recorded HDR content in SDR, and tonemapped back to HDR, if that makes sense?

But like i said before, i still need to run more tests some time.