Topic-Free Mega Thread - v 1.11.2020

I kinda want to say ‘duh’ here…

DwmFlush (…) is not for the purpose of VSYNC, it is to kick-start compositing, which is a different beast altogether. The DWM coordinates rendering between multiple applications, it’s not really a good idea to call DwmFlush and expect that to only affect / be affected by your application.

There’s a correct way to wait for VBLANK if that behavior is desired, and that’s not it :slight_smile:

1 Like

I like how the programmer refused the explanation and kept having it as a Windows OS issue then when pressed and compared to Chrome (Where this is done correctly.) the solution was to write code to hide the problem rather than actually fixing it and still believing it’s a Windows OS issue and over time it’s gotten worse.

Firefox has declined a lot in quality over the last two years as they’ve lost much of the market share to Chromium and then Edge (And now Edge is on Chromium too.) might be time soon enough to just hop over to Edgium (Edge with Chrome as I keep calling it.) it’s pretty good after some setting up from what I’ve tested so far.

Funny reading, for sure…

But it did get me thinking, and my lazy ass has been defaulting the scale of the framepacing widget to 60 FPS for no good reason for all this time. It’s very easy to get the active refresh rate from the DWM, and not only that, I can get an extremely accurate measure of it too…

image

That will be going into the next version of SK :slight_smile: If you don’t set a limit, it gets refresh rate from DWM.

image
^^^ Added benefit, when you click the Framerate Limit button, a very precise limit matching your refresh rate is selected instead of one that a user would naively enter (believing their refresh rate to be exactly 120).


DWM’s just this awesome thing that doesn’t get enough love :slight_smile: You know how difficult it is to get the refresh rate using any other API? Way more than it is supposed to be, and even when you do get the refresh rate, it’s not anywhere near as accurate as DWM can give you in 1 function call.

4 Likes

Weird, i can’t seem to get Sekiro working with Special K HDR anymore. I disabled in-game HDR and tried different render passes. Going to look through older builds of SK.

Boss rush mode for the game is releasing today, i’m looking forward to it :smiley:

Yeah, it doesn’t work anymore. I had to stop supporting it because of its driver-based HDR. It will lock your entire system up.

I’m seriously starting to HATE NvAPI-based HDR. I really get annoyed whenever a game ships using it. Said game never works long-term without issue.

DXVK NVAPI to false and going at it that way through GeForce Experience then?
No idea if the SpecialK NVAPI NVHDR overrides work I can’t test on my end.

Games getting a enhanced edition update with a few new outfits, think one to three new boss fights and a boss rush mode as a free patch soon.

EDIT: Suppose SpecialK could technically report the GPU as a other device ID lifting that same trick too but that’s a hacky solution probably best left alone and NVAPI is used by some games or game engines beyond extra features same as AMD ADL but you don’t see that as often.
(Here and there AMD_AGS.dll for FreeSync2 HDR and a few other bits.)

EDIT: NVAPI and NVHDR in SpecialK are these bits.

[NvAPI.HDR]
hdrBpc=NV_BPC_10
hdrColorFormat=NV_COLOR_FORMAT_RGB
hdrDynamicRange=NV_DYNAMIC_RANGE_AUTO

[NVIDIA.API]
Disable=
DisableHDR=

Where’d you find that, BTW? :slight_smile:

I only vaguely can remember how to set those things, lol. I know they are settings, but have not really explained them to anyone or made them available through SK’s control panel.

https://gitlab.special-k.info/Kaldaien/SpecialK/-/blob/0.11.liberated/src/nvapi.cpp#L629-695

Good question that’s been in there for some time.

I think I found those in config.cpp but from there to linking into the other source bits I’m not entirely certain.
https://gitlab.special-k.info/Kaldaien/SpecialK/-/blob/0.11.liberated/src/config.cpp

ConfigEntry (nvidia.api.disable,                     L"Disable NvAPI",                                             dll_ini,         L"NVIDIA.API",            L"Disable"),
ConfigEntry (nvidia.api.disable_hdr,                 L"Prevent Game from Using NvAPI HDR Features",                dll_ini,         L"NVIDIA.API",            L"DisableHDR"),
ConfigEntry (nvidia.bugs.snuffed_ansel,              L"By default, Special K disables Ansel at first launch, but"
                                                     L" users have an option under 'Help|..' to turn it back on.", dll_ini,         L"NVIDIA.Bugs",           L"AnselSleepsWithFishes"),
ConfigEntry (nvidia.sli.compatibility,               L"SLI Compatibility Bits",                                    dll_ini,         L"NVIDIA.SLI",            L"CompatibilityBits"),
ConfigEntry (nvidia.sli.num_gpus,                    L"SLI GPU Count",                                             dll_ini,         L"NVIDIA.SLI",            L"NumberOfGPUs"),
ConfigEntry (nvidia.sli.mode,                        L"SLI Mode",                                                  dll_ini,         L"NVIDIA.SLI",            L"Mode"),
ConfigEntry (nvidia.sli.override,                    L"Override Driver Defaults",                                  dll_ini,         L"NVIDIA.SLI",            L"Override"),

ConfigEntry (amd.adl.disable,                        L"Disable AMD's ADL library",                                 dll_ini,         L"AMD.ADL",               L"Disable"),

EDIT: There’s been a lot of code shuffling and cleanup after all and some of these are pretty old, I would think it’s one of the D3D11 source files but I don’t think I’ve ever dug up exactly where these are from.

EDIT: Well if the function is named disable_hdr I can use that as a search term and it’s results.
Yeah it’s actually in NVAPI.cpp and then a small mention in Sekiro.cpp which is in the plugins folder. :slight_smile:

EDIT: Sekiro just seem to be a plugin specific checkbox maybe that’s where this originated from but I don’t think anything is set by default so no Sekiro deadlocking from trying to do some forced disabling of NVAPI or NVAPI’s HDR mode.

I’m not much of a coder though but the main functionality seem to be in NVAPI.cpp

Should go through these more thoroughly at some point I occasionally use these to do config cleanup without making a completely new blank preset.
(Though this is a good idea once in a while for getting new optimal default values for some of these settings.)

And for config parameter I believe that’s been primarily from developments in the framerate limiter through the last few releases of SpecialK which has overhauled it quite a bit from the previous versions now.
(Limiter tolerance for example for one removed setting and the two sleepless options defaulting to false now for a example of a change of default settings.)

EDIT: Should get a HDR display while at it, at some point I’ll have some fun with this too. :smiley:

Looking forward to the day when this actually works:

          { L"NV_BPC_16",      NV_BPC_16      } };

NVIDIA has the ability to check for / set 16-bpc color, but there’s no signal standard that supports that right now :stuck_out_tongue: I think the only devices in existence that support 16 bpc are printers, and last I checked NVIDIA wasn’t in that market.

Nonetheless, 281 trillion color mode… it’s gonna happen someday.

Looks like I know the reason why Ori’s official HDR is among the worst I have ever seen.

This is how the game handles SDR rendering, it simply blows past the limits of the image format until large portions of the screen are brighter than the signal can transport. In other words, the artists saw nothing wrong with crushing really bright details into a solid full-bright color.

Things do not improve when you turn on the game’s official HDR, in fact saturation goes so far out of whack you would think the game was set on Mars.

1 Like

Hmm guessing the SpecialK options could help balance it in a bit, interesting to hear it affecting SDR too.
Wonder if Trine would have been similar but they’re mostly a bit too old for HDR to have been a thing maybe the fourth game if anything.

Liking these histogram and charts too they give a really good look into these things highlighting and showing exactly what’s going on and helping with properly calibrating the image using bright and dark scenes to dial in more optimal overrides.

Color balancing has been a thing with ReShade for a while too including using it or having similar shader tuning for media playback but that’s a bit different plus ReShade doesn’t quite do 10-bit properly although it’s improved a bit.

Much of that is for getting the color channels in sync however not reigning in extreme brightness where everything goes white or what would happen if something tries to master against 10K nits carelessly.

Wonder if it’s the same on console, Boris HDR reports on some games were pretty fascinating for what some games attempted or targeted and how that then scaled or got calibrated.

And now the new XBox UI and functionality has tools to also attempt to set HDR though I don’t know if one can re-calibrate the software side of the implementation or if it’s intended to try and attempt HDR support plus HDR TV calibration helpers.

I don’t support 10-bit either… I mean from a technical stand point I do, but from my perspective it’s not appropriate to render HDR in 10-bit, ever.

An HDR pipeline should be 16-bit with the downgrade to 10-bit saved for the final tonemap. ReShade cannot possibly hope to do any kind of post-processing on an HDR game in the 10-bit colorspace it uses for final output. That 10-bit image most games produce isn’t suitable for post-processing, too much precision has been lost at that point and the only reason games get away with crushing stuff down to 10-bits like that is because the image is tailored precisely to the attached display device’s luminance capabilities.

1 Like

Hmm guess Microsoft is trying to make this better although I was under the impression 10-bit was the current go-to standard but hopefully as hardware improves and with bandwidth less of a concern with HDMI 2.1 and newer and what’s next for Display Port this can be resolved given more time.

Would expect NVIDIA and AMD’s HDR modes to be at least ready for this too but I wouldn’t be entirely sure how developers would leverage and implement these capabilities which I suppose is where much of this goes wrong even with improvements on consoles and PC and better documentation and recommended best practices.

Sigh well D3D12_3 HD HDR perhaps then, HDR16 or something joking aside though I think that’s already covered in current D3D12 information and supported without any real problems.

Still a fairly early tech after all but LG and others are pushing to improve the situation and with new hardware and better standards the software side should follow. :slight_smile:
(After a lot of time but that’s a constant recurring issue.)

EDIT: Hmm actually scRGB and the full 16-bit support would be separate issues as demonstrated by what I assume is still a pending NVAPI implementation.

Though with the OS now supporting HDR these separate vendor specific HDR attempts should be slowly diminishing or so one could hope.

Guess there’s a reason AMD is having their FreeSync2 HDR thing too and likely something similar for NVIDIA’s NVAPI HDR functionality even with Windows HDR in the picture now.

EDIT: Think these have their own limitations and restrictions too so from a developer standpoint going by what’s standard should be the common implementation but I don’t really know obviously NVAPI HDR and AMD ADL FreeSync2 HDR are still around even with the current builds of Windows 10 and support for HDR across D3D12 and D3D11 plus extensions for Vulkan offering HDR functionality too.

EDIT: Well I suppose the bits per channel has that limit with alpha too if that’s a bigger concern.
Going from 24 (888) to 30 (101010) to 48 with 16 here.

And then alpha was what 2 bit or something.

Lots of different formats and what not, not something I’m very good at.
Millions of colors, billions of colors and the overall precision too then potentially also hitting those banding issues I’m guessing.

Hmm wonder if bandwidth is going to be a concern after all thinking of it, ray-tracing wants a lot of that and it’s going to be next up with 4K for this coming generation of hardware. :smiley:
(But that might be more on the VRAM and overall memory demand though still a issue potentially even as this gets faster and newer hardware has better total availability or faster - wider? - buses.)

10-bit is totally the display standard, but that’s quite a bit different than the rendering standard.

Dolby Vision is a 12-bit format, and most displays that use it are only 10-bit. The reason it’s 12-bit is because it needs a signal with more precision than the display has because it’s going to do display-side processing to try and reshape the image’s waveform to something the display can produce that closely matches the reference image.

Rendering HDR in games works the same way, except the end goal is to have NO PROCESSING done on the display. HDR rendering is usually at 16-bit, with the same kind of stuff that Dolby Vision does applied at the engine level. This is how a game can transfer a 10-bit signal with the same range as takes Dolby Vision 12-bit, neither technology can do its processing in 10-bit though.

1 Like

Maybe you can take a look at the source and see what the issue is?

Or maybe you could make a thread in their forums for this?

Maybe. I don’t have any experience working on the other side of these popular engines, lol. I can tell you all sorts of weird quirks in Unity and Unreal, but since I don’t know how their official tools work most of the time I can just comment on how weird something is, write a workaround for it and then call it done. To go beyond that would require me to learn an engine I have no conceivable use for, and that’s not a great way to spend my time.

Fair enough.

If I manage to put something interesting together to process LDR → HDR with similar levels of success as Microsoft is doing with the Xbox Series X Auto-HDR system, this will surely be discussed in any write-up.

For now, the system I’ve got is not worth writing about. It’s definitely worth using, but there’s nothing amazing that needs to be shared with other developers (yet?) :stuck_out_tongue:

Soo AMD announced an answer to the 3090, in the form of the 6900 XT. We’ll have to see if it actually matches the performance whenever reviews are up, but it’s already better from a value standpoint and with a tdp of 300w.

Not to mention CHEAPER than the 3090 by a comfortable margin at $999 USD. The only thing that gives me pause is my monitor is GSync only, no FreeSync support, and it is a nice ass monitor (Asus PG27QU).