D3D12 Missing Features

I really wish D3D11On12 didn’t exist. It’s @#$%ing confusing trying to figure out a game’s render API on-the-fly when there’s D3D11 noise coming from RTSS and Steam :stuck_out_tongue:

That’s why the overlay behaves strangely / crashes sometimes when changing graphics settings. SK has to make a split second decision whether the game’s D3D12 or D3D11 and those overlays are not making it easy.

Much easier when you put dxgi.dll in the game’s directory, the game will load it up before those non-native D3D12 overlays start mucking things up. I have absolutely no idea why Steam’s overlay isn’t D3D12 native, it’s not even rendered inside the game – the only thing the overlay does is transfer images rendered in another application and stretch them over the game. That’s trivial to make D3D12 native.

put the dxgi.dll on the Control (Game) folder and when start DX11 works, DX12 dont.

I would want to test it with Forza Horizon 4 (Microsoft Store game), but only way to do that I know is to use SKIF. How can I make SKIF use this DLL for DX12 games, so I can test MS Store games?

Make sure you use proper settings.

i only put the dxgi.dll on the Control folder and when hit DX11 the SK show up, when hit DX12 dont. what proper settings should i be using? Before starting the game or after SK configuration?

Special K works for me in Control DX12.

Maybe you could try starting from scratch, deleting dxgi.ini and let Special K generate a now one.

It might take a couple tries though. For me, the game will close down the first time launching the game + Special K without dxgi.ini.
But the second launch, after dxgi.ini was generated, the game runs properly with Special K.

2 Likes

Ok. I delete “dxgi.ini”. I drag “dxgi.dll” to Control folder. Only “dxgi.dll”. Launch the game in DX12. The SK dont show up. I try several times. (if i launch in DX11, SK do popup)
Do i need others files to proper install SK? I never see SK work like global injection, only local.

As I shake my head about this subject, I realized something curious that might interest everyone: I have a Samsung Q80. Her behavior has always been; if a game has HDR it activates HDR alone. If I turn off the HDR in the game the TV will switch to SDR. Okay. Okay.

I don’t know if it was a TV or Nvidia update, but if I put HDR in Windows and then run Control, TV or Nvidia (I don’t know) forces HDR in the game that doesn’t have HDR, understand?

I tested several games that do not have HDR, old enough to ensure that HDR does not exist there (Limbo, Inside, Control, RE Remake). And in all cases the HDR is forced into the game, changing the gamma of colors, contrast and peak brightness.

And before, and ever since, TV behavior was; even if I turned on HDR on windows and then ran the game, if this game didn’t have HDR the TV would simply switch to SDR on its own.

Curious…

Maybe the TV received an update, or Nvidia, or I’m going crazy.

It has to do with Windows, its automatic SDR->HDR conversion, as well as the game itself and whether it makes use of NVAPI or not to enable HDR.

A few key things to note before I get into the gritty details:

  • Games running in borderless/window mode will not override the display mode set in Windows.

  • Games running in exclusive fullscreen mode can override the display mode set in Windows.

  • Windows 10 v1803 and later includes automatic SDR->HDR conversion to prevent SDR content from looking dim and washed out when HDR is enabled for a display.

  • Nvidia’s NVAPI allows games to force HDR mode on a display without having to enable it in Windows itself.

What the above means is the following:

  • If HDR is enabled for a display in Windows 10, and the game is running in borderless/window mode, the operating system will automatically convert SDR to HDR output to prevent the game from looking grey/washed out. This is what’s typically known as “fake HDR” as there’s no additional gain or brightness to be ha. It’s similar to e.g. resizing a 720p image to 4K – you will get a larger image but the details will still limited by what was present in the 720p image.

  • When HDR is enabled for a display in Windows 10, if a SDR game is running in exclusive fullscreen mode it will tell Windows to switch over to a SDR display mode, thereby disabling HDR.

  • When HDR is disabled for a display in Windows 10, games making use of the NVAPI to enable HDR can forcefully enable HDR mode for the display – in both borderless/window and exclusive fullscreen mode.

So the various different results you’re seeing is based on the above factors.


If we assume HDR was disabled for the display in Windows, this means the game that you’re seeing this behavior in makes use of Nvidia’s NVAPI to forcefully enable HDR for the display.


Both of the above scenarios are Windows 10’s SDR->HDR conversion at play. It’s a simple changing of the scale basically – you don’t actually get anything more out of the experience. It’s solely there to prevent SDR from looking grey/washed out as it did in earlier versions of Windows 10 (before v1803). So called “fake HDR” in a nutshell.

Special K’s retrofitting of HDR actually enables games to make use of the expanded dynamic range that HDR has – and is basically a more “proper” form of HDR.

Edit:

Note that beyond the above, the TV might also be configured to a “Vivid” or “Enhanced” mode/preset which further expands and manipulates the colors and brightness of the image. This is basically where the TV takes the “HDR” output from Windows and then artificially manipulates it to create more contrast where non existed previously. This doesn’t mean that the game is actually using HDR natively though – it just means the TV is taking the SDR->HDR image sent to it and throws some additional post-processing effects on top of it to create a (technically) fake but more “vivid” HDR experience.

SDR (game) → HDR (Windows) → Vivid HDR (TV) does not mean the game itself makes use of HDR natively.


This means the game is running in exclusive fullscreen mode, which allows it to override the display mode to a SDR one.

1 Like

Turn off Rivatuner Statistics Server, then it works. Also launch DX12 directly don’t use the launcher.

great…
Here I was studying about Servers only to stumble upon complete Greatness… Can’t wait to dabble with what you’ve given us on this early d12 build!!

Thank you & will keep ya posted~!

1 Like

Does that mean this release should only be used for Control, or that you implemented code that detects Control and makes game-specific changes for it?

Also, as an aside, are any of the game-specific releases of SpecialK still recommended over the latest SpecialK? If so, which ones should still be used (or, if it’s easier to answer, which ones are obsolete)?

Yes.

If the latest version(s) of Special K don’t work, use the game-specific releases.

Nobody really have time to track game compatibility for every version of Special K, so it’s anyone’s question whether a specific game might work or not. Typically newer versions of Special K don’t see as much time spent on guaranteeing compatibility with older titles, so some features or tweaks of game-specific versions might be broken in newer versions of Special K as a result of who knows what change over the years since.

@Kaldaien BFI and HDR are like water and oil :). LG Oled is about as good as it gets in terms of motion handling, HDR AND high refresh rates unfortunately. BFI does make things pretty clear in motion, but to be honest sapping all the brightness out of the image completely ruins it for me. I didn’t buy an LG oled and an ACER X35 (one of the only computer monitors than can do proper HDR) to have a dim low contrast image :slight_smile:

OLEDs are plenty bright. Even with BFI turned on, it’s a good 3 or 4x brighter than my plasmas.

HDR on OLED is not about being really bright, it’s about having detail near black and true local highlights (e.g. not backlight zones). You don’t need a super bright display for that :slight_smile:

And that pure black is just :melogold:

I friggin love pure blacks! More than I love the full local dimming array of my Acer monitor.

Works quite well with CP77. Only thing I have problems is vsync/2 and saw like latency.

Running 75 Hz and 37.5 limit

PresentationInterval (more accurately any non-zero value) is effectively a no-operation until you turn off “Drop Late Frames”.

If you are trying to accomplish 1/2 refresh V-Sync, you need to enable buffer sequencing. SK’s default setup disables sequencing and always displays the newest frame.

1 Like

So I can regular vsync 64-77 with ~72 ms average, vsync/2 (disabling drop late) 102 ms or no vsync + tearing (63 ms in my test). Thanks

I don’t think Max Device Latency 0 does anything meaningful. There’s a minimum value of 1, the only reason the UI lets you select 0 is because -1 is reserved for application-default.

I should probably fix that :slight_smile:


If you right-click the Waitable SwapChain button, there are a few super secret values you can tune that control latency bias :male_detective:

Setting both to 1.0 will aggressively reduce the render queue depth, 0.5 is a good compromise between stutter reduction and latency. These values aren’t saved, and will probably be a slider of some sort if they stick around.

I haven’t documented that stuff yet, it’ll be explained in more detail when I do LDAT-based testing later this month.