Topic-Free Mega Thread - v 1.11.2020

Tested again, I lose controller rumble when running SK and native input without Steam.

You can disable SK’s use of the input APIs of the gamepad:

In this case it’s probably related to the Raw Input bug that Cyberpunk 2077 had. The Cyberpunk Engine Tweaks mod included a fix for that which was necessary before CDPR fixed it properly in the v1.05 patch I believe.

The Raw Input bug broke all form of virtual input – including those of Steam Remote Play, Parsec, and many others.

Edit:
Here’s the original code with the bug:

Did anyone manage to get Kingdome Come Deliverance to work with Special K? As soon as I try launching it with SKIF enabled, the game just crashes back to desktop :frowning:

Try a local install instead.

20.12.29
========
+ Added support for switching between D3D11/D3D12 on-the-fly in Serious Sam 4

 >> Should assist with D3D11On12 and Ansel compatibility in normal games that
      don't support this bizarre render API hotswap feature

+ Fixed framerate uncap in NieR: Automata

20.12.28
========
+ Disabled auto-update checks on all game-specific plug-ins built-in to SK

SpecialK64.7z (7.7 MB) SpecialK32.7z (6.5 MB)

5 Likes

Lovely as that engine is I am curious if the API was ever designed to work in that way but somehow it apparently does. :smiley:

Guessing there are some downsides though or there would be less of this dual binaries or exiting and starting the program again to switch which I presume is a lot cleaner.
(And makes other programs go ??? a bit less as to what just happened presumably. :smiley: )

EDIT:
D3D11On12 is what I expected most current D3D12 titles to be relying on if not coded explicitly only for D3D12 with no D3D11 code or interop how it’s called code so that should help overall compatibility from the sounds of things. :slight_smile:

Overlays and what not too as another improvement.

I stare at a lot of disassembly, and all I can say is WTF is any of that? :stuck_out_tongue: Those pseudo-variable names, comments on branches on variables I don’t see declared or used anywhere. That’s meaningless trash :slight_smile:

More details:

I had a feeling it was something to do with raw input so this looked promising. Loaded the game in IDA Pro which is a very popular disassembler/decompiler and is what’s seen in my screenshot. Checked if the game used GetRawInputData and it did - once. I looked around the code where it was being used with the decompiler so I don’t have to read assembly. Saw another call to GetRawInputDeviceInfoW nearby and googled it - GetRawInputDeviceInfoW function (winuser.h) - Win32 apps | Microsoft Learn

Game seemed to be using this function to determine if the current raw input event processing from WM_INPUT is either keyboard or mouse. Used x64dbg again to break on this and noticed the function was returning an error and the game wasn’t handling it so the game didn’t determine if it was keyboard or mouse and stopped handling the event. Still don’t exactly know why, but the HANDLE GetRawInputData was returning which was being passed to GetRawInputDeviceInfoW is zero / invalid - causing the bug. Then on the msdn docs I noticed the RAWINPUTHEADER returned from GetRawInputData already contains the info to determine if keyboard/mouse - RIM_TYPEMOUSE 0 / RIM_TYPEKEYBOARD 1.

So I noticed they never had to use GetRawInputDeviceInfoW in the first place. The hex change just makes it ignore what was returned from GetRawInputDeviceInfoW and instead use the info already obtained from GetRawInputData via pretty basic x86 assembly knowledge.

From https://www.reddit.com/r/cyberpunkgame/comments/kb73fr/fix_for_virtual_input_not_working/gfht16e/?utm_source=reddit&utm_medium=web2x&context=3

1 Like

Hahahah, yeah. The whole thing is so silly. You just look at the header to know where the data comes from, not the freaking device that generated it, lol. Talk about inefficient.

That little ‘Raw Input’ red/green light that blips when there’s input on kbd / mouse would have very high overhead if it were doing anything more than just looking at the header as the game processes RawInput.

image


In any event, this game doesn’t use SteamInput, so none of this matters to me :stuck_out_tongue: There’s still no way to block input to SteamInput (since it’s not even hosted in the same process as the game is), so games that use it will always register mouse clicks and keyboard input and other obnoxious things when SK’s control panel is in use.

Making matters worse, Valve disables XInput when a game turns on SteamInput. So I can’t use my gamepad to interact with SK.

I want to murder SteamInput.

2 Likes

Is there a page that we visit to always get the latest SpecialK64.dll or we just need to wait for your post?

Wait for post. Usually those uploaded dll’s are for testing and not intended for the public.

1 Like

For now it’s either this topic or the D3D12 status topic but it hopefully won’t be too long until 0.11.1 is available directly as a download over the existing version.

Alternatively the Discord channel and discussion can also see new build posted.

EDIT: Ah and yeah it’s kinda like a beta or work in progress so there’s a few builds here and there.
Far as I know it’s always compiled against the latest code and changes so no separate branches or specific variants for compatibility or testing of some specific issue or game but just the latest code overall rolled out and less tested and more experimental. :smiley:

EDIT:

Isn’t Proton or something open to where one can look into and route the Steam Input API should there be a need to hook that and set some options for?

Or would that have to go through the Steam client, Steam overlay and such. Hmm guess it would be outside of the game process as mentioned at least and nothing to do with the Windows OS functionality like XInput since it’s custom.

EDIT: Speaking of gamepads I wonder what SpecialK does with AMD ReLive in that regard, it’s a custom XInput XBox gamepad I think but I don’t use ReLive so it’s little emulator thingie is not something I’ve investigated.

One of those things added in 20.5.1 so it’s been there for a while.

(That and the also forcibly installed OpenVR driver and feature set tends to be mysteriously removed after driver updates. ~ )

ReLive and the overlay is something I tend to stay away from, I preferred the older modular process but at least it can be handled with some tweaking.
(Same as with NVIDIA though don’t remove or clean out something that another actually useful or important part of the driver or something the user wants to utilize outright requires or things break.)

EDIT: Should check on that widget too come to think of it.

GCN
RDNA1
RDNA2

All differ a bit with ADL and AMD’s lack of documentation or providing info early enough or to smaller partners and such.

Guessing general sensor data should work but specifics like the fans won’t be picked up if there’s any monitoring for those in the GPU and other widgets.

I’m sorry, but what does that mean exactly?

Installing Special K for the game alone, and not using global injection. See the below page on the wiki:

2 Likes

Has anyone had experience with a game that appears visibly jittery but its fps/frame-time is reported to be stable?
This is the first time I have had this issue. It’s not really affect gameplay but I’m curious whether there’s a way to get rid of it.

If you look at above video, focus on background hills/mountains, you can see that there are some small stutters.
However, the game’s framerate is stable 60 fps with no framedrop.

This issue happens regardless of framelimiter. So, it’s not caused by Special K.

Both my CPU and GPU usage are ~50%-60%.

Is DLSS in use?
From what I am reading the effect causes some dither issue with the mountain area so I wonder if that’s what is interfering and showing up might look like it’s jittering like a bad implementation of temporal AA being overly noise or “shaky” for lack of better wording.

https://preview.redd.it/733bn6pvtxl51.jpg?width=3264&format=pjpg&auto=webp&s=bc736f4b770252592e705bea3696eed00d0312d5

Hard to find a good example of it it’s probably best in motion but a pattern effect.

EDIT: This is something a recent update introduced, something in the implementation got worsened.
There’s other smaller glitches some artifacts and then now there’s this on the distant mountain terrain.

If it’s wobbling or moving it’s going to look jittery best I can compare to is very poor TAA techniques with visible graphical glitches worst being when the object appears to be moving or shifting almost like the 3D geometry itself is stuttering or well yeah, jitters.
(Vegetation in the recent Assassin’s Creed games close to the TAA ghosting issue like SMAA T2X at low framerate.)

EDIT: Guessing the limiter in SpecialK can also be tuned if it’s very brief short stutters or delays having it update even faster so you can catch what to call it sub-1ms micro (macro?) stutters perhaps.

If this is unrelated entirely to graphical effects or glitches giving the impression of stuttering. :slight_smile:
Think that used to be more of a multi-GPU inherent issue with SLI/Crossfire and the frequent choice of alternate frame rendering as a way to optimize performance across the two or more cards with some drawbacks in turn.

It’s not quite the same but a similar feeling of stuttering or micro stutters going on but might not show up on the FPS limiter depending on how frequently it’s updating.

1 Like

When I take a closer look, the issue is actually more like what you described; only distant background objects, like mountains, appear jittery/shaky. The main character does not jitter at all.
So, these jittery/stutters are actually graphical glitches and not related to fps or frametime.
Though I tried disabling DLSS and anti-aliasing, the issue is still persist. I guess this probably is how the game was made and there is nothing end-user could do to get rid of it.

Modern games are complex… Now I don’t know if this is the case with Death Stranding, but you also have Variable Rate Shading on the RTX 20 series and later that can dynamically adjust the shading quality of the game based on the game world being rendered – basically decreasing shading quality for objects in motion or in the distance dynamically while using the “full shading quality” of objects in focus or clear view.

Similar optimization techniques might also be used in the typical post-processing effects – especially when combined with temporal optimizations that take multiple frames into account.

So yeah, modern games are complex…

Since it only seems to be experienced with distant objects I would assume it’s due to something that is either temporal or dynamical in nature and causes those jitters as a result of its processing.

Yeah another one would be screen space effects and solutions that might be flawed in how they’re perceived visually or how am I describing this.

Lower precision shader effects or lower resolution effects can also be inherently noisy and with the temporal anti-aliasing and effects re-using previous frames there can be some I tend to call it dithering but noisiness or lack of detail.

Some games even upscale data from a lower resolution though on PC the actual frame buffer resolution scaling tends to be a toggleable option.

Doesn’t work too well in static screenshots but temporal AA is what I’m having most trouble with the vegetation and finer details almost look like they’re shifting or vibrating even with higher framerate figures or trying to offset the effect by resolution super sampling scaling to limited success.

Graininess or ghosting are other issues overall it differs a bit how bad it looks from title to title and how heavy some of these optimizations are against the reduced visual quality.

Nier Automata has some of this come to think of it and the discussion of screen space effects namely a fixed resolution ambient occlusion effect from memory and the sun ray especially in some scenes.

Anyways the sum of it is a lot of effects together makes for a very rough final image depending on how sensitive the user is to this. :slight_smile:

And with all the shader effects modern game engines throw in optimizations for performance and all that it’s not always easy to try and resolve short of well even brute forcing high-resolution super sampling tends to have limited success especially against the massive performance decrease.
(Full integer as preferred so 2x2 or higher which already makes 1920x1080 → 3840x2160 and a far heavier GPU workload.)