Topic-Free Mega Thread - v 1.11.2020

Uh…

10/29/2020 14:03:02.750: C:\Program Files (x86)\Ubisoft\Ubisoft Game Launcher\games\Watch Dogs Legion\bin\DuniaDemo_clang_64_dx11.dll:  In memory integrity check failed. This may be caused by debug breakpoints or DLL relocation.
10/29/2020 14:03:04.061: C:\Program Files (x86)\Ubisoft\Ubisoft Game Launcher\games\Watch Dogs Legion\bin\DuniaDemo_clang_64_dx11.dll:  Crypto++ DLL integrity check failed. Actual MAC is: 7649EC17B995B48BC8CA10AC705FF48E63BBBD4D

Can we get rid of this stupid DRM please? It’s breaking without me needing to do anything.


Additionally, the uPlay overlay is ripping itself a new one:

10/29/2020 14:06:45.945: [Input Mgr.] [!] > First Call:         XInputGetState9_1_0_Detour
10/29/2020 14:06:45.946: [DLL Loader]   ( overlay64.dll                ) loaded '                                                                                                       XInput1_4.dll' <LoadLibraryExW> { '           Initialize' }
10/29/2020 14:06:45.946: [  Input   ]   >> Hooking XInput 1.4
10/29/2020 14:06:45.987: [Input Mgr.] [!] > First Call:           XInputGetState1_4_Detour
10/29/2020 14:06:45.988: [Input Mgr.] WARNING: Third-party module 'overlay64.dll' uses different XInput interface version (XInput1_4.dll) than the game (XInput9_1_0.dll); input reassignment software may not work correctly.
10/29/2020 14:06:45.988: [Input Mgr.] [!] > First Call:         XInputSetState9_1_0_Detour
10/29/2020 14:06:45.988: [  Input   ] WARNING: Recursive haptic feedback loop detected on XInput controller 0!
10/29/2020 14:06:45.988: [  Input   ] WARNING: Recursive haptic feedback loop detected on XInput controller 1!
10/29/2020 14:06:45.988: [  Input   ] WARNING: Recursive haptic feedback loop detected on XInput controller 2!
10/29/2020 14:06:45.988: [  Input   ] WARNING: Recursive haptic feedback loop detected on XInput controller 3!

^^^ This happens because the damn overlay does the following:

10/29/2020 14:03:04.336: C:\Program Files (x86)\Ubisoft\Ubisoft Game Launcher\overlay64.dll      :  D:\JenkinsWorkspace\workspace\client_build_installer\client\products\overlay\mg\overlay\win\PlatformHooks.cpp(132) : 'xinput9_1_0.dll' hooked
10/29/2020 14:03:15.454: C:\Program Files (x86)\Ubisoft\Ubisoft Game Launcher\overlay64.dll      :  D:\JenkinsWorkspace\workspace\client_build_installer\client\products\overlay\mg\overlay\win\PlatformHooks.cpp(132) : 'xinput1_4.dll' hooked

Pick one please, not both (!!)

1 Like

Wow… but kaldaien, I thought you tried to brick peoples OS and magically fry their routers and electrical circuit in their entire house with your DRM supported code in special k!!!

The text above is a clear satirical joke, I unfortunately have to write that this is satire because people are stupid

Yeah, I have nothing against DRM itself, but the implementation methods over the years is seriously frustrating and boggles the mind.

The more I learn the more I feel I should ask how the game manages to work at all, pah and then Ubisoft caused that problem on the Steam forum for AC Odyssey when SpecialK was somewhat resolving some of this at least to a extent.

Sigh couldn’t just be BattlEye which can just be command-line resolved on it’s own from what I’m now reading it had to be a bunch of client, overlay and game specific compatibility issues and problematic resolutions to backwards compatibility and who knows what else.

At least Koei Tecmo are just generally inconsistent although I suppose comparisons like that this would be more akin to the stuff Capcom jammed into Monster Hunter World over time plus a bunch of additional issues on top of the debugger and what not.

Explains that one line during the prologue mission at least on the annoyance of anti tamper, some cheeky developer snuck that one right in ha ha. :smiley:

Question Kaldaien you using DX11 mode and such? @Kaldaien And Global Injection mode for SK for Watch Dogs Legion.

On side note I know ppl yesterday had their game unlocked and playable within around 4PM eastern time. I was like wtf. Mine didn’t unlock until this morning. I do know a few who bought the collector’s edition too lol.

Edit Noticed game has HDR support. Does it work in DX11 mode?

Edit: Lovely Game doesn’t detect my HDR mode I have turned on. Does it only work for Nvidia?

When you say that Free/Gsync has little potential to improve smoothness what do you mean exactly by that, wouldn’t you need to always hit the specified refresh rate to not get screen tearing, or weird micro-stutters?

No, you don’t get tearing just by the virtue of having V-Sync on, lol. Micro stutter is eliminated by frame pacing. Basically, a functional framerate limiter.


“Drop Frames” does not mean display an old frame twice, it means remove queued frames from the render queue. If you don’t turn that mode on, then you eventually start to accumulate 3+ frames worth of undisplayed frames in the render queue – input latency gets out of control.

So supposedly if i don’t use GSYNC i should leave the framerate limiter at the decimal value etc 59.995/119.95hz, and have Drop late frames ticked?

I still do get microstutter though whenever the FPS doesn’t match the refresh rate though but that’s normal right?

Also with this setup reported render latency of 1 frame is correct?

YOu want to match the 1% value in most games to eliminate stutter. For example WDL if I put limiter on to like 144 I will get microstutter. If I put it to 30 it will remove all stutters pretty much 95% of time. Almost flawless. You need to use the lowest fps value your game reaches to get flawless framepacing. And even at 30 fps my game still runs like its at 60 or so. With the settings Kaldaien is telling you to activate.

Microstutter can happen from SSD /HDD queing up data and CPU spikes or anything usual. In WDL I had to disable KB/M in SK and almost all stutters were gone as well b/c of that. Even more smooth. I use PS4 Gamepad for that game.

I could not tell you. The number of times my framerate is not exactly my refresh rate is too small to measure this ;-\

I’m always delivering frames within about 0.1 ms of VBLANK, which is consistent enough to turn on Black Frame Insertion and never think twice about image artifacts.

Hmm i don’t know if i set my game to 60fps framelimiter which it always hits. i get consistent jerking when moving camera/character.

Disable KB/M in SK as well. IF you have flawless hitting of 60 fps. I use gamepad only. that got rid of a ton of microstutter. Its same with FFXV game in terms of interrupted input lag.

Actually can it be a refresh rate/FPS mismatch as i don’t seem to get it if i enable GSYNC but that has the near black gamma bug on LG, sadly im too sensitive to use BFI on the LG OLED as the C9 only has the 60hz mode.

EDIT: interesting targeting a even multiple of my current refresh rate when GSYNC is disabled with the frame rate limiter eliminates the weird hitching, but of course actual latency/motion smearing is much much worse.

It is? It should be the other way around. G-Sync is basically a piece-wise thing:

  1. Present Rate < RefreshRate - Render Queue Limit = Variable Refresh
  2. Present Rate >= RefreshRate - Render Queue Limit = VSYNC

You can turn off the behavior in 2, but you will get screen tearing. Latency reduction comes via scenario 1, latency increase is observed in 2.

The latency added by VSYNC capping your framerate is much higher than the latency removed when G-Sync is letting you send frames to your display at sporadic intervals. Worse still, Special K can use a latency waitable swapchain to remove the latency penalty of VSYNC in a fixed-refresh screen mode, but G-Sync doesn’t work with latency waiting. That means your render queue’s going to immediately fill up completely (3-5 frames) as soon as you hit refresh rate and VSYNC caps your framerate.

Latency will come back down as soon as you start running into performance problems that cause you to render < Refresh Rate, but this swing of 3-5 frames worth of input latency as you move around is very unpleasant :stuck_out_tongue:

You either always want to be below V-Sync performance targets when using G-Sync, or find a better solution (i.e. FastSync + Framerate Limiter).

Basically my go to solution since forever. I don’t need 90+ FPS — just give me ultra-smooth framepacing at 60 FPS (VSYNC cap at 120) and I. Am. GOLDEN!

Well i did say with GSYNC deactivated, and fps set to a even multiple so of course it would be much worse latency/motion smearing when it’s only updating at 60 frames.

Non-even multiples of refresh rate in framerate limiter without GSYNC causes hitching that i believe SK captures as it happens each time the grey vertical bar shows up in the frametime graph widget in a even interval though, i wonder if that’s what consoles do use a framerate limiter set to a even multiplier with vsync on, as when i set the framerate limiter to 30 fps it feels very “console-like” if i enable motion blur.

I assume you are referring to the latency histogram when you’re talking about gray vertical bars.

Enable a waitable swapchain to keep that under control, just be aware it doesn’t work with G-Sync.

Kaldaien, you can enable the fullscreen borderless upscale without any latency if the game already starts its window at desktop resolution, yes?
And if it doesn’t, you can untick the option and then set the resolution yourself with centering if needed?

No, any upscale operation done using windowed mode prevents you from drawing directly to the DWM. That will add at least 1 frame of latency while the DWM compositor activates and rescales the window’s image.

1 Like

Ubisoft’s DRM / Anti-Debug actually breaks DXGI Waitable Swapchains :frowning:

One of their anti-debug tricks is to replace calls to CloseHandle (…) with a function that crashes when you call it. They then monitor all calls to CloseHandle (…) to see if the exception is caught or uncaught. Said exception is caught when there is a debugger attached, it’s uncaught and would completely crash the software when no debugger is attached.

I need to close the wait handle on the SwapChain, or, well, this happens:

When your deliberately introduced software defects to make debugging a nuisance can do this to my framerate limiter, it’s time to admit that it has overstepped its bounds and is harming customers. I am certain that their tricks are breaking other software too, not just mine.

In regards to that, would it even introduce 1 frame of latency if SK could set the window client region and swapchain buffer up in such a way that the GPU scanout hardware performed the actual scaling?

If I understand it correctly, that would prevent DWM from scaling it and introducing latency?

  1. DirectFlip with panel fitters: Your window client region covers the screen, and your swapchain buffers are within some hardware-dependent scaling factor (e.g. 0.25x to 4x) of the screen. The GPU scanout hardware is used to scale your buffer while sending it to the display.

With windowed flip model, the application can query hardware support for different DirectFlip scenarios and implement different types of dynamic scaling via use of IDXGIOutput6:: CheckHardwareCompositionSupport. One caveat to keep in mind is that if panel fitters are utilized, it’s possible for the cursor to suffer stretching side effects, which is indicated via DXGI_HARDWARE_COMPOSITION_SUPPORT_FLAG_CURSOR_STRETCHED.

Furthermore, applications should leverage IDXGIOutput6:: CheckHardwareCompositionSupport to determine if they need to scale the content before presenting it, or if they should let the hardware do the scaling for them.

(from For best performance, use DXGI flip model - DirectX Developer Blog)