How to limit the FPS in DX12 games, using SK?

Is there a catch-all set of parameters to enable just the frame rate limiter in DX12 games? Something that I can just copy and paste to the INI for testing?

I know SK isn’t compatible with most titles running in D3D12, but I’d still like to try to cap the FPS using that solution first, before moving on to NVCP and RTSS.

I’ve just tried the Horizon INI file with Dirt 5 (which is D3D12), but the game wouldn’t even start, and Special K’s MGS alert sound popped up every time. Ghostrunner in D3D12 (with Horizon’s settings) starts, but it’s like no changes were made to it.

I was using this settings on Horizon. SpecialK.ini (513 Bytes)

I actually do want to know the status/situation of Dx12 with SK. New games are preferring it to the now-ancient Dx11, and Vulkan seems so far to be a really hard sell to the game devs.

And while I can understand Kaldaien’s position on Dx10, I personally would have liked to have it included in SK, given how a number of popular-ish games are DX10-exclusive or have obvious advantages of running it over Dx9 (and lack a Dx11 mode).

I don’t want to mess with D3D12, that has nowhere to go but make already unstable D3D12 games even more unstable. As soon as I see some improvement in the stability of D3D12, I might change my mind. Otherwise, I play all games in D3D11 :slight_smile:

As for D3D10… it’s more or less an identical API to D3D11, except for some weird reference counting changes. I don’t have time to memorize those changes, nobody does, so it goes ignored.

1 Like

Hmm interesting remark, other than the first wave of what felt like rushed D3D12 support across the initial range of games getting support for this API like Deus Ex Mankind Divided I can’t recall any general instability.

Games themselves can be buggy and have all kinds of issues and there’s limitations and issues for how Vulkan or D3D12 has been implemented but in terms of stability it’s been solid though the API’s demand that the users system hardware and software is robust and stable.

Going to find out real fast in games like Battlefield or Tomb Raider and the new Watch_Dogs whether the system is holding up or not. :stuck_out_tongue:

Horizon Zero Dawn managed to keep a decent VRAM overclock before encountering framerate drawbacks (Error correction.) Watch_Dogs Legion in turn only has a 25 / 50 Mhz (Double data rate.) range and once above that it will crash and fast.
(This GPU, not a model of quality when it comes to VRAM but then the AMD Navi / 5000 series has it’s quirks here and a few of them at that.)

Pretty good for stability testing in that way though I am curious what they’re doing with some of these games so they are just that sensitive to otherwise recoverable errors.
(Newest PC patch also resolved some Ampere and RTX specific software issues though the errors for that are a bit different.)

Nothing really wrong with D3D11 though, D3D12 can help with performance but it doesn’t feel like it’s quite at the level yet where it fully leverages this and on capable hardware there’s now D3D12 Ultimate functionality and the newer D3D12_2 functionality for the moment seen with features like DXR currently through RTX for ray tracing capabilities.
(Eventually sampler feedback, variable rate shading and mesh shaders should be a thing and then however PC Direct Storage turns out too.)

1 Like

It may well have improved by now, I just had such bad experience with all of Capcom’s games and all the others that were experimenting with D3D12 that I wanted nothing to do with it after that.

I’ve always had a top-tier CPU and framerate targets between 60 and 100 FPS, so never really needed D3D12 for anything anyway, so I enjoyed the increased stability and more potential for modding that came with D3D11. Unfortunately, DXR is not going to work with D3D11, so when Ray Tracing stops being a gimmick and becomes something I really want in my games, I’m going to be forced to start using D3D12.

RT right now is just too flashy for my tastes, they’re using it the same way that 3D was when that was popular in films… rather than use it to produce something natural, it’s all about showcasing what’s possible, even if what’s possible isn’t realistic (i.e. everything’s absurdly shiny).

Mmm the early ray tracing implementations I expect will follow NVIDIA’s model focusing on particular effects mixing rasterization and cube-mapping with ray tracing of a far as I can tell very low quality but real-time capable though running through a de-noising routine for a decent final result.

Early support from AMD and NVIDIA for the API’s themselves with D3D12 and Vulkan are also still catching up to the newer SDK’s and extensions while mixing in bug fixes where as for how it feels like AMD’s Windows drivers have improved but aren’t on a level of NVIDIA but both have open unresolved issues and unsupported features and recommended best practices of how to handle certain tasks here.

AMD’s RDNA1 / Navi10 driver support was something alright running almost everything through DXVK became a standard and still can be a big performance and compatibility improvement although while it took much longer than it should have the driver situation now is passable enough it’s almost to a level I could recommend looking into AMD as a GPU alternative especially if RDNA2 / Navi20 (6000 series.) is capable of matching NVIDIA in performance or near close at a lower cost.

CPU wise AGESA for Zen3 seems early too and has issues and known problems, memory stability even on Patch C variants is also iffy including Zen2 CPU’s which is really good to know for XMP compatibility.
(Fixes already confirmed to be coming in and more features also in newer AGESA code and later bios updates using that code.)

JRPG fan here though I’m not especially unfamiliar with some very rough and strange behavior and some of the PC ports hit issues I don’t think were ever intended to be possible.

Monster Hunter World itself went through a couple of major revisions and it took until Iceborne the content synced up with the console version and even then a few more updates were needed to smooth out the issues and quirks with the PC port, D3D11 and later introduced D3D12 I think they even revised the initial HDR implementation and then there’s this stuff developers or the publisher insist on with anti debugging and anti tracing code and custom attempts which in that games case throw it out and be rid of it. :stuck_out_tongue:
(Seriously it lowers CPU temps, decreases CPU utilization and improves CPU performance with zero drawbacks once that checking routine is properly disabled.)

Some of the studios from SEGA and others on D3D12 or Vulkan is going to be something alright, Koei too but bringing them up is kinda cheating since Square, Capom and others might be inconsistent but they’re… unique. :smiley:

EDIT: Ooh that was a bit lengthy.
It’s a big topic though and a lot of upcoming changes and unknowns in the future and maybe this next form of support could be really rough at first despite the few years now of developers getting the hang of how this is supposed to work and improvements to both API’s and such over time.

I suppose that’s what it comes down to the low level API’s allow more so you can never really know the results and how it’ll end up, D3D11 kinda handled stuff or resolved things as it should be and mostly when wrapped or otherwise handled in a less standard way is when that stuff gets exposed.

Course discussing the intricacies of the graphics API sorta falls when there’s problems with less complex more standardized stuff like the input API and how often that breaks or is done wrong in one way or several so yeah…might not be entirely unwarranted to be a bit concerned how things are going to be handled ha ha.

“Your mouse is now a emulated gamepad.” So many things that can go wrong and so many other issues as well.

No wonder AMD and NVIDIA probably have a ton of back end stuff for D3D12 and Vulkan API support just in case. :smiley:

Steam API I guess could be used as another example and probably other stuff too that should be simpler but breaks in the strangest ways.

EDIT: At least the ways things are currently full native D3D12 only support isn’t a big thing yet for backwards compatibility both good and bad compatibility and performance wise. :slight_smile:
(And D3D11 support isn’t going anywhere anytime soon.)

EDIT: Getting back to the framerate limiter itself though that’s also a host of other issues like timing and uptime consideration for this clock cycle and counter going on and problems overshooting or undershooting the target framerate and then latency and input delay issues on all of it too.

There’s always something.

That’s an entirety of 150 games that you’re leaving outside!

The majority of modern DX12 implementations are extremely good.

At most, it’s only 20.

Did you reply to the wrong post? Discourse has your reply flagged as one directed to my DX10 post, but I feel your message is more applicable to the DX12 topic at hand?

Nope, it was the right post. You posted a bunch of multi-API games and then claimed that 150 of them wouldn’t work in Special K because they were D3D10, ignoring the fact that the game supports D3D9 and D3D11 as well :stuck_out_tongue:

Ah, my bad, that went completely over my head. Sadly the backend that PCGW uses to create our lists don’t allow us to set up “NOT EQUAL TO x” so a list of exclusively D3D10 support isn’t possible.

But yes, almost all of the game also supports another API.

It does not change the point that most of the games that are Dx10-exclusive or has Dx10 as the latest APIs are the games that people want to play, at least from time to time. I have counted at least 35 games in that case. As relegated as Dx10 appears, it is not one that nobody in the mainstream gaming sphere has bothered with.

I’m interested in this too. I run borderlands 3 in DX12 and without the layout I just configured the TargetFPS=60.0.
Also, how can I let it start with special K like using the feature from the layout to install the dll for a specific game?