Topic-Free Mega Thread - v 1.11.2020

But yeah my own experimentation with this was back in the earlier days, AMD had a helper software called Hydra I think NVIDIA had Triple-Head or something I entirely forgot it’s name it eventually stabilized to 2 or 3 main displays through a combination of HDMI or Display Port connectivity in various modes.

Unlike the earlier days of 10 - 20 displays all splittered up making a giant single image though completely divided by the larger bezel and border areas.

EDIT: It’s been a long time since I used tech like this myself is what I mean.
So some general advice and suggestions is what my own contributions come down to.

I was reffering to https://www.reddit.com/r/pcgaming/comments/b1r9g5/using_a_gamepad_on_one_screen_and_a_mouse_on/
where somebody says that it’s possible to lock input methods per screen.

1 Like

Interesting!

I wonder if there’s a way to lock the input to a specific screen I know the mouse options support this but then there’s also keyboard and gamepad options.

EDIT: It would retain the selected input to the display window the way I’m thinking if this could be solved.

Thanks for answering so fast anyway! ;))
Would You know if Kaldaien could help me with this? or anyone / anywhere else to look for info?

Yeah, it’s a good read! And now I cannot stop to try and figure out how to do this with G29. If it’s doable with controlers (both xbox and ps4) I guess it should’t be much more crazier to do it (lock the input to a specific screen) with a steering wheel, no?

Yeah I wish I had a actual answer but I hope Kaldaien has a more conclusive answer for when he is active again on the forums. :slight_smile:

Slime Rancher from the Reddit URL is also on Unity which uses a form of borderless display mode so if you end up outside of what’s considered the window borders you can end up minimizing or losing focus on the window.

Part of a issue of how various games or game engines deal with input whether they have focus or not or the input is just registered in general and how it’s all handled plus different ways this is implemented and then input for mouse and keyboard controls as well on top of various ways of gamepad input management.

I see from the example as well this is also going by some of the existing SpecialK options such as input being active or not for when the process is in the background and how the X Input controller is arranged.

There’s also these options.

[Input.Keyboard]
DisabledToGame=false

[Input.Mouse]
DisabledToGame=false

[Input.Gamepad]
DisabledToGame=false

You can deny input from registering for the game process entirely though then you would likely want to re-enable one or more of these conveniently again and while opening up the SpecialK OSD could also work to toggle input on and off that is a bit more involved with having to go through the SpecialK on-screen display UI and various drop-down menus for these toggles each time.

If I knew the macro functionality a bit better I wonder if hotkey binding these could also be a alternative enabling and disabling it from a button combination at will or a toggle.

For XInput the slot-reassignment should be matching the Reddit post for selecting the gamepad device though I am a bit lost about the background input part.

There’s the window option for if a game is set to continue running or if it’ll pause but that’s a bit different I think.

This is generally as simple as

Turn on “Continue Rendering,” and then you can have keyboard/mouse input focused in another application, while the game uses XInput or DirectInput for gamepad. This doesn’t work for SteamInput games, and a large part of the reason I hate SteamInput.


It should be added, that there is a performance penalty for this unless you are using DXGI Flip Model. SK has overrides for that since 99% of games don’t use it.

2 Likes

What is this program shown in the pictures?

That’s the SpecialK UI then you have the various tabs and menus for configuring it.

Bringing that up (Ctrl+Shift+Backspace I think is the default binding.) should have the general UI and menus all folded up but with the names shown so expanding the Window Management one should give the options seen in the first image. :slight_smile:

EDIT: There should be a larger article on PC Gaming Wiki also before I forget about it for many of these settings and how to set up and use SpecialK.

Thanks a lot! I will check all of that out tonight or tomorrow!
WIll let You know how it goes! ;)) o/

https://www.guru3d.com/news_story/displayport_2_monitors_delayed_due_to_corona_crisis.html

So it’s going to be Display Port 2.0 then not 1.5 but the initial monitors are delayed citing the CV situation as the cause.

Impressive bandwidth too at 80 GB/s (HDMI 2.1 at 48 GB/s) although other than high-refresh rate displays I don’t think much is going to be close to saturating that.
(Seems there’s some compression used currently too that also manages with 1.4 and higher refresh rates for 3840x2160 and higher display resolutions.)

Hmm and both can now also transmit through USB-C as well.
Interesting that this was also going to have the first displays out at the end of 2020 initially, there’s no GPU’s that support far as I know so it’d be running at 1.4 mode.
(Wonder if you still don’t have to worry about the cable beyond it’s build quality so no versioning it’s just going to work sending these data bits or packages.)

1 Like

Oh, wow. I hadn’t been paying attention to DisplayPort for a while… back it goes to making a complete joke of HDMI’s bandwidth :slight_smile: We finally got HDMI 2.1 GPUs, and TVs that support some of (40 / 48 Gb/s) the HDMI 2.1 bandwidth (lol).


USB-C and DisplayPort have interoperated for quite some time now, actually. I think that all goes back to Thunderbolt and MHL. USB-C is awesome, I wish Micro USB would go and die already. USB-C connectors are only a few cents more, but the cables last so much longer… mfg’s have no excuse for not using USB-C.

Yeah driving up to 16k 60Hz and then a variety of modes and resolutions at 120Hz or higher and without requiring compression well once capable displays and GPU’s are available at least plus this delay now for the display hardware to become available.

Hmm thinking about the USB as well guessing it also goes for HDMI 2.1 but both AMD and NVIDIA now apply a USB-C connector to their reference GPU’s though the display engine internals here might still limit what feature level the connection runs at and in addition to what protocol is used also the maximum speed supported.

And yeah that’s a thing too now that you mention it the Thunderbolt protocol although from memory I think I’ve mostly heard about this around Apple devices.

Don’t think there’s any immediate downsides to USB-C either it’s just the slower support and number of ports available on current motherboards for PC and I think even consoles until USB 4.0 cleans up all these variations of 3.0 a bit. :stuck_out_tongue:

Pretty sure the newer XBox controllers also utilize this now over the mini or micro cables or well the others I guess B is the usual flat sided one and A’s the cube one often seen for display purposes.
(LG for example the proud supporters of what has to be the slowest firmware update system on displays ever.)

Just curious but is there any way to get SDR games to use a WCG DCI-P3 gamut so games look more accurate in colors, without resorting to 3DLUT.

I kinda wish that more developers had a option for toggling WCG in their game even if it’s not HDR, as you don’t really need HDR to have the wider color gamut right, something like SDR 2020.

https://www.guru3d.com/news_story/did_nvidia_silently_downgrade_g_sync_ultimate_hdr_specification.html

Seems NVIDIA changed the wording for the G-Sync criteria and specs.

Going from a more detailed list of requirements to a less defined one.

Doesn’t mean too much on it’s own though besides “Lifelike HDR” being super vague from how it’s worded but they are still testing and there are criteria for getting the certification that must be met. :slight_smile:

EDIT: I don’t know about the nit recommended but stuff like dimming and backlight seems like it should be somewhat of a standard but to my knowledge the lower HDR specs don’t even require that so yeah part of the problem is how vague the actual HDR specifications themselves are to where it just has to basically be a little bit brighter without the rest of the effect.

Someone that is set on getting a really good HDR experience … would look at a TV purchase … err would probably look up details on what’s available and some of the tech behind this along with having a very high budget for the displays on offer.

Both the ASUS and the ACER one goes for 2.500 USD / EUR or more so yeah quite a cost.

Those are the top-end current displays though but that is close to an entire fairly high-end computer system on it’s own ha ha.

“Lifelike HDR” sounds weird…

That said, this isn’t the first time we’ve seen high-end certifications change in the HDR space. Most specifically, it often seems to occur to account for OLED displays capable of producing the same or better contrast than 1000+ nits LED displays.

I hope we’re seeing some high-end 27’’ 4K Ultimate G-Sync 120 Hz monitors soon myself :smiley:

Yeah OLED and it’s many variations seem to be really good for this but it’s often been IPS or VA for computer displays whereas OLED type panels mostly find themselves in TV’s changes it up a bit from my understanding as it’s meant to be really good though I am kinda lost as to all these different vendor variants of the tech and experimental upcoming tech. :smiley:

So many factors though, OS, display drivers and then the absolute mess of a situation with the game side of the implementation and support gotten better but there’s still three different formats and then a number of ways these can be implemented so NVIDIA, AMD and Microsoft HDR10 different color spaces and how it all gets mapped out.

In-game configuration and setting it up has improved too I believe but also varies a bit how adjustable some of the parameters really are.

For me the price barrier is the bigger hindrance although it’s probably a really good upgrade whether you go for the more high-end tech or something a bit balanced in terms of fairly good quality without being super expensive ha ha.

Could improve too but I guess that for now these tariffs will [beep] up hardware prices for a while and who knows what the level of normalized prices it will then somewhat settle on afterwards.
(Already feels like it’s close to 2x the cost it’s supposed to be at for hardware, VAT’s a thing but this is worse than that little extra.)

EDIT: Another batch of 6800’s and 6900’s got stocked, neat.
Pricing is up though as expected.

Nitro+ and Red Devil 6800’s non-XT’s for almost 900 Euro.
Red Devil 6900 XT for 1400 Euro.

Ridiculous, interesting to see stock coming in but the pricing is impossible for recommend these cards.
(2 to 4 GPU’s for availability too, very limited quantities still.)

You mean less accurate. Nothing about what you just said improves accuracy :stuck_out_tongue: Game’s content is all still mastered in Rec 709, monitors don’t cover 100% DCI-P3, gamut mapping doesn’t exist.

Isn’t that Win10 setting supposed to be able to solve that problem?`

image

Nope, sadly that toggle is sorta misleading.

This is only for UWP games that doesn’t natively handles variable refresh rate systems. This will NOT work with any DirectX application, just with UWP application in fullscreen borderless mode that do not already handle the DXGI_PRESENT_ALLOW_TEARING flag.

Edit: That particular flag, DXGI_PRESENT_ALLOW_TEARING, is only for the flip based presentation model as well, meaning it requires games to make use of flip model to begin with to be effective.

Nvidia’s DWM driver hack is used to enable VRR in games using the legacy BitBlt model in window mode.

2 Likes

They’re really vague about that UWP stuff even on the blog about it, was actually thinking it was D3D11 but for exclusive full-screen mode.

…That goes for a few other settings though and what it does and what Microsoft supports and how. :smiley:

Various DPI overrides, 260 max path size limit removal, VRR here and the various Game Mode and Full Screen Optimization features which had a more important effect early and then shrunk down.
(Game Mode I think now has the awesome single feature of pausing Windows Update in the background when a full-screen 3D application is running or something of that nature.)

EDIT: Upcoming features are being ping-ponged back and forth too seems like 21H1 / Vibranium 2 and whatever other names it has might not have a newer WDDM version but Microsoft is jumping past 2.8 up to 2.9 with 21H2 instead.

So much to catch up on and try to follow. :stuck_out_tongue:

1 Like