Topic-Free Mega Thread - v 1.11.2020

Man, I really do not like how the sRGB gamut was mapped, because if it wasn’t for the need to map nicely to sRGB, I would have preferred a slightly bigger version of AWG-RGB to be honest. I guess it was based on the limitations of micro-LEDs at the time, but could they not have tilted the triangle a few degrees counter-clockwise?

sRGB is a CRT-derived colorspace. It was designed in 1996/97 (nobody seems to agree on that) for standardizing color on the Internet, basically. Thus it was designed around the display device all computers at the time used, CRTs. Additionally, they threw in a weird piece-wise gamma function because linear near-black gamma looks better under overhead fluorescent lights glaring off of a CRT.

In terms of future-proof HDR, would it not be easier to store colour values in CIE coordinates and intensity, and then have a real-time translator getting the correct values for whatever gamut is needed for the display in question? When it gets a number beyond the scope of the current space, it can either snap to the closest maximum, and/or compress the intensity of the surround areas.

Store them in CIE? No.

But CIE XYZ and xyY are colorspaces that get used during Special K’s HDR processing. There are a number of colorspace transforms for the various components of the tonemapper, some need to be in LogC, others need CIE XYZ, still others operate in PQ (HDR10 / DolbyVision) gamma.

It’s not really fair to pigeonhole SK’s HDR processing down to a single format, there’s a lot of work going on in several colorspaces. scRGB just happens to be the colorspace chosen for final composite (game scene + alpha-blended UI). After the final image is produced in scRGB, there’s an extra step to convert to signal standards.


What you’re describing already exists, BTW. SK does its tonemapping tailored specifically to the attached display device.

Oh, sorry, I wasn’t talking about SK specifically. I was talking more about monitor output. For example, instead of a pixel in an image stored as (255,255,255), have it stored in the computers as (0.3127, 0.32902,1) (CIE-xyY). Then that set of values gets sent to the monitor, which does all of the remapping to the correct power numbers in relation to what gamut the monitor is set/calibrated to.

100% There is 0% reason why it shouldn’t work. I mean, I have a 2080ti and a RX580 in my system to power my 6 screens (yeah 6, doing 3D modeling / having a shitton of other screens open / Video Editing etc.) The only thing I can think of is maybe needing a shitton of power in your PSU, but then again… 1600W PSU’s are a thing…

About having to write software to get a specific GPU powering a specific game: Windows 10 got ya covered! Right click on your desktop > Display Settings > Graphic Settings > at the bottom you get an option to select “Desktop App” or “Microsoft Store App” (choose whatever the game is, basically Desktop App for anything that isn’t from the MS Store) and click “Browse” > Point towards the exe of your game > OK > Options > Select the GPU you want to use for the specific game > Windows will pass through the signal from that card through the card that is connected to your display (costs about 3-5% performance in games) > This way you can even output 4k60 4:4:4 (if the GPU that is connected to your display supports it) through an “retro” card e.g. a GeForce 9800GTX.

I use this option to run old games that don’t need much GPU-power on my iGPU.

2 Likes

Oh, wow. That’s awesome. Here I thought I was going to need two separate cables and inputs on my display, lol.

Yeah, the PSU is going to be an issue… but not completely unreasonable since I don’t intend to run both cards at full steam simultaneously.

Wonder what will happen when Intel enters the arena and I then want an Intel, AMD and NV GPU in the same system? Does the Universe collapse in on itself? :stuck_out_tongue:

I’m running that config already, basically… iGPU/2080ti/RX580…

does anyone have a step by step to get the frame limiter working with assasins creed Val on ubisoft. ive tried a bunch of different ways but cannot get it to work.

Thanks in advance.

It should be just drop-and-go. The DLL and INI file need to be in the game’s directory, and I’ve pre-configured everything based on my own testing over dozens of runs of the benchmark.

You cannot use SK’s OSD to configure stuff, but it should just work out of the box.


If you need verification that it’s working, one thing you can do is create a couple of keyboard macros to control the limiter…

Add the following to the INI file:

[Macro.FPS]
Ctrl+Shift+0=TargetFPS 60.0
Ctrl+Shift+1=TargetFPS 30.0
Ctrl+Shift+2=TargetFPS 0.0
Ctrl+Shift+3=PresentationInterval 0
Ctrl+Shift+4=PresentationInterval 1

You can use any combo of Ctrl, Shift, Alt and regular key for these combos

TargetFPS 0.0 means unlimited
PresentationInterval is VSYNC (0 = Off, 1 = On)

Should work just fine, Kal. Windows is able to load two distinct graphics KMDs since Windows 7, I believe it’s a WDDM 1.1 feature.

I received my RTX 3090 a few days ago (currently caught up with studies so not much time to play) and it’s rather… delightful. Such an amazing graphics card, in every sense of the word. Also got AC Valhalla key from AMD Rewards, for my 3900XT purchase in July. Might give it a shot soon, if I find the time.

Anyone who bought an ASUS RTX 3000 series cards ought to visit ASUS’s website and download the BIOS update for idle fan stop, did it with my TUF OC 3090 and it’s pretty great. Acoustics and performance just make me absolutely baffled, being used to the Radeon Vega cards these past few years.

It isn’t Windows I am worried about. It’s the myriad of PC ports out there that already fail to work on hybrid laptop GPU solutions when it’s pretty easy to explicitly choose a GPU to run on in DXGI. Some of those nightmares cannot even be overridden and will paradoxically always pick the wrong GPU when the user defines a preferred GPU at the OS level :slight_smile:

Hopefully when Intel enters the market in a serious capacity, they will have a driver with similar D3D11 / D3D12 feature tier support as the real GPU vendors do… cause I don’t have a lot of faith in games to pick the right GPU (for rendering or even just querying hardware caps).

Oh, yeah. About that. My experience is that most games tend to boot on the display that the primary monitor is connected to… but it might be a quirk every now and then. I always wondered if there was an application where one could choose explicitly where to render the game, other than Windows’ own preferred GPU setting, but if one exists… I don’t know of it :confused:

Say… if you end up getting an RX 6900 XT, I might be able to get you an invite to the Radeon Vanguard beta testing program, if you can tolerate using Discord to talk to folks every now and then. I get a hunch you’d get along well with their HDR and display guys, and it’d certainly be refreshing to have an actual graphics programmer’s input there. Techie stuff, hahah :smiley:

I can probably stomach Discord. It’s not really my preferred communication style, but it does seem I miss out on a lot by not using it. Same goes for Twitter, it surprises me how many situations exist where contacting someone in a professional capacity is easier via Twitter than formal channels like E-Mail.

I feel ya, man. Modern world! I’m fairly sure that they’re going to invite a new wave of folks to the program when these new cards launch, I’ll be sure to bring it up and let you know if I succeed. You’d basically have access to prerelease drivers, a hotline to request bug fixes and direct access to a considerable amount of AMD’s driver engineers, which should be a pretty productive thing in its own right. I learned a lot with them the past year i’ve been there, but I was kind of forced to get the RTX 3090 because unfortunately, I strongly feel that their DX11 driver quality just isn’t up to par (with the ability to work with command lists basically not existing) and I consider this far too important to skimp on.

TBH I have never had an issue running a game on a specific card with the Windows 10 setting… it really seems to work flawlessly. And I have over a 1000 games, so if there are any that you want me to test LMK.

I also have a laptop with nVidia Optimus… (Intel HD 4600 + GTX870 6GB) it is a hell to get that working correctly every time with games just blatantly ignoring the setting to use the high performance card more often than not (my laptop has a LED to show when the 870 is active), and the Windows 10 setting doesn’t work for Optimus enabled laptops (or at least, I haven’t been able to get it to work correctly)

Up to you, but personally after setting up my own server, I had to drop it a few months later. Discord is too emotional and full of pointless debates. And as I’ve mentioned before, random people get too comfortable in any Discord and start awkwardly spilling their personal life. I thought I could stomach it myself, but ultimately couldn’t.

I quite literally kicked everyone out of my own server without warning. Didn’t delete it since some stuff needed to be archived.

They’re called hangouts I think or however it works so I can imagine there being a bit different than stuff like forums maybe closer to the old bulletin board systems of posting.

Reminds me of this one time on the bus an elderly man sat down and was being really talkative about his work relating all the deaths and horrific injuries he had seen operating heavy machinery.

EDIT: Suppose it’s about standard though for those working with old people, must be a lot of stories going around.

At least for schools and such you’d only have to hear a lot of tall tales.

Heh yeah I get that it can quickly grow into a full time job if you want to ensure that it’s clean and professional… given the explosive messenger-like and at the same time forum-like experience, it gets super hectic and you’ll definitely need a full team of moderators to that end.

But ultimately, this one server I’ve mentioned him is alright. It’s very private and you have to get an invite from the project maintainer, so things tend to be a lot quieter as folks are very tightly knit in there. It’s relatively more professional than social, and I feel in the right blend.

Yeah Vanguard is working pretty well from what I am hearing about it, took a while for it to get to that point but it seems to be functioning nicely now together with the new “Crash Defender” as AMD calls the reporting service they added to the drivers.

EDIT: Catches some really interesting bug reports and resolves them like this one screensaver causing a GPU driver crash when running and actually making a dent on the Navi10 issues like the memory instability and several of the black screen GPU shut-down causes.

(Still several outstanding issues and unlisted problems that AMD’s hopefully aware of internally but it’s getting there.)