Topic-Free Mega Thread - v 1.11.2020

Ah so that’s a bit on how that works for AMD and Smart Access Memory.

https://www.amd.com/en/technologies/smart-access-memory

Plus it mentions a second November driver, nice.

And then there’s this from Reddit on what this actually does not just the tech talk and marketing hype about it.

https://www.reddit.com/r/Amd/comments/jm2r2j/smart_acces_memory_and_pcie_30/

Could be good maybe even more so for Direct Storage come to think of it needing more data still.
Though it’d depend on the amount of data and potential bottlenecks here whether this is a larger gain or not much of one.

Explains why AMD has it more like 3 - 6% performance rather than 1 - 3% it’s actually a bit more meaningful but situational.

EDIT: Reviews for Zen3 and the 5000 series of desktop processors should be going up on Thursday too the 5th in time for what seems to be retail availability and sales on the 6th or midnight between the two.

Not quite worth the cost for the estimated performance here but it’d be nice to have if that wasn’t as much of a factor. :smiley:

EDIT: Love the little laser heating the CPU is doing to the GPU VRAM in that SMA comparison.
All the data beams!
(20x the throughput totally accurate comparison. :stuck_out_tongue: )

And next up Crysis news.

Remastered alien jelly blobs are a thing!

Kinda liked the more aquatic design not bit of gum in metal casing the sequels went with.

EDIT: They make a real satisfactory pop with the microwave gun at least.
(Terrible things done in video games all the way from the 1970’s → present day.)

EDIT: Also that’s a really tight schedule a year would be minimum I would think at the very least and they’re covering a lot of platforms so one and a half would be more like it stuff like this takes time to do right and that involves a lot of testing and stuff too in this time frame.

Would be costly though, probably quite a bit more at that.

EDIT:
Let’s see it’s Playstation 4, XBox One and possibly Pro and X updates then Switch and also PC so yeah that’s some work alright.

Added a new visualization mode to show off quantization standards in HDR.

HDR10 is (surprise, surprise) 10-bit :wink:

Turns out image fidelity is much higher if you render 16-bit and then quantize down to 10-bit yourself.

The last thing you want to do is what Ubisoft is trying to do, and render 10-bit to avoid quantization altogether. That actually produces banding artifacts and near-black noise that I think they thought could be avoided by rendering at a lower bit-depth and simply applying SMPTE 2084 EOTF rather than re-quantizing the image.

Quantization’s not a dirty word, but HDR10 is.

This is like the difference between EGA and VGA for those unfortunately old enough to remember those things :slight_smile:

For those not old enough (luck you), here’s a taste of graphics from 30 years ago.

1 Like

Is this available yet to use? HDR10 mode?

Question… Which settings are recommended for AC odyssey for performance. I was trying dxvk and got a huge performance improvement. But I really want SK HDR

Okay Watch Dogs Legions is spamming this in the logs.


11/02/2020 14:59:56.528: [WIN-SCHED] NtSetTimerResolution (1.000000 ms : Set) issued by KERNEL32.DLL

Though I am trying DX12 mode to see if game is a bit better.

No, you already know that.

It doesn’t work on AMD drivers.

AMD goes through AMD_AGS.dll which would need support in SpecialK and Kaldaien would need a AMD GPU for testing and all that.

For whatever reason it’s AMD GPU Services for AMD and NVIDIA API (NVAPI) for HDR instead of just going with Windows 10 I suppose come to think of it that may allow Win7 HDR for backwards compatibility but it has it’s quirks and whatever criteria or limitations either vendor HDR implementation works via or how to put it. :smiley:

Early on what with Shadow Warrior 2 some of this made more sense but other than some specific G-Sync or FreeSync2 bits I don’t think there’s any pressing reasons to go this route now that Windows 10 has HDR support natively.

Unless backwards compatibility just is that important still which for a modernized game engine and including D3D12 support Win7 functionality shouldn’t be that big of a priority especially now that it’s EoL.

EDIT: This is about the same as the above reply really just expanding on that there’s two paths seemingly and you’d need the hardware and then implementing support in the software for each and that’d need the hardware for proper testing and seeing what works and what doesn’t possibly even a range of hardware for compatibility with older stuff at that.
(Generally not quite as much of a concern but can be for a few things.)

EDIT: Plus probably more, WD Legion topic has a little write-up at the start and I doubt the current patch made any major improvements maybe the control issues are less problematic perhaps it’s a bit faster in some situations but the next update is going to be November 9th and hopefully that can start resolving a few of the actual issues.

Isn’t going to do much though for design decisions and choices or at least usually with some exceptions. :slight_smile:

EDIT: And if a certain command line parameter were to disappear well that’d be a pain.
(So far though Ubisoft only really removed one for skipping the logo videos back with Watch_Dogs 1 and Far Cry 4 as I recall.)

Technically, Ubisoft goes through AMD_AGS.dll :stuck_out_tongue: That’s the real problem here… they decided to be dumb and write vendor-specific code for HDR.

As much as I usually have bad things to say about Unity engine, its HDR support in the handful of games that use it, uses DXGI and not NvAPI or AGS. Special K’s HDR10 features work in that engine regardless who makes your GPU.

Every other engine on the planet uses the vendor specific APIs, which I have half a mind to intercept and automatically translate to DXGI since games that use NvAPI and AGS for HDR inevitably stop working after a few years.

In fact, that’s effectively what I’ve done for NvAPI. I now translate its HDR10 mode straight to DXGI and process it as scRGB. As soon as I acquire an AMD GPU, I’ll do the same for AMD.

1 Like

Demo’s up for Dragon Quest XI definitive.

(Via Meta Council)

Not going to replay it anytime soon but nice of them to make a demo version. :slight_smile:
A few reductions visually for a long list of additional gameplay and content updates.

EDIT: Wonder how it switches to 2D mode, playing through the whole thing in that perspective might be a thing for next year perhaps.

Uuuhhh… wasn’t that game just released the other year or something…?

Only if you owned a Switch.

How is this version compared to the vanilla one? Being a port from the switch version.

Looks like this is the reason for the ridiculous 30 FPS limit in the menus in Watch Dogs Legion…

That’s an absurd amount of latency. I don’t think a 30 FPS limit is called for, but a limit nonetheless would be nice. 8 frames of latency at 20 ms per-frame = WTF?

With their 30 FPS limit turned back on, 0 frames of latency.

An application limit to VSYNC rate would be just as effective, 30 FPS is not needed.

1 Like

Haven’t tested it myself but I hope it’s similar to the original PC version but being a entirely separate build there’s a chance for anything from a framerate lock to other issues.

Geometry and asset detail and world detailing is down among other things like texture detail and shader complexity but there’s a lot of updated content and new additions and tweaks plus the 2D mode and bonus content the orchestral soundtrack inclusion and more.

Newer build of Unreal Engine 4 but it’s not so much what it’s using it’s how it’s used and ported although unless they really bungled input or framerate I would hope this should be less of a problem but it is two separate builds and I don’t know which studio is behind the port of this one if it’s outsourced or not.

Ideally it should be the same minus a few graphical details plus a number of gameplay elements but since it’s not a patch to the existing code that’s hard to say.

Wonder if Square will do a update like that for the PS5 and X Series X best of both or how to say though porting this version forward to those platforms seems more like it’s going to be how it’d be done.
(Potentially part of the backwards compatibility program perhaps.)

Odd latency bit there too with Legion, thought the thing was a video and they are 30 FPS so the main menu gets 30 FPS, guess it’s not that simple and it’s a problem with something else. Hmm.

EDIT: AGESA 1.1.0.0 C and a update to the bios from Gigabyte again so that’s where those Kernel events in some demanding games comes from it’s a memory controller stability issue, Gigabyte and their tweaking and tuning.

And downgrading. :smiley:
(1.0 to 1.1 adds a Thunberpolt connector to a part on the motherboard that had tracings for it, 1.2 after finally finding info on it strips off a number of caps and stuff for the RAM and CPU which has about the effect you’d expect nice to see that crap continues on.)

Well it’s a WIP bios but they really keep tuning that memory profiling although with how sensitive the AMD systems can be for XMP and how lacking the XMP’s can be even good memory chips and decent profiles can be very touchy.

I didn’t play the vanilla version, but after downloading the demo, i’m very pleased. There are videos comparing the switch version against the PS4/Steam one, and there is clearly a difference. But playing the demo, doesn’t feel that anything was cut, and the orchestral music is amazing.

– edit –
There are some shots here (How does the game look in the demo ? :: DRAGON QUEST XI S: Echoes of an Elusive Age – Definitive Edition General Discussions) , but i really don’t know what to think.

1 Like

Exposure Stops (> middle-gray)

Luminance (vs display max level)

These tell a slightly different story :wink: The exposure visualization is probably better for gauging the dynamic range of a game, where as the display level visualization is for calibrating your display.

They almost look the same, but pay close attention to the pink part of the exposure map. The display luminance image would lead you to believe you could visually distinguish the change in brightness there, but the fact of the matter is that it is so bright it’s not visually distinct.

Wonder if the game has the same skew(?) tonemapping problem the prior game had or maybe more about the game engine and then made worse through HDR.

Possibly compressed or TV range too so 16-235 instead of 0-255 shifting it down a bit if I got that correct so more like 0-219 I’m not that great at this mostly following some read-ups and info posted about problems then.

Could be the same here but now also with high-dynamic range hardware support and further complications also resolving that I suppose?

EDIT: Watch_Dogs 2 does have a tint or tone map effect over the image though maybe Legion is a bit more neutral since it’s not sunny Chicago it’s incredibly gray London. :smiley:

Night-time isn’t too terrible but the sun lighting ranges from good to bland to almost bad looking but it might be a combination of settings and sacrificing a few things for performance reasons or a balance between performance and image quality.

Plus a real time dynamic time cycle and various weather patterns unlike something static or baked good as Assassin’s Creed Unity could look at times even if it too was often very stylized. :smiley:

Bahahahah, that was awesome. I read that title, and the surprise, surprise… Voodooman turned ut to be the author :slight_smile: He rants as much as I do about this stuff, lol.

I’d actually try and approach Ubisoft directly with my findings, but the last time that happened, it didn’t end well for me.