I’ll pick up the 20 GB variant myself, I guess. Got to have that to get some glorious speed up in AI Dungeon.
Long as it’s reasonable priced though I suppose with the 3090 at double the cost there’s a bit of leeway here for the 3080 20 GB GPU plus it’s not bad for productivity and memory heavy workloads.
Availability could be a problem though but that also goes for computer hardware in general now and I expect the AMD hardware and the console hardware will see that issue too.
You’re aware that Special K has a VRAM tracker, right?
I don’t simply mean that it can measure usage in real-time (though it can), it actually spits out a summary at game exit of the game’s VRAM requirements. This is because the entire reason this project exists was Batman: Arkham Knight’s insane texture streaming.
10/10/2020 14:24:32.056: --------------------
10/10/2020 14:24:32.058: Shutdown Statistics:
10/10/2020 14:24:32.059: --------------------
10/10/2020 14:24:32.060: Memory Budget Changed 2 times
10/10/2020 14:24:32.061: GPU0: Min Budget: 10281 MiB
10/10/2020 14:24:32.062: Max Budget: 10281 MiB
10/10/2020 14:24:32.064: Min Usage: 00005 MiB
10/10/2020 14:24:32.069: Max Usage: 02131 MiB
10/10/2020 14:24:32.070: ------------------------------------
10/10/2020 14:24:32.071: Minimum Over Budget: 00000 MiB
10/10/2020 14:24:32.073: Maximum Over Budget: 00000 MiB
10/10/2020 14:24:32.074: ------------------------------------
dxgi_budget.log
is a definitive measure of how much VRAM your game actually needs. Most VRAM budget is dedicated to RenderTargets (they cannot be shuffled between System RAM and VRAM) and thus for properly functioning games, resolution / HDR contributes the most to VRAM requirements.
If you want to play games in 8K + HDR, this is when you start needing ludicrous amounts of VRAM.
Guessing that all works together although with resolution as the primary VRAM drain?
Then HDR for bandwidth and a bit more.
Then flip model or rather the number of back buffers on that since it seems 3 - 5 is a good number for high refresh rate display support if I got that right?
Plus whatever moderns display port or HDMI can put out but HDMI 2.1 should remove that concern and then whatever the next one from display port will be.
(1.4 manages well compared to HDMI 2.0 but it has it’s limitations too.)
EDIT: Hmm actually maybe HDR is not strictly just speed but also uses a bit more memory too. Could be mistaking how that worked here.
Guess the bit depth is a factor too but scRGB SpecialK already goes up to 16 bits above the usual 10 and however 12 is managed for current high-end displays. (Dithered down?)
Kal can you check your PMs please?
Yeah, that’s where stuff gets really expensive. The actual backbuffer images are miniscule, but for each additional backbuffer in a swapchain, you have to allocate storage for the dozens of render passes needed to produce the final image.
Technically, you don’t have to allocate that in D3D11 … the driver figures it all out. D3D12 / Vulkan requires the engine to micromanage allocation per-backbuffer. That’s the reason I cannot increase backbuffer count in Horizon: Zero Dawn and that game’s stuck double-buffered.
Actually, stuff gets interesting here. The storage requirements for a FP16 image are constant, but delta color compression means that the amount of time spent reading / writing the image is not. So bandwidth requirements are somewhere less than double, but storage requirements are exactly double.
This is how Fallen Order looks with DRS + HDR enabled.
10/10/2020 01:52:07.291: Memory Budget Changed 348 times
10/10/2020 01:52:07.291: GPU0: Min Budget: 09223 MiB
10/10/2020 01:52:07.292: Max Budget: 10359 MiB
10/10/2020 01:52:07.292: Min Usage: 00002 MiB
10/10/2020 01:52:07.293: Max Usage: 10168 MiB
10/10/2020 01:52:07.293: ------------------------------------
10/10/2020 01:52:07.294: Minimum Over Budget: 00499 MiB
10/10/2020 01:52:07.294: Maximum Over Budget: 00627 MiB
10/10/2020 01:52:07.295: ------------------------------------
I suppose that’s settled then You’d benefit from another 1 GiB of VRAM at least.
I should probably change the “Budget Changed” statistic to indicate how long the game was running.
The more frequently those budget changes happen, the more the driver is struggling to figure out how to evict resources from VRAM and the more urgently you need to either turn your settings down or go buy a GPU with more VRAM.
I’d definitely have to test more games to be sure. Seems odd to me that Fallen Order would use so much VRAM when the resolution is prob closer to 1440p than 4K.
I friggin’ love the new render latency graph! If I am not mistaken, it properly showcases the additional latency that’s incurred when the G-Sync upper limit is hit and where V-Sync kicks in when two buffers are used.
Just a comparison on my monitor with the usual optimal ‘G-Sync’ settings (V-Sync forced globally):
- Unlocked frame rate – FPS capped at 119.6 FPS with 3 frames delay.
- Capped to 119.5 FPS – 0 frames delay.
As Kal mentioned earlier, it illustrates that the golder 3-FPS-below-refresh-rate rule seems to have been more about FPS limiters with huge variation at the time (ca 2017 – before RTSS’ and Nvidia’s own improvements in that area) than an actual requirement.
I do not think there was ever anything particularly magical about about that -3 offset.
It should be stupid simple to keep G-Sync active 100% of the time, there’s no requirement for frametime consistency for G-Sync to work. In fact, the worse you are at timing, the more likely G-Sync is going to remain active. Just deliver a bunch of sloppily timed frames at a rate under refresh Hz and G-Sync stays active.
SK’s designed to solve way more interesting timing problems at the complete opposite end of the spectrum G-Sync OFF, Black Frame Insertion ON. That’s an application where the more consistently spaced your frames are, the less blurry things get.
So that’s what that 2 - 3 FPS variance recommended came from, interesting.
Never really understood why the G/Free Sync recommended was using that figured it was down to engines like Unreal being popular and using a similar although above the limit guideline for smoothing or something of that nature.
Something to experiment a bit with.
EDIT: Quite a bit of interesting info really, Microsoft and the flip model with flip discard support, VSync DWM tearing and developers perhaps still targeting Windows 7 or Windows 8 or the game engines not being designed around Windows 10 seemingly ignoring (Missing?) this bit so support has been slow.
Then there’s all the newer developments around input latency and framerate limiting like NVIDIA Reflex starting this up again along with driver level smoothing and latency reducing attempts and more.
I’m actually a little baffled as to why latency suddenly became a big a topic.
I thought that with all the work Carmack, etc. were doing on VR head tracking latency, THAT’S when we would see interesting measures of latency showing up in performance overlays. In fact, the reason I keep going back ‘Trover Saves the Universe’ for benchmarking is because it is a VR game.
Come to find out, Unreal Engine’s not really doing anything special to manage latency. I totally expected to see it dong interesting things that I could learn from, the only thing I learned is that Unreal is not nearly as advanced as I thought it was.
Disappointingly, almost nothing ever turns out to be cutting-edge when I peek under the hood of game engines By the time a game makes it to market, its engine is obsolete seems to be reality.
I honestly feel more games should take advantage of WCG, it’s either WCG+HDR, or standard SDR, the SDR rec2020 is still a spec but almost nobody seems to be using it, with most monitors not being able to display HDR properly without proper backlight dimming this would be a good compromise.
Though it does seem like windows is currently bundling HDR with WCG, no separate option for WCG only aka SDR2020, probably something that GPU manufacturers need to collaborate on to add to GPU drivers, I know the NVIDIA shield has a option for that colorspace, but the desktop GPU’s don’t.
I accidentally put the DWM into WCG mode somehow using NvAPI not too long ago. I think it was using one of the 6bpc formats.
This reminds me, I need to build a standalone tool to change the DWM color format. Currently you have to go into a game, put it in fullscreen exclusive mode and then select one of those color override options. NvAPI applies that override to not only the current game, but also to the DWM.
I don’t know of any other way to explicitly control the DWM pixel format.
@JonasBeckman
I find it amazing how GOG games usually don’t show up in AMD Games Panel. I have to usually manually Add them to the list of games. Weird.
Yeah I would think Galaxy at least be part of the “engines” AMD supports but I don’t think they do it’s Steam, Origin and Uplay with (partial?) Windows Store support added as well.
Epic Games Store supprt as of the newer driver update and I’m unsure if Blizzard/Activision and their launcher is detected at all.
EDIT: Would expect them to support GOG or at least Galaxy in time for Cyberpunk 2077’s release.
(Would be pretty important after all.)
You are right all those platforms are detected and even Uplay and Origin games too.