I agree that 8GB is not enough for 4K, but 10GB is 25% more memory. I had a 2080 Ti before I sold it off two weeks ago (thank god I sold when I did, bagged $1200 ). I never found any single game that would use a true amount of 10GB (budget not allocation). Many games will cache as much memory as is available, if you offer it a 16GB VRAM card, it will cache cache cache away, filling up with data that may or may not ever be relevant. But you can then go and run this same game with 8GB GPU and have zero issues.
Improved texture compression on the GPU might also help along with better usage and streaming methods in game engines although caching and keeping around 75 - 80% of the GPU total is probably a fine idea too but 8 GB will probably do well for a while yet.
Only really limited card recently would be AMD’s Fury and 4 GB when this amount was starting to be exceeded and 5 - 6 GB utilization became a bit more common for the higher settings but the card still got a year and a half or so until it became a more noticeable problem.
Yeah game engine improvements and using a cached amount I think is going to increase and keep actual resource usage and memory allocation nice and efficient with a few games or game engines possibly standing out a bit more.
I expect 10 GB to last until the new GPU models arrive and probably well after too before it becomes truly limiting.
EDIT: Especially as consoles far as I know of the designs still use a total and shared amount so being smart with the available will see this sort of design improve too even as they do have more memory now but developers will still have to be using it well and not exhaust the available or waste it.
(Utilizing the new storage capacity and IO speeds could also be a boon for streaming and keeping the required memory usage lower.)
EDIT: DLSS too but it’s still bound to game engine (UE4) or specific game implementation plus it’s handled by NVIDIA and isn’t too open but it does work well for what it does so if it could be separated off and stand-alone even as a slightly worse quality model that’d really change things up.
(And open up for entirely new and strange debates hah, 4k versus 4k scaled through AI.)
Certainly would cut memory usage by a bit though.
EDIT: Less overall brute force approach and just having the hardware although there could still be a thing for the highest settings keeping a bit of a skew on actual image improvements over hardware demand but eh that’s nothing new.
Suppose the low level API focus on consoles might also see more demand from the developers utilizing hardware and getting to know the API well also improving game engine efficiency and optimization although whether this translates to PC or not well…perhaps not entirely.
There, that should be everything. Going to be interesting to see how things change up once the new console hardware is out on the market but it won’t happen anytime soon on PC just from that alone.
(Lowest common denominator after all, D3D12_2 support / DirectX 12 ultimate isn’t landing on PC until Windows 10 21H1 either.)
Yes the 8GB was shared, in fact we know that Xbox One has 3GB reserved for OS and 5GB for the rest of the system. Using that number it is 2.7x VRAM increase for the XSX’s 13.5GB.
Nvidia went from 3GB on 780 to 10GB on 3080 within that time. 3.33x increase.
Remember, PS3 to PS4 was a 16x generational jump compared to 2.7x generational jump.
Screen resolution was an increase from 720p → 1080p, a 2.25x jump, and next gen is going from 1080p to 2160p, a 4x jump. More VRAM is being used due to the massive resolution increase, and not texture quality. You should compare apples to apples, and see how much VRAM those games that hit 8GB at 4k consume, and compare to when it is set to 1080p, despite using the same textures.
I am extremely comfortable to bet, you will still be able to play any game with Ultra max textures in 4k on a 3080 in 2 years when Nvidia Hopper comes out, if not all the way to 2024, when the next next GPUs come out.
This all also doesn’t really touch on the fact that RAM is shared as additional VRAM in many cases, and as long as the bandwidth and needs are low enough, shuffling out resources between RAM and VRAM can in some cases be done without a noticeable effect on performance.
On another note, one of the reasons why I want 24 GB of VRAM is to speed up the execution of that text-based AI story builder game. It uses something like 20 GB of VRAM and while it is perfectly playable on my 1080 Ti, the delay between interacts are a bit much.
Throwing more VRAM at it will help the neural network to more quickly decide on its responses as the story progresses.
Other similar fun neural network concepts will likely love the additional VRAM provided by the 3090.
That’s a huge bet to make. Traditionally, there’s always those games that push texture quality further than seen on consoles. You’re basically saying all developers will limit themselves to the console spec for next gen. For console equiv textures, yeah i expect the 3080 to handle that just fine. For “higher res PC exclusive” textures that i expect to exist and should exist if the past has anything to say about it? No, i don’t.
And before you bring up the Xbox One X as covering those higher res PC textures - there were already quite a few games with higher res textures on PC for Microsoft to refer to, and yeah their engineers would know the amount to push the ram to allow for significantly higher res textures as part of the push towards better graphics. You could remove Xbox One X from the equation, and you’d still get higher res textures on PC vs the 8gb consoles because some developers like to push their games and allow the work of their texture artists to shine.
I’ve mentioned increased world complexity and the SSD situation too. We already have games hitting around the 7-8gb vram mark as i mentioned earlier, you’re basically saying even with next-gen consoles pushing the visuals and game complexity further, vram usage will barely increase beyond those current gen games just because the XSX determines what devs will stick to as their max texture res.
The 3090 is a thing (soon enough), trying to lean into 8k gaming. Some games will probably increase texture res for 8k gaming, which will also prove beneficial for 4k gaming, should you be able to run it. I can tell the difference at 1080p with any existing game at max textures vs console textures.
We have yet to see the RDNA 2 cards as well, many are expecting a 16gb flagship, so the 3090 may not be the only card capable of significantly higher res textures for next gen games.
Yeah, the vram requirements shoot up when textures are pushed further than base consoles are capable of. If you stick to console equivalent, 3-4gb of vram is fine at 1080p and 8gb doesn’t matter - 4gb can handle 4k gaming with console equiv textures.
Ultimately, i think the misunderstanding lies in the fact that you don’t expect PC to have higher res texture options vs consoles, and i do. I’ve mentioned increased game complexity, the processing power is there in next gen consoles to push graphics much further, and so vram usage should naturally increase on PC, unless texture LODs are lowered in the process vs existing games - i doubt that sort of regression will occur. I expect devs to just lower texture LODs and overall settings on consoles to fit the 16gb overall ram spec, while sometimes allowing those higher res versions to be available on PC - this is the way it’s always been (or at least for the last couple of generations).
The thing about 8K gaming is quite funny. You know how they are doing it right? It’s DLSS that is upping Resolution to 8K through Upscale method. Hope you guys know that.
In the RTX 3090’s 8K-gaming demo, Nvidia ran the games at 1440p and then used DLSS to upscale them to 4320p. This is crucial because it frees up the video card to focus on rendering frames quickly and with lots of graphical effects.
Meanwhile, the RTX 3090 is what Nvidia calls a “BFGPU,” a massive $1,499 graphics card with 24GB of GDDR6X memory at 19.5Gbps. Jensen proudly boasted that the RTX 3090 can play games at 8K in 60 FPS. This isn’t rendered natively, of course, but instead uses Nvidia’s supersampling A.I. technology, known as DLSS 2.0
Of course, if anyone thought you were going to be getting 8k60 without DLSS, I have a bridge to sell ya
33m pixels is not a joke
@GPUnity I actually do think we will still see Ultra Texture packs on PC that push further than consoles, but establishing where the consoles will be at is critical to understand what pushing further really means.
And I know that you know this, you make wonderful texture packs, but I just want to make it clear that we all agree that texture resolution has nothing to do with screen resolution.
If RDNA2 does have 16GB cards, they will unfortunately be bandwidth limited compared to 30 series. The only way you can have 16GB without doing split speeds (which would be a terrible idea on PC), is at 256-bit bus width, or 512-bit bus width. 512 will not happen, it is very expensive and power consuming. And at 256bit bus width, even if they sourced the same G6X memory as nvidia, they are now at 624 GB/s instead of 760 GB/s.
Anyway back to the main point, no, I don’t disagree that PC’s will have higher available settings than console, we always have of course, even when we don’t look at what the modding community puts out.
Still isn’t a limitation. 10GB of VRAM in 2021 is not the same as 10GB of VRAM in 2020.
This is thanks to the new technologies. This is a straight up magnitude decrease of VRAM needed.
And this keeps the beast fed, and allows streaming for smaller committed VRAM sizes.
This all circles back to the fact that yes I do trust in Nvidia. They know what the future of PC gaming looks like, they help establish it every time, and they have never deliberately held back their flagship, and that is why they have held the performance crown for the last 10 years. Nvidia now holds 80% of the discrete GPU market. PC developers know this, and many of them work directly with Nvidia to optimize their games.
For the people who use VRAM for other things than gaming, or for those who are worried about keeping their GPU beyond 4 years, they’ve got a product for you, it is the 3080 20GB, or the 3090.
For 99% of PC gamers (and also the majority of which who are not gaming at 4k, but 1440p or 1080p), the 3080 10GB will not be holding them back.
Ouch… seems the prices for the 30 series have started to be revealed in Sweden and… I’ll probably stick with a third-party OC’d 3080 because it costs like $1000 whereas the 3090 costs like $2100 here.
Yeah saw that yesterday through Webhallen, VAT’s doing it’s thing.
20.000 SEK for the 3090 so about 80% of what this entire system cost ha ha.
And then around 10.000 for the 3080 but no pricing or listings even for the 3070 yet which I expect will be a bit more around the usual GPU pricing.
Remove the 25% VAT and that’s the ~1500 MSRP and 800 respectively for these.
(EUR or USD it’s not too big of a difference when the 1 USD = 1 EUR thing is still going.)
Very Additional Tax yes.
EDIT: Well the usual GPU pricing I like to keep to is 3000 maybe 4000 SEK whereas I expect the 3070 to be more like 6000 plus low availability and limited stock and a immediate price hike as the initial shipment starts selling out.
EDIT: Suppose that means AMD’s GPU offering might scale up to around 10.000 SEK possibly and there’s a chance for NVIDIA to jam in a 3080 Ti and 3070 Ti variant at 15.000 SEK and 8000 SEK perhaps.
A few hundred here and there depending on what AMD’s performance scaling is compared to NVIDIA’s which from rumors suggest somewhere in-between the 3070 - 3080 but that remains to be seen and tested.
Not to mention the driver situation and NVIDIA’s tech and features which are going to be very tough to compete against.
(5700 was really low priced even for a mid-range card around 3000 - 4000 SEK even for some of the custom designs but I don’t think it’ll be like that this time though pricing below NVIDIA’s recommended could happen.)
Why only 10 GB of memory for RTX 3080? How was that determined to be a sufficient number, when it is stagnant from the previous generation?
[Justin Walker]We’re constantly analyzing memory requirements of the latest games and regularly review with game developers to understand their memory needs for current and upcoming games. The goal of 3080 is to give you great performance at up to 4k resolution with all the settings maxed out at the best possible price.
In order to do this, you need a very powerful GPU with high speed memory and enough memory to meet the needs of the games. A few examples - if you look at Shadow of the Tomb Raider, Assassin’s Creed Odyssey, Metro Exodus, Wolfenstein Youngblood, Gears of War 5, Borderlands 3 and Red Dead Redemption 2 running on a 3080 at 4k with Max settings (including any applicable high res texture packs) and RTX On, when the game supports it, you get in the range of 60-100fps and use anywhere from 4GB to 6GB of memory.
Extra memory is always nice to have but it would increase the price of the graphics card, so we need to find the right balance.
I’m interested in a replay but I would want to replay it fully and getting back into the first game again is a bit rough but not impossible.
A full moderniz
EDIT: Posted?
(Yes, sometimes it seems hitting Enter goes both for make new line and also making a new post. Good to know.)
So…a full modernized remake would be nice but probably not happening after the previous XBox updated version got cancelled.
Ray tracing so likely D3D12 as well possibly Vulkan but likely D3D12 RTX.
Loading time updates could be anything even a larger game engine overhaul or changes to the underlying systems which could also include scripting and other assets (Mod compatibility issues ensues well it’s probably unavoidable.) well it’s been announced so the usual trailer and promotional screenshots showing more on what exactly they are doing with the game can’t be too far away.