• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Associate
Joined
27 Sep 2020
Posts
33
There was no mentionof adding a 3080 to consoles. Where did you get that from? The discussion is as is the title, 10GB vram enough for the 3080? Discuss..
What I did say was that the next gen consoles would only be using 10GB of their total 16GB of RAM for graphics. So if the consoles can do 4k in 10GB or more often under, why can't the PC?

I mean they can but it doesn't mean they couldn't benefit from more if its available. The console argument is more Consoles tend to get by with less than PCs so if the consoles have this much to get better results the PC might need more.

PCs can manage 4k with less than 10GB, doesn't mean they aren't taking a hit to visuals and/or performance to do it. Consoles do but tend to have more work done on tuning things to their fixed hardware.
 
Permabanned
Joined
31 Aug 2013
Posts
3,364
Location
Scotland
I mean they can but it doesn't mean they couldn't benefit from more if its available. The console argument is more Consoles tend to get by with less than PCs so if the consoles have this much to get better results the PC might need more.

PCs can manage 4k with less than 10GB, doesn't mean they aren't taking a hit to visuals and/or performance to do it. Consoles do but tend to have more work done on tuning things to their fixed hardware.

Well we already know that you can turn up AA to improve visuals and that can eat up VRAM, but that has the downside of needing more GPU power. Which brings us back to the is 10GB enough and the answer being yes as you will need a better GPU before you need more VRAM.

If my 1080Ti had 24GB of VRAM, would it be good enough to run 4k+ with RT today? No, of course not, because the GPU doesn't have the feture set or the grunt. What about the 2080Ti then as it is more modern. The answer is still no, it didn't have the RT performance that was promised to begin with. The 2080Ti is interesting though as what you can do there is render at a much lower resolution and then upscale with DLSS. Rendering at the lower resolution by coincidence takes less VRAM. So what we are seeing is less usage of VRAM on older cards to try to play modern titles. In two years time we will see the same thing with the 3080.
 
Associate
Joined
27 Sep 2020
Posts
33
Well we already know that you can turn up AA to improve visuals and that can eat up VRAM, but that has the downside of needing more GPU power. Which brings us back to the is 10GB enough and the answer being yes as you will need a better GPU before you need more VRAM.

If my 1080Ti had 24GB of VRAM, would it be good enough to run 4k+ with RT today? No, of course not, because the GPU doesn't have the feture set or the grunt. What about the 2080Ti then as it is more modern. The answer is still no, it didn't have the RT performance that was promised to begin with. The 2080Ti is interesting though as what you can do there is render at a much lower resolution and then upscale with DLSS. Rendering at the lower resolution by coincidence takes less VRAM. So what we are seeing is less usage of VRAM on older cards to try to play modern titles. In two years time we will see the same thing with the 3080.

Not saying it isn't enough, but if the consoles are able to use 10GB shouldn't a 3080 have at least a chance of being able to? Sure the 2080ti can use DLSS to makeup for not having the grunt to run a certain resolution, but that can also be used to allow it to turn on other VRAM heavy techniques that may have more visual impact. VRAM usage and the power needed to use that VRAM isn't a simple relationship. Plenty of non-gaming GPU accelerated tasks can benefit from the extra VRAM so its likely at least some of the way gaming engines don't scale to over 10GB is there's not been any reason for them to as gaming cards haven't had it.
 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
Well we already know that you can turn up AA to improve visuals and that can eat up VRAM, but that has the downside of needing more GPU power. Which brings us back to the is 10GB enough and the answer being yes as you will need a better GPU before you need more VRAM.

Bingo.

vRAM isn't a fixed demand per game, the amount of vRAM you need increases as you improve graphical fidelity (texture size, amount of AA, shadow resolution, etc) you can literally see this in some games in the video options as the developers will tell you the estimated vRAM budget for each feature, and every video setting you alter has an impact on the vRAM usage.

But those video options also increase demand on the GPU. What some people have been doing is speculating that future games will demand more vRAM (true) but then ignoring the impact that will have on the frame rate. The people playing games like FS2020 who max it out (4k Ultra) are getting like 20fps which is unplayable by the worst of standards and it's still not filling 10Gb of vRAM (when measured by actual usage). I think someone managed to get Avengers in 4k Ultra up to about 9Gb of vRAM usage but that was at 17fps.

Not saying it isn't enough, but if the consoles are able to use 10GB shouldn't a 3080 have at least a chance of being able to? Sure the 2080ti can use DLSS to makeup for not having the grunt to run a certain resolution, but that can also be used to allow it to turn on other VRAM heavy techniques that may have more visual impact. VRAM usage and the power needed to use that VRAM isn't a simple relationship. Plenty of non-gaming GPU accelerated tasks can benefit from the extra VRAM so its likely at least some of the way gaming engines don't scale to over 10GB is there's not been any reason for them to as gaming cards haven't had it.

True and that's why there's prosumer video cards like the 3090 which come with like 10% more GPU power but over 2x the vRAM, and a hefty price premium to boot. In previous generations the titans and of course the Quadro professional line that all have way more vRAM.
 
Last edited:
Associate
Joined
3 Jul 2012
Posts
425

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,191
Location
Greater London
Resident Evil 3 remake with 200% scaling is playable on 3090. Used 21gb vram at one point. Great show-off game in 8k.
By the time 8K becomes a thing we will be rocking a 5080 that will be at least 2x performance of a 3090 :p

I rushed to 4K and been using it since 2014, but will not be rushing to 8K. With machine learning tech getting better and eventually becoming standard will likely solve the anti aliasing issue. Only way I will go 8K early is if DLSS type tech becomes a standard in big triple a demanding games, that way I benefit still from having all the extra pixels, even though I can still run the game at 4K internally.


So even 20GB isn't enough...
He is likely just reporting the cache like everyone else does. Would be interesting to see what it really is if he uses that software that displays true usage.
 
Associate
Joined
3 Jul 2012
Posts
425
He is likely just reporting the cache like everyone else does. Would be interesting to see what it really is if he uses that software that displays true usage.

If RE3 is anything like RE2 it's incredibly well optimized (from seeing footage of how the RE2 AI works you can see the insane amount of Occlusion Culling they used), but then again it could simply be cached, be interesting to see what happens if you tried the same test on a 3080.
 
Soldato
Joined
27 Nov 2005
Posts
24,559
Location
Guernsey
There was no mention of adding a 3080 to consoles. Where did you get that from? The discussion is as is the title, 10GB vram enough for the 3080? Discuss..
What I did say was that the next gen consoles would only be using 10GB of their total 16GB of RAM for graphics. So if the consoles can do 4k in 10GB or more often under, why can't the PC?
PC games can have much higher Graphic settings then the same game on a console so you can't really compare the video memory usage between them
 
Last edited:
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
Resident Evil 3 remake with 200% scaling is playable on 3090. Used 21gb vram at one point. Great show-off game in 8k.

Uh huh, now upgrade to the new MSI Afterburner beta that shows real vRAM usage and do the same test again, post results and also post the average (or representative) frame rate.

Not only that but vRAM is widely reported in Resident evil 3 as badly measured, with the menu telling you that you may run into issues because it needs 12Gb of vRAM but cards with only 4Gb maxing it out without a problem. They even speculate in this article that different vRAM configs for the texture is simply reserved for streaming and not what is actually in use https://www.pcgamer.com/uk/resident-evil-3-best-settings/

Basically we need real numbers and MSI Afterburner will tell you that, I'd do it except I don't own the game.
 
Permabanned
Joined
31 Aug 2013
Posts
3,364
Location
Scotland
PC games can have much higher Graphic settings then the same game on a console so you can't really compare the video memory usage between them

Yes you can turn up the settings on PC, but then the GPU can't keep up. Consoles can also have much higher settings, but then their GPU won't keep up either.
 
Soldato
Joined
27 Nov 2005
Posts
24,559
Location
Guernsey
Yes you can turn up the settings on PC, but then the GPU can't keep up. Consoles can also have much higher settings, but then their GPU won't keep up either.
But is the amount of Vram a game uses always tied in with how many FPS you get ?

For Example

Does a game that uses only 6GB of Vram always give more FPS then a game that uses 8GB of Vram
 
Man of Honour
Joined
13 Oct 2006
Posts
90,823
"16K x 16K landscape textures"

This must be a professional app or he's making games for Quadro users only, which then begs the question why he doesn't have one already...

Not clear from their post but interesting whether that is 16Kx16K tiles with streaming or 16Kx16K images being streamed in to the virtual tiles. That is so beyond what is useful even on a super high res game though - you don't really gain anything beyond 8K tiles and 4K assets for gaming purposes - even when people have higher than UHD resolution displays (if done properly).
 
Permabanned
Joined
12 Sep 2013
Posts
9,221
Location
Knowhere
I know its a long shot but I feel it will easily last long for me. I will be upgrading from integrated graphics. I game at 800*600 sometimes at 30fps. I have low standards.
So I think i will upgrade once my machine can no longer do 60+ fps at medium/high settings at 1440p in the majority of the games.

Also I am just 17, i don't think I will have money to be able to afford a $700+ gpu again after 2 years :p

It's not that big a long shot if you don't consider ultra graphics & high resolutions a must, you'll be more than happy with a 3080 for many years. I'm in the same boat in a way, I'm back to running a 3440x1440 monitor with an RX 480 while I wait on Big Navi.
My 480's still doing okay all things considered although I have found that I've had to drop the settings a little more than I'd like to maintain good frame rates in games like Dirt Rally 2.0 as the car now looks like it's floating above the textureless ground. :rolleyes:
 
Status
Not open for further replies.
Back
Top Bottom