• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Soldato
Joined
15 Oct 2019
Posts
11,694
Location
Uk
Honestly unless your card is **** I don't know why anyone would buy a 3080, your just going get mugged off when the 16gb or 20gb 3080s comes out
10gb VRAM is fine if you play at 1440p, 20 would just be overkill and a waste especially if it ends up costing you more cash.
 
Last edited:

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,570
Location
Greater London
Honestly unless your card is **** I don't know why anyone would buy a 3080, your just going get mugged off when the 16gb or 20gb 3080s comes out
Get a 3090 then, we will see who will get mugged off when the 4080 comes out and the value of the cards plummet. Lol.

20gb 3080 will end up costing a lot more, the extra 10gb won’t be for free. So can’t see why anyone would be mugged off.

Only way I see people getting mugged off is if AMD bring much better price for performance and force nvidia to respond by dropping prices.
 
Soldato
Joined
30 Dec 2011
Posts
5,447
Location
Belfast
Highly relevant article about Nvidia cards getting more Ram, not because they need it, but because AMD is reportedly using 12 & 16GB and nVidia doesn't want to look second best.

Is 10GB enough - not if you're wanting to posture with marketing it isn't !

https://www.techradar.com/uk/news/a...uld-both-have-more-vram-than-nvidias-rtx-3080

Having lower VRAM has never stopped Nvidia before. The reason the RX 6900 has 16GB is because it has a 256bit bus, so going with lower VRAM would have meant an 8GB card aimed at 4K and that is not enough.

The bus width dictates the number of VRAM chips that MUST be mounted on the PCB. A GPU with 256-bit bus has 8 memory slots, since each memory chip have a 32-bit wide bus. You can double the memory amount on a 256bit bus and have 16 memory chips (front and back of PCB) but you couldn't for example have 10 chips. VRAM chips are usually 1GB or 2GB.

192bit bus = 6 modules (hence the 12GB Navi 22 rumours). Clearly having 6GB VRAM for this level of GPU would be a serious problem for 4K gaming.
256bit bus = 8 modules (so either 8GB or 16GB versions). Going with 8GB would not be an option if this GPU is aimed at 4K
320bit bus = 10 modules (now you see why we have 10GB or 20GB versions).
384bit bus = 12 modules (12x2GB). It could be possible for Nvidia to release a 12GB 3090 as a cut down cheaper model and still be decent for 4K gaming. This could be an option if the RX 6900XT is close to 3090 speed for half the price. Nvidia release a cheaper 12GB 3090 and they have an answer to AMD.

So Nvidia and AMD went with the VRAM capacity they did due to design and practicality constraints. Now we need to see it the rumoured AMD 128Mb Infinity cache will actually help the lower bus-width on AMD Navi GPUs.

As for the question if 10Gb is enough for 4K gaming, well I would prefer not to take that chance and if I wanted 3080 I would wait until the 20GB versions come out. I have a 2080 with 8GB and have been fine at 4K but that's because I need to compromise on settings for most games as the grunt just isn't there. So ironically I don't see the VRAM issue because I have to lower setting to maintain FPS in modern games and that also reduced VRAM requirements. Adding 60% - 70% extra performance and I can up the graphics settings and VRAM capacity is suddenly a problem. So I suspect there will be occasions when the 10Gb on a 3080 may become a problem at 4K. The grunt may be there but the VRAM isn't.
 
Last edited:
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
I'm surprised more people haven't mentioned this. Vram isn't the only way a new game can stop you from reaching xx settings and xx frame rates. MSFS 2020 is here, now, and it can't run at 60fps on anything, with any amount of vram.

Do people really think the 3080 would be able to max every new game over the next two years...if it only had more vram?

I tried pointing this out in the Ampere 8nm thread for about 30-40 pages of back and forth on this prior to launch but it didn't really go anywhere. There's a lot of old school thinking that the vRAM is somehow just a big cache for all of the assets in the game and they sit in there in case they're needed. And that was true 15+ years ago. But engines moved on since then to dynamic streaming of assets just in time, especially in open world games. So vRAM is much less about what the game needs, GTA V my goto example is 90Gb install for me but I have 8Gb vRAM, and that streams in all those assets just fine as you zip about the island in a heli or whatever.

vRAM today is way less about being a dumb cache and instead keeping just what is needed to render the next frame. From an architecture point of view this makes sense, the powerhouse in video cards is the GPU it's literally what does the work to calculate the next output frame based on all the inputs, the vRAMs only job is to service the GPU with the assets it needs to perform the calculations needed to push out the next frame, that's it. Using it as a dumb cache is expensive because the vRAM modules are expensive.

So there's this intrinsic connection between vRAM size (not just bandwidth) and GPU speed, and vice versa you can think about it like this: that every asset you're putting into vRAM is just more stuff the GPU will be doing work on, and as you demand more work from your GPU the frame rate goes down. In modern game implementations if you're putting a crap ton of stuff into vRAM then you're also putting more load on the GPU. And all attempts so far to find a game you can load up on 10Gb+ of vRAM and go "AH HAH! 10Gb is not enough!!" on the 3080 have resulted in unplayable frame rates. And low and behind you drop all those settings to make it playable and your vRAM usage is back below 10Gb.

It's also why we get all these erroneous arguments about future proofing. About the games in 2+ years that will use more vRAM. Sure they will. But a 3080 wont be playable maxed out by then, you'll have long run into that GPU bottleneck. And that's why you shouldn't think about vRAM in terms of future proofing or game performance or what you feel you deserve or what looks nice or what is the biggest number but rather "How much can this particular GPU realistically use and still maintain playable frame rates?" which on some rough aggregate is pretty much game agnostic.
 
Soldato
Joined
23 Apr 2010
Posts
11,896
Location
West Sussex
Nah, honestly, that 10GB is plenty. You'll run out of grunt before you run out of VRAM.

Particularly getting a 3090 is like inviting the burglars into your house and then waving them off and wishing them well as they take off with your precious money.

Don't agree at all. DLSS will make sure you have plenty of grunt.

Look I'm not being funny mate but I was one of the people stupid enough to fall for all of this crap when the Fury X came out. "It's not about memory amount it's about bandwidth and the ability to switch textures before the VRAM runs out. You'll run out of grunt before you run out of VRAM. The memory is so fast it doesn't matter, it's all about allocation blah blah blah".

Within three months of me buying the cards (note, plural, over a grand) COD BLOPS II came out. At first it had "Extra" settings. Yet for some reason every time I got to a certain level the game crashed. Not just crashed, but hard locked my rig. I figured oh it's the game it'll get a patch. It got a patch alright, that sniffed for a Fury card and disabled all of the Extra settings. So OK I found a hack to put them back, back to Alaskan rig. Freezing all the time.

In the end it indeed turned out to be VRAM. And the "fix" was for AMD to let it use your paging file. Which tanked FPS down to about 8.

So if you "honestly" believe what you write you need to form a little bit more scepticism. You haven't even seen a next gen game yet. No one has.

BTW Dave I totally agree. It's enough for 1080p and 1440p, yet IMO it's a 4k card. It's almost wasted at lower resolutions.
 
Associate
Joined
11 Jun 2013
Posts
1,087
Location
Nottingham
DLSS will make sure you have plenty of grunt.

Thing is that DLSS has to be explicitly implemented, and I'm sure I run plenty of things where it's not (at least: yet) been implemented. MSFS and X-Plane for example.

I'm with you on the Fury X nonsense regarding VRAM, never believed all that for a second. However, I do think 10GB is enough to drive 4K just fine (some benchmarks show that having more VRAM just boost performance). Maybe it would not be enough in 5 years from now, but by that time I would have upgraded anyway, and if going for the at least twice as expensive an option that upgrade might not even be feasible as quickly.

Yeah I would've liked a little bit more (12GB would've been good) but I'd rather turn down a setting or two than pay twice the price.
 
Soldato
Joined
27 Jul 2004
Posts
3,522
Location
Yancashire
10GB is clearly going to be skating on the edge for 4K gaming over the next few years. I get why they’ve gone with that to keep the costs down for the 3080, but...

I can’t help thinking that the logical config lineup should have simply been:

3070 8GB (aimed at 1080p high FPS and 1440p gaming, a bit of 4K gaming, ie most people)
3080 16GB (aimed at 4K and more enthusiast gamers into modding etc)
3090 24GB (aimed at workstation type people, and small appendiged people who just need to have the ‘best’)

The anomaly in that above is the 3080.

Also, good call above on the FuryX, was trying to remember which one it was. Wasn’t it the 4GB of HMB memory that was supposed to have magic properties. All the other cards were releasing with 6GB+ and it all turned out to be ********* did it not?
 
Last edited:
Associate
Joined
9 Jul 2009
Posts
1,008
10GB is clearly going to be skating on the edge for 4K gaming over the next few years. I get why they’ve gone with that to keep the costs down for the 3080, but...

I can’t help thinking that the logical config lineup should have simply been:

3070 8GB (aimed at 1080p and 1440p gaming, a spot of light 4K gaming, ie most people)
3080 16GB (aimed at 4K and more enthusiast / super high FPS gamers)
3090 24GB (aimed at workstation type people, and small appendiged people who just need to have the ‘best’)

The anomaly in that above is the 3080.

Also, good call above on the FuryX, was trying to remember which one it was. Wasn’t it the 4GB of HMB memory that was supposed to have magic properties. All the other cards were releasing with 6GB+ and it all turned out to be ********* did it not?

You can't have a 16GB 3080 though. It can only be 10 or 20GB because of the 320bit bus
 
Soldato
Joined
18 Feb 2015
Posts
6,484
As soon as the 16 GB & 20 GB cards hit, all the people defending the 10 GB models will go silent because now they don't have to rationalise to themselves that 10 GB is enough because they actually have the option to get more vram without stretching all the way to a 3090. It's gonna be all "oh having extra vram is so nice, peace of mind" etc. Happens every time the consoles change, exactly the same.

This is why I stopped arguing with people about it, it's so stupid & pointless. The new cards will automagically change their minds without any of the facts themselves actually changing.

:o
 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
You can't have a 16GB 3080 though. It can only be 10 or 20GB because of the 320bit bus

Right, and 20Gb is way way overkill, even if you could make an argument that in some really specific types of game, or application, or some outliers that somehow need more than 10Gb but still get playable frame rates, it wouldn't be much more than 10Gb that's for sure. Basically if you buy a 3080 20Gb you're paying for 10Gb of super fast GDDR6x vRAM that you're not using outside of those exceptions. That's a lot of money for zero return.
 
Soldato
Joined
23 Apr 2010
Posts
11,896
Location
West Sussex
Thing is that DLSS has to be explicitly implemented, and I'm sure I run plenty of things where it's not (at least: yet) been implemented. MSFS and X-Plane for example.

I'm with you on the Fury X nonsense regarding VRAM, never believed all that for a second. However, I do think 10GB is enough to drive 4K just fine (some benchmarks show that having more VRAM just boost performance). Maybe it would not be enough in 5 years from now, but by that time I would have upgraded anyway, and if going for the at least twice as expensive an option that upgrade might not even be feasible as quickly.

Yeah I would've liked a little bit more (12GB would've been good) but I'd rather turn down a setting or two than pay twice the price.

DLSS is quickly becoming a standard dude. Quite simply because it's brilliant. It's coming for Flight Sim soon apparently.

I doubt it will last five years. The 8gb on the 2080, for example, didn't even last two.
 

TrM

TrM

Associate
Joined
3 Jul 2019
Posts
744
As soon as the 16 GB & 20 GB cards hit, all the people defending the 10 GB models will go silent because now they don't have to rationalise to themselves that 10 GB is enough because they actually have the option to get more vram without stretching all the way to a 3090. It's gonna be all "oh having extra vram is so nice, peace of mind" etc. Happens every time the consoles change, exactly the same.

This is why I stopped arguing with people about it, it's so stupid & pointless. The new cards will automagically change their minds without any of the facts themselves actually changing.

:o

Here is the thing this time though.

There is 2 reasons why nvidia will bring out the 16/20gb model

1. amd bring out something to push the 3080 and maybe beat it and they need the 20gb model to compete.
2. they decide they want to make a card that costs more money to buy £649.99 for 3080 and 1399.99 for 3090 thats a lot of room to add a 20gb model in and the old price of the 2080 ti is avaible Nvidia decide to price it at £999.99 pounds

now the first reason would give anyone a good chance to change there mind in the thread and say yes its worthwhile. But if its 2nd reason nvidia can choose there price no competion ANd such a gap in price between 3080 and 3090 they could make even better profit of it and people will buy it
 
Man of Honour
Joined
21 May 2012
Posts
31,940
Location
Dalek flagship
Thing is that DLSS has to be explicitly implemented, and I'm sure I run plenty of things where it's not (at least: yet) been implemented. MSFS and X-Plane for example.

I'm with you on the Fury X nonsense regarding VRAM, never believed all that for a second. However, I do think 10GB is enough to drive 4K just fine (some benchmarks show that having more VRAM just boost performance). Maybe it would not be enough in 5 years from now, but by that time I would have upgraded anyway, and if going for the at least twice as expensive an option that upgrade might not even be feasible as quickly.

Yeah I would've liked a little bit more (12GB would've been good) but I'd rather turn down a setting or two than pay twice the price.
10GB is clearly going to be skating on the edge for 4K gaming over the next few years. I get why they’ve gone with that to keep the costs down for the 3080, but...

I can’t help thinking that the logical config lineup should have simply been:

3070 8GB (aimed at 1080p and 1440p gaming, a spot of light 4K gaming, ie most people)
3080 16GB (aimed at 4K and more enthusiast gamers into modding etc)
3090 24GB (aimed at workstation type people, and small appendiged people who just need to have the ‘best’)

The anomaly in that above is the 3080.

Also, good call above on the FuryX, was trying to remember which one it was. Wasn’t it the 4GB of HMB memory that was supposed to have magic properties. All the other cards were releasing with 6GB+ and it all turned out to be ********* did it not?

Fury X was the worst card I have owned because of memory type problems and running out of memory.

In the end I threw all my Fury Xs in the dust bin and felt a whole lot better for it.

A single Fury X on its own was ok but using more was awful.

I am not anti AMD either, ATM I am typing this on a Ryzen based 3950X system which is the best CPU setup for the games I play.
 
Soldato
Joined
8 Jun 2018
Posts
2,827
I tried pointing this out in the Ampere 8nm thread for about 30-40 pages of back and forth on this prior to launch but it didn't really go anywhere. There's a lot of old school thinking that the vRAM is somehow just a big cache for all of the assets in the game and they sit in there in case they're needed. And that was true 15+ years ago. But engines moved on since then to dynamic streaming of assets just in time, especially in open world games. So vRAM is much less about what the game needs, GTA V my goto example is 90Gb install for me but I have 8Gb vRAM, and that streams in all those assets just fine as you zip about the island in a heli or whatever.

vRAM today is way less about being a dumb cache and instead keeping just what is needed to render the next frame. From an architecture point of view this makes sense, the powerhouse in video cards is the GPU it's literally what does the work to calculate the next output frame based on all the inputs, the vRAMs only job is to service the GPU with the assets it needs to perform the calculations needed to push out the next frame, that's it. Using it as a dumb cache is expensive because the vRAM modules are expensive.

So there's this intrinsic connection between vRAM size (not just bandwidth) and GPU speed, and vice versa you can think about it like this: that every asset you're putting into vRAM is just more stuff the GPU will be doing work on, and as you demand more work from your GPU the frame rate goes down. In modern game implementations if you're putting a crap ton of stuff into vRAM then you're also putting more load on the GPU. And all attempts so far to find a game you can load up on 10Gb+ of vRAM and go "AH HAH! 10Gb is not enough!!" on the 3080 have resulted in unplayable frame rates. And low and behind you drop all those settings to make it playable and your vRAM usage is back below 10Gb.

It's also why we get all these erroneous arguments about future proofing. About the games in 2+ years that will use more vRAM. Sure they will. But a 3080 wont be playable maxed out by then, you'll have long run into that GPU bottleneck. And that's why you shouldn't think about vRAM in terms of future proofing or game performance or what you feel you deserve or what looks nice or what is the biggest number but rather "How much can this particular GPU realistically use and still maintain playable frame rates?" which on some rough aggregate is pretty much game agnostic.
Thanks for the explanation. All of this is sound reasoning. Unfortunately, it won't take for some, IMO, until Nvidia releases those 16/20 cards.
 
Status
Not open for further replies.
Back
Top Bottom