• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Soldato
Joined
18 Feb 2015
Posts
6,484
So I am going to jump from an intel hd 5500 to a rtx 3080. Its going to be amazing, but do y'all think if im keeping the card for 4-5 years, 10gb will be good enough for high settings no AA at 1440p? I have no experience i have never owned a video card before. I will only buy the 20gb card if its only $100 more any more than that and it loses my money. Hope amd can compete im interested in seeing what rdna2 is going to do this time.

I am just worried 10gb might turn out to be like 1060 3gb which aged like milk...
I think given the choices you will be fine with 3080 10GB. If you go with AMD, they will have more vram but you'll give up other software benefits (like DLSS). If you go with 3070 16GB (when that releases) then you'll trade less performance for more vram. How it all shakes out is hard to tell but it will have to come down to subjective preference: if you're ok with turning texture quality down a step, then 3080 10GB is guaranteed to be the best choice. Otherwise you'll have to weigh texture quality vs more FPS. Tbh you can't go too wrong with any of the choices atm, though at the same time you can't even buy the cards so it's all a moot point. At 1440p though I would say you shouldn't worry about vram.

Watch this comparison as an example of what you might have to turn down (more relevant for 4K):
 
Associate
Joined
13 Mar 2009
Posts
704
So I am going to jump from an intel hd 5500 to a rtx 3080. Its going to be amazing, but do y'all think if im keeping the card for 4-5 years, 10gb will be good enough for high settings no AA at 1440p? I have no experience i have never owned a video card before. I will only buy the 20gb card if its only $100 more any more than that and it loses my money. Hope amd can compete im interested in seeing what rdna2 is going to do this time.

I am just worried 10gb might turn out to be like 1060 3gb which aged like milk...

At 2k the 3080 will be fine for ages, I've been running thing at 2k on a 1080 with 8gb ram and haven't really hit any issues yet. at 4k you might have questions but at 2k nah you will be fine and tbh unless you want to run things are silly levels at 4k its probably going to be fine.
 
Associate
Joined
25 Sep 2020
Posts
128
I think given the choices you will be fine with 3080 10GB. If you go with AMD, they will have more vram but you'll give up other software benefits (like DLSS). If you go with 3070 16GB (when that releases) then you'll trade less performance for more vram. How it all shakes out is hard to tell but it will have to come down to subjective preference: if you're ok with turning texture quality down a step, then 3080 10GB is guaranteed to be the best choice. Otherwise you'll have to weigh texture quality vs more FPS. Tbh you can't go too wrong with any of the choices atm, though at the same time you can't even buy the cards so it's all a moot point. At 1440p though I would say you shouldn't worry about vram.

Watch this comparison as an example of what you might have to turn down (more relevant for 4K):
Thanks a lot for the answer i really appreciate it!
Oh okay, I think i would like to get more fps and i would be fine going ultra->High if i ran into limitations 2 years or so down the line.

Wasn't amd planning on launching their own open source dlss,direct storage,and ray tracing?
 
Associate
Joined
25 Sep 2020
Posts
128
At 2k the 3080 will be fine for ages, I've been running thing at 2k on a 1080 with 8gb ram and haven't really hit any issues yet. at 4k you might have questions but at 2k nah you will be fine and tbh unless you want to run things are silly levels at 4k its probably going to be fine.
Aight
 
Soldato
Joined
18 Feb 2015
Posts
6,484
Thanks a lot for the answer i really appreciate it!
Oh okay, I think i would like to get more fps and i would be fine going ultra->High if i ran into limitations 2 years or so down the line.

Wasn't amd planning on launching their own open source dlss,direct storage,and ray tracing?
No problem. AMD-wise, we don't really know anything about something "DLSS-like", but we'll definitely get hardware accelerated raytracing (which who knows if it will be faster or slower than Nvidia's implementation), and direct storage is vendor agnostic and will also be present on AMD but will require that the developers implement it in their games in the first place. For the first few years I'd say it's not going to be relevant much at all until people start catching up & fully switching to the new consoles.
 
Associate
Joined
25 Sep 2020
Posts
128
No problem. AMD-wise, we don't really know anything about something "DLSS-like", but we'll definitely get hardware accelerated raytracing (which who knows if it will be faster or slower than Nvidia's implementation), and direct storage is vendor agnostic and will also be present on AMD but will require that the developers implement it in their games in the first place. For the first few years I'd say it's not going to be relevant much at all until people start catching up & fully switching to the new consoles.
Yea I'm hoping they come up with something DLSS-like otherwise it won't make sense to buy amd.
Thanks for the answer!
 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
My personal view is that when buying an already expensive electronics product, its sometimes worth spending extra to maximise the longevity and add something to the re-sale value that offsets the additional investment. We have yet to see how much a 20GB 3080 will cost, but even if it's £900-£950 it will certainly be far better value than the 3090 while preparing you for the possibility that the VRAM may come in handy 12-24 months down the line. :)

I'd agree with this IF it did actually future proof it more, but that's not typically the case. Future games have larger demands on vRAM for sure, but they also have greater demands on the GPU. What you'll find is that in 2 years if you're playing games and trying to max out settings in 4k, your GPU will be suffering trying to run the game. And when that happens people go straight to the video settings, they lower all the new effects which are dragging the frame rate down, and as you lower those settings it lowers demand on vRAM.

The usage of vRAM and the load on the GPU are disconnected in that way, you cannot just load more stuff into vRAM in future games and not have that stuff have an impact on the GPU and the frame rate it can put out.
 
Caporegime
Joined
8 Sep 2005
Posts
27,421
Location
Utopia
I'd agree with this IF it did actually future proof it more, but that's not typically the case. Future games have larger demands on vRAM for sure, but they also have greater demands on the GPU. What you'll find is that in 2 years if you're playing games and trying to max out settings in 4k, your GPU will be suffering trying to run the game. And when that happens people go straight to the video settings, they lower all the new effects which are dragging the frame rate down, and as you lower those settings it lowers demand on vRAM.

I wish you would stop talking in definitives when what you are saying is actually speculative. You are making a reasoned assumption, an educated guess. You do not know beyond doubt that a 10GB 3080 will not run into a VRAM limitations that hold back the available GPU horsepower in future, especially when we have seen similar happen in past generations some years back. The journey from 3-4GB VRAM being an actual limitation to 10GB VRAM being a potential limitation has been a hell of a long one, but we are now at the point where it could happen within this generation.

The usage of vRAM and the load on the GPU are disconnected in that way, you cannot just load more stuff into vRAM in future games and not have that stuff have an impact on the GPU and the frame rate it can put out.

Do you mean connected? We have seen in the past with older generations that video cards can be VRAM limited but not horsepower limited. It's not a given that if a card doesn't have enough VRAM to smoothly handle a game, that it doesn't have enough horsepower to run the game it needs additional VRAM for.

Anyway, 10GB is currently not a limitation, that's clear. However, even not knowing if VRAM will actually be a limitation within this generation I still would prefer to invest some extra (20% or so would to me be reasonable) in a 20GB 3080 (or a 16GB 3090XT if they cut the mustard as that should be cheaper than a 3080) to remove this as a potential concern on an expensive GPU purchase for all the reasons I stated in my above posts. I'm more than happy to agree to disagree on the rest and I guess we now just have to revisit this thread later when we know more. :)
 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
I wish you would stop talking in definitives when what you are saying is actually speculative. You are making a reasoned assumption, an educated guess. You do not know beyond doubt that a 10GB 3080 will not run into a VRAM limitations that hold back the available GPU horsepower in future, especially when we have seen similar happen in past generations some years back. The journey from 3-4GB VRAM being an actual limitation to 10GB VRAM being a potential limitation has been a hell of a long one, but we are now at the point where it could happen within this generation.



Do you mean connected? We have seen in the past with older generations that video cards can be VRAM limited but not horsepower limited. It's not a given that if a card doesn't have enough VRAM to smoothly handle a game, that it doesn't have enough horsepower to run the game it needs additional VRAM for.

Anyway, 10GB is currently not a limitation, that's clear. However, even not knowing if VRAM will actually be a limitation within this generation I still would prefer to invest some extra (20% or so would to me be reasonable) in a 20GB 3080 (or a 16GB 3090XT if they cut the mustard as that should be cheaper than a 3080) to remove this as a potential concern on an expensive GPU purchase for all the reasons I stated in my above posts. I'm more than happy to agree to disagree on the rest and I guess we now just have to revisit this thread later when we know more. :)

Going by the evidence we have now of games that get anywhere close to 10Gb of vRAM usage we're seeing aggressive GPU limitations, it's based on the available evidence we have right now. Is it definitive? No, nothing is, and there's always random exceptions where some developer makes a badly optimised game that wastes memory or does something stupid, edge cases always exist. I'm not trying to say it's black or white, but I'm explaining a general principle. You cannot load up vRAM with useful assets and that not have an impact on the GPU, that's literally the entire point of the vRAM is to supply the GPU with the data it needs to do calculations to produce the next frame.

Past issues with vRAM have been mostly around things like the 3/6Gb variants we've seen, and have been rare. I've offered an explanation for this before, I personally think it's because architecture limitations with memory bus width plus available memory module sizes means you can only get certain memory configurations out of a card, in the past that was 3Gb and 6Gb. If that GPU really needed 4Gb then 3Gb would have been an under provision and 6Gb would have been an over provision. When it's kind of in the middle like that, what do you do? Well you can release both cards and let people pick, but you pay a premium for the 6Gb card and if that's worth it or not really just depends.

Sorry I meant to say that they "aren't disconnected in that way", what I'm trying to say is that vRAM amount and GPU speed are intrinisically connected, you cannot load tons and tons of new assets into a larger bank of vRAM and have the GPU spit out frames at the same speed, if those assets are going into the game in the way of higher texture detail, more detailed meshes, new effects or whatever, it all takes longer to compute and thus lower frame rates. Which is why I've advocated a better way of thinking about vRAM is not about what "the game needs" but about "how much vRAM does the GPU need to not be bottlenecked", the latter is game agnostic. Games use WILDLY varying amounts of vRAM depending on the settings you use, an what settings you use is limited by your GPU speed.

I'm just frustrated that people keep alluding to future proofing vRAM because new games will use more vRAM, but completely ignoring the fact that they'll also be way more brutal on the GPU. What really matters is the ratio of the component speed/capacity such that you have as little bottleneck as possible.

As for generational leaps. It's clear that the average games up to today have used about ~6Gb vRAM and that's only measuring allocation, and not real usage. Scarcely few games have used anything close to 8Gb, and again that's an over estimate due to measuring what was allocated. People that did this future proof thing with the 11Gb cards were ultimately burned because nothing used anything close to that.

I'm not against people wanting to not take a risk and pay for say an extra 6-10Gb of vRAM, I mean that's gonna cost you because it's not cheap. And ultimately that might have been a pointless upgrade, but thats your risk to take, it could go the other way and I could be wrong. Mostly what bugs me is that people discuss this topic one sided like 10Gb has been decided to not be enough, when so far the evidence kinda points the other way. I fundamentally see the argument that future games will demand more vRAM as a one sided take on the situation, they will also demand more from CPU/RAM/GPU/HDD/SSD etc, it's not that clear cut.
 
Caporegime
Joined
8 Sep 2005
Posts
27,421
Location
Utopia
I'm not against people wanting to not take a risk and pay for say an extra 6-10Gb of vRAM, I mean that's gonna cost you because it's not cheap. And ultimately that might have been a pointless upgrade, but thats your risk to take, it could go the other way and I could be wrong. Mostly what bugs me is that people discuss this topic one sided like 10Gb has been decided to not be enough, when so far the evidence kinda points the other way. I fundamentally see the argument that future games will demand more vRAM as a one sided take on the situation, they will also demand more from CPU/RAM/GPU/HDD/SSD etc, it's not that clear cut.
Pretty much any modern AMD or Intel CPU with 16GB RAM and an SSD will do the job for powering a modern GPU, which most people already have in some form. Indeed I would say that the CPU hasn't really been a limiting factor since a good while. The GPU is the limiting factor in the vast majority of games at higher resolutions nowadays.

I suggest to stop worrying about so much about members giving black and white view on what will or will not happen with VRAM (like saying it's 'definitely' not enough), because most people have questionable judgement at the best of times. We can give educated guesses based on what we know and based on our own experiences, and in the context of this particular discussion there is no definitive right and wrong. 10GB is enough for now, and may or may not be a limitation later within this generation if any specific games are released that go nuts with graphical fidelity. If, could, maybe, etc etc.

As I work in IT risk since years I always prefer to mitigate any potential downsides as reasonably as I can, even if the risk ultimately never materialises. It's a weighed and conscious decision that has worked out well for me so far during my long electronics-buying and PC-building career. and I encourage everyone to make practical purchasing decisions that give them the most peace of mind while still staying sensible. :)
 
Last edited:
Associate
Joined
11 Apr 2015
Posts
272
Don't be silly, 20GB won't be enough. Just pay a little bit more again and get the 3090 :p

I think it was Shadow of War that offered an optional high res texture pack, which I downloaded and tried at the time of release. I don't know what I expected, but you really couldn't tell the difference while playing. Complete waste of download time and storage space.
Actually you could. you where just running at a resolution were you would not see it.. i remeber. it made a difference on a 4k screen. i dont think it was ever ment for lower resoltuions though.
 
Associate
Joined
11 Apr 2015
Posts
272
I think because 8 k is now a thing..like how 4k was a thing when it first was all hyped up. means for ultra textures its going to be 8k isntead of 4k.. so thats where the 10gb will not be enough. but for just 4k it will.
 
Caporegime
Joined
8 Sep 2005
Posts
27,421
Location
Utopia
I think because 8 k is now a thing..like how 4k was a thing when it first was all hyped up. means for ultra textures its going to be 8k isntead of 4k.. so thats where the 10gb will not be enough. but for just 4k it will.
You can already get 8k textures for 4k games that increase VRAM requirements, usually user-created mods. 8k gaming isn't yet "a thing", Nvidia just shoehorned it in to their marketing. 8k is four times the number of pixels as 4k... and current gen GPU's can only just now run 4k reliably. 8k gaming is at least couple of generations away yet.
 
Associate
Joined
25 Sep 2020
Posts
128
I mean if some game do use 10gb of vram in like lets say 3 years or so, we can counter that by reducing the graphical fidelity right? Like going from Ultra/High -> High/Medium?

I don't really wanna spend a premium for 10 more gbs of ddr6x on the 3080 20gb lol, It is likely to be $900+

Didn't the 780ti fare fine for the next 2-3 years after the ps4 with 8gb shared ram was released? I think the 3080 10gb should be fine except maybe some rare game with THAT setting that makes it unplayable at 10gb...
 
Associate
Joined
11 Apr 2015
Posts
272
You can already get 8k textures for 4k games that increase VRAM requirements, usually user-created mods. 8k gaming isn't yet "a thing", Nvidia just shoehorned it in to their marketing. 8k is four times the number of pixels as 4k... and current gen GPU's can only just now run 4k reliably. 8k gaming is at least couple of generations away yet.


it is a thing..you can run a game at 30fps at 8k...i remeber when 4k was a thing and it was the same back then. that was my point about 8k textures and ultra texture settings in future games..were talking about the actual game here..not mods.
 
Caporegime
Joined
8 Sep 2005
Posts
27,421
Location
Utopia
it is a thing..you can run a game at 30fps at 8k...i remeber when 4k was a thing and it was the same back then. that was my point about 8k textures and ultra texture settings in future games..were talking about the actual game here..not mods.
It's possible, but it's not a thing. The minimum frame rate is also under 30fps. But sure, if you want to call it a thing then go ahead and call it a thing. *shrugs*
 
Associate
Joined
24 May 2015
Posts
499
No one with an ounce of sense is considering gaming at 8k.

Also, given the small difference in perf between the 3080 and the 3090 and the huge difference in price. How can you possibly insert a good value card between the 2? A 20GB card is going to be at least several hundred pounds more expensive with at best, a tiny improvement in performance.
 
Status
Not open for further replies.
Back
Top Bottom