• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Soldato
Joined
21 Jul 2005
Posts
20,045
Location
Officially least sunny location -Ronskistats
This is not true at all..

Nobody as far as I remember said 10gb would be enough this generation. Not with any real conviction or any kind of source to explain that opinion..

I would say there were some hill fighters on this. I would not call him a liar put it that way. I guess you could play safe and say some just sat on the fence over it, but then if everyone did that there wouldn't be much of a discussion would there?

Post 2 and 3 said it's enough, then there was posters like (I think) princessfrosty spamming with useless info that 8gb/10gb's enough over this thread and the 70 thread, with other posters spamming back that it wasn't.

But, you need to read the thread objectively not just with your own opinion(which everyone can be guilty of) to notice it's still a split opionion.

Nice timing ^ there you have it, some examples. :)
 
Soldato
Joined
15 Oct 2019
Posts
11,694
Location
Uk
They’re like cavemen trying to explain graphics cards. “Big number look like other big number so must be same same”.

As you said, Titan is an AI card with separate drivers. The Titan is also $1000 more expensive.
Yeah people are just being hoodwinked by nvidia into thinking the 2000 quid they just shelled out is a good deal as a Titan is normally £3000! It wouldn't have looked like a very good deal had nvidia called it the 3080ti but instead by calling it the 3090 it was a clever way to raise the price and at the same time make people think it's cheap.

Jensen even said it in his keynote thing, 3080 was the flagship. Why do people even argue over this?

He also said it was twice as fast as a 2080 when it reality it was more like 60%.

A salesman says a lot of things but that doesn't mean they are all true.
 
Last edited:
Soldato
Joined
17 Aug 2003
Posts
20,158
Location
Woburn Sand Dunes
I would say there were some hill fighters on this. I would not call him a liar put it that way. I guess you could play safe and say some just sat on the fence over it, but then if everyone did that there wouldn't be much of a discussion would there?

probably, but...who? Anybody who has taken more than a passing shot at the thread? Is Richdog taking an argument that some run and gunner made, and using that to blanket paint any participant who isn't thoroughly going for nvidia throats?

Th0nt said:
Post 2 and 3 said it's enough [snip]
Nice timing ^ There you have it, some examples :)

Right, here's post 2:
Basically, yes it's fine. But you dont know what new games will need.
For me it's a bit of a shame it doesn't at least match the 2080ti. But, its not far off TBH!
https://www.youtube.com/watch?v=mKfGz6-Gzw0

Does that support Richdog's argument [people said 10gb will always be enough this gen] or does it not? I don't think it does in any way support his argument.

tommybhoy said:
[continued]...then it kind of does what it does with multiple posting, for example there was posters like (I think) princessfrosty spamming with useless info that 8gb/10gb's enough over this thread and the 70 thread, with other posters spamming back that it wasn't.

10Gb looks fine, yes future games will demand upwards of 10gb when maxed, but they'll also be so demanding on the GPU you wont be able to run them at those settings, and as you dial back the settings you dial back the space needed in vRAM.

That was the end of the first post he made in the thread. he did not say 10gb would always be enough. People don't listen. princessFrosty spent a lot of time looking at existing games which people were saying were already causing problems and all he got was **** for it. How many times was the vram buffer option in Doom brought up despite it being proven it makes no positive difference at all? again and again and again.

tommybhoy said:
But, you need to read the thread objectively not just with your own opinion(which everyone can be guilty of) to notice it's still a split opionion.

Do you think 10gb's enough?

I don't as my 3070 runs out at 1440p so I don't imagine 10gb enough for 4K, especially not over the gpu's life cycle.

I agree, objectivity would be spelndid. Do I think it's enough? it is right now and that's all i've ever said. This is no black and white question; it's not a case of always will be or never will be, no matter how hard people try to twist and control the narrative to win empty baseless arguments. No, i believe 10gb is enough, right now, for the vast majority of people. There are going to be weird exceptions and niche use cases of course but by large yes, it's enough. And no, one poorly written AMD-optimised title doesnt count. We all knew godfall's vram requirements were nuts. I dont know why this has come up again recently.
 
Last edited:
Soldato
Joined
27 Jul 2004
Posts
3,522
Location
Yancashire
Circular discussions about nomenclature will go on forever. That’s what a circle does.

10GB is still stingy and right on the edge for a newly released ‘high end’ card (is that ok ffs?) that is great at and touted for 4K gaming.

Now, would I have bought a 3080 at msrp £650 if I could get one? Hell yes.

Have I bought a 3090FE at msrp instead because I could get one? Yes.

Does that make me a ******* idiot? Probably yes.
 
Last edited:
Soldato
Joined
7 Dec 2010
Posts
8,251
Location
Leeds
Yeah people are just being hoodwinked by nvidia into thinking the 2000 quid they just shelled out is a good deal as a Titan is normally £3000! It wouldn't have looked like a very good deal had nvidia called it the 3080ti but instead by calling it the 3090 it was a clever way to raise the price and at the same time make people think it's cheap.



He also said it was twice as fast as a 2080 when it reality it was more like 60%.

A salesman says a lot of things but that doesn't mean they are all true.


I don't think anyone that is the target audiance for a 3090 feels they have been hoodwinked, £1400 for a 3090 with 24GB of VRAM is a bargain to the right user and even £2k. Titan drivers or not, most people don't need Titan drivers and the studio drivers are enough and cuda and optix. I purchased 2 3090s for a sli/nvlink setup for work and same setup would have cost over double with a Titan card and the Titan drivers are useless to me.

Also customers were told 3090 is not really aimed at gaming but is a more powerful card than a 3080 and you can buy it if you like for gaming too but recommended to buy a 3080.

I liked the 3090 so much I purchased 2 more for another SLI system but decided to sell one to a work mate that needed a 3090 for work and kept the other one in the HTPC as a spare for my work rig (sli/nvlink 3090s in that), in case I needed to rma one of them, so as the target customer for such a card that I have been waiting for, I don't feel hoodwinked as it helps me make a living.

Buy a 3090 to play games on and then complain it was too expensive is the users mistake for making the purchase that was not really aimed at them. There are people gaming on Quadros and Titans because they think it will make their gaming better and dont care how they blow their money and now even seeing people buying the A6000 to game on and have no other use for it a £5.5K- £6k card because they don't want to wait for a 3090 or some other GPU.


Also a 10GB card to me for work is useless (but a 3080 for gaming is amazing at the FE price, at £1K-1.2K they sell at now is basically same as I thought of the 2080ti overpriced for what it is intended for .. gaming.. sadly new world norms now £1k+ gaming cards), 3090s with 24GB and the ability to pool VRAM and make use of a total of 48GB and 2x GPU power is amazing to me for the price. The A6000 has 48GB but one GPU and costs £6k. I have now same memory and twice the gpu power for less than half the price if you buy 3090FE cards. If you buy 3090 AIB cards at £2k each that's still a saving of £2k over a A6000 and you have a more powerful setup than the A6000 if you don't need the A6000 drivers. See how it works now for people using it for work ?


Example in the video of another happy user that thinks the 3090 is a bargain and many more examples if you search youtube of people actually using them for work :-






This guy has 8 x 3090s.




A6000




 
Last edited:
Soldato
Joined
28 May 2007
Posts
10,070
But you will buy a card that can NOT run max settings in all games and then...turn down settings anyway?

Considering I bought this card close to 3 years ago when it mainly had the power for me not to worry. I would set to ultra and turn off all the blurr crap which was usually enough for 60fps. All I had to worry about was gpu power when setting up. Why would I ever want to run a game and have to think about 2 things when I can just dial in my settings on the gpu performance. I can't max this not enough gpu power in this game, next game I have the performance but not the memory. To much compromising for a top end product. I never had to do it with a £450 Vega 64 so why start with a £650 plus 3080. I also hate thinking about a new product as flawed before spending money on it. Anything i buy has to tick the boxes for the money. 3080 pretty much ticks them all bar the 10gb and no gsync on my monitor.

It has not stopped me recommending it to people lately as it can be had easier than 6800xt. I always these people that I would not be happy with the 10gb but if they want that level of performance at the right price the 3080 is the best bet atm.
 
Soldato
Joined
30 Mar 2010
Posts
13,058
Location
Under The Stairs!
Who? Anybody who has taken more than a passing shot at the thread? Is Richdog taking an argument that some run and gunner made, and using that to blanket paint any participant who isn't thoroughly going for nvidia throats?



wait, what?

post 2:


Is that statement saying 10Gb will always be enough [Richdog's argument] or is it a statement saying it is right now but who knows? (what i've been saying...)





That was the end of the first post he made in the thread. he did not say 10gb would always be enough. People don't listen. princessFrosty spent a lot of time looking at existing games which people were saying were already causing problems and all he got was **** for it. How many times was the vram buffer option in Doom brought up despite it being proven it makes no positive difference at all? again and again and again.



Yes, objectivity would be spelndid. Do I think it's enough? it is right now. That's all i've ever said. This is no black and white question; it's not a case of always will be or never will be, no matter how hard people try to twist and control the narrative to win empty baseless arguments. No, i believe 10gb is enough, right now, for the vast majority of people. There are going to be weird exceptions and niche use cases of course but by large yes, it's enough. And no, one poorly written AMD-optimised title doesnt count. We all knew godfall's vram requirements were nuts. I dont know why this has come up again recently.

That was maybe princessfrosty's first post, but made many other posts when there was comments like 'so now that we know 10gb is enough' (multiple other posters making that clam too) when that ultimate claim of 'it's enough' specifically over future requirements of vram on the 3080 can't get any dumber.

The games my 3070 is running out Iv'e played through(not tried 5 mins and went it's fine) is Nvidia titles, Cod games, I don't have it but WDL reverses a lead@1080p to the slower 2080ti when you up the resolution.

Do you think the 10gb is enough over the lifetime of the gpu?

Asking as I think the 68XT will reverse it's slower 4K performance@4K and overtake the 3080 in vram intensive titles over the 4yr year lifetime of the gpu.

I'll try and get the next 70 or 80 or Amd equivalent, gpu when they launch, but plenty don't upgrade every release.
 
Soldato
Joined
17 Aug 2003
Posts
20,158
Location
Woburn Sand Dunes
Do you think the 10gb is enough over the lifetime of the gpu?
I have answered this already. I believe it's enough for now. Lifetime? Depends how long you keep it. Will it matter if when 10gb starts noticeably hamstringing performance all over the place? depends if the GPUs are still fast enough to do the job they were bought for. They proooooobably wont be? I don't know. We'll see.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,576
Location
Greater London
I played call of duty Cold War and seen the vram hit nearly 14 gig what would happen with the 10 gig cards lower detail.
I saw my Titan use nearly the full 12gb a few years ago when playing Final Fantasy 15. Does that mean it ran poorly or lower detail on 1080 and 1080Ti? Nope. Allocated memory and used memory are different. Most have not heard about this yet so get all confused thinking they need loads of vram for such games. You need to have the latest version of MSI Afterburner and I believe there is a way of showing you allocated vs what is actually being used.

Guess which card Final Fantasy 15 runs better on, a 3080 with 10gb or a Titan with 12gb. Hell, even a Titan RTX with 24GB. I am sure the Titan RTX will show it allocates more than even 12gb, but it still would get smashed by a 3080. All the while the 3080 costs about a quarter of the price of a Titan RTX. Hence why I think 10gb is fine for now. By the time it ain’t we will be on next gen cards anyways.
 
Permabanned
Joined
31 Aug 2013
Posts
3,364
Location
Scotland
Post 2 and 3 said it's enough, then it kind of does what it does with multiple posting, for example there was posters like (I think) princessfrosty spamming with useless info that 8gb/10gb's enough over this thread and the 70 thread, with other posters spamming back that it wasn't.

But, you need to read the thread objectively not just with your own opinion(which everyone can be guilty of) to notice it's still a split opionion.

Do you think 10gb's enough?

I don't as my 3070 runs out at 1440p so I don't imagine 10gb enough for 4K, especially not over the gpu's life cycle.

8GB is a bit low when the consoles will target 10GB. Remember though that at 4k with DLSS the source is 1440p. Just out of interest, what titles are having issues with 8GB at 1440p?

So far at 4k the only issue appears to be an AMD sponsored title, Godfall, requiring 12GB at maximum settings. The sad thing is that Godfall doesn't even look like a next gen title and so would appear to be AMD just making VRAM an issue as they are so far behind in ray tracing and AI. How much VRAM would Godfall have used if it were based on the latest Unreal Engine, while also using DLSS? Maybe 6-7GB at 4k?
 
Soldato
Joined
21 Jul 2005
Posts
20,045
Location
Officially least sunny location -Ronskistats
I agree, objectivity would be spelndid. Do I think it's enough? it is right now and that's all i've ever said... No, i believe 10gb is enough, right now, for the vast majority of people. There are going to be weird exceptions and niche use cases of course but by large yes, it's enough. And no, one poorly written AMD-optimised title doesnt count. We all knew godfall's vram requirements were nuts. I dont know why this has come up again recently.

I have said for a while now and this is my stance on it, now SAM/bar is a thing, the likelihood of needing more VRAM is a much better case than when it was just <princess investigates and finds games dont need 10Gb>. If games even benefit by 2% from having this feature unlocked, surely you can see that any GPU that has some VRAM left to breath it could get you better performance?

I think what people were struggling to swallow was older gen cards like the 1080Ti had more vram, and AMD were releasing 16Gb as standard.

Circular discussions about nomenclature will go on forever. That’s what a circle does.

10GB is still stingy and right on the edge for a newly released ‘high end’ card (is that ok ffs?) that is great at and touted for 4K gaming.

Now, would I have bought a 3080 at msrp £650 if I could get one? Hell yes.

Have I bought a 3090FE at msrp instead because I could get one? Yes.

Does that make me a ******* idiot? Probably yes.

I agree. For the record @james.miller , I never said princessfrosty or the others from that moment were wrong, I just came to my conclusion that nvidia were just stingy and should have offered the next leg up (which means this discussion would be eradicated, the end).
 
Soldato
Joined
12 May 2014
Posts
5,236
8GB is a bit low when the consoles will target 10GB. Remember though that at 4k with DLSS the source is 1440p. Just out of interest, what titles are having issues with 8GB at 1440p?

So far at 4k the only issue appears to be an AMD sponsored title, Godfall, requiring 12GB at maximum settings. The sad thing is that Godfall doesn't even look like a next gen title and so would appear to be AMD just making VRAM an issue as they are so far behind in ray tracing and AI. How much VRAM would Godfall have used if it were based on the latest Unreal Engine, while also using DLSS? Maybe 6-7GB at 4k?
RTX3000 series was announced in September, Godfall came out in November. Are you saying that in 2 months they planned and executed a (huge?) increase in VRAM to push it above 10Gb?
 
Permabanned
Joined
31 Aug 2013
Posts
3,364
Location
Scotland
RTX3000 series was announced in September, Godfall came out in November. Are you saying that in 2 months they planned and executed a (huge?) increase in VRAM to push it above 10Gb?

I'm saying that I don't see how Godfall is managing to demand 12GB of VRAM given how it looks. The specs of the 3080 were known several months before launch.
 
Soldato
Joined
12 May 2014
Posts
5,236
I'm saying that I don't see how Godfall is managing to demand 12GB of VRAM given how it looks. The specs of the 3080 were known several months before launch.
You think that AMD and gearbox put in extra work (read money) to screw over Nvidia based on rumours? And a rumour that can very easily change since it would be **** easy for Nvidia to put 20GB on that cards if they wanted.

looks are subjective and having nothing to do with my post.
 
Permabanned
Joined
31 Aug 2013
Posts
3,364
Location
Scotland
You think that AMD and gearbox put in extra work (read money) to screw over Nvidia based on rumours? And a rumour that can very easily change since it would be **** easy for Nvidia to put 20GB on that cards if they wanted.

looks are subjective.

It's certainly not easy to double the VRAM to 20GB, while remaining competitive. Remember Nvidia is using GDDR6X, as apposed to GDDR6. It's not just Nvidia that can play dirty.
 
Status
Not open for further replies.
Back
Top Bottom