• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 8GB of Vram enough for the 3070

Soldato
Joined
17 Aug 2003
Posts
20,158
Location
Woburn Sand Dunes
Yeah you did, you spent all day making out black was white, or green in this case. Me grow up? maybe it's time you grew out of being lovesick for a corp that sells you GPU's.

No i didnt, proof you cant read for toffee and you need to stop with the baseless accusations. Everyone can read the first page and see what i said.

I never said 8gb was enough,
but apparently i believe it is?
Sorry, where did i say it was ok for nvidia to lie? I didn't.
but apparently i think it is?
james.miller said:
NVIDIA GOT WHAT THEY DESERVED.
but apparently i'm defending them?
they misadvertised the number of rops and size of the l2 cache. There's no getting away from that.
but apparently i don't think nvidia lied?

I wonder why you can't just have a mature discussion. Is it because you lack any maturity?
 
Last edited:
Soldato
Joined
15 Oct 2019
Posts
11,687
Location
Uk
AMD showcase their cards tomorrow so people can have a look and decide if they want to drop cash on a 3070 or wait for navi so atleast people will know their options.
 
Soldato
Joined
13 Jun 2009
Posts
5,884
Location
In the asylum
Whilst I'll agree the 3080 has more grunt not everyone can afford the difference in price so for those who can't the 3070 will do them just fine oh I'm not a Nvidia fan boy I've had amd cards as well as nvidia
 
Soldato
Joined
21 Jul 2005
Posts
20,018
Location
Officially least sunny location -Ronskistats
..oh I'm not a Nvidia fan boy I've had amd cards as well as nvidia

giphy.gif
 
Soldato
Joined
6 Oct 2007
Posts
22,281
Location
North West
It's £500+ for 8GB of vram, that is not acceptable in 2020 when the ps5 has 16GB of the same memory for £450. Come 2021 and console ports to PC, that 8GB is gonna choke like ******. My suggestion is to go RDNA2 if you want to survive the next few years with the latest games.
 
Soldato
Joined
15 Oct 2019
Posts
11,687
Location
Uk
Whilst I'll agree the 3080 has more grunt not everyone can afford the difference in price so for those who can't the 3070 will do them just fine oh I'm not a Nvidia fan boy I've had amd cards as well as nvidia
Yeah, these are £200 cheaper than a 3080 and maybe the chance to get one on Thursday rather than waiting till next year for a 3080.

It's £500+ for 8GB of vram, that is not acceptable in 2020 when the ps5 has 16GB of the same memory for £450. Come 2021 and console ports to PC, that 8GB is gonna choke like ******. My suggestion is to go RDNA2 if you want to survive the next few years with the latest games.

The PS5 is a 4K 60 console though whereas 70 class GPU are more for 1080p/1440P high refresh gaming.

Also consider most people who play PC games don't even have a GPU with more than 6gb VRAM and GPU performance that isn't on par with high end of 4 years ago.
 
Last edited:
Associate
Joined
2 Feb 2018
Posts
237
Location
Exeter
From NVIDIA's Q&A regarding VRAM.

cLwel0l.jpg

Between 4 and 6GB is needed for 4K gaming in current and upcoming games. Means when you see more, it is just VRAM caching if there is available memory to be used.

Which is proven in my post on the previous page when testing some games with a 6GB GPU at 4K maxed settings, and only peaking at 5.8GB VRAM usage.
 
Soldato
Joined
13 Jun 2009
Posts
5,884
Location
In the asylum
You are a brave man for such honesty.. one does not simply, mention AMD cards in an nvidia thread. ;)
Lol thing is I go with what best suits me at the time don't see the point in been loyal to either because they ain't loyal to me they don't say you've had a Nvidia card for so long you can have a discount so I go with what I feel suits my needs at the time and I won't be buying at these inflated prices I shall wait till they come down
 
Soldato
Joined
17 Aug 2003
Posts
20,158
Location
Woburn Sand Dunes
Nope, but what they are saying is "it's enough". Totally forgetting that most of us don't buy a card every time Nvidia farts.

It's not enough. Come back in a year and tell me it was enough.

It would be easier if we treat the games we have access to now, and the games that might arrive in the future, as two separate things. Ie, is 8gb of VRAM enough right now, and will it be enough in the future?

Now, Dave2150 took issue with doom eternal, suggesting that people are dismissing it and apparently saying 'nobody plays it'. Nobody to my knowledge has said this. Nobody could logically dismiss the game. What they (myself included) are doing is sorting the facts from the bull. We arent dismissing the game, we're saying that whole texture pool setting is a load of overblow nonsense. and it is. And the 3070 benchmarks PROVE that the game runs just fine maxed out with an 8gb buffer, which should put an end to people posting charts of the vram allocation on a 2080ti exceeding 8gb. But i guess it's gonna take some people a few days to catch up....

Then there's future games, and the answer is who knows? Certainly, i've never said 8gb is enough for games that'll come in the future because i dont know. I can speculate; the consoles dont have 8gb of vram, let alone 10. direct storage will be a thing, which means futures games should start leaning on texture streaming more and less on trying to keep as much in vram as possible. Could vram requires actually decrease? it's possible. I don't know for sure though. What i do know, is that I have never said 8 or 10gb will definitely be 'enough'. enough is such a vague word that i wouldn't bother anyway. But you watch people continue to accuse me of saying otherwise.
 
Soldato
Joined
23 Apr 2010
Posts
11,896
Location
West Sussex
It would be easier if we treat the games we have access to now, and the games that might arrive in the future, as two separate things. Ie, is 8gb of VRAM enough right now, and will it be enough in the future?

Now, Dave2150 took issue with doom eternal, suggesting that people are dismissing it and apparently saying 'nobody plays it'. Nobody to my knowledge has said this. Nobody could logically dismiss the game. What they (myself included) are doing is sorting the facts from the bull. We arent dismissing the game, we're saying that whole texture pool setting is a load of overblow nonsense. and it is. And the 3070 benchmarks PROVE that the game runs just fine maxed out with an 8gb buffer, which should put an end to people posting charts of the vram allocation on a 2080ti exceeding 8gb. But i guess it's gonna take some people a few days to catch up....

Then there's future games, and the answer is who knows? Certainly, i've never said 8gb is enough for games that'll come in the future because i dont know. I can speculate; the consoles dont have 8gb of vram, let alone 10. direct storage will be a thing, which means futures games should start leaning on texture streaming more and less on trying to keep as much in vram as possible. Could vram requires actually decrease? it's possible. I don't know for sure though. What i do know, is that I have never said 8 or 10gb will definitely be 'enough'. enough is such a vague word that i wouldn't bother anyway. But you watch people continue to accuse me of saying otherwise.

So far we have established that the 3070 can be up to 38% slower than the 3080. We've also discovered that it's only 27% slower when not hampered by VRAM at 4k.

Three games have been shown to USE (remember I said use, not cache or pool or anything else) more than that, and pretty much the entire tech press have voiced their concern.

And that still isn't enough for people? We even had someone here think that what Nvidia said was gospel, you couldn't make it up.

When I first got (note, first got) my Fury X cardS plural 4gb was enough. Then about two months later a game came out that totally refused to even run. In fact, it would black screen and freeze my rig. At first settings were disabled, yet you could hack them back into life. And so it went, and it only got worse.

What I would like to see now, though, is more press on this. IE, in the games it does absolutely run short in (and starts streaming from memory) what exactly is happening to the metrics of the game. Obviously FPS are lower, which is why overall the 2080Ti is quite far ahead of it even though it's dead even at 1440p. But what is happening to frame times, and latency?

If it were any other time I would think Nvidia was doing us all a big favour this time around. However, it's blatantly clear they are not. The two new consoles are coming AND AMD have what appears to be something resembling real competition. I almost posted earlier "Yeah yeah, 8gb is enough blah blah.... Until tomorrow".

I know that many in this day and age see £500 as small change. Thus? they won't care. However, £500 to many is an awful lot of money for something that could be pretty screwed within a year. I however? think that £500 for a 1440p card is steep. I bought my 2070S at launch for £428 IIRC.

Like I said, time will tell. You may get a year before the dev cycle hits full swing? maybe you won't. BTW, when I said earlier about my VRAM usage I was agreeing with you. 9.5-10.5GB that it was using was obviously not what it needed. And I know what you are talking about with caching and etc. IE, it will use as much as it feels like. I, however, am talking about what happens when there is not enough in the first place. So far today? three titles have been shown to use more than that at 4k. The rest? some of them use far less. However, you can not make absolute claims on things you either don't know about, or have yet to see.

A good point was raised. "People don't realise these new consoles have twice as much VRAM as the ones now". Maybe not a good point without clarity, but with clarity I will say this. Every one here knows that PC games are often terribly optimised, and quite often pretty bloody lazy efforts. I have seen games in the past that were so terrible that cards to run them at full settings did not even exist (Crysis, GTA4 to name two). This was sloppy and it was lazy. They got the games to run on the consoles they knew they would make the most money on, then got them running on a PC (badly). If you don't think that will hold true? FFS, devs struggle to even finish games these days. We went through about three solid years where games didn't even work and in the case of Batman AA it never properly worked and is still being fixed by coders today.

If all of a sudden Microsoft and Sony say "Here, have double the VRAM" do you think they are going to think to themselves "Well, what about all of the folk using PCs? what is going to happen if we double our VRAM use and then make it more so the games are slightly better on a PC?" or do you think they are going to say "Sod that, that will take time and money just give it to them as it is".

Nvidia know this too. That is why when they were approached about it they just did what they usually do and spoke a load of poo. They know full well how little they can get away with. That has been their game for bloody years. If that wasn't their mindset? they would have gone with TSMC. Fact is they are in it for the money (shock horror) and thus you can't believe a single word they utter. If you needed any proof of that? just skim back over this entire fiasco of a launch they have just done. Just lots and lots of lies.

Edit. I tell you what else I think. If Nvidia had more VRAM on their cards than AMD nearly every one here would say "It needs more VRAM !".
 
Soldato
Joined
17 Aug 2003
Posts
20,158
Location
Woburn Sand Dunes
So far we have established that the 3070 can be up to 38% slower than the 3080. We've also discovered that it's only 27% slower when not hampered by VRAM at 4k.

Three games have been shown to USE (remember I said use, not cache or pool or anything else) more than that, and pretty much the entire tech press have voiced their concern.

And that still isn't enough for people? We even had someone here think that what Nvidia said was gospel, you couldn't make it up.
Bandwidth. 448Gb/sec vs 760Gb/sec. There's a massive difference in bandwidth between the 3070 and 3080. That's the reason the performance drops off even more against the 3080 than it does the 2080ti. It's not the size of the buffer, it's the width of the bus.
 
Associate
Joined
25 Sep 2020
Posts
128
You can't tell him, he doesn't care.

He will just bang on and on and on about how it's perfectly fine.

This is exactly what I predicted yesterday. Nvidia wanted to focus on 1440p, because they knew it matches a 2080Ti which is what they had said in their marketing. But this guy knows more than Nvidia. Guys like that? don't waste your time on.

Yea I was one of those guys saying that allocation and usage are different. But after seeing the actual results now I don't think 8gb will cut it for 4k for long. Sure, it will play any game out right now, and msfs 2020 and doom eternal at high texture settings, but next gen games? Definitely not at maxed, but at high/medium settings? Sure.

10gb is also cutting it close but it looks like it will be enough for 2 years or so at 4k...

I am planning to buy for 1440p144hz.. What are you thoughts about 3080 10gb at 1440p144hz? Do you think 10gb is enough for 1440p? I have heard people saying its kind of on the low end going forward at 4k, but Kinda curious to see what y'all think about 1440p...

I am going to wait until February 2021, If they release a 3080ti with 12gb of g6x(for competetive prices) I might grab that one, this is the latest gpu being leaked right now along with a 3070ti 10gb ddr6x.
 
Associate
Joined
25 Sep 2020
Posts
128
Last edited:
Back
Top Bottom