• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 8GB of Vram enough for the 3070

Soldato
Joined
17 Aug 2003
Posts
20,158
Location
Woburn Sand Dunes
Frosty, i know you'd do the same work and have the same answers if the tables were reversed and it was AMD shipping the smaller VRAM buffers. There's nothing there to suggest nvidia really has anything to do with your view on the matter. People need to recognise the difference between what you are disputing (and the length you've gone to to provide information and numbers to back your argument up) and what you are leaving up to opinion. But, i guess it's easier to call you a shill in order to try and dismiss your whole argument in one whallop than actually read what you're writing and comprehend. Nothing unusual though, and nothing unique to this forum either *shrug*
 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
Also Guru3d benchmarks show WD:L performance on 4k Ultra with a 3070 neck and neck with a 2080Ti which has 11Gb of vRAM, so whatever is happening in their benchmarks the vRAM isn't an obvious performance hindrance, if there was major league stuttering due to running out of vRAM

https://www.guru3d.com/articles_pag..._graphics_performance_benchmark_review,7.html

They also specifically discuss vRAM usage at the end of the article here where even maxed out completely they're <8Gb vRAM

https://www.guru3d.com/articles_pag..._graphics_performance_benchmark_review,9.html

One final note, is that WD:Ls DRM actually prevents tools like MSI Afterburner from using the D3D API to fetch real vRAM usage, as noted in the resetera forum thread regarding real vRAM usage. Hopefully a cracked copy will allow inspection of the real vRAM usage and again my guess is that it'll be lower than 8Gb

A lot of the people testing this and seeing 9 to 9.5Gb of vRAM being used are using 10Gb 3080's and that's indicative of games that simply allocate as much memory as they can, but not necessarily of what their using. Lack of penalty moving to <8Gb configs in this kind of testing indicates this is the case.

And actually this kinda proves my point because on these bechmarks the vRAM is being pushed up fairly high (we don't know exactly how high) but we're also seeing the 3070 start to struggle. 44fps is not exactly ideal. Playable for some people but certainly not everyone. As I have maintained all along these games also stress the GPU and which gives out first is what is more important and this card in this game at these settings is getting borderline unplayable.
 
Soldato
Joined
6 Oct 2007
Posts
22,281
Location
North West
I have spent a few hours with WD:L today and I have to say - 4K quality assets are finally here! Game is absolutely gorgeous and the textures hold up incredibly well. This is what we have to look forward to, more of, going forward thanks to bigger vram in consoles. :)

Not mine, but they speak for themselves:

Yeah, PS5/XBX is going to push that VRam big time with texture quality, check the guy out below crying over the vram use at 1080p in a current gen game.

 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
Frosty, i know you'd do the same work and have the same answers if the tables were reversed and it was AMD shipping the smaller VRAM buffers. There's nothing there to suggest nvidia really has anything to do with your view on the matter. People need to recognise the difference between what you are disputing (and the length you've gone to to provide information and numbers to back your argument up) and what you are leaving up to opinion. But, i guess it's easier to call you a shill in order to try and dismiss your whole argument in one whallop than actually read what you're writing and comprehend. Nothing unusual though, and nothing unique to this forum either *shrug*

Yeah it's purely a technical question, and it's one we can objectively measure with tools we have available. Most of the people that blow right past the arguments on their merit and go straight to accusations of shilling are pivoting away from the original discussion, it's a pretty common tactic.

Accusations of shill were only a matter of time with him though, I've read carefully, and thought carefully about a lot of his posts about how Nvidia are "short changing" people on vRAM as if somehow by putting less on the card they benefit and pocket the difference. He's quick to attribute things like this to malice, when the reality is way more boring, vRAM costs money, if Nvidia buy more to put on the card they have to charge more to consumers. It's extremely basic business/economics. Anyway it's not worth engaging in people who are bad faith actors like that, so I've put him on ignore. Accusations of shill are just completely uncalled for.
 
Soldato
Joined
6 Feb 2019
Posts
17,556
Yeah, PS5/XBX is going to push that VRam big time with texture quality, check the guy out below crying over the vram use at 1080p in a current gen game.


Yep like I said most games today are shipping with downgraded low quality assets because most gpus don't have much vram but that's changing very quickly
 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
Yeah, PS5/XBX is going to push that VRam big time with texture quality, check the guy out below crying over the vram use at 1080p in a current gen game.


Pretty uncharitable take to be honest. He mentions vRAM being 8.6Gb of the 10Gb but doesn't "cry" about it, in fact he makes no judgement at all about vRAM usage. The only thing he's "crying" about is performance at 1080p. And quite frankly that's probably because ray tracing is trashing his performance, he has ray traced reflections set to Ultra which will be a performance hog as we all know RT is demanding.

Again this kinda confirms what I've been saying. He's playing on a 3080 that has 10Gb of vRAM and he's got completely trash frame rate at 1080p, yet he's well within vRAM budget, we do not know how much inside that vRAM budget he is because he can't measure vRAM in use, only what is allocated and we know those 2 things differ. My bet is that it's sub 8Gb like every single other game I've personally tested that people claimed was >8Gb.

So it's like...yeah what I said all along, newer games are more demanding on vRAM but they're also more demanding on the GPU. He's inside his vRAM budget but his frame rate is garbage. Evidence that the amount of vRAM is appropriate for the card, the GPU struggles before it become vRAM constrained.

I can't wait to test this because now I know these vRAM measurements are malloc not mem used, I can be reasonably sure that real memory usage will be a lot lower and than this whole 8Gb thing is another nothing burger like the last few attempts to ignore the differnece between these 2 metrics. Hopefully the DRM crack will enable inspection of the memory to give us real usage and we can see what the fuss is all about :D

Anyone want to place some bets?
 
Soldato
Joined
8 Jun 2018
Posts
2,827
Yeah, PS5/XBX is going to push that VRam big time with texture quality, check the guy out below crying over the vram use at 1080p in a current gen game.

Wow, he's at 8GB at just 1080p? LOL

Here is 4K

I'm noticing slight stutters here and there even though he says it's smooth. It plays like any other open world game that is between 30-45 FPS.
And his vram usage is right at 10GB.
 
Soldato
Joined
23 Apr 2010
Posts
11,896
Location
West Sussex
FWIW I think 10gb *is* enough. I think the 3080 is a decent card, minus all of the shenanigans. And the bait and switch pricing and etc.

Nothing will convince me 8gb is enough though. It may cling to life at 1440p for a while, but never what I could even consider close to the lifetime of the card.

I had my Titan XP I bought used for 3.5 years and if I still had it it would still be a fantastic card, more than good enough for 1440p. I suppose if you are happy replacing it (3070) in a year or two? fair enough. It's your money after all. GPUs these days seem to last a lot longer than ten years ago that's for sure. I used to upgrade every year or two then, as I hated playing new games looking like 'doo. Now? I am happy to just leave it be, until it becomes an issue. Which oddly enough in the 3.5 years I had it my Titan XP never was. I mean yeah, it went from being a 4k card to a 1440p card. So I bought a 1440p monitor (cost me way less than a new GPU). TBH I am happy with 1440p any way for the most part, but do render some games at 4k using DSR as I feel it improves them.

I mean look this is the same old crap going on when Nvidia released the 6gb 1060 and the 3gb. "Buy the 3gb ! it's not worth the extra, it's more than capable blah blah". Only if you bought the 3gb? congratulations, you now have a paper weight. The same reason why the 780 and 780Ti are now pretty much useless, even if you were hanging on at 1080p and why the 970 is still enough (because it has .5gb more lmao).
 
Soldato
Joined
19 May 2012
Posts
3,633
Well the extra vRAM offers no benefits but costs the consumer, the price of those additional memory modules is passed onto the consumer by making the card more expensive than it otherwise need be.

RTX is a trade off, better visuals but at the cost of frame rate, which DLSS is designed to help mitigate. If you don't think the trade off is worth it then I have no argument about that, extra visuals are always a trade off with performance and each is a personal choice we make based on our subjective preferences. I can only say that I personally want to play games like Cyberpunk 2077 in all its ray traced glory, I'm happy to mitigate the performance loss with DLSS because the quality of DLSS 2.0 and 2.1 is extremely good.

The extra VRAM surely comes in use in games such as modded Skyrim and the newly released Watch Dog Legions which seems to go over the VRAM limit of 8GB in some instances. Or do we just ignore these games?

I can say in my experience, the RTX has caused FPS to plummet considerably and sometimes to sub 60fps which is just unacceptable and this is with DLSS enabled. As games get more complex, I think this is going to be common even on NVIDIA's highest ends cards; as evidenced by Watch Dog Legion.
 
Soldato
Joined
19 May 2012
Posts
3,633
I'm quietly confident in 24 months, 6800XT owners wil be better off than 3080 owners in terms of longevity of their GPUs.
I still might go 3080 got gsync with LG OLED CX which seems to work perfectly, 4K 10-bit 120hz output available via control panel (rather than 8/12) and 2x HDMI 2.1 ports for my projector/AVR and TV.
 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
The extra VRAM surely comes in use in games such as modded Skyrim and the newly released Watch Dog Legions which seems to go over the VRAM limit of 8GB in some instances. Or do we just ignore these games?

I can say in my experience, the RTX has caused FPS to plummet considerably and sometimes to sub 60fps which is just unacceptable and this is with DLSS enabled. As games get more complex, I think this is going to be common even on NVIDIA's highest ends cards; as evidenced by Watch Dog Legion.

We don't know how much Watch Dog Legions uses because we can only measure the memory allocated to it, we cannot see the memory actually used by the GPU. It almost certainly uses less than 8Gb and the second I can get my hands on it for testing I'll do just that and post the results. I'm not ignoring it, I've addressed it in a number of my posts already. Specifically showing Guru3d benchmarks that indicate a 8Gb 3070 doesn't seem to suffer at all for its vRAM limitation vs 11Gb cards like the 2080ti, a good marker that it's not vRAM constrained.
 
Soldato
Joined
19 May 2012
Posts
3,633
We don't know how much Watch Dog Legions uses because we can only measure the memory allocated to it, we cannot see the memory actually used by the GPU. It almost certainly uses less than 8Gb and the second I can get my hands on it for testing I'll do just that and post the results. I'm not ignoring it, I've addressed it in a number of my posts already. Specifically showing Guru3d benchmarks that indicate a 8Gb 3070 doesn't seem to suffer at all for its vRAM limitation vs 11Gb cards like the 2080ti, a good marker that it's not vRAM constrained.


My Skyrim modded VR also destroys FPS once over 8GB of VRAM. TBH its just easier to revisit this topic in 12-24 months and see who was right, and who was wrong.

We have one side that believes video game development will not progress with a new generation of consoles from a VRAM point of view, and we have the other side which thinks it will (because historically it has every single console generation to generation).

Time will tell.

If NVIDIA are clever and pro-active, any games which do play up on their cards, they'll work with studios to DLSS. But I don't have much faith in that as with my 2080, I ran into many many many games released recently which made my 2080 cry at 4K and it would have been great to have a DLSS option; which is kind of why I don't care for DLSS. In owning my 2080, DLSS has helped me twice in the games I like to play. Thats pretty pathetic.
 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
We have one side that believes video game development will not progress with a new generation of consoles from a VRAM point of view, and we have the other side which thinks it will (because historically it has every single console generation to generation).

No, this is a complete straw man of what I've repeatedly said.

I've explicitly agreed that that future games will need more vRAM, BUT they will also need faster GPUs. And that it's reasonable to expect the GPUs in modern cards to struggle and become the limiting factor before the vRAM budget does. I've been super, ultra, mega clear on this point. Watch dogs as one of the most modern games with all the new bells and whistles and demonstrates my point clearly.
 
Soldato
Joined
16 Aug 2009
Posts
7,739
I'm quietly confident in 24 months, 6800XT owners wil be better off than 3080 owners in terms of longevity of their GPUs.
I still might go 3080 got gsync with LG OLED CX which seems to work perfectly, 4K 10-bit 120hz output available via control panel (rather than 8/12) and 2x HDMI 2.1 ports for my projector/AVR and TV.

Nvidia aren't about longevity they're only interested in this upgrade cycle maybe only this year, their drivers are the same once the new cards come along they don't bother optimizing for older cards. Longevity may be good from the consumer point of view but its an anethema to nvidia's product and profit cycle.
 
Soldato
Joined
19 May 2012
Posts
3,633
No, this is a complete straw man of what I've repeatedly said.

I've explicitly agreed that that future games will need more vRAM, BUT they will also need faster GPUs. And that it's reasonable to expect the GPUs in modern cards to struggle and become the limiting factor before the vRAM budget does. I've been super, ultra, mega clear on this point. Watch dogs as one of the most modern games with all the new bells and whistles and demonstrates my point clearly.


So, 8GB of VRAM is perfectly matched for the 3070's GPU bandwith.
Does this mean the 2080ti, 1080ti and 1080 had TOO much VRAM since they're obviously equal to or no where near as fast as the 3070?

NVIDIA at that point in time were just giving away free VRAM for no reason.. doesn't sound like them...
 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
So, 8GB of VRAM is perfectly matched for the 3070's GPU bandwith.
Does this mean the 2080ti, 1080ti and 1080 had TOO much VRAM since they're obviously equal to or no where near as fast as the 3070?

NVIDIA at that point in time were just giving away free VRAM for no reason.. doesn't sound like them...

Is it perfect? No I never said that but I believe it's close enough. Seeing something like Watch Dogs Legion release as a new game and not be vRAM constrained but is right on the very edge of being GPU constrained shows that it's been paired very intelligently. The vRAM is getting near its limits right at the same time as the GPU is near its limit. Same with something like the 3080, if you look at games that genuinely demand somewhere close to 10Gb (not just allocate that much vRAM but also use all of that) like FS2020 you see a similar thing, around 9.5Gb of vRAM usage and the 3080 is on it knees, the GPU is starting to choke just before the vRAM limit is reached. That's why I think it's very likely a 3080 Ti variant would run with a 12Gb vRAM config as sufficient for its supposed slightly better GPU.

When measured properly (again measuring actual real vRAM usage), yes I believe the 1080Ti and 2080Ti at 11Gb never likely used more than 8Gb and saw zero benefit over the 8Gb models. Now we can measure real vRAM usage people have gone back and looked at loads of the more modern/recent games and the overwhelming outcome of that is that even at 4k ultra most of those games do not exceed 6Gb of vRAM usage. Unless you're that fabled person running 1000+ skyrim mods. I think the decision to go with 11Gb was purely to do with limitations of architecture, the bus width of those cards was higher than their 8Gb couterparts and probably messed with the allowable memory configs to be something dumb like 11Gb or 6.5Gb and it was decided 6.5Gb was too low so they went with 11 to avoid bottlenecks. Probably similar reason to why AMD has rounded up to 16Gb this round.

No Nvidia weren't giving away free RAM. They purchase the RAM from suppliers to put onto the boards, that purchase costs them money, and to continue to make a profit they have to pass those memory costs onto the consumer (us) though higher prices for that card. That makes the cards less appealing to customers and they want to remain competitive so they want to keep the vRAM to only as much as is useful and no more.
 
Associate
Joined
25 Sep 2020
Posts
128
I'm quietly confident in 24 months, 6800XT owners wil be better off than 3080 owners in terms of longevity of their GPUs.
I still might go 3080 got gsync with LG OLED CX which seems to work perfectly, 4K 10-bit 120hz output available via control panel (rather than 8/12) and 2x HDMI 2.1 ports for my projector/AVR and TV.

At 4k yea next gen games will ask around 10-12gb for ultra textures im pretty sure.

What do you think about 1440p? I think next gen is going to be fine on a 3080 which im planning to buy for 1440p.
 
Soldato
Joined
18 May 2010
Posts
22,370
Location
London
Wow, he's at 8GB at just 1080p? LOL

Here is 4K

I'm noticing slight stutters here and there even though he says it's smooth. It plays like any other open world game that is between 30-45 FPS.
And his vram usage is right at 10GB.

We need to see how what kind of vram usage 3090 owners get.

Thing is we are in a bit of pickle. Stick wiht Nvidia and the 3080 but potentially have vram issues. Go AMD for the 16GB of vram but then the RT performance will be less than the RTX 3000 series.

We cant win either way.
 
Back
Top Bottom