• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 8GB of Vram enough for the 3070

Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
For 1440p 8gb right now is OK. Doom Eternal will use over 7 on Nightmare, but at 4k it will over flow the memory buffer. I don't base cards on what they can do. It matters more to me on what they can't. Like, for example right if a card does not have enough VRAM for even one game at 4k? then it is not a 4k card. Why do I look at it like this? because basically the price of these cards when compared to other gaming equipment.

I think that standard is calibrated a bit too sensitive, some games can be arbitrarily demanding because they've not been optimized or because they have settings which are "future proofed" which you see with games like the old Crysis and now of course Crysis remastered, and games like FS2020 these games can't run at 4k ultra even with a 3090, and to call that not a 4k card would be pretty controversial I feel. It's also the same with what I consider to be generally badly optimized games like Ark Survival evolved comes to mind, and maxed out barely maintains 60fps at 1440p on a 3080. This standard is way too sensitive to outliers especially when outliers might not be games you even play.

Doom Eternal was used, because it was a game that likes more VRAM than the 2080 they were comparing it to has.

Benchmarks that a cursory search shows are in conflict:

here https://www.techspot.com/article/1999-doom-eternal-benchmarks/
here https://www.techpowerup.com/review/doom-eternal-benchmark-test-performance-analysis/4.html
here https://www.kitguru.net/components/graphic-cards/dominic-moass/doom-eternal-pc-performance-analysis/
here https://www.eurogamer.net/articles/digitalfoundry-2020-nvidia-geforce-rtx-3080-review?page=2

All show 4k ultra/nightmare preset with 8Gb cards like the 2080 beating 11Gb cards like the 1080Ti

So, it should not simply be "Is 8gb on the 3070 enough" it should be "Which resolutions is it enough for". Hint - not 4k.

Right but by your own standards no card is 4k. You're taking a single dubious example of which there's a lot of evidence against and use that as a singular exception to say a card isn't 4k, by that set of standards it means no card is 4k.

And I'm going to get all autistic about this when it comes to Doom, but it uses the Id Tech 7 engine, an evolution of the id Tech 6 engine, and I investigated claims of 10Gb not being enough in the 3080 thread, using another game on this engine, Wolfenstein II. And the claims people had made were if you ran the game at what people were calling 8k or 16k textures, that you could reach like 18-20Gb of vRAM usage. After spending some time just investigating those claims it turned out what config they were editing was NOT texture resolution (as the term 8k textures and 16k textures would allude to) but rather the "is_poolSize" value which is the memory pool in vRAM measured in MB, allocated by the engine for texture streaming. And so what they were doing was setting a value of 16,000 for this which they thought meant 16k textures, and actually setting the vRAM reserved pool to 16Gb which accounted for the jump to like 18+Gb of vRAM usage. Note that the game pre-allocated all this vRAM even before a level was loaded so that vRAM usage was not a measure of what vRAM was being put to good use.

From my own testing actually setting texture settings in the menu for Wolfenstien II was altering this value in the config file, so the preset was setting like 1Gb, 2Gb, 4Gb as reserved values for this. It didn't have a direct impact on texture quality, it simply gave the engine a bigger memory pool to deal with texture swapping which in some areas in some circumstances stopped texture pop in. And all the people reviewing so called "16k textures" were doing side by side comparisons on youtube and seeing that indeed no there's no visible difference in game.

I feel this is relevant to Doom because all these texture presets are doing is the same thing as with Wolfenstien, they're pre-setting a pool of reserved memory in vRAM which is then used in some way in game to texture swap as you move through the level, first painting low quality textures to all the surfaces far away or out of sight, and then swapping them to full resolution as you get near them, then flushing old unused textures out of memory when you're far away from surfaces that use them.

So one question would be, does this even change the visual quality in game? Has anyone even checked? Has anyone checked memory usage with the new MSI Afterburner which measures per process and not just memory allocated? I don't own the game but I might acquire it at the very least just to double check some of these claims, because after a lot of research with Wolf 2 on similar claims I'm now familiar with how this all works and I'm highly skeptical. Especially considering inconsistent benchmark reviews for this game showing 8Gb cards on nightmare beating 11Gb cards just fine.

The problem as I see it? XBSX £450. Runs 4k. XB cheapo model? runs 1440p and is considerably cheaper than a 3070, plus the extra £500 in parts to make it a complete PC and of course then a monitor, keyboard, mouse, desk.

Euugh, no, not really. I mean it's marketed as 4k but let's face it the games are dumbed down with their visual settings until 4k is playable on them, the APUs in them are basically middle of the road compared to the PC which is why they're so cheap. To jackknife so seamlessly from expecting utra super dooper pooper mode in Doom on the PC to basically any old middle of the road setting (equivalent) the consoles will use, is really...weird.
 
Soldato
Joined
23 Apr 2010
Posts
11,896
Location
West Sussex
I think when the facts actually emerge the 2080Ti will slot somewhere between the 3070 and 3080 price wise. Even though I wouldn't consider the 2080Ti a great 4k card (it's pretty pants without DLSS at 4k) it does have more VRAM which will be very handy to any one rendering or what not. The other uses "other than gaming" will bolster the price a little.

You need to consider that most do already have "a" TV. Might not be 4k, but the console will still upsample any way making the image look better than a native 1080p image. My TV is only 1080p, but I noticed a nice difference in visual clarity when I switched my PS4 for a PS4 Pro (playing TLOU Remastered). It was a much nicer experience with extra grunt and clarity.

Grim - they have basically made a rod for their own back IMO. They exploited the weaknesses of the 2080 in some dumbass show of strength for the 3080, whilst simultaneously exposing what will be the issues with the 3070. I mean, if the 2080 brick walls in Doom with 8gb VRAM how is the 3070 going to be any better?

They really didn't need to do that. The facts would have sufficed. You know? "Here, 3080, 30% faster than 2080Ti at 4k, about 25% faster at 1440p and only £650" would have done perfectly well. Instead their own crap talking will now catch them out and expose their weaknesses on what will probably be their biggest selling card lol.
 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
So this is the first level of Doom Eternal on the rig in my Sig, although I'm on a single MSI GTX 1080 right now because I gave the 2nd one to my friend.

The settings are 4k screen resolution, no vsync, Ultra Nightmare preset which sets all the video settings to Ultra nightmare with the performance metrics set to ultra nightmare. The FOV is default at 90, the film grain, chromatic aberation are all default, and I'm not using any colourblind tweaks. The only thing that isn't on is HDR because I don't have a HDR monitor and resolution scaling.

I'm also running MSI Afterburner beta which you can find here with instructions to list vRAM more accurately https://www.resetera.com/threads/msi-afterburner-can-now-display-per-process-vram.291986/ there's instructions on how to get that working at the link. In my screenshot below I've got enabled OSD the typical vRAM usage that Afterburner would report if you were just using it normally, this I've labelled "vRAM Allocated" and I've got what is called "GPU Dedicated Memory Usage \ Process" labelled as "vRAM used" this is what people have been using to measure so called "real" usage.

Warning full size image is 17Mb

Doom-Eternal-Screenshot-2020-10-26-22-27-16-53.png


The graphics menu lists my settings as using 7561 MiB
Memory allocated as measured by Afterburner is 7786 MB
Memory used as measured by Afterburner per process measumrent is 6945 MB

Final note on this, it's worth pointing out that as with Wolfenstein II which uses the prior iteration of the id Tech engine, the "Texture pool size" graphics option actually assigns the entire pool to memory at game launch, which means the measurement of "vRAM used" here while lowest and closest to the true value we can measure with external tools, is probably not the actual real-real usage. We're assuming that the entire pool of memory that this video option reserves is actually filling up 100% and not say only using 70% or whatever. The only way to know that for sure is if the game has developer tools that allow us to inspect that pool and see its usage.

If anyone knows of console commands or mods that can display that kind of info please let me know. Also Doom unlike Wolf II doesn't dump the graphics options into the same config file which normally sits here Users\%userprofilename%\Saved Games\id Software\DOOMEternal\base\DOOMEternalConfig.cfg which means I can't see what pool sizes the preset options actually set, is this now stored somewhere else?


*edit*

Also worth noting that dropping from nightmare texture pool size down to high in that starting area and doing side by side screenshots from the exact same position shows zero difference in texture quality, which I sorta suspected would be the case. Confirmed with Afterburner that it drops the vRAM usage substantially yet has no visual impact, which suggests to me whatever extreme pool size they're setting with nightmare setting simply isn't being filled up. It's less about the quality of the textures and how far ahead it can stream them in.
 
Last edited:
Soldato
Joined
17 Aug 2003
Posts
20,158
Location
Woburn Sand Dunes
Also worth noting that dropping from nightmare texture pool size down to high in that starting area and doing side by side screenshots from the exact same position shows zero difference in texture quality, which I sorta suspected would be the case. Confirmed with Afterburner that it drops the vRAM usage substantially yet has no visual impact, which suggests to me whatever extreme pool size they're setting with nightmare setting simply isn't being filled up. It's less about the quality of the textures and how far ahead it can stream them in.

It's not like i havent been saying this, but according to some people dropping the texture pool setting down something appropriate is 'gimping the settings' and absolutely not acceptable :o
 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
It's not like i havent been saying this, but according to some people dropping the texture pool setting down something appropriate is 'gimping the settings' and absolutely not acceptable :o

Yeah it's kinda annoying that we don't seem to have a way to inspect that memory pool to see what's really going on, I've been looking through thousands of command line CVARs to see if I can find anything that will dump memory usage information but can't see anything obvious. It means we need to infer what is going on in there with some testing.

First off, if you type is_poolsize into the console and hit tab it will autocomplete and tell you the current allocation for it, this is based on your Texture Pool size setting.

At "Low" this is 1024mb (1Gb)
at "Ultra Nightmare" this is 4608Mb (4.5Gb)

I've just done more testing moving to further areas of the game including large open spaces with distant objects, and then toggling literally between "Ultra Nightmare" memory pool and "Low" standing exactly still between menu transitions to get accurate comparison shots. And again Afterburner confirms memory usage differences in real time. And again side by side screenshots look identical and the frame rate is identical too. This tells us something very important, if visual fidelity doesn't drop as you drop to 1Gb of vRAM pool size. That must mean all the assets for the area you're in fit into 1Gb of vRAM. When you push that pool up to 4.5Gb of vRAM the visual fidelity doesn't see an improvement. All that happens is you have more headroom for texture swapping to avoid plausible texture pop in as you do area transitions.

And you have to think that's about right, because the game is only 40Gb install on disk, and I believe the textures are compressed into the streamdb files which make up 22Gb of that. You're not stuffing 1/5th of the entire games textures into 4.5Gb of vRAM in any one scene, that would be MENTAL. I'm playing at "Low" right now moving through areas and not actually seeing any texture pop in myself, and I suspect that might be because of my SSD setup, I have 2x Samsung 960 Pros in RAID 0 which means I max out my PCI-e bandwidth of 4GB/sec transfer speed on that disk. I'll keep going on low and actually see if I can notice any texture pop in and keep an eye on task manager for disk usage.
 
Associate
Joined
25 Sep 2020
Posts
128
Also worth noting that dropping from nightmare texture pool size down to high in that starting area and doing side by side screenshots from the exact same position shows zero difference in texture quality, which I sorta suspected would be the case. Confirmed with Afterburner that it drops the vRAM usage substantially yet has no visual impact, which suggests to me whatever extreme pool size they're setting with nightmare setting simply isn't being filled up. It's less about the quality of the textures and how far ahead it can stream them in.

So can you do a comparison of a few games if possible to see how much vram usage drops when you go from the highest settings to the high settings?

Like maybe 2-3 games? I would highly appreciate that as I don't have a good rig so i can't test that myself but i really wanna know, I can't find a similar benchmark online anywhere.

Also, do you think 3070's 8gb should be good for atleast 2-3 years then at 1440p? Cos looking at real usage looks like at 1440p games are usually using around 4gb-6gb which leaves a lot of headroom even if next gen games will be demanding, If they are really that demanding I reckon I'll run out of gpu grunt before I run out of vram..So I can just drop textures or some other vram demanding settings down a notch for the games that are extremely demanding.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,587
Location
Greater London
It's not like i havent been saying this, but according to some people dropping the texture pool setting down something appropriate is 'gimping the settings' and absolutely not acceptable :o
Yep. I have been saying the same.

Oh and by the way we are watching TNG at the moment and watched the episode in your sig last night. There are four lights!!! :p
 
Soldato
Joined
28 Oct 2011
Posts
8,407
I've spent a while thinking about this point of view and trying to work out how anyone could possibly hold this, because to me it just seems obviously wrong, and I think I've narrowed it down. And maybe that can narrow some of this gap between peoples understanding if we can get at the root cause of this.

There's this underlying expectation, as you said, what you "SHOULD" get and this seems entirely based on the history of what we've got in the past and then extrapolating into the future. But you have to understand that this is a fundamentally flawed way of thinking, it presumes things can carry on indefinitely, and they cannot. Technology is a sector that has had a lot of exponential growth largely thanks to Moore's law. Specifically in computing we get most of our additional computational power and memory capacity through being able to put more transistors into a small space. But this is not something that can go on forever, we know that Moore's law will come to an end as we run into the limits of physical systems. And we're starting to see the effects of that now, Jensen even said this explicitly in CES in 2019.

You reference people on a forum laughing off someone who would tell them we'd not see doubling of performance in future, but this is not an authority for anything. Those people if they would do such a thing would be WRONG to do that, and using that as baseline for judging what you SHOULD get in future is also wrong. This is a harsh truth that you will have swallow either now or pretty soon in future.

In reality when you set aside expectations about what you should get, or what you feel like you deserve, the actual way products are priced is you add up the costs associated with producing the product, of which one of them is memory, and then you add profit margin on top of that, normally some certain % you're aiming for. This means if you add more memory onto your product, your cost to manufacture it goes up and you need to put the price up for the consumer. Because people don't like paying a lot of money you don't want to have anymore memory on your video card than you need, because otherwise you have to charge more. You're saying that your argument is not what is needed but it ought to be! Otherwise all you're advocating is putting more memory onto a card than it needs and increasing its cost with no benefit.

You've talked about the vRAM being "gimped", and people being "short changed", and people being "stuck @ 8Gb" but you're completely ignoring the fact that all the testing that has been done shows no games come anywhere close to this limit today and some games are GPU limited already before getting to 10Gb of usage which is a solid indicator that even future games wont be able to make use of 10Gb of vRAM, the GPU will simply be too much of a bottleneck.

Anyway I hope that is somewhat of an olive branch to people that feel like they're getting short changed. You need to be really careful about your expectations of what you should get. The costs to build this technology is naturally increasing and will continue to increase in future unless there's a serious paradigm shift. Plenty of people have warned about the end of Moore's law, that we're on a gravy train of doubling power for a good 50 years but that is coming to an end. As it does come to an end your expectations will have to change to stay in line with reality. The prices of video cards going up is not because Nvidia is short changing you and pocketing that money like some kind of weird evil monopoly guy, it's legitimately costing more to produce faster and higher capacity components. You compared 8Gb now to 8Gb 4 years ago, and a cursory google of the cost of 8Gb of GDDR5 vRAM 4 years ago is way cheaper ($6.50 per Gb) than GDDR6 ($11.69) and we know that GDDR6x prices are higher but there's no good sources on exactly how much.


1. It's not obviously wrong at all, just because you say so. There are conflicting views on VRAM usage you do not have the last word on this subject, usage is case specific and people have different needs, ie heavily modded games or 4K.

2. It's not a fundamentally flawed way of thinking, accepting 2016's low range VRAM in 2020/21 on 70 80 cards and maybe on what AMD come up with in simply defending big tech companies to short change you on VRAM.

3. Nope I didn't say that at all, I never said anything about "doubling performance". Go back and read what I actually posted re: "someone being laughed off the forums".

4. Yes NV are really struggling finacially as the prices of cards have sky-rocketed, look at the obscenity that was Turing pricing.

5. No games come anwyhere close? That's not true, people have posted examples you have simply ignored them. But YET AGAIN whether the level of VRAM was "enough" or "sufficient" was never my arguement, I've said that a million times...

6. Don't tell me about what I should expect to get as a consumer, maybe you should have higher expectations of what you're getting for your money rather than defending huge corps making obscene profits, why are you pro NV (in this case)and not pro-consumer? Again this speaks to the phenenom of "white Knighting" big business.


As for "olive branches" maybe it is time you accepted that others have different opinions of what is value and what isn't, and what the requirements of others are? Plenty of others can see that 8GB/10GB is not value for money on the 70 80 cards.

Here's another thing, why were NV lining up a 20GB 3080 and a 3070 with 16GB, if it's not needed, and if it's so difficult to increase VRAM as "we run into the limits of physical systems" ? - that clearly proves that there was no limit on these cards from a tech perspective.

NV tried to get away with selling minimum viable VRAM cards initially, then suddenly here's an 80 card with 2.5 X the VRAM and a 70 card with 2 X RAM...

Wake up dude.
 
Last edited:
Soldato
Joined
17 Aug 2003
Posts
20,158
Location
Woburn Sand Dunes
Yeah it's kinda annoying that we don't seem to have a way to inspect that memory pool to see what's really going on, I've been looking through thousands of command line CVARs to see if I can find anything that will dump memory usage information but can't see anything obvious. It means we need to infer what is going on in there with some testing.

First off, if you type is_poolsize into the console and hit tab it will autocomplete and tell you the current allocation for it, this is based on your Texture Pool size setting.

At "Low" this is 1024mb (1Gb)
at "Ultra Nightmare" this is 4608Mb (4.5Gb)

I've just done more testing moving to further areas of the game including large open spaces with distant objects, and then toggling literally between "Ultra Nightmare" memory pool and "Low" standing exactly still between menu transitions to get accurate comparison shots. And again Afterburner confirms memory usage differences in real time. And again side by side screenshots look identical and the frame rate is identical too. This tells us something very important, if visual fidelity doesn't drop as you drop to 1Gb of vRAM pool size. That must mean all the assets for the area you're in fit into 1Gb of vRAM. When you push that pool up to 4.5Gb of vRAM the visual fidelity doesn't see an improvement. All that happens is you have more headroom for texture swapping to avoid plausible texture pop in as you do area transitions.

And you have to think that's about right, because the game is only 40Gb install on disk, and I believe the textures are compressed into the streamdb files which make up 22Gb of that. You're not stuffing 1/5th of the entire games textures into 4.5Gb of vRAM in any one scene, that would be MENTAL. I'm playing at "Low" right now moving through areas and not actually seeing any texture pop in myself, and I suspect that might be because of my SSD setup, I have 2x Samsung 960 Pros in RAID 0 which means I max out my PCI-e bandwidth of 4GB/sec transfer speed on that disk. I'll keep going on low and actually see if I can notice any texture pop in and keep an eye on task manager for disk usage.

I'm confident you wont find anything, although Poneros is insistent that he cant tell the difference - might be worth asking him? *shrug*

I can definitely notice streaming & textures differences.

Hilarious you talking about 'good faith' by the way, Poneros. You couldn't be less objective if you tried.

I can definitely notice streaming & textures differences.
Yep. I have been saying the same.

Oh and by the way we are watching TNG at the moment and watched the episode in your sig last night. There are four lights!!! :p

ahh i wanted to get Sisko in there SHOUTING. AND PAUSING ODDLY. FOR. NO REASON. But it wouldnt fit :p
 
Last edited:
Soldato
Joined
26 May 2006
Posts
6,058
Location
Edinburgh
As for "olive branches" maybe it is time you accepted that others have different opinions of what is value and what isn't, and what the requirements of others are? Plenty of others can see that 8GB/10GB is not value for money on the 70 80 cards.


Wake up dude.

Name a card from the start of 2019 that is value for money?

Name a card if we were buying new from today that is value for money?
 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
1. It's not obviously wrong at all, just because you say so. There are conflicting views on VRAM usage you do not have the last word on this subject, usage is case specific and people have different needs, ie heavily modded games or 4K.

2. It's not a fundamentally flawed way of thinking, accepting 2016's low range VRAM in 2020/21 on 70 80 cards and maybe on what AMD come up with in simply defending big tech companies to short change you on VRAM.

3. Nope I didn't say that at all, I never said anything about "doubling performance". Go back and read what I actually posted re: "someone being laughed off the forums".

4. Yes NV are really struggling finacially as the prices of cards have sky-rocketed, look at the obscenity that was Turing pricing.

5. No games come anwyhere close? That's not true, people have posted examples you have simply ignored them. But YET AGAIN whether the level of VRAM was "enough" or "sufficient" was never my arguement, I've said that a million times...

6. Don't tell me about what I should expect to get as a consumer, maybe you should have higher expectations of what you're getting for your money rather than defending huge corps making obscene profits, why are you pro NV (in this case)and not pro-consumer? Again this speaks to the phenenom of "white Knighting" big business.


As for "olive branches" maybe it is time you accepted that others have different opinions of what is value and what isn't, and what the requirements of others are? Plenty of others can see that 8GB/10GB is not value for money on the 70 80 cards.

Here's another thing, why were NV lining up a 20GB 3080 and a 3070 with 16GB, if it's not needed, and if it's so difficult to increase VRAM as "we run into the limits of physical systems" ? - that clearly proves that there was no limit on these cards from a tech perspective.

NV tried to get away with selling minimum viable VRAM cards initially, then suddenly here's an 80 card with 2.5 X the VRAM and a 70 card with 2 X RAM...

Wake up dude.

There's conflicting opinions on vRAM usage but actually measuring usage in games is a pretty objective thing. So far in both the 10Gb and 8Gb threads we've seen FUD being spread about vRAM usage for games and I've addressed all the examples where people to have claimed to find this one killer app. But that's not the issue here as you've literally said yourself, you don't particularly care about what people need rather than what they're owed.

You're saying that people are being short changed on vRAM, can you explain what the right amount of vRAM would not "short change" people and why specifically that amount of vRAM would be appropriate for the card. What actual criteria do you use to pick the amount you're picking? Be specific.

It's hilarious that you claim I've ignored examples, I've gone to pretty long lengths in both threads to actually download and install these games, test them with new and improved tools that gather more accurate metrics and even investigate how the games are using the vRAM and what difference those settings make. Demonstrating with evidence these claims are wrong.

I pointed out that as vRAM gets faster and more compact that the price goes up and quoted real world prices of vRAM for the new cards literally being more expensive per Gb than the previous generation by at least 2x more but probably even higher for GDDR6x. It is more difficult to produce these new RAM modules that's why they're more expensive per Gb for Nvidia and AIBs to actually purchase and part of why the cards cost more. I'm sorry but if you don't know about the physical limitations of shrinking electornics then I'd suggest you read something like this https://www.technologyreview.com/2020/02/24/905789/were-not-prepared-for-the-end-of-moores-law/ what you're going to have to come to accept is that your expectations are wrong, not because I say they're wrong but because they don't accord with reality. It's not a controversial fact that Moore's Law is coming to an end, all the big tech leaders have said it, the principles of physics that define behaviour of electronics at small scales is well known and well studied. As long as you ignore or reject this fact you're going to continue to expect that components get faster at the same rate as previous generations for no additional cost, and this is fundamentally a bad expectation.

Nvidia will sell you whatever you will buy, if you want to buy a 20Gb card then be my guest, but you're going to pay more money for that memory, doesn't matter if it comes from Nvidia or AMD, for every additional Gb chip they put on the card is an additional cost to them which increases the price for you. It's really obvious that you have this chip on your shoulder about big business and getting ripped off and it's leading you to all sorts of odd conclusions.
 
Back
Top Bottom