• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Associate
Joined
9 May 2007
Posts
1,284
Textures can be pretty darastic.

RDR2 ultra vs high for example.

I think even subtle use of RT can look fantastic (e.g. Spiderman PS5).

Spider-man 5 has all the RT features of Control. With lots of performance tweaks so the developers can match the required frame time budget.
 
Last edited:
Soldato
Joined
17 Aug 2003
Posts
20,158
Location
Woburn Sand Dunes
It’s not really sensible to use an 8GB and proclaim that a new and untested metric (VRAM allocation) is the only way to measure the usage and performance cost of VRAM saturation.

People arent just using 8gb cards though, are they? And what size buffer would you suggest is big enough to use this new metric that *is* more accurate than the old metric that was accepted for years?
 
Associate
Joined
8 Nov 2020
Posts
75
Location
Sarajevo
Interesting claim here. No idea how we could properly test this.


Edit: @TNA Don't worry about tweaking the texture settings yourself, apparently the game engine will turn down the textures for you:p

I think this might be the truth. I have tried Death Stranding on 3080 in 8K and in some demanding DLSS modes, the textures are not loading or loading in a very low quality, but it kinda works.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,559
Location
Greater London
I think this might be the truth. I have tried Death Stranding on 3080 in 8K and in some demanding DLSS modes, the textures are not loading or loading in a very low quality, but it kinda works.
Best sell that gimped card and buy a 3090 then. Even the 6900XT will be gimped. 24GB should be enough though :p


People arent just using 8gb cards though, are they? And what size buffer would you suggest is big enough to use this new metric that *is* more accurate than the old metric that was accepted for years?

16GB obviously :p
 
Permabanned
Joined
31 Aug 2013
Posts
3,364
Location
Scotland
Games have been doing this for decades. This is why we have faster VRAM than before, lossless Tensor compression and tech such as RTX IO / DirectX IO on the way. It's also worth noting that Microsoft is also looking in to supplying lower texture quality and using AI to upscale, though I imagine this will require the cloud while Nvidia will make more use of Tensor.
 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
Modern games are in the ~100GB range on disk and a lot of that is texture data, it's even more when uncompressed in vRAM. If you want open world games which have texture data that exceeds the amount of vRAM you have then you need texture streaming. We've had this for more than a decade now. Saying that it's "cheating" is profoundly stupid. What you can say fairly IMHO is that if you're seeing texture pop in or persistent low resolution textures that you don't have enough vRAM to support the texture pool size that specific game is demanding.

The people arguing for that's why you need 16GB of vRAM are clutching at straws, it's a bad argument for a load of different reasons. First of which is that you can no more cram all the assets of a modern AAA game into 16GB than you can a 10GB card, you'd need vRAM in the range of something like 64-128GB+ of vRAM to pre-load all world assets. GDDR6 is about $12 per GB, so do the math.

Second of all we've seen what happens with side by side gameplay what happens in modern video games that have sophisticated texture streaming, with the likes of Doom Eternal and Wolfenstein that use idTechX engine allows you to explicitly set the is_poolsize value in the console/config and that controls how many MB of vRAM are reserved for texture streaming. When you set "high" in the settings and compare it to "Ultra Nightmare" you're comparing 2.5GB to 4.5GB yet there's no visual difference, that extra pool size is doing nothing for you. Reviewers have commented on this as well but you can just go try it and see for yourself.

Third, games that basically don't flush their texture pool unless they need to (when it's full) aren't using that to their benefit anyway. The card isn't pre-loading loads of extra game assets when the engine initializes it's still filling that vRAM in real time as/when it needs the assets, it can just hold them for longer before needing to discard unused stuff. LtMatt demonstrated this when he showed vRAM usage in FC5 which creeps up slowly over time. It's not as though FC5 sees you have 16GB available so allocates it all and then pre-populates it all, that's not the behaviour. I also showed the opposite effect that you can play in 4k with less vRAM and the game swaps in/out that just fine, even if you get all autistic about it and enable frame time graphs to check for micro stutter that you can't naturally see or feel. I posted my gameplay of that to youtube to demonstrate real vRAM usage and could zip around the entire open world map in a heli streaming in textures just fine. The benefit of measuring the vRAM used rather than the vRAM allocated is that you can see what it's actually using in real time, and you can see the moments it flushes a bunch of old crap it doesn't need anymore and the usage drops 1-2GB and then it'll rapidly streams in another couple of GBs in like a second. And it's constantly doing this as you zip about the map, and it works just fine.
 
Soldato
Joined
21 Jul 2005
Posts
20,031
Location
Officially least sunny location -Ronskistats
You too with your 3090 (if DPD ever deliver it). At least you won't have to listen to the it is gimped and not enough VRAM rubbish over and over :p

Tell me about it. DPD should be a meme.

Best sell that gimped card and buy a 3090 then. Even the 6900XT will be gimped. 24GB should be enough though :p

2df55652b8642f17d9b25a351fd6a792.jpeg
 
Soldato
Joined
12 May 2014
Posts
5,236
Indeed.

He can poke fun all he likes. I shall be enjoying Cyberpunk 2077 4K 720p for playable framerates with RT on release :D;)

You too with your 3090 (if DPD ever deliver it). At least you won't have to listen to the it is gimped and not enough VRAM rubbish over and over :p
You only need to put up with me poking fun of your gimped card till you get a 4080

with 11GB of VRAM:p
 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
Best sell that gimped card and buy a 3090 then. Even the 6900XT will be gimped. 24GB should be enough though :p

This is obviously the real kicker for those with AMD cards who are trying to clutch at straws to justify 16GB of memory. That the competition has a card with way more vRAM. And any weird argument that justifies 16GB over 10GB would also justify 24GB over 16GB.

It's also super amusing to me that with CoD Cold War in the video settings menu you can literally set how much vRAM to allocate. You can set options for 70%, 80% or 90% which is the first time I ever recall seeing this level of control for vRAM. So if it wasn't obvious enough to people yet that vRAM allocated is a dumb metric, this should hopefully shatter that illusion. Because you can put that bad boy on 90% with a 3080 and get ~9GB allocated but then put all your visual settings to low and see the very same menu tell you that you're only using about 2.5GB of vRAM based on your options. And the old way of measuring vRAM (allocated) would tell you "hey you're right near your 10GB limit!!!!!11oneone" when in fact the vRAM used value just confirms what the menu advises you, that you're really just using a few GB. You can see by looking at other modern games how much they allocate is often based on rules of thumb which are little better than the 90% option. Which is why you end up with some games allocating 22GB of a 3090's vRAM.

I also see people with 3090s measuring vRAM allocated in the ~20GB on youtube videos of Cold War, when we know maxing the in game settings isn't going above about 8GB of real in game measured usage, which is about on par with what the menu will advise your settings will use.

"Hey u guise, Cold War uses 20GB of memories, time to upgrade from your cruddy old measly 16GB AMD cards to a real 24GB king that can manage all 20GB of usage, rite u guise ¯\_(ツ)_/¯"

:D
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,559
Location
Greater London

:D:D:D

You only need to put up with me poking fun of your gimped card till you get a 4080

with 11GB of VRAM:p

Haha :D

Never know, it maybe a 7800XT. All depends on what's what at the time. If I have a OLED monitor by then there is a good chance it could be a 7800XT. I never stick with one company and always go with whatever suits me best at the time.


This is obviously the real kicker for those with AMD cards who are trying to clutch at straws to justify 16GB of memory. That the competition has a card with way more vRAM. And any weird argument that justifies 16GB over 10GB would also justify 24GB over 16GB.

It's also super amusing to me that with CoD Cold War in the video settings menu you can literally set how much vRAM to allocate. You can set options for 70%, 80% or 90% which is the first time I ever recall seeing this level of control for vRAM. So if it wasn't obvious enough to people yet that vRAM allocated is a dumb metric, this should hopefully shatter that illusion. Because you can put that bad boy on 90% with a 3080 and get ~9GB allocated but then put all your visual settings to low and see the very same menu tell you that you're only using about 2.5GB of vRAM based on your options. And the old way of measuring vRAM (allocated) would tell you "hey you're right near your 10GB limit!!!!!11oneone" when in fact the vRAM used value just confirms what the menu advises you, that you're really just using a few GB. You can see by looking at other modern games how much they allocate is often based on rules of thumb which are little better than the 90% option. Which is why you end up with some games allocating 22GB of a 3090's vRAM.

I also see people with 3090s measuring vRAM allocated in the ~20GB on youtube videos of Cold War, when we know maxing the in game settings isn't going above about 8GB of real in game measured usage, which is about on par with what the menu will advise your settings will use.

"Hey u guise, Cold War uses 20GB of memories, time to upgrade from your cruddy old measly 16GB AMD cards to a real 24GB king that can manage all 20GB of usage, rite u guise ¯\_(ツ)_/¯"

:D
Yep. If they carry on with this nonsense I might start posting that rubbish back at them. Your 6800/6900XT is gimped mate, you need to get a 3090. Here, look at this COD video... :p

I could also start saying their cards are gimped in regards to RT performance. Haha. But nah, it is all just silly. Both the 3080 and 6800XT are great cards imo.
 
Caporegime
Joined
12 Jul 2007
Posts
40,543
Location
United Kingdom
People arent just using 8gb cards though, are they? And what size buffer would you suggest is big enough to use this new metric that *is* more accurate than the old metric that was accepted for years?
8GB was just an example as that's what's been mentioned before in this thread.

Other metrics (such as 1% and 0.1% lows for example) should be used in conjunction to give an overall picture of what is required. Sure you can get away with less (video memory) but the experience might not be optimal, other metrics can show this. Having two GPUs with different memory sizes, like 8GB & 16GB that offer similar performance (aka 5700 XT and Radeon VII) is a useful way to test this as it's running on the same system, the only change is switching to a GPU with more video memory.

VRAM allocation metric is all nice and dandy, but it doesn't tell the full picture (about what is required for an optimal experience). A GPU cannot use more video memory for an application than what is physically available, so in theory an 8GB GPU will always be sufficient (according to the VRAM allocation metric) as you'll never get an app to show allocation higher than 8192MB (and that's assuming that no video memory is required for the OS or any background apps/extra displays etc).

My opinion on this has not changed from the start and that's why I stopped posting in here, you can only say the same thing so many times before you start going mad. :p
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,559
Location
Greater London
8GB was just an example as that's what's been mentioned before in this thread.

Other metrics (such as 1% and 0.1% lows for example) should be used in conjunction to give an overall picture of what is required. Sure you can get away with less (video memory) but the experience might not be optimal, other metrics can show this. Having two GPUs with different memory sizes, like 8GB & 16GB that offer similar performance (aka 5700 XT and Radeon VII) is a useful way to test this as it's running on the same system, the only change is switching to a GPU with more video memory.

VRAM allocation metric is all nice and dandy, but it doesn't tell the full picture (about what is required for an optimal experience). A GPU cannot use more video memory for an application than what is physically available, so in theory an 8GB GPU will always be sufficient (according to the VRAM allocation metric) as you'll never get an app to show allocation higher than 8192MB (and that's assuming that no video memory is required for the OS or any background apps/extra displays etc).

My opinion on this has not changed from the start and that's why I stopped posting in here, you can only say the same thing so many times before you start going mad. :p
So 16GB is not optimal (some may even say gimped), one should get 24GB to be sure yes? :D

Just kidding mate :p;)
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,559
Location
Greater London
Yeah why to miss the point TNA, that's why i gave up on this thread (and the other Navi) thread a long time ago. :p
Mate I got the point. Just on a wind up like some other AMD boys recently :D

Hence why I put a :p

You know my position on this, I think the 6800XT is a very good card and AMD have done well to compete so well ;)
 
Soldato
Joined
17 Aug 2003
Posts
20,158
Location
Woburn Sand Dunes
A GPU cannot use more video memory for an application than what is physically available, so in theory an 8GB GPU will always be sufficient (according to the VRAM allocation metric) as you'll never get an app to show allocation higher than 8192MB (and that's assuming that no video memory is required for the OS or any background apps/extra displays etc).

So your shared gpu memory pool is entirely pointless, then?

GPUs in the task manager | DirectX Developer Blog (microsoft.com)
Shared memory represents normal system memory that can be used by either the GPU or the CPU. This memory is flexible and can be used in either way, and can even switch back and forth as needed by the user workload. Both discrete and integrated GPUs can make use of shared memory.

Windows has a policy whereby the GPU is only allowed to use half of physical memory at any given instant. This is to ensure that the rest of the system has enough memory to continue operating properly. On a 16GB system the GPU is allowed to use up to 8GB of that DRAM at any instant. It is possible for applications to allocate much more video memory than this. Â As a matter of fact, video memory is fully virtualized on Windows and is only limited by the total system commit limit (i.e. total DRAM installed + size of the page file on disk). VidMm will ensure that the GPU doesn’t go over its half of DRAM budget by locking and releasing DRAM pages dynamically. Similarly, when surfaces aren’t in use, VidMm will release memory pages back to Mm over time, such that they may be repurposed if necessary. The amount of shared memory consumed under the performance tab essentially represents the amount of such shared system memory the GPU is currently consuming against this limit.

oqM7ct7.png
 
Last edited:
Caporegime
Joined
12 Jul 2007
Posts
40,543
Location
United Kingdom
So your shared gpu memory pool is entirely pointless, then?

GPUs in the task manager | DirectX Developer Blog (microsoft.com)


oqM7ct7.png

Lol, i like the picture. Time for an upgrade James come on. :D :p

Shared GPU memory is not true video memory though it's much slower system memory acting as video memory. It's a nice feature though to stop the game from outright crashing if saturation occurs.

High Bandwidth Cache - I used it on a Vega 64 8GB and I also had a Vega Frontier Edition 16GB. Same GPUs, same everything bar video memory capacity. You could enable HBCC on Vega 64 which allowed you to use more than 8GB, but the FPS were lower than the Vega Frontier with its 16GB native video memory on the same game at the same settings. System memory is just not as fast as video memory.

This is why it's important to use other metrics (in my opinion) to gauge VRAM requirements to achieve the optimal experience.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom