• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Caporegime
Joined
8 Sep 2005
Posts
27,421
Location
Utopia
whilst 4K will be pushed along a lot by consoles and more importantly 4K TVs are cheap enough now that perants like myself will easily buy on one for there bedrooms now. Pc gaming high refresh rate has taken off a lot even tv are doing the same on the top models. That lies the problem 4K 144hz or 120hz 165hz depending on the monitor maker is still out of reach for all of us. Monitors are still expensive for that and even the mighty 3000 series will struggle at that heights of refresh rate.

it’s still the same position we’ve had for quite a while 4K 60fps or 1440p 144hz the choice hasn’t changed :(
Why do people need 4k 144hz? Yes of course it's nice, but 4k60-75hz is also a good gaming experience as long as you aren't a twitch multiplayer gamer (I am a 99% single-player gamer). I can't play much more than RTS/strategy and older RPG games at 4k on my 1660-Super-gap-filler card, but for those I can 60-75hz is also a smooth and playable framerate with Freesync/Gsync.
 

TrM

TrM

Associate
Joined
3 Jul 2019
Posts
744
Why do people need 4k 144hz? Yes it's nice, but 4k60-75hz is also a good gaming experience as long as you aren't a twitch multiplayer gamer. I can't play much more than RTS/strategy and older RPG games at 4k on my 1660Super-gap-filler-card, but 60-75hz is also smooth and playable with Freesync/Gsync.

that’s personal choice though. Look I’m not saying your wrong in the slightest. But that is based on your personal choice. I personally play at 100fps plus in every game I play from cod to Witcher 3 at 1440p. And for me to move to 4K I would want the same Experience from that also. It’s have I’ve played for a few years now.

lots of people will feel the same as you. Lots of people will feel the same as me. This is a no win discussion as both side have merit but that’s why 4K won’t become the standard on PC at least on console I believe 4K will become the standard but even there will be a lot of people really wanting to max out there 120hz oled tv’s
 
Caporegime
Joined
8 Sep 2005
Posts
27,421
Location
Utopia
that’s personal choice though. Look I’m not saying your wrong in the slightest. But that is based on your personal choice. I personally play at 100fps plus in every game I play from cod to Witcher 3 at 1440p. And for me to move to 4K I would want the same Experience from that also. It’s have I’ve played for a few years now.

lots of people will feel the same as you. Lots of people will feel the same as me. This is a no win discussion as both side have merit but that’s why 4K won’t become the standard on PC at least on console I believe 4K will become the standard but even there will be a lot of people really wanting to max out there 120hz oled tv’s
I have a 144hz 1440p monitor so yes, I know it's nice... I am just saying that you can also have a very good experience on a 60-75hz monitor... gamers have been doing it for decades. Realistically speaking, the 3080 will not power all demanding games at 4k above those 120hz+ frames anyway without lowering graphical fidelity. :)
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,525
Location
Greater London
that’s personal choice though. Look I’m not saying your wrong in the slightest. But that is based on your personal choice. I personally play at 100fps plus in every game I play from cod to Witcher 3 at 1440p. And for me to move to 4K I would want the same Experience from that also. It’s have I’ve played for a few years now.

lots of people will feel the same as you. Lots of people will feel the same as me. This is a no win discussion as both side have merit but that’s why 4K won’t become the standard on PC at least on console I believe 4K will become the standard but even there will be a lot of people really wanting to max out there 120hz oled tv’s
Agreed. But it does not fit with how he feels so...
 
Associate
Joined
13 Mar 2009
Posts
704
4k is now becoming mainstream this generation, the consoles will ensure this in addition to the new Nvidia and AMD GPU's. Anyone saying things like "it's pointless talking about 4k" deserves no attention in these discussions, regardless of the point they are "trying" to make.


lol no it's not, 1080p is still the mainstream and if anything 2k will become the next thing worldwide because its the next cheapest level, I mean look up what resolution is the most used in the world. 4k gamers probably make up less than 5% of all gamers PC wise if that I would suspect it even lower.
 
Man of Honour
Joined
13 Oct 2006
Posts
91,063
I think it is fair to say 4K is entering mainstream this generation - it sure ain't replacing 1080p or even 1440p any time soon, if ever (short of monitor manufacturers abandoning any lower res).
 
Caporegime
Joined
8 Sep 2005
Posts
27,421
Location
Utopia
I think it is fair to say 4K is entering mainstream this generation - it sure ain't replacing 1080p or even 1440p any time soon, if ever (short of monitor manufacturers abandoning any lower res).
Yup, agreed. Some people seem to confuse "entering mainstream" as meaning: "being the most popular".
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,525
Location
Greater London
I think it is fair to say 4K is entering mainstream this generation - it sure ain't replacing 1080p or even 1440p any time soon, if ever (short of monitor manufacturers abandoning any lower res).
Entering mainstream I can accept, becoming mainstream which is what he said, not so much. Big difference imo. 4K is no way near mainstream in PC gaming, simple as.
 
Soldato
Joined
31 Oct 2002
Posts
9,860
The new consoles are the reason 4k is becoming mainstream. There's always been a need for the PC to one up the consoles, that can't continue if PC's stuck on low resolution 1080P, or even 1440P.

This is a paradigm shift, one that will be resisted by the few and welcomed by the many. ~6 months after new console generation release, more will have accepted this IMO. They'll pass a shopfloor, or a mate's house, and see 4K AAA games loading instantly on PS5 that look better than anything their PC/Monitor has shown them, for a far cheaper price.

1080P will have it's place only for top level esports, where ££ is on the line in tournaments, on 240Hz, or 360hz 1080P monitors etc, much like they clung to their CRTs for years.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,525
Location
Greater London
They'll pass a shopfloor, or a mate's house, and see 4K AAA games loading instantly on PS5 that look better than anything their PC/Monitor has shown them, for a far cheaper price.
Then they will notice it is 30fps... lol.

Which is perfectly fine for me, but many don’t like it as they are into twitch gaming. I personally enjoyed the picking up a PS4 Pro last Black Friday and finally playing all the exclusives on my OLED. Sold it a while ago and now will wait a couple of years for some exclusives to build up on the PS5 and for the price to come down and pair it with my OLED :D
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,525
Location
Greater London
It’s just a shame the choice of “4K” monitors is bloody awful. Hopefully we’ll see more progress here in the next couple of years.
True. I have had my G-Sync 4K monitor for over 3 years and there is nothing out there worth upgrading to for my needs. I am waiting and hoping LG next year bring out a 40" OLED and that is what I will likely upgrade to.
 
Permabanned
Joined
22 Oct 2018
Posts
2,451
I must confess that ( having read around on the topic ) it does seem a curious choice to put 10gb on the 3080. It's certainly cutting it close considering the 3080 is touted as a 4K card. I wonder, though, to what extent NVIDIA are trying to push the 3090, and or are deliberately putting off some discerning buyers for the arrival of the 3080ti? I don't know. I am still undecided which card I will get. It does very much seem to me that there is a large hole there that would be filled by a 3080ti / 3080S.
 
Soldato
Joined
12 May 2014
Posts
5,236
The memory allocation is purely based on the developers implementation of the game engine, they're free to request more or less whatever they want from the GPU and driver stack and then have their own internal complex rules about how that memory is used. Yes you can dynamically scale up and down memory allocation based on demand but it's better that it's not done very frequently, mostly because other apps/processes can use vRAM and if you're designing a game engine that intends to use most of what is available you don't want to faff about increasing and decreasing allocated memory, there's a risk any memory is released could be assigned somewhere else. It also forces the GPU to do a bunch of memory management which is best done infrequently.

In the case of say 2 hypothetical identical GPUs one with 10Gb and one with 13Gb you'd find the performance is the same as long as the "bloat" is basically memory that has been allocated but not filled. 2 things can happen in that circumstance, the developers rules around memory allocation can simply notice there's less available vRAM on the 10Gb card and be more conservative about their allocation, so don't over provision as much. Or because we now have unified memory they can attempt to allocate that much memory anyway, and let the GPU itself handle swapping assets from disk to vRAM. As long as that over provisioned memory isn't filled with useful stuff, there's no performance penalty there either.
.

This is interesting because going by (at the time of writing) 57 pages of discussion with people shouting "Its Allocation, not actual usage" You would think that all that extra VRAM has no purpose. However from your post it seems that allocation isn't just devs going "LOL, lets just allocate stuff for ***** and giggles", but a prediction of the maximum amount (+buffer) they will need depending on the section of the game. Since as we already know, stuff is streamed in and out depending on when it is needed and different parts of the game have different requirements. Would you call this a fair summary?

I wonder how many people who needed to read those paragraphs actually read it.



In most cases game engines are so sophisticated that a team of engineers work on them and often just licence the engine to game development studios who are mostly just artists who implement them tools. And memory management is kinda abstracted away from them, it's not as efficient in terms of possible performance but it means games are cheaper, not everyone needs to write their own engine. So for example an engine like unreal engine deals with memory allocation, and then provides mapping tools to devs to make use of that, and a mapper can add in transition/loading zones to areas in which new area loads in, with all the assets, while the old one is dumped out of memory. The devs don't deal with that process themselves it's just too complex, they just get a set of tools. The hardware/drivers/API/Engine/Game/design are all so complex these days you need massive amounts of abstraction to allow them to all work together seamlessly and that has performance implications.
.

The sounds like something that indie devs and smaller studios would do. Most AAA games tend to have their own engines, surely at that point they will be talking to each other. I just can't see how games development can get so far with the throw it over the fence and forget about it approach.
Also how would that work for console development though? Surely they need at least one person in the studio who understands the engine (to a certain extent) to ensure it functions properly on the consoles.
 
Associate
Joined
11 Jun 2013
Posts
1,087
Location
Nottingham
This is interesting because going by (at the time of writing) 57 pages of discussion with people shouting "Its Allocation, not actual usage" You would think that all that extra VRAM has no purpose. However from your post it seems that allocation isn't just devs going "LOL, lets just allocate stuff for ***** and giggles", but a prediction of the maximum amount (+buffer) they will need depending on the section of the game. Since as we already know, stuff is streamed in and out depending on when it is needed and different parts of the game have different requirements. Would you call this a fair summary?

It'll be the drivers. Theyll do allocations beyond what the application itself is requesting. The lower level the API, the more they'll be allocating for doing the work the application is requesting, and they might be allocating more if more VRAM is available.
 
Soldato
Joined
12 May 2014
Posts
5,236
It'll be the drivers. Theyll do allocations beyond what the application itself is requesting. The lower level the API, the more they'll be allocating for doing the work the application is requesting, and they might be allocating more if more VRAM is available.
How much more? 1%, 5%, 15%, 30%?
I'm assuming it is a range, do we have a figure for how much is being "wasted"?
If it is drivers are there differences between AMD and Nvidia?
Does GPU architect affect the amount of "waste"?

For the record so we are all on the same page, anything greater than the max VRAM a program needs for performance is what i am classifying as waste/bloat.
 
Associate
Joined
11 Jun 2013
Posts
1,087
Location
Nottingham
Sorry, I don't have those kind of answers for you. I just know the drivers do a lot of work behind the scenes (especially for OpenGL, not so familiar with DirectX) for which they will need to be doing their own VRAM allocations.
 
Status
Not open for further replies.
Back
Top Bottom