• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Soldato
Joined
12 May 2014
Posts
5,236
Isn't the answer "It depends." If all it takes is a random texture pack to eat up all of a vram buffer, no amount of vram will ever be "enough".
.
The answer to what question? should i assume this in reference to may statement on the longevity of the graphics card?
It depends on what?
Enough to achieve what? Increrased longevity in a graphics card? Of course there is an amount of VRAM that will help to extend the longevity of a graphics card past a single graphics generation.


I think the "10gb is not enough" crowd have made their point in a way that also makes their point moot.

If it's that easy to wreck a given vram buffer, then there's no fixed target to shoot for beyond "More is better."
.
Expand on the bolded part please. I really hope that your explaination isn't just the proceeding sentence that i've left in.

I'm saying that we don't know how fast VRAM requirements will grow,

Why did you cutout the bits from my statement that touches on this point? Also do you think that 10GB will be enough to see us through highend gaming till the RTX5000 series rolls around?

that the consoles really only have around 10GB of VRAM available so they aren't aren't really relevant.

See.
Next gen engines will be built around the consoles and their limitations. AAA games will be built around the consoles and their limitations. If we are lucky we might get better textures when games come over to PC. Hence they are relevant to the discussion.
.


I'd like some real-world evidence that a) we're going to hit this limit in an appreciable number of games (i.e. not outliers like flight simulator),

Well you're not going to get it till the next generation of games are released, because the next generation of consoles that will drive AAA game developement are not released. Do you plan on waiting till then before you buy a GPU?


and that it matters if we do (games engines can likely get more intelligent about swapping data in and out ahead of need)

Game engines already do this, but there is a limit. If swapping stuff out was some sort of magical cheat code we wouldn't need more than 3GB of VRAM. That's what the 780 came out with when the current consoles were released.


Now you're just making up stuff you imagine I might say. I feel no need to respond any further. The other posters have covered this - there's a lot of hot air and posturing in this thread. Let's wait until we have some data.

No, i am making assumptions on your position because of a lack of data. So what is your position/what are you arguing?
Do you think 10GB is enough on a flagship GPU till the RTX4000 rolls around?
Do you think 10GB is enough on a flagship GPU till the RTX5000 rolls around?
Do you think 10GB is enough on a flagship GPU till the RTX6000 rolls around?
Do you think 10GB is enough on a flagship GPU till the PS6 and Xbox Series something rolls around?
 
Associate
Joined
4 Nov 2015
Posts
250
From what little graphical programming I've done, my recommendation is to get as much GPU memory on your card as possible.

I say this because the two big problems involved in binding a texture and rendering it to a buffer are transferring the data to GPU memory, and keeping it there. Transferring it requires as much bandwidth as possible (hello PCI-E 4) and can be accelerated with technologies like RTXIO, but is still prone to the fundamental limits of the API you're using. For example, OpenGL pages data to GPU memory lazily, meaning just-in-time, which is probably the main cause of hitches or blips in framerate. Vulkan allows asynchronous paging operations, meaning you can transfer data into GPU memory while you're rendering, and bind it when it's ready. Great stuff, but more complicated for the average developer. There's an interesting article on this, written by an X-Plane dev: https://developer.x-plane.com/2020/01/all-your-vram-is-belonging-to-us-and-plugins/

So basically, for games that have the budget to manage GPU memory for any given amount, you'll be fine buying any card. The engine will ideally adjust quality on the fly to avoid allocation failures, hooray. But for all other games that rely on the driver to manage memory, or use a different approach, you'll want as much GPU memory as you can get to avoid costly paging operations. In a few years I can see 8 and even 10GB being borderline for the Skyrims of tomorrow.
 
Soldato
Joined
20 Aug 2019
Posts
3,030
Location
SW Florida
The answer to what question? should i assume this in reference to may statement on the longevity of the graphics card?
It depends on what?
Enough to achieve what? Increrased longevity in a graphics card? Of course there is an amount of VRAM that will help to extend the longevity of a graphics card past a single graphics generation.

"Enough" to ensure that no game is able to blow the vram buffer over x time period.

No amount of vram will be enough to ensure *that*^ because all it takes is a mod, or a texture pack, or a developer that wants their game to be the new "Crysis" standard. vram buffers are apparently very easy to overload because developers can code vram needs to infinity and beyond if they so choose.


Expand on the bolded part please. I really hope that your explaination isn't just the proceeding sentence that i've left in.
See above.



Why did you cutout the bits from my statement that touches on this point? Also do you think that 10GB will be enough to see us through highend gaming till the RTX5000 series rolls around?

Nope. 10gb isn't enough because texture packs. Because mods. Because developers can do whatever they want. But then, because of the same reasons, neither is 16gb...or 20gb..or whatever.




If consoles will protect us from too much vram usage, how then are people making the argument that we are *already* running out of vram buffers that are larger than what consoles have...now?

There is no real limit to games' ability to crush our hardware. There are monetary/business considerations, of course, but no hard limit.
 
Soldato
Joined
12 May 2014
Posts
5,236
"Enough" to ensure that no game is able to blow the vram buffer over x time period.

No amount of vram will be enough to ensure *that*^ because all it takes is a mod, or a texture pack, or a developer that wants their game to be the new "Crysis" standard. vram buffers are apparently very easy to overload because developers can code vram needs to infinity and beyond if they so choose.

Are yes the classic, let me take your idea to the extreme and because it doesn't work in this extreme scenario it is a bad idea. *Slow clap*

If consoles will protect us from too much vram usage, how then are people making the argument that we are *already* running out of vram buffers that are larger than what consoles have...now?

Nobody said that consoles will protect us from too much VRAM usage. Nice strawman.
 
Caporegime
Joined
17 Feb 2006
Posts
29,263
Location
Cornwall
But everything is done for cost reasons. The 3080 doesn't have two of the chips out of the 3090, pre-SLI'd on one card, for cost reasons too.

I think the onus really is on the complainers here to show that it is too low, and that it's actually affecting something.
That'll be quite difficult until next-gen games are coming in the next 12 months.

I don't believe anyone has said that 8/10GB is not enough for 3-5 year old games, such as you might typically find in card reviewers' go-to benchmark suites.

And we've also said the people who upgrade every cycle have little to worry about.

So with that in mind, we're concerned for the people who ideally want to keep their cards a couple years, whilst not being VRAM limited in future titles.
 
Caporegime
Joined
17 Feb 2006
Posts
29,263
Location
Cornwall
But then isn't raw GPU power expected to limit performance too by then?
There isn't a "yes" or "no" answer to that one.

There could be some title for which the answer is "yes" and some for which the answer is "no".

Let me ask this: if RTX 4080 has "only" 10GB GDDR6XXX, will we be seeing the same arguments? If not, why not?

Why wouldn't you also be able to claim that 10GB is "enough" for a 4080, and you'd need more 5080 levels of GPU grunt to justify more VRAM?
 
Soldato
Joined
8 Jun 2018
Posts
2,827
It should, but if the miners strike, it's unlikely.
Miners are using Asic now so that's a none issue.

What is a bigger issue is how will Nvidia's Ampere compete against Consoles costing $499 with no delay in playing the games we like to play.

For $500 you get a 3070 with only 8gb of vram
For well over $500 you get a 3080 with just 10GB of vram.

For the cost of a console you could use the money left over and get COD BO Cold War anda USB HUB so you can switch your keyboard and mouse between console and PC.
 
Soldato
Joined
20 Aug 2019
Posts
3,030
Location
SW Florida
There isn't a "yes" or "no" answer to that one.

There could be some title for which the answer is "yes" and some for which the answer is "no".

Let me ask this: if RTX 4080 has "only" 10GB GDDR6XXX, will we be seeing the same arguments? If not, why not?

Why wouldn't you also be able to claim that 10GB is "enough" for a 4080, and you'd need more 5080 levels of GPU grunt to justify more VRAM?

I think vram should be tied to a GPU's ability to use it. We have gone from 8gb to 10g and next gen will probably go to 12 or 16gb as they should have even stronger GPU's.

I have owned my 1080Ti for years without ever using even 8gb. When I maxed out Project Cars 2 (The sim I spend most of my time in) to see how much vram it would actually fill, it *still* didn't get to 8GB. Even more important, it was freaking unplayable on my Reverb. The 1080Ti couldn't hold 30fps.

So, for the stuff I have used my GPU for, I have hit the GPU performance wall waaaay before I hit the vram wall.
 
Soldato
Joined
8 Jun 2018
Posts
2,827
As it stands now, Xbox Series X, derailed the Amper hype train.
Now Nvidia has to scramble to decide if they want to lower prices or simply provide a higher vram sku.
Because as it stands now a 10GB 3080 is simply a silly purchase at 1/3, and higher with exotic cooling, the cost more then a console which offers more games.
 
Last edited:
Caporegime
Joined
21 Jun 2006
Posts
38,372
Nvidia have made an official statement saying it's a non-issue.

I currently have a 2070 super and it's got less so I will believe what the experts say.

If some sweaty nerd doesn't think that's the case then go buy the 3090.
 
Soldato
Joined
8 Jun 2018
Posts
2,827
Nvidia have made an official statement saying it's a non-issue.

I currently have a 2070 super and it's got less so I will believe what the experts say.

If some sweaty nerd doesn't think that's the case then go buy the 3090.
Experts? At what exactly? Nvidia only knows their own products not the gaming market. They are competing against nextgen console which has a target price of $499. Which will offer more games then the 1 game Nvidia provides, Cyberpunk 2077. So, I don't see the relation of them being experts at anything other then marketing hype.

As it stands now 3080 is still born before the reviews came out.


I am willing to believe there was more people watching (and will buy) the LIVE BO Cold war trailer then people who will actually buy a 3080 once released. Guarantee!!
:D
 
Status
Not open for further replies.
Back
Top Bottom