• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,583
Location
Greater London
Microsoft direct storage "preview" not available on windows until 2021 it seems. So games support it late in the year or 2022? Not quite the saviour then?
By then I will have a 4070/80 :p

In the mean time worst case use DLSS or drop one setting on textures on the odd game that needs more than 10gb to work properly. Either that or sacrifice on 30% or more grunt, save £100 and go 3070 16gb.
 
Soldato
Joined
28 Dec 2003
Posts
16,080
Well I was planning on going 3080 FE on 17th but this thread has me thinking.

Given that I have a 1080Ti which is doing fine really, reckon it's worth sitting this launch out for a little while and seeing what transpires?
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,583
Location
Greater London
Well I was planning on going 3080 FE on 17th but this thread has me thinking.

Given that I have a 1080Ti which is doing fine really, reckon it's worth sitting this launch out for a little while and seeing what transpires?
If you feel that way then yes. 1080Ti is no slouch, I am sure it will be fine for another couple of months.

If you are planning to upgrade again next gen then I am sure it is fine, especially if you go in knowing some games may need you to lower textures one notch from max. That’s like the worst thing/consequence of not going for more VRAM. Not exactly a huge deal for something that may occur a handful of times in the next 2 years.

Not ideal, but the alternative is wait and pay extra £150 for the 20gb version. That has zero appeal to me personally.
 
Soldato
Joined
28 Dec 2003
Posts
16,080
If you feel that way then yes. 1080Ti is no slouch, I am sure it will be fine for another couple of months.

If you are planning to upgrade again next gen then I am sure it is fine, especially if you go in knowing some games may need you to lower textures one notch from max. That’s like the worst thing/consequence of not going for more VRAM. Not exactly a huge deal for something that may occur a handful of times in the next 2 years.

Not ideal, but the alternative is wait and pay extra £150 for the 20gb version. That has zero appeal to me personally.

I'm running at 3440x1440 so not full 4K and I do have to turn off a few shinies on some games but the 1080Ti can still manage 90fps+ in the majority of stuff without significant quality reductions.

I deliberately skipped Turing as it was appalling value for money but will definitely be upgrading to Ampere, it's just a question of what and when.

Cyberpunk in November may be the first game that really makes the 1080Ti struggle to the point I feel I have to pull the trigger.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,583
Location
Greater London
I'm running at 3440x1440 so not full 4K and I do have to turn off a few shinies on some games but the 1080Ti can still manage 90fps+ in the majority of stuff without significant quality reductions.

I deliberately skipped Turing as it was appalling value for money but will definitely be upgrading to Ampere, it's just a question of what and when.

Cyberpunk in November may be the first game that really makes the 1080Ti struggle to the point I feel I have to pull the trigger.
Yes, so as it is your problem is grunt not vram. I am likely buying a 3080 and I am on 4K :D
 
Soldato
Joined
28 Dec 2003
Posts
16,080
Yes, so as it is your problem is grunt not vram. I am likely buying a 3080 and I am on 4K :D

Yeah I doubt the 10GB VRAM is going to be an issue now but I'm not sure about over the next year or two.

Of course, if a "higher" 3080 variant does come at some point, then it could merely be a 3080 with more memory or it could be a "Ti" or "Super" with better processor specs too.

Basically I'll be getting a 3080, I just have three options:

1. Buy an FE (like the look of them) blind on launch day before seeing reviews
2. Don't buy on launch and wait a week or two for the dust to settle and reviews to come out then decide what card to buy
3. Wait for longer (months) to see what RDNA2 looks like and whether there are any rumours of a better 3080 model coming
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,583
Location
Greater London
Yeah I doubt the 10GB VRAM is going to be an issue now but I'm not sure about over the next year or two.

Of course, if a "higher" 3080 variant does come at some point, then it could merely be a 3080 with more memory or it could be a "Ti" or "Super" with better processor specs too.

Basically I'll be getting a 3080, I just have three options:

1. Buy an FE (like the look of them) blind on launch day before seeing reviews
2. Don't buy on launch and wait a week or two for the dust to settle and reviews to come out then decide what card to buy
3. Wait for longer (months) to see what RDNA2 looks like and whether there are any rumours of a better 3080 model coming

I will be doing number 1 this time around myself. Feels like it has been ages since I picked up something on release. There is a certain excitement and fun about it that I am willing to take a hit if it turns out not to be the optimal decision. Besides, the review will be out at the same time as release or earlier I would imagine, if they turn out to be rubbish or not as promised I can just cancel the order the same day. No biggie.

Gona be fun running some becnhmarks and smashing those 2080Ti results with a GPU costing half the price what people paid a week ago :p:D
 
Soldato
Joined
5 Jan 2009
Posts
4,759
I'm personally not that bothered about RDNA2, as I'm 100% a team green fanboi and am committed to a gsync display.. The only reason to wait for RDNA2 before purchasing a 30x0 card will a if nvidia suddenly pull a 3080ti with more VRAM or prices are slashed to keep the lead.

I'm very much in the 'not sure what to do' camp. On paper 10gb vram is enough, especially if this new IO is utilised by devs, but the sceptic in me feels it will be slow. This car has to last me a long time so it has to provide some safe future proofing..
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,583
Location
Greater London
I'm personally not that bothered about RDNA2, as I'm 100% a team green fanboi and am committed to a gsync display.. The only reason to wait for RDNA2 before purchasing a 30x0 card will a if nvidia suddenly pull a 3080ti with more VRAM or prices are slashed to keep the lead.

I'm very much in the 'not sure what to do' camp. On paper 10gb vram is enough, especially if this new IO is utilised by devs, but the sceptic in me feels it will be slow. This card has to last me a long time so it has to provide some safe future proofing..
Yeah, defo wait and see what’s what in November in that case. I would go no less than 16gb in your position.
 
Soldato
Joined
22 Oct 2008
Posts
11,493
Location
Lisburn, Northern Ireland
Seen quite a few back and forth talk online about 10GB cutting it a bit short, others saying it's fine. Usage at 1440p is less than 4K but there is still a few games where usage goes just above 10GB.

So do we think the rumours of extra cards with 20GB will be there to pre-order with the other cards? Or are they waiting to see what AMD does?

3080Ti will fill that gap with extra vRAM over the normal 3080
 
Soldato
Joined
12 May 2014
Posts
5,236
Where did this image come from? Just curious, as I was intrigued to see if any other cards had been benchmarked in Fortnite with RTX enabled, to see how the 3080 compares. But I couldn't find anything, including this 3080 benchmark. Also couldn't find any discussion about it anywhere.
There image comes from a leaked Nvidia press deck.
 
Soldato
Joined
18 Feb 2015
Posts
6,484
Microsoft direct storage "preview" not available on windows until 2021 it seems. So games support it late in the year or 2022? Not quite the saviour then?

Of course not, and it require per-game implementation as well on top of it. That's why Nvidia's not relying on it as anything more than a marketing point versus consoles, and for actual vram it's going to deliver cards with more of it (at a higher price). Win/win... for them.

Where did this image come from? Just curious, as I was intrigued to see if any other cards had been benchmarked in Fortnite with RTX enabled, to see how the 3080 compares. But I couldn't find anything, including this 3080 benchmark. Also couldn't find any discussion about it anywhere.
It's from slides Nvidia has just released.
https://www.hardwareluxx.de/index.p...pere-und-gtx-30-series-deep-dive.html?start=3

3080 doing 28fps wont be due to vram limit.
Indeed
https://www.overclockers.co.uk/forums/posts/33902299/
 
Soldato
Joined
17 Apr 2009
Posts
7,591

Thanks. Interesting insight.

Minecraft RTX could bring the last generation cards to their knees. It looks like Fortnite can do the same thing this time out, without DLSS. I wonder how far the RTX implementation can go in more demanding games like Cyberpunk? Maybe we still don't have the hardware. 4000-series?

I didn't notice a vram usage bar in that slide?

I would expect to see a slide showing vram at (or near) 100% with DLSS off. That would be proof.

You know why town centres erect those signs saying "Don't feed the pigeons"? It's because they don't want the pigeons to keep coming back.

Same principle applies here.
 
Associate
Joined
12 Jul 2020
Posts
288
Seems a little silly for Nvidia to only use 10GB. But I suppose this was an intentional downgrade to entice buyers to the inevitable 16gb version.
 
Soldato
Joined
5 Jan 2009
Posts
4,759
Yeah, defo wait and see what’s what in November in that case. I would go no less than 16gb in your position.
At this point, the stupid, impatient, greedy part of me is thinking sod it, I'll just get the 3090 and be done with it. I'd not be buying anything for at least half a decade after that...

I just don't trust that new game devs are going to get immediately on board with this new SSD PCI-E streaming tech and I tend to play older titles like mass effect and Skyrim, though of course I have TES VI and mass effect remastered in mind too. FS 2020, I want 40+ fps on high/ultimate everywhere. It may be that I have to get the KY jelly out and pay up for that 3090
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom