• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

3080 or 3090 for ultra wide?

Man of Honour
Joined
24 Sep 2005
Posts
35,487
Hello

For the first time, I’m dabbling with the idea of building a PC for my ‘man cave’ in my new home.

I have a fairly large budget and I’m quite taken with some of the 49” super Wide monitors I’ve seen. I like the way they can partition off into separate screens and I can foresee it being really useful for work. The videos I’ve seen suggest they are epic for gaming.

I like the idea of some meaty gaming rig that can eat through anything, but I’m also conscious of diminishing returns.

From what I can gather, the 3080 will absolutely demolish most games and the 3090 gives a ~10% performance boost, but the best difference with higher resolutions.

The price difference between the cards is massive.

If getting a large monitor like that, is it wise to go for a 3090? I’m kind of reluctant to throw money up the wall just for peace of mind / simple childish notions of having ‘the best’, but I could buy one if I wanted to.

This is all hypothetical somewhat as they are obviously out of stock. I’m looking to build in q1 of next year, hopefully.

Cheers.
 
Associate
Joined
19 Jun 2017
Posts
1,029
We have been following the big Navi rumours.. it seems the flagship Navi will be beating the 3080 in traditional rasterization at 4k..since there's not much performance gap between the 3090 and 3080 Nvidia may offer the 3090 with lesser ram at a much sweeter price by the time you upgrade

Also there are these rumours of Nvidia shifting ampere to tsmc..

So nothing set in stone as of now
 
Soldato
Joined
22 Apr 2016
Posts
3,425
Yes the performance difference between the two cards is minimal. If the 3080 can’t run it well then a 3090 won’t either, but if you have to have the best then...
 
Man of Honour
OP
Joined
24 Sep 2005
Posts
35,487
We have been following the big Navi rumours.. it seems the flagship Navi will be beating the 3080 in traditional rasterization at 4k..since there's not much performance gap between the 3090 and 3080 Nvidia may offer the 3090 with lesser ram at a much sweeter price by the time you upgrade

Also there are these rumours of Nvidia shifting ampere to tsmc..

So nothing set in stone as of now
Interesting - I’ll keep an eye on the Navi. Thanks for the information :)

Yes the performance difference between the two cards is minimal. If the 3080 can’t run it well then a 3090 won’t either, but if you have to have the best then...
I feel that “having to have the best“ is such a terrible and childish reason to buy anything. It leads to such waste. I find myself trying to exert more temperance these days. Otherwise there is always another ‘the best’ and you end up in this loop of ‘want’.

I suppose I was trying to suss whether a monitor like that would make it less wasteful and more prudent.

This is the worse, I can afford a 3090 thread.
Ahh, you got me! :)
 
Caporegime
Joined
13 May 2003
Posts
33,939
Location
Warwickshire
3090 seems like peeing money up the wall unless you have some kind of 'prosumer' use case for the VRAM.

Even for the wealthy there's obviously always an opportunity cost to peeing money up the wall.

Look at a 3080 or Big Navi and be dull and stick the rest in a SIPP or mortgage :p.
 
Associate
Joined
29 Nov 2015
Posts
446
Location
East
I think there is almost certainly going to be a 3080TI to plug in the gap of £700 to £1600 pricing. Logically you want a card with more VRAM at that res so the 3080 is a bit iffy imo.

AMDs offering may be an excellent opportunity to get middle of the pack of value. Although if RTX is your desire then you're kind of stuck at present.

I was going to get a 3090 and even moved internals of my case around so it would fit as price is not an issue. In a similar fashion to you I looked at the performance per £ and it's simply a total rip off. Although I am only on a 34" 3440x1440 monitor. :D
 
Man of Honour
OP
Joined
24 Sep 2005
Posts
35,487
3090 seems like peeing money up the wall unless you have some kind of 'prosumer' use case for the VRAM.

Even for the wealthy there's obviously always an opportunity cost to peeing money up the wall.

Look at a 3080 or Big Navi and be dull and stick the rest in a SIPP or mortgage :p.
Thanks for the post. Yes, no prosumer use here - it’s more for gaming and a wish to dabble in video editing. But nothing professional.

I’ve gathered there is a lot of huffing about the VRAM of the 3080, but there it more seems to be doom mongering about future proofing and a whole bunch of people saying ‘it’s fine’. What is the relevance of the VRAM and is it an issue likely to be exacerbated with resolution?

Sorry for the silly questions, I’m very new to this and have never build a PC myself, so eager to learn!
 
Man of Honour
OP
Joined
24 Sep 2005
Posts
35,487
I think there is almost certainly going to be a 3080TI to plug in the gap of £700 to £1600 pricing. Logically you want a card with more VRAM at that res so the 3080 is a bit iffy imo.

AMDs offering may be an excellent opportunity to get middle of the pack of value. Although if RTX is your desire then you're kind of stuck at present.

I was going to get a 3090 and even moved internals of my case around so it would fit as price is not an issue. In a similar fashion to you I looked at the performance per £ and it's simply a total rip off. Although I am only on a 34" 3440x1440 monitor. :D
Hmm interesting, is the VRAM material then?

I’m in no rush to buy per se, so I could wait around until Q2. The one thing that is fixed is a whopping monitor, not 100% fixed on size but I find anything other than three monitors horrible for paperless working... so one larger screen seems a much more aesthetically pleasing solution. I was really impressed with settings on those monitors allowing you to split it up into effectively 3/6 separate monitors to snap documents to windows. Sounds amazing.

I just want to be able to drive it for gaming.
 
Associate
Joined
19 Jun 2017
Posts
1,029
Hmm interesting, is the VRAM material then?

I’m in no rush to buy per se, so I could wait around until Q2. The one thing that is fixed is a whopping monitor, not 100% fixed on size but I find anything other than three monitors horrible for paperless working... so one larger screen seems a much more aesthetically pleasing solution. I was really impressed with settings on those monitors allowing you to split it up into effectively 3/6 separate monitors to snap documents to windows. Sounds amazing.

I just want to be able to drive it for gaming.

This may be relevant to video editing:
https://www.pugetsystems.com/labs/a...VIDIAGeForceRTX3080&3090performinPremierePro?

Doesn't seem much of a benefit. Though I have never done video editing, only seen folks doing it.. a painstaking activity, maybe every percentage counts

For gaming 10gb shouldn't be much of a problem.. but I think you'd have more options by the time you upgrade. There's something coming from Intel as well in that time frame.
 
Man of Honour
Joined
15 Jan 2006
Posts
32,369
Location
Tosche Station
The answer to this thread is simple, the 3090 is only a very small improvement over the 3080, it's a waste of money and I'd totally disregard it.

In fact, I'd disregard Nvidia entirely for the moment. If you're not already in the queue for a card then you're in a position where they might as well have not even released yet.

Tune in to the AMD announcement on Wednesday and then start thinking about where your hard earned goes. Fingers crossed AMD do a real Ryzen on the GPU market.
 
Man of Honour
OP
Joined
24 Sep 2005
Posts
35,487
The answer to this thread is simple, the 3090 is only a very small improvement over the 3080, it's a waste of money and I'd totally disregard it.

In fact, I'd disregard Nvidia entirely for the moment. If you're not already in the queue for a card then you're in a position where they might as well have not even released yet.

Tune in to the AMD announcement on Wednesday and then start thinking about where your hard earned goes. Fingers crossed AMD do a real Ryzen on the GPU market.
Cool will check it, thanks.

Seems to be a good time to invest in a decent PC. Lots of products coming out that are pushing things forward significantly. Unrelated to GPUs, but from my limited reading, I’m not very impressed with intel’s awkwardness with CPU socket and boards compatibility... seems very unfriendly to those that bought in previously and want to upgrade... puts me off! Much prefer the neater AMD route. I guess you’re still at risk of them doing something similar though.
 
Caporegime
Joined
13 May 2003
Posts
33,939
Location
Warwickshire
Re. VRAM, it's been discussed to death, but people always seem to conflate VRAM usage with VRAM requirement.

It's the same with RAM...games will allocate to the extra if it's there, but what matters is frame rate deltas in current and near-term games, for which 10GB just is plenty, even at 4K.
 
Associate
Joined
19 Jun 2017
Posts
1,029
With the Bitcoin (think gpu mining) price going back up you may well find a competitive 6900 is even harder to find than a 3080.

What kind of ETH hash rates per kWh are you estimating for the 6800/6900.. seems the 3080 is going to have the teraflops horsepower this time

Though, I would realistically believe AMD will match or charge higher prices as they might have to also cover demand from unfulfilled geforce orders .. a 6900 at $1449 (discounted for lower ram) might turn out to be as eye watering as the 3090 :)
 
Last edited:
Associate
Joined
3 Jul 2012
Posts
425
What kind of ETH hash rates per kWh are you estimating for the 6800/6900.. seems the 3080 is going to have the teraflops horsepower this time

Though, I would realistically believe AMD will match or charge higher prices as they might have to also cover demand from unfulfilled geforce orders .. a 6900 at $1449 (discounted for lower ram) might turn out to be as eye watering as the 3090 :)

It's not so much about raw power but power/watt ratio, that's what made the 5000 series popular with miners.

A 5700xt consumes 140 watts per 50Mh/s, tdp is 225W. 2.8w per hash.

In theory a 6900XT consumes 153w per 75Mh/s, tdp is 255w (pressumes 6900xt is 50% better than 5700xt). 2w per hash.

Edit: 3080 is 300w per 90Mh. ~3.33w per hash.

I wouldn't be surprised if the 6900 ends up more expensive than the 3080.
 
Last edited:
Back
Top Bottom