• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

3080 + 3090 - Most power hungry cards in recent memory

Associate
Joined
8 May 2014
Posts
96
I think you've completely missed the point for the criticism which was more to do with the insane power and heat for the size of the die and the performance lets not get too fanboyish here now.

I dont think anyone should really care about power consumption providing the performance is good enough I mean you can run the 3080 on a 650w power supply which isn't a issue for most people at all unless theres a underlying agenda which seems to be the case.

I'm giving up. This is bizarre. Power consumption = Heat. There's nothing more to say.

I think we just wait and see what happens and see what the actual mainstream users think... not after a day but after a month or two. And I predict they won't be overjoyed.

BTW - a 650w Power Supply better not be some cheap ****** clone - and be at least Bronze, preferably higher. Even then people are going to experience problems - especially if they have latest Intel Processors.
 
Man of Honour
Joined
13 Oct 2006
Posts
91,158
Are you sure about that? It is my understanding that the team that works on one node isn't allowed to work on a competitor's node due to legal reasons and signed legal documents which makes sense since TSMC doesn't want to share node information with another Fab and wise versa.

I've never encountered anything like that aside from some very specific stuff i.e. when working with Micron on developing memory.

AFAIK stuff like their failure analysis labs, etc. don't have separate teams per where the cores are being made with several of the more experienced people working across a range of products.

EDIT: In general though there isn't that level of secrecy - I've looked at having custom ASICs made up at TSMC before (shuttle runs) and been somewhat involved in that in a previous job and aside from stuff like risk production there isn't that much in the way of that kind of legal stuff involved - just your usual Confidential Information handling terms/clauses.
 
Last edited:
Permabanned
Joined
15 Oct 2011
Posts
6,311
Location
Nottingham Carlton
I think you've completely missed the point for the criticism which was more to do with the insane power and heat for the size of the die and the performance lets not get too fanboyish here now.

I dont think anyone should really care about power consumption providing the performance is good enough I mean you can run the 3080 on a 650w power supply which isn't a issue for most people at all unless theres a underlying agenda which seems to be the case.
Power consumption Could be 500w i got cooling power as long as there is processing power I DONT CARE.
 
Last edited:
Soldato
Joined
10 Oct 2012
Posts
4,421
Location
Denmark
I've never encountered anything like that aside from some very specific stuff i.e. when working with Micron on developing memory.

AFAIK stuff like their failure analysis labs, etc. don't have separate teams per where the cores are being made with several of the more experienced people working across a range of products.

EDIT: In general though there isn't that level of secrecy - I've looked at having custom ASICs made up at TSMC before (shuttle runs) and been somewhat involved in that in a previous job and aside from stuff like risk production there isn't that much in the way of that kind of legal stuff involved - just your usual Confidential Information handling terms/clauses.

I see, but could that be different if we are talking work with the bleeding edge nodes/technology?
 
Soldato
Joined
22 Nov 2018
Posts
2,715
Power requirements are relative. You could easily overclock the most power efficient GPU in history until it becomes power hungry.

Nvidia have clearly just clocked these higher than usual that's all. If they hadn't done that, the enthusiast would have overclocked them anyway.

Nvidia haven't actually created a power hungry architecture.
 
Associate
Joined
8 May 2014
Posts
96
Power requirements are relative. You could easily overclock the most power efficient GPU in history until it becomes power hungry.

Nvidia have clearly just clocked these higher than usual that's all. If they hadn't done that, the enthusiast would have overclocked them anyway.

Nvidia haven't actually created a power hungry architecture.

Delusional.
 
Soldato
Joined
18 Oct 2002
Posts
10,951
Location
Bristol
For me the high power consumption makes this release a bit of a failure, engineering wise. It seems like most of the performance increase has simply come from burning more power. Thats not progress. Progress needs to come from architecture and silicon, not from designing a better cooler!

CPUs have managed to live with ~100W for 20+ years, Nvidia are mucking around with 320W and 350W parts. I'm really hoping AMD can be competitive with 3080 for 220-250W.
 
Associate
Joined
21 Oct 2013
Posts
2,061
Location
Ild
It's just that Nvidia have cracked the clockspeed to a stupidly high level which naturally consumes a lot of power and generates a lot of heat.
because they had to in order to get a performance increase. The card has no headroom and operates at the power limit by default. It screams of Vega and yes I have one too.

You get the impression Nvidia has something else lined up to replace within 12 months. I can't see them creating a mobile gpu out of these either.(without moving from samsung)
 
Associate
Joined
2 Jun 2016
Posts
2,382
Location
UK
30W higher power consumption compared to a 2080Ti for a 32% performance increase at 4k seems reasonable.
power-gaming-average.png

Cooler seems to do a decent job as well.

performance-per-watt-3840-2160.png


8s5wGgb8
 
Last edited:
Soldato
Joined
18 Oct 2002
Posts
10,951
Location
Bristol
30W higher power consumption compared to a 2080Ti for a 32% performance increase at 4k seems reasonable.
8s5wGgb8
No I don't see that as reasonable at all. Where does that thinking take us? 400W cards next year, then 500W? If performance had scaled with power the last decade we'd be pumping hundreds of kW into our systems by now. Raw performance needs to come with comparable efficiency improvements.
 
Associate
Joined
2 Jun 2016
Posts
2,382
Location
UK
No I don't see that as reasonable at all. Where does that thinking take us? 400W cards next year, then 500W? If performance had scaled with power the last decade we'd be pumping hundreds of kW into our systems by now. Raw performance needs to come with comparable efficiency improvements.
It's only 30W! For 4k gaming now it's a fair trade off imo. Looking at the list it doesn't seem too bad.......
Btw, did you look at the Performance per Watt chart?

The rumour is that AMD cards are going to be quite poor hungry as well. The revealed cards, three fan and two fan cards are 2x 8pin - 375W max.
 
Back
Top Bottom