Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Couldn't 2x8pin power consumption be anywhere between 226w and 375w? AMD's RDNA2 is rumored to have a 60% perf/watt increase over RDNA1, exceeding their 50% target, which is mostly from architecture changes. If it has better perf/watt than Ampere and can get close to the performance of the 3080, and I suspect it will, I'll probably be switching to an AMD GPU. Nvidia really screwed up trying to squeeze TSMC on pricing but I have no doubt they will get away with it.It's only 30W! For 4k gaming now it's a fair trade off imo. Looking at the list it doesn't seem too bad.......
Btw, did you look at the Performance per Watt chart?
The rumour is that AMD cards are going to be quite poor hungry as well. The revealed cards, three fan and two fan cards are 2x 8pin - 375W max.
But that's my point, if RDNA2 is so efficient why are there 2x8 pin connectors instead of say 1x6 and 1x8 on the smaller card? It just seems oddCouldn't 2x8pin power consumption be anywhere between 226w and 375w? AMD's RDNA2 is rumored to have a 60% perf/watt increase over RDNA1, exceeding their 50% target, which is mostly from architecture changes. If it has better perf/watt than Ampere and can get close to the performance of the 3080, and I suspect it will, I'll probably be switching to an AMD GPU. Nvidia really screwed up trying to squeeze TSMC on pricing but I have no doubt they will get away with it.
This probably doesn't matter to most people, but the power consumption is actually a deal breaker for me which is a great shame as the price/performance looks excellent by modern standards. As the Mrs and I both game in a small room TDP is really important and we can't afford to add another 2-300 watts (2 cards) of heat into our office. We've tried - and suffered - previously, and even moving from overclocked 1080Ti's to 2070 Supers at stock made a huge improvement.
Looks like we'll be waiting to see how the 3070 fairs performance wise.
Edit: cooling the cards is one thing, but a lot of people don't seem to understand the heat is still dissapated into the room. All good cooling does is remove the heat from the GPU more quickly and efficiently.
Both are important, and Ampere fails at both with the 3080. Yes it is slightly more power efficient than the previous high end cards, but it fails because it's not enough for the node shrink. If every generation had similar power efficiency gains we would have cards using thousands of kilowatts by now.Whats important is performance per watt, not simply the number of watts used.