• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Associate
Joined
1 Sep 2018
Posts
188
Without knowing enough to get into the specifics, is the move from 12nm to 8nm not meant to result in less energy consumption and less heat? Then why are we seeing the opposite in the rumours?

If you took the Turing architecture from the 2000 series, which itself I understood was a derivative of Pascal, and then simply shrunk this down so you could run more current through it and generate more heat in order to get higher clocks, wouldn't that account for the speculated performance increases we are hearing about (and in turn the increased power consumption and heat generation)? Are the rumoured clock speeds noticeably different, and if so, if this just an overclocked Turing gpu?

Again, I don't know enough about the technical side of things to understand (hence the question), but it just seems to me that if this is more efficient by design architecture and/or fabrication process then we shouldn't see such energy and cooling demands.
 
Soldato
Joined
23 Apr 2010
Posts
11,896
Location
West Sussex
Why does everyone get so obsessed with power consumption? What makes 350W unreasonable? Versus what exactly?

If that's how much power it takes to shift that many pixels around at that speed then fair enough. If AMD managed to produce a card with the same performance but far lower wattage then maybe we'd have a frame of reference but until then, it is what it is.

You don't see people pointing at 11kW electric showers and saying "OMG that's ridiculous".

hahaha you really don't expect it to use no more than 350w, do you?

It specifically states that is at stock clocks with NO overclocking. So in other words? don't even install anything that will artificially boost those clocks if you ever expect it to use 350w or less.

Want me to put it into context? the 2080Ti with its big old honker 772mm2 die uses 250w. So, that is 30% more there abouts, just at 1500mhz. More perspective needed? hint, apparently it can easily boost to well over 2ghz.

My 2080Ti Kingpin can consume over 550w @ 2.4ghz or higher under LN2. That 12 pin socket they invented is capable of 600w.

Power frugal these Samsung dies ain't. For that they will need to go back to TSMC.

That is why the cooler, the design, the 12 pin ETC ETC.
 
Caporegime
Joined
17 Mar 2012
Posts
47,559
Location
ARC-L1, Stanton System
Soldato
Joined
23 Apr 2010
Posts
11,896
Location
West Sussex
Yea to me it looks like Ampere has not went to plan architecture wise or the process from Samsung is not so good. I just can't see Nvidia at the planning stage wanting there gpu's to be over 300w. It's looking a little like Fermi but it's going to release on time.

And every one accepted Fermi once they put in artificial limiters and lower heat. Because they were fast. No one cares about one out of the three. Heat, power use, speed. If all three are crap? you have a turkey. If two of the above are OK? then yeah, no one will care.

BTW as for the comment about kettles and showers? you run your shower for what? 30 mins? kettle for 5? you can game as long as you like ;)
 
Soldato
Joined
26 Aug 2004
Posts
5,032
Location
South Wales
If that's true it looks like the 3080 is a direct replacement for the 2080TI with 19Gb/s Memory bandwidth, probably about 15 - 20% faster.

The 3090 has 25% more shaders and again higher 19Gb/s memory bandwidth, expect that to be 40 to 50% faster.
3090 is definitely not worth the extra cost if that's the case, unless of course they are gimping VRAM at first.

Probably those running 4K and want highest fps possible might stump up for it though
 
Soldato
Joined
28 May 2007
Posts
10,061
And every one accepted Fermi once they put in artificial limiters and lower heat. Because they were fast. No one cares about one out of the three. Heat, power use, speed. If all three are crap? you have a turkey. If two of the above are OK? then yeah, no one will care.

BTW as for the comment about kettles and showers? you run your shower for what? 30 mins? kettle for 5? you can game as long as you like ;)

Ohh i get that Fermi's main problem was it was late and ampere is on time. As long as it can back up those power figures with top performance it's not going to be a problem for Nvidia.
 
Soldato
Joined
23 Apr 2010
Posts
11,896
Location
West Sussex
Ohh i get that Fermi's main problem was it was late and ampere is on time. As long as it can back up those power figures with top performance it's not going to be a problem for Nvidia.

Fermi was 11 months late, too hot, too loud and guzzled power. Its worst crime was that it was barely any faster than the relatively tiny 5870. Like I said, the equation only counts if all three are a match.

The refined Fermi was faster, better, quieter etc. Still ate power but no one cared.
 
Soldato
Joined
9 Nov 2009
Posts
24,824
Location
Planet Earth
The specifications of the RTX3080 hint at yields being very poor. Despite being massively cut down,and having 40% of the VRAM,it only has a 30W lower board power. I suspect the RTX3090 has 24GB of VRAM,to justify a much higher price. What is the likelihood that RTX3090 stocks,will be much lower than the RTX3080??
 
Soldato
Joined
15 Feb 2011
Posts
3,099
Got to wonder if the memory restrictions on the midrange cards for reference are to throw a bone at the AIB's to provide some way to upsell their 'special' versions, if the MSI trademarking 25+ versions is true.
 
Caporegime
Joined
17 Mar 2012
Posts
47,559
Location
ARC-L1, Stanton System
The specifications of the RTX3080 hint at yields being very poor. Despite being massively cut down,and having 40% of the VRAM,it only has a 30W lower board power. I suspect the RTX3090 has 24GB of VRAM,to justify a much higher price. What is the likelihood that RTX3090 stocks,will be much lower than the RTX3080??

They are the same GA102 Die, TSMC 7nm, i think the 3070 will be on Samsung
 
Back
Top Bottom