• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Associate
Joined
14 Apr 2014
Posts
598
I think this coming generation will be the last of a painful run off the back of Turing being the worst in history of accelerated graphics. My rationale is simple once we are over the 4k hump with decent framerates you can add all the bells and whistled you like but the demand drops right off for the high-end over priced products... or to put it another way you'll be able to comfortably sit on a product for 4yrs+ and I think you'll see more people do that. I personally look to upgrade when I see double the performance on offer at a price I'm willing to spend... I would have bought a 2080Ti at £700 2yrs ago but that was not to be but that is in the realm of the possible later this year.

You have to realise 4K is is extremely far off from being the standard and demand will still be high for high end products because by the time 4K is the standard people who are currently buying 2080Ti's and 3080Ti's will want to be gaming at 8-16k
 
Caporegime
Joined
17 Mar 2012
Posts
47,640
Location
ARC-L1, Stanton System
The rumors have been pretty consistent on 400w when OC'd. For reference, my 2080ti oc'd can pretty much pull that also (caps around 396w).

Effectively, you'll have similar power draw as Turing for more performance.

400 Watts? Are you sure, what are you doing to it to get it that high?
As for Ampere, yes IMO prepare for higher power than you're used to in recent years, when you need to get to a level of performance to compete and you cannot do it with IPC or number of cores you have to go super clocks.
Ask AMD about that, or Intel these days.
 
Soldato
Joined
26 Sep 2013
Posts
10,711
Location
West End, Southampton
Upgraded the 6950X to a 10980XE in my 24/7 PC the other day.

What was I doing in the middle of this heatwave, getting a stable overclock on the power hungry beast running Cinebench 20 lol.

Lol Kaap, I've not even turned my pc on in a week, far to hot, roll on Autumn. A 400w Ampere should keep many houses nice and toasty come winter time though :p
 
Associate
Joined
13 Jan 2018
Posts
1,223
or to put it another way you'll be able to comfortably sit on a product for 4yrs+ and I think you'll see more people do that.
This is my current plan - 1080ti > 3080ti and sit the next generation out providing the price of the new ti does not go up again by 25%. Currently running 3440x1440 so this should be fine.
For me I would want at least 50% perf improvement to upgrade
 
Associate
Joined
28 Sep 2018
Posts
2,267
400 Watts? Are you sure, what are you doing to it to get it that high?
As for Ampere, yes IMO prepare for higher power than you're used to in recent years, when you need to get to a level of performance to compete and you cannot do it with IPC or number of cores you have to go super clocks.
Ask AMD about that, or Intel these days.

OC bios with voltage, tpd, core and mem tuned.

Driving a higher transistor density if the next node isn't scaling the power efficiency will also boost TDP, not just clocks. Example: if your transistor budget goes up 50% but your efficiency gains are only 30% on the new node, then you'll need more wattage to make up the difference.
 
Caporegime
Joined
17 Mar 2012
Posts
47,640
Location
ARC-L1, Stanton System
OC bios with voltage, tpd, core and mem tuned.

Driving a higher transistor density if the next node isn't scaling the power efficiency will also boost TDP, not just clocks. Example: if your transistor budget goes up 50% but your efficiency gains are only 30% on the new node, then you'll need more wattage to make up the difference.
Yeah.
Samsung 8nm is nothing like as good as TSMC 7nm let alone 7nm P.
Get ready to see what that looks like, tho Nvidia architecture is really efficient and that will help them, somewhat. AMD's RDNA architecture is pretty good too.
 
Associate
Joined
21 Apr 2007
Posts
2,487
You have to realise 4K is is extremely far off from being the standard and demand will still be high for high end products because by the time 4K is the standard people who are currently buying 2080Ti's and 3080Ti's will want to be gaming at 8-16k
I honestly don't that will happen, sure you'll always get a % that will but imho that % dramatically falls off compared to 4k because you are chasing ever smaller gains.

4k quite clearly is a standard (I've had 4k for over 4yrs) I accept its not the most common standard across the board, but you have to remember a lot of Steam surveys etc include laptops and what use is 4k on a laptop?
 
Associate
Joined
21 Apr 2007
Posts
2,487
Yeah.
Samsung 8nm is nothing like as good as TSMC 7nm let alone 7nm P.
Get ready to see what that looks like, tho Nvidia architecture is really efficient and that will help them, somewhat. AMD's RDNA architecture is pretty good too.

Samsung 8nm EUV is supposedly much cheaper (but with lower yields), so you have to wonder how much of that the consumer will gain from vs. Nvidia pocketing the cash themselves
 
Associate
Joined
14 Apr 2014
Posts
598
I honestly don't that will happen, sure you'll always get a % that will but imho that % dramatically falls off compared to 4k because you are chasing ever smaller gains.

4k quite clearly is a standard (I've had 4k for over 4yrs) I accept its not the most common standard across the board, but you have to remember a lot of Steam surveys etc include laptops and what use is 4k on a laptop?

The vast majority of gamers still play at 1080p
 
Associate
Joined
31 Jul 2019
Posts
515
You have to realise 4K is is extremely far off from being the standard and demand will still be high for high end products because by the time 4K is the standard people who are currently buying 2080Ti's and 3080Ti's will want to be gaming at 8-16k

4K and 60hz is easily mainstream, albeit agreed it's not the defacto standard. However, 144hz 1440p is incredibly common and they're not fully delivering on that.

What's interesting is that you don't see many people recommending 4K at the moment. Even taking refresh rate into account, a lot are saying that the extra resolution just isn't worth it. So whilst I'm sure some people will want an 8K screen, I can only see that being a niche until people start using bigger screens. 4K will increasingly become the standard once we get faster screens and ultrawides.

Factoring the consoles into the equation, PC has to deliver over-and-above to justify the cost surely? I mean if you're going to have a console consistently delivering 4K and promising raytracing at the same time, the only way PC can challenge is refresh rate I would think. I'm assuming consoles are planning on 30FPS or 60FPS max given current TV tech?

Long way of saying, I think NV and AMD need to properly deliver this time around.
 
Back
Top Bottom