• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Soldato
Joined
14 Jul 2003
Posts
14,492
If I didn't do the odd bit of video editing and work at home then console with M+K would be more suitable, right up until the point I need to use an unusual peripheral like HOTAS.

Consoles are closed platforms, as long as you don't need to pop out of their approved hardware and software lists you are golden, I would though.
 
Soldato
Joined
19 Dec 2010
Posts
12,027
A ps5 currently has the same amount of cores as a 2070 and less than a 5700XT.

So yes even with some rdna2 improvement, it’s likely a ps5 will be 2070/5700XT levels of performance which by next gen will probably be even less than a 3060. 3070 likely being 2080Ti levels.

Same story with the XSX even though that has nearly 50% more shaders than the ps5.

No way. The PS5 and Xbox series X are going to be faster than that. If the consoles are only that level of performance that means no uptake because of the change in process and no uptake because of RDNA 2. AMD are saying that it's an up to 50% improvement in performance per watt. Even if they only get half that it will still be around 2080 levels of performance.
 
Soldato
Joined
5 Nov 2010
Posts
23,946
Location
Hertfordshire
If I didn't do the odd bit of video editing and work at home then console with M+K would be more suitable, right up until the point I need to use an unusual peripheral like HOTAS.

Consoles are closed platforms, as long as you don't need to pop out of their approved hardware and software lists you are golden, I would though.

Indeed. The choice between console and PC is not all about graphics/performance.
They’re two very different environments.
 
Soldato
Joined
3 Aug 2010
Posts
3,037
No way. The PS5 and Xbox series X are going to be faster than that. If the consoles are only that level of performance that means no uptake because of the change in process and no uptake because of RDNA 2. AMD are saying that it's an up to 50% improvement in performance per watt. Even if they only get half that it will still be around 2080 levels of performance.
He is quite spot on. The ps5 gpu had to be heavily overclocked to reach the tflops of a 5700xt. The xbox series x is about 10-15% more powerful which brings it closer to a 2070 super. This will be mid range level of performance once rdna 2 and ampere gpus are out which has usually been the case when next gen consoles come out. They become outdated very quickly.
 
Associate
Joined
21 Oct 2013
Posts
2,061
Location
Ild
If I didn't do the odd bit of video editing and work at home then console with M+K would be more suitable, right up until the point I need to use an unusual peripheral like HOTAS.

Consoles are closed platforms, as long as you don't need to pop out of their approved hardware and software lists you are golden, I would though.
Yeah a console is no good to me because I can't use it to work from home or edit videos :rolleyes:. I can't put frozen fish in the toaster either.
 
Soldato
Joined
19 Dec 2010
Posts
12,027
He is quite spot on. The ps5 gpu had to be heavily overclocked to reach the tflops of a 5700xt. The xbox series x is about 10-15% more powerful which brings it closer to a 2070 super. This will be mid range level of performance once rdna 2 and ampere gpus are out which has usually been the case when next gen consoles come out. They become outdated very quickly.

You think the GPU in Xbox series X won't quite reach 2070 Super levels of performance? You know the GPU in the Xbox series X is larger than the 5700XT, it's on an improved process, which gives roughly 10% more performance per watt, and it's on a new Architecture, RDNA 2.

So even if there wasn't any architectural improvements, you would still have a ~12% increase in die size and 10% increase in performance per watt. That would put the performance above the 2070 Super. When you add RDNA 2 into the mix, surely the performance levels will be 2080 minimum. RDNA 2 will be a flop if the console performance is any less.

I am not convinced that the PS5 will be much slower than the Xbox Series X in games either.
 
Soldato
Joined
24 Aug 2013
Posts
4,549
Location
Lincolnshire
No way. The PS5 and Xbox series X are going to be faster than that. If the consoles are only that level of performance that means no uptake because of the change in process and no uptake because of RDNA 2. AMD are saying that it's an up to 50% improvement in performance per watt. Even if they only get half that it will still be around 2080 levels of performance.

Performance per watt of what exactly is the question. That certainly won’t be 50% more performance per shader just more efficient.

Sure there will be some improvement but it won’t be anywhere near 50% in terms of performance increase. RDNA2’s big thing will be RTX.

Still on a bad day the 2080Ti should = 3070. If ampere gets a big uplift going to 7nm which it should the Ti could even become 3060 level.
 
Last edited:
Soldato
Joined
20 Aug 2019
Posts
3,030
Location
SW Florida
This will be mid range level of performance once rdna 2 and ampere gpus are out which has usually been the case when next gen consoles come out. They become outdated very quickly.

Meh. "Midrange", "High end", all mean nothing without price points. If the new consoles...as complete systems...offer the same performance as single-component GPU's that are more expensive, those GPU sales will suffer.
 
Soldato
Joined
6 Feb 2019
Posts
17,565
Performance per watt of what exactly is the question. That certainly won’t be 50% more performance per shader just more efficient.

Sure there will be some improvement but it won’t be anywhere near 50% in terms of performance increase. RDNA2’s big thing will be RTX.

Still on a bad day the 2080Ti should = 3070. If ampere gets a big uplift going to 7nm which it should the Ti could even become 3060 level.

it's important for people to realise that Performance per Watt is a bit of a bullshat measurement that on its own doesn't actually mean anything. Who stared with it first it, it was AMD right looking for a new way to market its products?

The main problem with that statement "50% more performance per watt" is that you don't know what wattage they are talking about - the curve is not flat and never will be - performance might be 50% improves at 20w but only 20% improved at 400w - if that's the case, is 20w enough for your gpu? No you want 200w, ok so maybe it's not actually 50% improved - who knows cause the marketing team at AMD won't tell you.
 
Soldato
Joined
9 Nov 2009
Posts
24,824
Location
Planet Earth
it's important for people to realise that Performance per Watt is a bit of a bullshat measurement that on its own doesn't actually mean anything. Who stared with it first it, it was AMD right looking for a new way to market its products?

It was actually Nvidia. They used it extensively in their Kepler and Maxwell slides.

Here is one from 2011 before Kepler was released:
Nvidia-Kepler-Will-Arrive-in-2012-Maxwell-in-2014-Says-Company-Slide-3.png
 
Soldato
Joined
20 Apr 2004
Posts
4,365
Location
Oxford
it's important for people to realise that Performance per Watt is a bit of a bullshat measurement that on its own doesn't actually mean anything. Who stared with it first it, it was AMD right looking for a new way to market its products?

The main problem with that statement "50% more performance per watt" is that you don't know what wattage they are talking about - the curve is not flat and never will be - performance might be 50% improves at 20w but only 20% improved at 400w - if that's the case, is 20w enough for your gpu? No you want 200w, ok so maybe it's not actually 50% improved - who knows cause the marketing team at AMD won't tell you.

I don't see how.

PPW has some real world impact, if GPU Alpha give 100% of the performance at 300w but GPU beta Can only give you 80% of the performance at 300w or need 400w to get the same performance level which would you use ? More power means more heat, need a better VRM, stronger PCB, bigger cooling. all things that add to the cost even if the GPU it self is cheaper. There is a good reason why we don't see navi in laptops, the PPW is just to far behind.

And if GPU A wants fill the performance gap is has plenty of power budget in the tank.
 
Soldato
Joined
14 Aug 2009
Posts
2,759
Tflops are completely meaningless for gaming.
A vega56 has more Tflops than my 5700XT and yet its 40% faster in games.
The new XBox GPU is around RTX 2080 or more.

I think if you spend the time and actually code around the limitation of the architecture, you do get more performance, just as you get more if a developer puts in the time and codes for Mantle/DX12/Vulkan with GCN. That doesn't happens often on PC (and to be honest, considering AMD's marketshare, no need to bother much), but on a consoles it can circumvent up to a point the bottlencks. :)
You can also see that in consoles titles where the devs put that extra effort.
 
Back
Top Bottom