• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Soldato
Joined
6 Jan 2013
Posts
21,843
Location
Rollergirl
Meh, they're good for that but you pay a penalty in terms of performance as the boost clocks won't be as high. Seems like nvidia are moving away from blower cards for their gaming range at least as it's getting to be too difficult for a single fan to be good in terms of acoustics and also trying to maintain the balancing act of how much do you sacrifice in terms of performance for a quieter card. It's obvious that "fe" cards are almost always clockspeed limited due to the cooler being more limited than what board partners can come out with.

Personally speaking i think the blower design outstayed its dubious welcome several years ago at least.

For my use case, they are perfect as the PC is not in the same room as me so I can't comment on the noise as the cave is silent.

They are ideal for water cooling also as they are reference PCB, however due to the above I don't bother with water any more.
 
Permabanned
Joined
8 Dec 2015
Posts
1,485
480 rad and EK 2080ti block, made no difference to clocks attained on 2 different 2080ti's
was quieter, but that's it, I wouldn't recommend water cooling GPU's any more to anyone that doesn't already have a loop

your idea of "blows out of the water" may be different to mine but to say that I would want an actual difference in OC clocks to be attained
Well i have 3xRX360 (The 1st edition XSPC ones, the best) rads and it makes a huge difference to me as long as gpu asic is good. Usually around 10-15fps increase
 
Associate
Joined
10 May 2020
Posts
27
Location
London
If the latest rumour is to be believed, and Igor has very good sources, I don't see the upcoming cards being cheaper than Turing.

If you look at the first two quarters after Turing's launch the 20-series definitely flopped. However, the super refresh bringing gaming revenue to pre-mining boom levels; and the fact that the 2080Ti hasn't dropped in price whatsoever, both seem to imply that consumers are OK with paying these prices as long as there is a tangible performance uplift.

I just don't see Nvidia charging Pascal prices for their 102 chips (also remember that the 1080 launched at £700 for a blower cooler). So from that leak my guess would be:

3080 - £799
3080ti - £999
3090/Titan - £3000
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,492
Location
Greater London
3090 £3000 lol. Too funny.

Why not make it £4999 at that point? People will still buy it so they can say they have the best. Might as well make them pay through the nose for it :p
 
Associate
Joined
21 Apr 2007
Posts
2,484
First point I agree on. AMD have to have better GPUs not just best bang for buck GPUs.

Second point. Your whole post that I previously responded to was full of speculation and things that haven't made any difference to GPU market share before.

AMD will only win Market share if their GPUs offer better performance than Nvidia's GPUs at the various price levels. And they have to have a solid launch.

Since we don't know what the performance of Nvidia's GPUs are going to be or what the performance of AMD's GPUs are going to be. We don't have any idea of the prices either. So how can you possibly make any predictions on who will lose or gain Market share.

I'll tell you how.... the only way Nvidia maintains market share is if they give consumers the power of Ampere at previously unheard of price points and that is just not how they operate OR AMD completely ballz it up which ok you have Vega and Fury to poke fun at but its not looking like that this time around we don't know the details but the power and capability is quite clearly there.

What Nvidia are most likely to do is maintain the current price points and pull an Apple to lower production costs thus increase margin and maintain profit at the expense of the consumer all wrapped in fluffy words. They will however loose market share this time around I guarantee you that because they absolutely will not give consumer any more silicon per $ than they absolutely have too where as AMD have more room to manuvour even if they ultimately fall short of the ultimate perf part because it doesn't matter once you start hitting those high prices.
 
Man of Honour
Joined
13 Oct 2006
Posts
91,027
OR AMD completely ballz it up which ok you have Vega and Fury to poke fun at but its not looking like that this time around we don't know the details but the power and capability is quite clearly there.

I think this is the key point here... but I don't think it needs them to completely ballz it up - they need to hit things out the park to shift the momentum away from nVidia and I just don't think they will manage that - so things will likely lumber on forwards zombie like market share wise.

A lot is going to depend on games going forward and the features levels each bring to the table as well IMO.
 
Soldato
Joined
6 Aug 2009
Posts
7,071
Going to be interesting to see what power all the new cards pull. Sounds like both sides will be pushing it. At least I've got plenty to spare with a 750W PSU. CPU's and M.2 drives don't pull much these days. Not counting the crazy high core count overclocked ones ;)
 
Soldato
Joined
10 Oct 2012
Posts
4,420
Location
Denmark
I'll tell you how.... the only way Nvidia maintains market share is if they give consumers the power of Ampere at previously unheard of price points and that is just not how they operate OR AMD completely ballz it up which ok you have Vega and Fury to poke fun at but its not looking like that this time around we don't know the details but the power and capability is quite clearly there.

What Nvidia are most likely to do is maintain the current price points and pull an Apple to lower production costs thus increase margin and maintain profit at the expense of the consumer all wrapped in fluffy words. They will however loose market share this time around I guarantee you that because they absolutely will not give consumer any more silicon per $ than they absolutely have too where as AMD have more room to manuvour even if they ultimately fall short of the ultimate perf part because it doesn't matter once you start hitting those high prices.

Now you've made my Vega 64 sad. You people are so mean, its okay to be "special" you can still contribute to the world, you just need someone to have some faith in you.

Joke aside, considering what AMD has achieved with the 5700 series, I think they can supply a new set of GPU's that will satisfy everyone's needs, even those people who want to play 4k games at a reasonable level above 60fps. I personally don't care if they still the performance crown. It's a stupid metric IMHO to look at. What matters to me personally, is solid performance at a reasonable price. So the only way I see AMD botch anything is if they start thinking that they are Nvidia and starts charging through the roof for anything other than the "halo" product. Now time will tell, I'm certainly very excited to see what both companies launch and at what price point. The best performance per dollar card that can drive a 4k display properly will get my money.
 
Caporegime
Joined
18 Oct 2002
Posts
39,299
Location
Ireland
Now you've made my Vega 64 sad. You people are so mean, its okay to be "special" you can still contribute to the world, you just need someone to have some faith in you.


Vega is an ok card that as usual gained more performance as time went on, though the expectations for it were blown out of all proportion as is usually the case.
 
Soldato
Joined
23 Apr 2010
Posts
11,896
Location
West Sussex
What sources ? its just a rehash of the same Chiphell info from the weekend with some educated from the couple of extra bits of info that came with.

Just seem like something to drive clicks and views

It has been said, from legit sources, that the cooler alone costs $150 each to make. That's not sold on to us as a profit, that is the cost to make each one.

When you make a cooler like that at that price you are so far up your own a**e that all reality is gone. Nvidia want to be Apple *so* bad. However, for someone to actually get away with that you need to offer something that your buyers will see as truly exclusive. Not just the fastest (which I have absolutely no doubt it will be) but also the most exquisitely built.

Whilst none of us like it Nvidia sold a butt load of 20 series cards and made a huge profit. So when the rumours were going around that the 30 series would be much cheaper I knew then that that particular rumour was total BS. When you are winning by such a huge lead you can totally take the pee. There is no one and nothing to give you a slap back to reality. So maybe they are cheaper to produce the GPU cores? that's OK, they will make a $150 cooler.

Big Navi may well be as fast as a 2080Ti. Big whoop (I am being sarcastic here btw). The 2080Ti was a cobbled together card that replaced Ampere at the last minute. Why? because Nvidia probably realised just how far ahead it would be so instead made the 20 series by shrinking Pascal and bolting on some RT cores, Tensor cores, whatever you want to call them. I heard that Ampere just wasn't ready so it was an executive decision to just go with the 20 series. They were always going to win any way.

I have also heard that actual game performance on the 3080 will be about 10-15% better than the 2080Ti. However, the RT aspect will be massively improved. That I can believe also, given that Turing was probably never even meant to exist.

So if you have a card that sits at the bottom of your high end stack that beats the big Navi by the expected 15% and then you have two more cards on your stack that utterly obliterate it? why would you sell it cheap? Sure, selling it cheap may demean AMD and make them look stupid (and cost them profit on their high end cards) but it would also mean taking less for Nvidia. Something that no corporation will ever want to do. When you have proven that people will absolutely pay £1300 for a GPU then why the heck would you stop?

There will be value where there is competition and nowhere else. The only reason Nvidia did us all the huge favour of releasing the slightly faster Super cards and dropping them to even remotely affordable levels was because of Navi. When there is no Navi they can charge whatever they want.

It also looks like they are dropping the Titan this time too, and giving us the 3090Ti. Why? because why make a card that is so grossly over priced over your desktop cards when you can just make it a desktop card, add it to the god like performance stack and get more people to buy it because whilst they could excuse not buying a Titan card over a 80Ti it's going to be hard for them to accept that by doing that there's a better card out there with a higher number that makes their epeen look even bigger.
 
Associate
Joined
2 Nov 2006
Posts
1,636
Location
Sheffield
Was more like £620 at launch.

I don't think the Founders 1080s were £620. Those were the clown cards. The ones with decent coolers and warranties were more expensive for a couple of months.
Edit: Actually I think the clowns were under £600. I'm thinking £570, you're right.

Edit2: Correcting for 1.4 £/$ back then plus a bit of actual 'normal' inflation and that £620 today is the wrong side of £700 now anyway.
 
Soldato
Joined
12 May 2014
Posts
5,235
@ALXAndy
Just spit balling some ideas here. Let us assume that it is going to cost $150 for the cooler alone. Why would they design such a cooler unless they need it? They could have literally slapped on the current FE coolers add a few visual tweaks and jobs done. Keep all the extra money for themselves. But they didn't. They designed a new cooler from the grounds up.

What if they simply cannot use the current FE coolers on these new 3000 series cards?

Comparing the renders and the images of the 3080 cooler, to the RTX 2080ti cooler (link) it seems like there is more fin surface area on the 3080 cooler.

If you scroll down the chart here you will see that the TDP for the A100 chip is 100W higher than then V100 it is replacing. Edit: That is with a 20% reduction to boost speeds

If the assumptions are correct i reckon they've had to do this to keep temperatures in check.
 
Caporegime
Joined
18 Oct 2002
Posts
39,299
Location
Ireland
I don't think the Founders 1080s were £620. Those were the clown cards. The ones with decent coolers and warranties were more expensive for a couple of months.
Edit: Actually I think the clowns were under £600. I'm thinking £570, you're right.

Edit2: Correcting for 1.4 £/$ back then plus a bit of actual 'normal' inflation and that £620 today is the wrong side of £700 now anyway.

The clown cards were in the £520 range, with the fe being £620 or thereabouts. Going by some old articles at least.
 
Back
Top Bottom