• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Permabanned
Joined
12 Sep 2013
Posts
9,221
Location
Knowhere
Oh God... this again? Turing didn't 'sell poorly', I made a post showing how it didn't sell poorly earlier in the thread. Overall, Turing has made a significant profit for Nvidia.

Unless there is some other definition of 'selling poorly' that I'm not aware of? :confused:

Turing sold well and because it did we now have a new level of pricing that Nvidia will stick with, Plenty of people said at the time that accepting the new price levels would give Nvidia permission to stick to this level of pricing permanently but people didn't care and bought Turing in droves, excited to try out Ray Tracing even though there was a lack of software & games to test it with at the time. The fact that the Pascal 1080 ti's replacement is above and beyond the Pascal Titans pricing killed any interest in Turing for me, I'd already been paying over the top pricing for first gen Vega thanks to the mining craze and I can't afford to keep doing that so near the end of last year I grabbed a first gen RDNA card & I'll probably stick with that until it can no longer run games like I want it to. It's the same with my CPU, I'm using a 2700x and the next upgrade from that will be from the last gen CPU's that support my X570 AM4 motherboard.

If Big Navi offers a decent performance jump without adding to the price for every single frame offered over their current cards I might change my mind but I've been priced out of the hardware race. AMD's Vega flagship wasn't priced that bad at £650 when you consider it had 16gb's of HBM2 & close to RTX 2080 performance in a lot of titles, Wouldn't it be nice if AMD stuck to that level of pricing for Big Navi
riddle
 
Caporegime
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
Explain how Turing sold poorly then? And they had to re-launch it with the super lineup that was more appropriately priced.
Pascal was good at mining and was available with the mining boom, so hence why cards like the 1080Ti etc sold so well and prior to Turing launch, the crypto mining crashed big time, so miners had no interest in buying GPUs any longer and the market was flooded with cheap Pascal cards second hand.
 
Caporegime
Joined
8 Sep 2005
Posts
27,421
Location
Utopia
Pascal was good at mining and was available with the mining boom, so hence why cards like the 1080Ti etc sold so well and prior to Turing launch, the crypto mining crashed big time, so miners had no interest in buying GPUs any longer and the market was flooded with cheap Pascal cards second hand.
Also correct, the mining slump massively impacted Nvidia GPU sales in 2019 and Jensen outright lied about the impact of this to shareholders and investors too. He is a typical corporate shark.
 
Associate
Joined
21 Apr 2007
Posts
2,484
'Blind' to what? Even if it initially sold in lower volumes than Nvidia would have liked or expected, it didn't by any definition 'sell poorly' when the overall lifecycle is taken into account, which is in the end all that matters at this point. It was still a successful series of GPU's for Nvidia and no amount of cherry-picking of the facts can change that.

If you take the overall lifecycle into account what you do is hide the true picture, its a bit like saying "well Nvidia's margin on Turing is 65%" when in reality that figure (or there about's) doesn't say its 120% for the 2080Ti which would be genuinely shocking to some people. The point here is that market doesn't hold the price/perf of a 2070 & 2080 for very long it gets saturated quickly by those willing or unaware of what they are buying into, that's not cherry picking facts or Harvard business school analytics that's common sense.

Turing made Nvidia a lot of money, and they have a good idea on what their pricing model is there we can agree on that.
 
Associate
Joined
21 Apr 2007
Posts
2,484
I believe the Turing sales were below expectations initially until the Super cards were released.

I think the exception was the 2080Ti. A Super version wasn't released and the price wasn't reduced significantly. I feel Nvidia were happy with the lower sales and bigger margins on it.

I agree. 2080Ti was an outlier with a very controlled rate of production with 'just enough' in the retail channels so it never had to drop in price (but one assumes it has to post Ampere release unless there is a supply shortage, but we are not there yet)
 
Caporegime
Joined
17 Jul 2010
Posts
25,705
I believe the Turing sales were below expectations initially until the Super cards were released.

I think the exception was the 2080Ti. A Super version wasn't released and the price wasn't reduced significantly. I feel Nvidia were happy with the lower sales and bigger margins on it.
Their mind was made up for them when people actually bought the damned things and confirmed they could charge that price and have it not affect sales. The next cards will certainly be bumped up by £1-200 across the board, maybe even £2-300 for the top card.
 
Caporegime
Joined
8 Sep 2005
Posts
27,421
Location
Utopia
If you take the overall lifecycle into account what you do is hide the true picture, its a bit like saying "well Nvidia's margin on Turing is 65%" when in reality that figure (or there about's) doesn't say its 120% for the 2080Ti which would be genuinely shocking to some people. The point here is that market doesn't hold the price/perf of a 2070 & 2080 for very long it gets saturated quickly by those willing or unaware of what they are buying into, that's not cherry picking facts or Harvard business school analytics that's common sense.

Turing made Nvidia a lot of money, and they have a good idea on what their pricing model is there we can agree on that.

I am personally not interested in going into an in-depth analysis of Nvidias quarter-per-quarter sales of Turing. My main point, looking at the outcomes, is that if Turing made Nvidia a lot of money, then Turing did not "sell poorly", so people need to stop throwing vague and inaccurate blanket terms like that out there without being specific as to what they actually mean. If you have to come in and attempt to completely clarify someone else's post with what you think it means (which in your case seems to be giving them a lot of generous leeway), then something isn't right with that post.

Their mind was made up for them when people actually bought the damned things and confirmed they could charge that price and have it not affect sales.

This is basically the concise summary of why prices keep going up.
 
Caporegime
Joined
17 Feb 2006
Posts
29,263
Location
Cornwall
It remains to be seen if we'll continue buying in with 3080 at £800 to £1k and 3070 at £600+

I personally don't hold much faith in people's ability to restrain themselves.

And again, I have no issue with people spending more than me. It's when my beloved mid-range cards end up being £500+ (and this becomes pretty much the entry-level hardware for newly released games) that I have an issue.
 
Permabanned
Joined
12 Sep 2013
Posts
9,221
Location
Knowhere
I forgot this until just now,

Nvidia's refusal to support Adaptive sync & the timing of their turnaround on the matter suggests that Turing might not of been selling as well as they would've liked. Plenty of people have been speculating about what it would take for Nvidia to change their minds & start supporting adaptive sync for years, a popular reason was low sales & mid Turing it happened which may or may not support the theory. :D
 
Soldato
Joined
19 Nov 2015
Posts
4,867
Location
Glasgow Area
Oh God... this again? Turing didn't 'sell poorly', I made a post showing how it didn't sell poorly earlier in the thread. Overall, Turing has made a significant profit for Nvidia.

Unless there is some other definition of 'selling poorly' that I'm not aware of? :confused:
It made a profit yes. But much less so than previous generations. Thats my point.

Shop makes £150Bn profit from product X.
Next year shop sells product y and make £10Bn

Yes they made a profit. Doesnt make is a sucess when they were hoping for £150Bn again.
 
Caporegime
Joined
8 Sep 2005
Posts
27,421
Location
Utopia
I forgot this until just now,

Nvidia's refusal to support Adaptive sync & the timing of their turnaround on the matter suggests that Turing might not of been selling as well as they would've liked. Plenty of people had been speculating about what it would take for Nvidia to change their minds over their refusal to support adaptive sync for years and mid Turing it happened.
Nvidia are typical of any big market leader in that they want proprietary technology to keep people hooked on only their products, or products specifically approved by them. When they saw that the trends were turning against them and that it would actually damage their business model to stick with the proprietary strategy then they changed it. Apple etc do the same.

It made a profit yes. But much less so than previous generations. Thats my point.
Then selling worse than expected or hoped that still doesn't mean that it "sold poorly".

Also, can you please provide some tangible sales figures for us to look at to show how "much less so" it sold than Pascal, of course taking into account the Bitcoin mining surge which we all know caused a huge boom in GPU sales at the time?
 
Soldato
Joined
12 May 2014
Posts
5,235
I forgot this until just now,

Nvidia's refusal to support Adaptive sync & the timing of their turnaround on the matter suggests that Turing might not of been selling as well as they would've liked. Plenty of people have been speculating about what it would take for Nvidia to change their minds & start supporting adaptive sync for years, a popular reason was low sales & mid Turing it happened which may or may not support the theory. :D
Any idea how close it was to the launch of the 5700XT
 
Soldato
Joined
19 Nov 2015
Posts
4,867
Location
Glasgow Area
Nvidia are typical of any big market leader in that they want proprietary technology to keep people hooked on only their products, or products specifically approved by them. When they saw that the trends were turning against them and that it would actually damage their business model to stick with the proprietary strategy then they changed it. Apple etc do the same.


Then selling worse than expected or hoped that still doesn't mean that it "sold poorly".

Also, can you please provide some tangible sales figures for us to look at to show how "much less so" it sold than Pascal, of course taking into account the Bitcoin mining surge which we all know caused a huge boom in GPU sales at the time?

One of many MANY avalible sources through the Magic of Google.

"Nvidia had hoped that Turing would bolster its GPU sales, pinning its hopes on gamers being enamored with the added performance and especially the feature upgrades, such as real-time ray tracing support and DLSS capabilities. For various reasons, though, Nvidia's RTX cards have not met Nvidia's sales expectations."

https://www.pcgamer.com/nvidia-ceo-...les-says-last-quarter-was-a-punch-in-the-gut/


I.E. Sold poorly.
 
Caporegime
Joined
8 Sep 2005
Posts
27,421
Location
Utopia
One of many MANY avalible sources through the Magic of Google.

"Nvidia had hoped that Turing would bolster its GPU sales, pinning its hopes on gamers being enamored with the added performance and especially the feature upgrades, such as real-time ray tracing support and DLSS capabilities. For various reasons, though, Nvidia's RTX cards have not met Nvidia's sales expectations."

https://www.pcgamer.com/nvidia-ceo-...les-says-last-quarter-was-a-punch-in-the-gut/


I.E. Sold poorly.
An article from January 2019, a mere 2 months after Turing was launched and at the end of the Crypo boom. By your logic, that means that "Turing sold poorly"... in general? Sigh...
 
Permabanned
Joined
12 Sep 2013
Posts
9,221
Location
Knowhere
Nvidia are typical of any big market leader in that they want proprietary technology to keep people hooked on only their products, or products specifically approved by them. When they saw that the trends were turning against them and that it would actually damage their business model to stick with the proprietary strategy then they changed it. Apple etc do the same.

As you're saying that's the norm unfortunately. :(

Any idea how close it was to the launch of the 5700XT

The first Nvidia driver with adaptive sync support released on Jan 15th 2019, The 5700XT released 6 months later on July 7th 2019.

Interestingly the Radeon VII launched in Feb 2019 a month after Nvidia's adaptive sync move so maybe that was in part a reflex to Nvidia adding Adaptive sync support to their Turing, Pascal (& Maxwell
riddle
) cards. I'm only kidding but it did finally gave AMD a 1080ti competitor at the original 1080ti's level of pricing. If AMD hadn't of screwed up with the VII's heatsink being too light it would have been a great card for the money.
 
Soldato
Joined
26 Sep 2017
Posts
6,189
Location
In the Masonic Temple
Mandatory viewing for those interested in the Sept 1st Nvidia Ampere event:

you may just learn something :)
He missed a crucial part, I think the 3090 will be 1400 ish convincing people that you get 2-4x ray tracing 50%+raster for only 200 more. Then they will drop a 3090ti next year for 1899

Makes a lot of sense that they want people to buy consoles and keep the people with cash buying into them. I always though they were in cahoots with amd anyway, they conveniently don't step on each others toes
 
Back
Top Bottom