• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Soldato
Joined
20 Aug 2019
Posts
3,028
Location
SW Florida
Jayztwocents did a standard Timespy test enabling me to update my chart.

8uxMOu0.png

The pricing is off with this gen I think. Turing, Pascal and Maxwell all had fairly linear pricing. However with Ampere the 3080 offers significantly more performance but not at a huge price differential, skewing it over to the right of the chart. Then, oddly, for a huge price jump up to the 3090 you get barely any performance gain. Obviously talking about RRP here as well.

Based on this, the 3080 is either too fast or too cheap.

If you put the 2080Ti in the $700-$800 range where it should have been all along, the pricing and performance steady.

It's not that the 3080 is out of place. It's the 2080Ti (and the non-super Turing stuff). I think the 3090 is priced where it is to capture some of the people that bought into the 2080Ti's offering over the 1080Ti. If people are willing to pay a lot more money for a small performance bump, it makes sense to cash in again.

Turing was a rip-off.
 
Last edited:
Soldato
Joined
14 Jul 2005
Posts
8,274
Location
Birmingham
Both Turing price lines are very linear, the Pascal price line is also very linear. The Ampere line isn't at all linear.

To put the 2080 Ti in the $800 bracket would mean the rest of the range having to be lower as well, to maintain a profile similar to Pascal. It may well have been a rip off, but it was linear.

The point I was making this time was about the non-linearity of the price/performance line in Ampere.
 
Soldato
Joined
20 Aug 2019
Posts
3,028
Location
SW Florida
Both Turing price lines are very linear, the Pascal price line is also very linear. The Ampere line isn't at all linear.

To put the 2080 Ti in the $800 bracket would mean the rest of the range having to be lower as well, to maintain a profile similar to Pascal. It may well have been a rip off, but it was linear.

The point I was making this time was about the non-linearity of the price/performance line in Ampere.

I think there two different observations happening here. You seem to be looking at each generation's "stack" where I'm looking at generational progress on a per-price-point basis over time.

I use the generational-progress approach to gauge progress and time upgrades. I don't personally see any practical use for the "linearity" of one product stack vs another.
 
Associate
Joined
19 Jun 2017
Posts
1,029
Both Turing price lines are very linear, the Pascal price line is also very linear. The Ampere line isn't at all linear.

To put the 2080 Ti in the $800 bracket would mean the rest of the range having to be lower as well, to maintain a profile similar to Pascal. It may well have been a rip off, but it was linear.

The point I was making this time was about the non-linearity of the price/performance line in Ampere.

Makes sense the 3080 wouldn't have been retailing at 699 if it wasn't for the perceived threat from AMD..

Knowing how unpredictable nvidia is with new product development, I believe this is a one time (or at best two time) offer
 
Soldato
Joined
20 Aug 2019
Posts
3,028
Location
SW Florida
Makes sense the 3080 wouldn't have been retailing at 699 if it wasn't for the perceived threat from AMD..

Knowing how unpredictable nvidia is with new product development, I believe this is a one time (or at best two time) offer


Competition may be part of the reason, but Jensen spoke directly to Pascal owners in the 3080 launch video. Many of us have been on the sideline since *before* AMD was a threat.

Turing appears to an experiment in what happens when Nvidia effectively scalp their own cards for way more money than they are actually worth. -And it doesn't seem to have worked out as well as Nvidia had hoped. (I still think they threw the 3090 in there just to capture the tiny part of the market that are willing to pay any price for bragging rights.)

One thing that chart shows, more than anything, is that Turing was the outlier. Nvidia may have wanted it to be the new normal, but for whatever reason, we are back to the progress that we have enjoyed for pretty much every generation other than Turing.
 
Associate
Joined
19 Jun 2017
Posts
1,029
Competition may be part of the reason, but Jensen spoke directly to Pascal owners in the 3080 launch video. Many of us have been on the sideline since *before* AMD was a threat.

It wasn't.. it's pretty obvious from how they binned the dies..
If this was any other year the GA102 would have been reserved for 3080ti/3090 (at same obnoxious prices)
and the GA104 would have been the 3080
Unless you want to completely disregard past product segmentation and believe what the leather jacket said...
 
Soldato
Joined
20 Aug 2019
Posts
3,028
Location
SW Florida
It wasn't.. it's pretty obvious from how they binned the dies..
If this was any other year the GA102 would have been reserved for 3080ti/3090 (at same obnoxious prices)
and the GA104 would have been the 3080
Unless you want to completely disregard past product segmentation and believe what the leather jacket said...

The dies are simply a means to an end. Nvidia is using the dies they have to hit performance targets. Unless you want to completely disregard past generational performance, the 3080 needed to perform at this level to get back "on track" with previous generational progress. (Turing was decidedly off track)

They used the die that got the job done.
 
Soldato
Joined
14 Jul 2005
Posts
8,274
Location
Birmingham
I don't personally see any practical use for the "linearity" of one product stack vs another.

I think it can indicate where interventions to the stack have been made that wouldn't have normally been made. Costs are not linear (a 3070 will cost about the same to make as a 3090 give or take a bit, because there are considerable base costs common to all GPUs), and ideally manufacturers would like to artificially create a fairly linear pricing line between their models to segregate the range and not give performance away for free. This is common in all sorts of products. In this case we have what looks to be a considerable intervention to skew a particular card.
 
Associate
Joined
19 Jun 2017
Posts
1,029
The dies are simply a means to an end. Nvidia is using the dies they have to hit performance targets. Unless you want to completely disregard past generational performance, the 3080 needed to perform at this level to get back "on track" with previous generational progress. (Turing was decidedly off track)

They used the die that got the job done.

All x04 dies have been xx80s from Maxwell era. The dies and underlying targets are decided long before the SKU names (and thus signal strategy at inception).. this is clearly a one off outlier influenced purely by surprise from competition after a lot of developmental work was done.. nvidia is looking like a fool slotting a 3080ti between 3080 and 3090
 
Soldato
Joined
26 Aug 2004
Posts
5,032
Location
South Wales
My gsync 1440p 165hz screen is not freesync compatible, if i decided to swap to a AMD card I assume it might not be a good idea in that case?

This reminds me of 2004, had an order in for an AMD card but was no sign of stock and ended up getting a 6800 Ultra instead.
 
Soldato
Joined
20 Aug 2019
Posts
3,028
Location
SW Florida
All x04 dies have been xx80s from Maxwell era. The dies and underlying targets are decided long before the SKU names (and thus signal strategy at inception).. this is clearly a one off outlier influenced purely by surprise from competition after a lot of developmental work was done.. nvidia is looking like a fool slotting a 3080ti between 3080 and 3090

Look like fools to who? What percentage of graphics card customers do you think know, much less care, what die is on the PCB of the card they are buying.
 
Associate
Joined
19 Jun 2017
Posts
1,029
Look like fools to who? What percentage of graphics card customers do you think know, much less care, what die is on the PCB of the card they are buying.

Lol graphics cards customers don't do product strategy (dies) it's the prerogative of the enterprise.

Am just saying it's not sustainable.. don't expect this to happen every generation because this looks like intervention in response to competitive threat
 
Soldato
Joined
20 Aug 2019
Posts
3,028
Location
SW Florida
Lol graphics cards customers don't do product strategy (dies) it's the prerogative of the enterprise.

Am just saying it's not sustainable.. don't expect this to happen every generation because this looks like intervention in response to competitive threat

Or an inferior node. They may have planned on using TCMC. I mean, a 320W TDP from cards that are push to the max when they leave the factory probably wasn't the original plan either.
 

bru

bru

Soldato
Joined
21 Oct 2002
Posts
7,360
Location
kent
Yeah, it did seem to suggest that, but who knows. I wonder if reviewers have the cards in hand already?

Both GN and Jay stated that they didn't have cards at the time of filming. Now whether others do have cards yet is anyone's guess. But I would expect tomorrow to only be an announcement with availability coming latter.

On the 3070 front the card looks quite good at the price point, pretty much equal to 2080ti performance wise but considerably cheaper.
 
Associate
Joined
19 Jun 2017
Posts
1,029
Or an inferior node. They may have planned on using TCMC. I mean, a 320W TDP from cards that are push to the max when they leave the factory probably wasn't the original plan either.

Possible.. someone with exposure to semis can share more insights..

From my general read, fab commitments have to be done well in advance and Nvidia has this naming scheme which gives away revisions (ga112 will be a revision of ga102, if I remember correctly)
 
Back
Top Bottom