1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.
  2. Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

    Dismiss Notice

NVIDIA ‘Ampere’ 8nm Graphics Cards

Discussion in 'Graphics Cards' started by LoadsaMoney, 4 Oct 2019.

  1. Twinz

    Wise Guy

    Joined: 20 Aug 2019

    Posts: 1,447

    Location: SW Florida

    If you put the 2080Ti in the $700-$800 range where it should have been all along, the pricing and performance steady.

    It's not that the 3080 is out of place. It's the 2080Ti (and the non-super Turing stuff). I think the 3090 is priced where it is to capture some of the people that bought into the 2080Ti's offering over the 1080Ti. If people are willing to pay a lot more money for a small performance bump, it makes sense to cash in again.

    Turing was a rip-off.
     
    Last edited: 27 Oct 2020
  2. danlightbulb

    Mobster

    Joined: 14 Jul 2005

    Posts: 4,768

    Both Turing price lines are very linear, the Pascal price line is also very linear. The Ampere line isn't at all linear.

    To put the 2080 Ti in the $800 bracket would mean the rest of the range having to be lower as well, to maintain a profile similar to Pascal. It may well have been a rip off, but it was linear.

    The point I was making this time was about the non-linearity of the price/performance line in Ampere.
     
  3. Twinz

    Wise Guy

    Joined: 20 Aug 2019

    Posts: 1,447

    Location: SW Florida

    I think there two different observations happening here. You seem to be looking at each generation's "stack" where I'm looking at generational progress on a per-price-point basis over time.

    I use the generational-progress approach to gauge progress and time upgrades. I don't personally see any practical use for the "linearity" of one product stack vs another.
     
  4. Nablaoperator

    Wise Guy

    Joined: 19 Jun 2017

    Posts: 1,029

    Makes sense the 3080 wouldn't have been retailing at 699 if it wasn't for the perceived threat from AMD..

    Knowing how unpredictable nvidia is with new product development, I believe this is a one time (or at best two time) offer
     
  5. Twinz

    Wise Guy

    Joined: 20 Aug 2019

    Posts: 1,447

    Location: SW Florida


    Competition may be part of the reason, but Jensen spoke directly to Pascal owners in the 3080 launch video. Many of us have been on the sideline since *before* AMD was a threat.

    Turing appears to an experiment in what happens when Nvidia effectively scalp their own cards for way more money than they are actually worth. -And it doesn't seem to have worked out as well as Nvidia had hoped. (I still think they threw the 3090 in there just to capture the tiny part of the market that are willing to pay any price for bragging rights.)

    One thing that chart shows, more than anything, is that Turing was the outlier. Nvidia may have wanted it to be the new normal, but for whatever reason, we are back to the progress that we have enjoyed for pretty much every generation other than Turing.
     
  6. sharpygsxr

    Mobster

    Joined: 11 Mar 2008

    Posts: 3,151

    Location: Norn Iron

    We might be getting a refresh of stock for the FE cards soon, my Distil just notified me and official site now showing as "coming soon" rather than out of stock. F5's at the ready boys
     
  7. randal

    Capodecina

    Joined: 1 Oct 2006

    Posts: 12,820

    Lol, absolutely nothing to do with the Big Navi announcement tomorrow. Nope. Not at all.
     
  8. Harlequin

    Mobster

    Joined: 17 Jun 2004

    Posts: 3,641

    Location: Eastbourne , East Sussex.

    Hardware unboxed - `its well worth waiting` - at the $500 price point; seems to me AMD have something rather good coming this time
     
  9. Nablaoperator

    Wise Guy

    Joined: 19 Jun 2017

    Posts: 1,029

    It wasn't.. it's pretty obvious from how they binned the dies..
    If this was any other year the GA102 would have been reserved for 3080ti/3090 (at same obnoxious prices)
    and the GA104 would have been the 3080
    Unless you want to completely disregard past product segmentation and believe what the leather jacket said...
     
  10. TNA

    Capodecina

    Joined: 13 Mar 2008

    Posts: 18,601

    Location: London

    Yeah, it did seem to suggest that, but who knows. I wonder if reviewers have the cards in hand already?
     
  11. Twinz

    Wise Guy

    Joined: 20 Aug 2019

    Posts: 1,447

    Location: SW Florida

    The dies are simply a means to an end. Nvidia is using the dies they have to hit performance targets. Unless you want to completely disregard past generational performance, the 3080 needed to perform at this level to get back "on track" with previous generational progress. (Turing was decidedly off track)

    They used the die that got the job done.
     
  12. danlightbulb

    Mobster

    Joined: 14 Jul 2005

    Posts: 4,768

    I think it can indicate where interventions to the stack have been made that wouldn't have normally been made. Costs are not linear (a 3070 will cost about the same to make as a 3090 give or take a bit, because there are considerable base costs common to all GPUs), and ideally manufacturers would like to artificially create a fairly linear pricing line between their models to segregate the range and not give performance away for free. This is common in all sorts of products. In this case we have what looks to be a considerable intervention to skew a particular card.
     
  13. Nablaoperator

    Wise Guy

    Joined: 19 Jun 2017

    Posts: 1,029

    All x04 dies have been xx80s from Maxwell era. The dies and underlying targets are decided long before the SKU names (and thus signal strategy at inception).. this is clearly a one off outlier influenced purely by surprise from competition after a lot of developmental work was done.. nvidia is looking like a fool slotting a 3080ti between 3080 and 3090
     
  14. Perfect_Chaos

    Mobster

    Joined: 26 Aug 2004

    Posts: 4,703

    Location: South Wales

    My gsync 1440p 165hz screen is not freesync compatible, if i decided to swap to a AMD card I assume it might not be a good idea in that case?

    This reminds me of 2004, had an order in for an AMD card but was no sign of stock and ended up getting a 6800 Ultra instead.
     
  15. Twinz

    Wise Guy

    Joined: 20 Aug 2019

    Posts: 1,447

    Location: SW Florida

    Look like fools to who? What percentage of graphics card customers do you think know, much less care, what die is on the PCB of the card they are buying.
     
  16. Nablaoperator

    Wise Guy

    Joined: 19 Jun 2017

    Posts: 1,029

    Lol graphics cards customers don't do product strategy (dies) it's the prerogative of the enterprise.

    Am just saying it's not sustainable.. don't expect this to happen every generation because this looks like intervention in response to competitive threat
     
  17. Twinz

    Wise Guy

    Joined: 20 Aug 2019

    Posts: 1,447

    Location: SW Florida

    Or an inferior node. They may have planned on using TCMC. I mean, a 320W TDP from cards that are push to the max when they leave the factory probably wasn't the original plan either.
     
  18. bru

    Soldato

    Joined: 21 Oct 2002

    Posts: 7,347

    Location: kent

    Both GN and Jay stated that they didn't have cards at the time of filming. Now whether others do have cards yet is anyone's guess. But I would expect tomorrow to only be an announcement with availability coming latter.

    On the 3070 front the card looks quite good at the price point, pretty much equal to 2080ti performance wise but considerably cheaper.
     
  19. mattferg

    Associate

    Joined: 9 Oct 2020

    Posts: 17

    Still says out of stock my end
     
  20. Nablaoperator

    Wise Guy

    Joined: 19 Jun 2017

    Posts: 1,029

    Possible.. someone with exposure to semis can share more insights..

    From my general read, fab commitments have to be done well in advance and Nvidia has this naming scheme which gives away revisions (ga112 will be a revision of ga102, if I remember correctly)