1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.
  2. Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

    Dismiss Notice

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Discussion in 'Graphics Cards' started by Gregster, 8 Nov 2019.

Thread Status:
Not open for further replies.
  1. humbug

    Caporegime

    Joined: 17 Mar 2012

    Posts: 37,432

    Nvidia are making larger GPU's to accommodate DLSS. its takes up transistors on the die, seemingly so they can say "look kids you can run this game a 1600P and with DLSS you can make it look almost 2160P with almost 1600P performance"

    Why not use that die space to add more shaders and bring the performance up that way?

    And AMD do the same thing without dedicating transistors to it, even if it 'only' looks 90% as good as DLSS 2 its a better solution, no?
     
  2. TNA

    Capodecina

    Joined: 13 Mar 2008

    Posts: 18,510

    Location: London

    There more to it all than that mate. That’s too simple way of looking at it.
     
  3. Jimmy Weirdarms

    Hitman

    Joined: 14 Aug 2017

    Posts: 986

    Nvidia include tensor cores on their chips, yes, but they are more general purpose than just DLSS.

    Because there are other uses for them? How much die space do they actually use anyway? Are you sure that the amount of die space the tensor cores use would actually make much difference if used as space for shaders? DLSS has a much greater than 12% difference on frame rates, yet there are only about 12% the number of tensor cores as there are shader processors on turing cards. It seems like they give a disproportionately good return for investment here.

    No, that would make it a worse solution, by definition, as it's only 90% as good ... ?
     
  4. humbug

    Caporegime

    Joined: 17 Mar 2012

    Posts: 37,432

    That's fine, Difference of opinion. :)

    Wheres that guy with the two images? he had to point out the power lines in the distance were less jaggy, ok that was a thing but other than that in this comparison the two images looked identical.

    I'll take that for £400 vs £500 for the Nvidia performance equivalent and run the feature globally without having to worry about whether or not the game supports it, it just does... "it just works" :D
     
  5. Jimmy Weirdarms

    Hitman

    Joined: 14 Aug 2017

    Posts: 986

    And that's definitely a plus on the AMD side, I'm not trying to be partisan here!
     
  6. humbug

    Caporegime

    Joined: 17 Mar 2012

    Posts: 37,432

    I know :)
     
  7. EastCoastHandle

    Mobster

    Joined: 8 Jun 2018

    Posts: 2,565

    Stop being an apologist. You understand the implications being made. And no, I would not pay more for "new" hardware that is hamstrung on dlss as a crutch at a premium. Were developers on average don't show any interest in it for upcoming AAA titles. And more often then not dlss 3.0 will be used on old games.
     
    Last edited: 10 Aug 2020
  8. LePhuronn

    Soldato

    Joined: 26 Sep 2010

    Posts: 6,078

    Location: Stoke-on-Trent

    If 100% of developers implemented DLSS in 100% of their games then it would be a different story. But they're not. So now I am facing paying a premium for hardware that is underpowered because of a "feature" that I will, in very practical terms, never benefit from. Nvidia are literally sacrificing rendering performance for a gimmick that doubles-up their profit margin. Tensor cores already pay for themselves in the datacenter cards, and now Nvidia are getting a 2nd slice of the pie implementing them on gaming consumer cards at the expense of shaders and charging a premium for it.

    Rending an image at a reduced resolution and faking it is not "optimisation". And the same can be said for AMD's Radeon Image Sharpening and FidelityFX; faking an image instead of having the pure grunt to render native. At least though with AMD's approaches, there is no premium to pay for dedicated hardware to power their cheating (although we'll see how much RDNA 2 cards cost).
    Not on a consumer gaming card there isn't. Real-time denoising for RT isn't done on the Tensor cores right now, and Tensor-based memory Compression touted for Ampere might not trickle down past the Titan and 3080 Ti, if it at all.

    I'll give Nvidia all the applause for lighting the blue paper that is real-time ray tracing, but their implementation at a gaming level is nothing more than a vessel to fleece the market.
     
  9. Jimmy Weirdarms

    Hitman

    Joined: 14 Aug 2017

    Posts: 986

    There seem to be more over time though.

    How are they sacrificing rendering performance? How much rendering performance do you think they are sacrificing, given there are under 15% as many tensor cores in a turing chip as shader processors? And the framerate gain can be far more than that.

    If the same image comes out of the other end, faster, then who gives a ****?
    (and that's pretty much the definition of optimisation)

    As AI/ML software becomes more commonplace, it seems like there will be more use for these. I already use mine for playing around with GPT-2 and other such stuff. If your argument there is that it's not purely a gaming feature, then I'd ask how you feel about other non-gaming functionality like hardware video encoding etc.
     
  10. Jimmy Weirdarms

    Hitman

    Joined: 14 Aug 2017

    Posts: 986

    I genuinely don't understand why you would care, unless you're standing firmly in the red camp shouting "Buh buh buh, that's CHEATING!"

    Personally I try not to be in either camp, and I hope they both come out with some spectacular stuff this year.
     
  11. EastCoastHandle

    Mobster

    Joined: 8 Jun 2018

    Posts: 2,565

    Nvidia holds no influence in the gaming market. Developers dont care for it. It's only presence is in older games. Dlss will fail. Just like the others. Besides, that's not an answer to my post. We are talking about next gen hardware that will likely cost well above 800. That requires dlss to be relevant.
    Besides you are the one defending nvidia in an AMD thread. That makes you more firmly in the green camp. :D

    And the sad part is this is the same hackneyed, stilted viewpoints that were said for turing. I'm not sure if it's a spell or, some folks have very poor memory. Regardless, it is truly a spectacle to witness.
     
    Last edited: 10 Aug 2020
  12. bru

    Soldato

    Joined: 21 Oct 2002

    Posts: 7,328

    Location: kent


    :eek:...................not sure if serious.
     
  13. humbug

    Caporegime

    Joined: 17 Mar 2012

    Posts: 37,432

    I wouldn't go so far as to say "no influence" but Nvidia only operate in the home PC space, consoles is far larger where AMD are completely dominant and that's what game developers care about, if game consoles don't run DLSS or Nvidia's bastardized version of Ray Tracing they ain't interested, why bother, pay me and then we'll talk.
     
  14. BigBANGtheory

    Wise Guy

    Joined: 21 Apr 2007

    Posts: 1,787

    It matters a lot whilst there are games that can't run or don't support DLSS, and right now that happens to be pretty every single game bar a handful of exceptions. I'm not against DLSS just that its practical use seems to be a mile away from the hyperbole
     
  15. Jimmy Weirdarms

    Hitman

    Joined: 14 Aug 2017

    Posts: 986

    Dominant PC graphics card manufacturer, plus does the Nintendo Switch.
    Totally, no influence!

    I'll have some of what you're smoking :)


    When AMD are competitive at the high end, with or without DLSS, let's have this conversation again. Right now, talking about nvidia hardware as being "hamstrung" or "relying" on DLSS is rather hilarious.
     
  16. bru

    Soldato

    Joined: 21 Oct 2002

    Posts: 7,328

    Location: kent

    I assume you mean MIcrosoft's DXR. :(
     
  17. humbug

    Caporegime

    Joined: 17 Mar 2012

    Posts: 37,432

    Yes but the levels at which Nvidia run it you need their RT cores, that's not going to happen with consoles, they are going to do it much more efficiently or use whatever acceleration tech AMD may or may not have.

    Nvidia try to be the "Respective Space Influences" but they are not as important as they think they are, Nvidia tried to play Samsung off against TSMC who told them to take a running jump off a cliff and gave almost all of the spare 7nm Capacity that HiSilicon left to AMD, AMD are a far larger wafer consumer than Nvidia and they get priority.
     
  18. EastCoastHandle

    Mobster

    Joined: 8 Jun 2018

    Posts: 2,565

    Then do us all a favor buy a switch and benchmark it LOL. The switch didn't become popular because it had 'the Nvidia' in it. Nintendo and the current market environment had everything to do with it. So again Nvidia has no gaming influence.

    And let's be clear on the subject shall we? Nvidia can entice a developer on a particular game. However that does not equate to the gaming market. Cp2077 comes to mind.

    Furthermore, I do not recognize Nvidia in the gaming market based on performance. GPU performance and the game market are not equivalent. That is called a false equivalence. One has nothing to do with the other.

    How poorly or how well Nvidia does in performance matrix has no influence in the gaming market. That has been proven generation after generation after generation.

    Therefore, waiting to see Nvidia's performance makes no bearing on what happens in the game in market.:D
     
  19. LePhuronn

    Soldato

    Joined: 26 Sep 2010

    Posts: 6,078

    Location: Stoke-on-Trent

    Sounds like you've had enough already...
     
  20. kazuya1337

    Hitman

    Joined: 14 Apr 2014

    Posts: 598

    Well looking at the two threads AMD is certainly winning the race for most toxic fan boys
     
Thread Status:
Not open for further replies.