1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.
  2. Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

    Dismiss Notice

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Discussion in 'Graphics Cards' started by Gregster, 8 Nov 2019.

Thread Status:
Not open for further replies.
  1. humbug

    Caporegime

    Joined: 17 Mar 2012

    Posts: 37,431

    Yes, i know, that's always been the way, i had driver problem on and off during the 2 years i owned the GTX 1070, as did others, people have already forgotten that, the R9 290 was rock solid, as was the GTX 970, i have had issues with the 5700XT drivers but they are pretty solid now.

    Nvidia do make great GPU's, i have had 2 of them in recent years, they are good, the R9 290 was a really good GPU too, the 5700XT is turning out to be a little beast and the Drivers have some really good and useful features. The only criticism i have is the image quality when using the Radeon software for recording, its not good, but weirdly if i use a third party app like OBS or Mirillis Action and use the AMD encoder the image quality is excellent, its the software and i wish AMD would fix that.
     
  2. doody

    Gangster

    Joined: 19 Dec 2012

    Posts: 280

    Wow that is close :p:o
     
  3. doody

    Gangster

    Joined: 19 Dec 2012

    Posts: 280

    I PAID £400 FOR MY 5700XT so i saved about £300 for 10 frames maX LESS :rolleyes::o
     
  4. humbug

    Caporegime

    Joined: 17 Mar 2012

    Posts: 37,431

    Its is, take out the Unreal Engine outlier which is basically built for Nvidia the difference between the RTX 2080 Super and the 5700XT is 16% overall

    I paid £330 for mine :p yes that's Brand new...
     
  5. shankly1985

    Capodecina

    Joined: 25 Nov 2011

    Posts: 19,730

    Location: The KOP

    All I read now is navi black screen poor drivers etc but people are fast to forget RTX series release with some very bad GPUs dying with space invaders it's all forgotten about lol

    Enjoy the 5700XT I can't comment on the recording side of things as it's working what I would expect from Vega 64.
     
  6. humbug

    Caporegime

    Joined: 17 Mar 2012

    Posts: 37,431

    Actually......

    Call it 18% Unreal Engine is a thing. No one pays $750 vs $400 for 18% higher frames when they buy CPU's, why do people do it when they buy GPU's?
     
  7. EastCoastHandle

    Mobster

    Joined: 8 Jun 2018

    Posts: 2,565

    I think the question we need to ask ours selves is who has the real performant this go around?

    AMD
    Were we saw a big performance uptick in recent drivers that made the 5700xt a 2070s competitor now. This using a smaller die and lower cu counts of 40.

    Nvidia
    Were we already assumed that the 3080ti would be fast. But for some reason still needs to down scaling the resolution for a big performance boost using dlss as a crutch in order to beat RDNA 2.

    Who really has the fastest Uarch? Time will tell. But I tell you one thing if you have to inject, recode games to get an advantage something is wrong with the approach of performance.
     
  8. BigBANGtheory

    Wise Guy

    Joined: 21 Apr 2007

    Posts: 1,787

    They are constrained by the amount of production they can do though. TSMC have x no. of machines that churn out the wafers per process node and fab plant. I don't know the details but for the sake of argument if AMD can order 100k wafers that yields a certain no. of chips it stand to reason they will order those that deliver certainty of a $$$ cos shareholders trump consumers unfortunately. If AMD can order more than they think they can sell that is good cos it encourages them to diversify and take some risks but if they are in we can't as much as we want position then I guess you'll have winners and losers particularly as they will have APU contracts for MS & Sony
     
  9. Twinz

    Wise Guy

    Joined: 20 Aug 2019

    Posts: 1,406

    Location: SW Florida

    I always thought of AMD as using the "brute force" method while Nvidia use the "software tricks" method.

    I don't really care how it's done though, as long as it looks good, offers good price/performance, and isn't totally ridiculous with power draw.
     
  10. EastCoastHandle

    Mobster

    Joined: 8 Jun 2018

    Posts: 2,565

    History repeating itself. I recall back during the IQ wars that same sentiment parroted. Even when shown differences that the developer did not intend in the game. It was still ignored as being the same IQ. Even when some were caught cheating in 3dmark by lowering the lid the same sentiment was parroted. That's because 3dmark was still validating their scores.

    Now we are seeing Nvidia down scaling resolutions and yet again, the same is parroted. And it won't be long before another Nvidia scandal pops up over IQ reductions to boost fps. Particle effects, reduction in sun reflections, etc be damned until then.

    Brings back memories, ahhhhhh the memories.
    :D

    But what doesn't make any reasonable sense is why are you paying more for a gpu that requires a software crutch when you should be paying less? I could understand that you don't care. But you should care for yourself to know that you are paying more and getting less in Uarch. That needs software to down scale resolution in order to bring you next gen performance. Something should click within you to say, "hey, wait a minute...an m I actually paying for dlss also??"
     
    Last edited: 2 Aug 2020
  11. humbug

    Caporegime

    Joined: 17 Mar 2012

    Posts: 37,431

    There is actually some truth to that, or at least there was, i think GCN hung around for too long, even in its latest iteration (The Radeon VII) has 60 CU's at 1900Mhz and its performance in games is identical to Navi 10 (5700XT) that's 50% more shaders at the same clock speed for the same performance, that's ignoring the fact that the Radeon VII has much higher memory bandwidth.

    RDNA1 is more than 50% IPC improvement over the latest iteration of GCN.

    Ok that was then, so lets look at RDNA1 vs Turing, with 20% more shaders at the same clock speed the 2080 Supper is 18% faster than the 5700XT, that's including the outlier PUBG, 16% without it, so with the same memory speed, with the same 256Bit Memory bus, take into account the 18% difference with 20% more shaders RDNA1 is the same if not slightly higher IPC than Turing.

    Just as a note.... in CPU's Zen 1 was 52% higher IPC vs Excavator, Zen 2 is 15% higher IPC again, its actually 12% higher than Intel's latest and greatest, Zen 3 is touted as another 15% higher IPC.

    Edit: BFV in the video i posted, 2080 Supper GPU power 250 Watts. 5700XT 200 Watts.
     
    Last edited: 2 Aug 2020
  12. LoadsaMoney

    Caporegime

    Joined: 8 Jul 2003

    Posts: 29,805

    Location: In a house

    I remember when AMD were lowering the quality through their drivers to boost their fps, and when they got found out, 'its juts a bug, and we'll fix it' :p

    Then theres that Radeon Boost (or whatever it is), that reduces the res on the fly.

    EDIT: Yup, Radeon Boost.



    :D

    Both as bad as each other lol.
     
    Last edited: 2 Aug 2020
  13. bru

    Soldato

    Joined: 21 Oct 2002

    Posts: 7,328

    Location: kent

    To be fair both sides have had their dodgy moments in the past.

    RDNA 1 is a good architecture, only let down by AMD trying to push it as a higher teir spec than it was suppose to be (690 anyone). We will have to wait and see just how good or bad (depending on which rumours you go by) RDNA 2 turns out to be.

    Exciting times ahead for sure.
     
  14. Kelt

    Capodecina

    Joined: 14 Nov 2007

    Posts: 11,826

    Location: With the færies wearing black cherries for rings

    Absolutely not.

    I'm struggling to see any scenario in the next decade or beyond where AMD will ever compete with Nvidia at the high end.

    I sincerely hope to be proven wrong and AMD can pull a "Ryzen" and do to Nvidia what they're doing to Intel.
     
  15. EastCoastHandle

    Mobster

    Joined: 8 Jun 2018

    Posts: 2,565

    Haaa, that's going to be a tough pill to swallow if it gets that bad for dlss 3.0. No worries dlss 3.5 to the rescue...wait I cant enable it with my 3080ti, wutt!!??
    :D
     
  16. KungFuSpaghetti

    Wise Guy

    Joined: 7 Apr 2017

    Posts: 1,664

    Yeah I'm aware that nvidia have their share of fails, but recent history they have simply had a massive performance advantage.
    Navi cards were okay, I almost bought a 5700xt, but for a small extra outlay the 2070s felt a better value proposition (cooler, less power hungry, more stable drivers IMO).
    AMD have done amazing considering their budget, but as a consumer I don't really care, again, I'll just get whatever is the best at my budget.
     
  17. LoadsaMoney

    Caporegime

    Joined: 8 Jul 2003

    Posts: 29,805

    Location: In a house

    AMDs gona smash em this time :D
     
  18. EastCoastHandle

    Mobster

    Joined: 8 Jun 2018

    Posts: 2,565

    Hey buddy, psssst, over here...
    AMD has not released a TI equilevant/competitor in several years.
    But keep it to yourself. You wouldn't want people to think that you didn't know that.
    :D
     
  19. Th0nt

    Capodecina

    Joined: 21 Jul 2005

    Posts: 11,657

    Location: N.Ireland

    That's the other gem, the card costs under $400 and the nvidia is a fat chunk extra (not value IMO ~ 20% better for 75% more £)
     
  20. shankly1985

    Capodecina

    Joined: 25 Nov 2011

    Posts: 19,730

    Location: The KOP

    It's a feature that is mostly designed for APUs and users that want that extra bit of performance and don't care about image quality. It actually works quite well in APex Legends the faster you move the mouse the more FPS you gain.

    They are a difference in labelling up a feature to do something than doing something behind the scene ;)
     
Thread Status:
Not open for further replies.