• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
Caporegime
Joined
17 Mar 2012
Posts
47,382
Location
ARC-L1, Stanton System
If the image quality and frame rate are there, and beating the competition, does it really matter to you how that's achieved?
Why should it?
Is all optimisation "cheating"?

Nvidia are making larger GPU's to accommodate DLSS. its takes up transistors on the die, seemingly so they can say "look kids you can run this game a 1600P and with DLSS you can make it look almost 2160P with almost 1600P performance"

Why not use that die space to add more shaders and bring the performance up that way?

And AMD do the same thing without dedicating transistors to it, even if it 'only' looks 90% as good as DLSS 2 its a better solution, no?
 
Associate
Joined
14 Aug 2017
Posts
1,194
Nvidia are making larger GPU's to accommodate DLSS.

Nvidia include tensor cores on their chips, yes, but they are more general purpose than just DLSS.

Why not use that die space to add more shaders and bring the performance up that way?

Because there are other uses for them? How much die space do they actually use anyway? Are you sure that the amount of die space the tensor cores use would actually make much difference if used as space for shaders? DLSS has a much greater than 12% difference on frame rates, yet there are only about 12% the number of tensor cores as there are shader processors on turing cards. It seems like they give a disproportionately good return for investment here.

And AMD do the same thing without dedicating transistors to it, even if it 'only' looks 90% as good as DLSS 2 its a better solution, no?

No, that would make it a worse solution, by definition, as it's only 90% as good ... ?
 
Caporegime
Joined
17 Mar 2012
Posts
47,382
Location
ARC-L1, Stanton System
No, that would make it a worse solution, by definition, as it's only 90% as good ... ?

That's fine, Difference of opinion. :)

Wheres that guy with the two images? he had to point out the power lines in the distance were less jaggy, ok that was a thing but other than that in this comparison the two images looked identical.

I'll take that for £400 vs £500 for the Nvidia performance equivalent and run the feature globally without having to worry about whether or not the game supports it, it just does... "it just works" :D
 
Soldato
Joined
8 Jun 2018
Posts
2,827
If the image quality and frame rate are there, and beating the competition, does it really matter to you how that's achieved?
Why should it?
Is all optimisation "cheating"?
Stop being an apologist. You understand the implications being made. And no, I would not pay more for "new" hardware that is hamstrung on dlss as a crutch at a premium. Were developers on average don't show any interest in it for upcoming AAA titles. And more often then not dlss 3.0 will be used on old games.
 
Last edited:
Soldato
Joined
26 Sep 2010
Posts
7,146
Location
Stoke-on-Trent
If the image quality and frame rate are there, and beating the competition, does it really matter to you how that's achieved?
Why should it?
Is all optimisation "cheating"?
If 100% of developers implemented DLSS in 100% of their games then it would be a different story. But they're not. So now I am facing paying a premium for hardware that is underpowered because of a "feature" that I will, in very practical terms, never benefit from. Nvidia are literally sacrificing rendering performance for a gimmick that doubles-up their profit margin. Tensor cores already pay for themselves in the datacenter cards, and now Nvidia are getting a 2nd slice of the pie implementing them on gaming consumer cards at the expense of shaders and charging a premium for it.

Rending an image at a reduced resolution and faking it is not "optimisation". And the same can be said for AMD's Radeon Image Sharpening and FidelityFX; faking an image instead of having the pure grunt to render native. At least though with AMD's approaches, there is no premium to pay for dedicated hardware to power their cheating (although we'll see how much RDNA 2 cards cost).
Because there are other uses for them?
Not on a consumer gaming card there isn't. Real-time denoising for RT isn't done on the Tensor cores right now, and Tensor-based memory Compression touted for Ampere might not trickle down past the Titan and 3080 Ti, if it at all.

I'll give Nvidia all the applause for lighting the blue paper that is real-time ray tracing, but their implementation at a gaming level is nothing more than a vessel to fleece the market.
 
Associate
Joined
14 Aug 2017
Posts
1,194
If 100% of developers implemented DLSS in 100% of their games then it would be a different story

There seem to be more over time though.

Nvidia are literally sacrificing rendering performance for a gimmick that doubles-up their profit margin.

How are they sacrificing rendering performance? How much rendering performance do you think they are sacrificing, given there are under 15% as many tensor cores in a turing chip as shader processors? And the framerate gain can be far more than that.

Rending an image at a reduced resolution and faking it is not "optimisation". And the same can be said for AMD's Radeon Image Sharpening and FidelityFX; faking an image instead of having the pure grunt to render native.

If the same image comes out of the other end, faster, then who gives a ****?
(and that's pretty much the definition of optimisation)

Not on a consumer gaming card there isn't.

As AI/ML software becomes more commonplace, it seems like there will be more use for these. I already use mine for playing around with GPT-2 and other such stuff. If your argument there is that it's not purely a gaming feature, then I'd ask how you feel about other non-gaming functionality like hardware video encoding etc.
 
Soldato
Joined
8 Jun 2018
Posts
2,827
I genuinely don't understand why you would care, unless you're standing firmly in the red camp shouting "Buh buh buh, that's CHEATING!"

Personally I try not to be in either camp, and I hope they both come out with some spectacular stuff this year.

Nvidia holds no influence in the gaming market. Developers dont care for it. It's only presence is in older games. Dlss will fail. Just like the others. Besides, that's not an answer to my post. We are talking about next gen hardware that will likely cost well above 800. That requires dlss to be relevant.
Besides you are the one defending nvidia in an AMD thread. That makes you more firmly in the green camp. :D

And the sad part is this is the same hackneyed, stilted viewpoints that were said for turing. I'm not sure if it's a spell or, some folks have very poor memory. Regardless, it is truly a spectacle to witness.
 
Last edited:
Caporegime
Joined
17 Mar 2012
Posts
47,382
Location
ARC-L1, Stanton System
:eek:...................not sure if serious.

I wouldn't go so far as to say "no influence" but Nvidia only operate in the home PC space, consoles is far larger where AMD are completely dominant and that's what game developers care about, if game consoles don't run DLSS or Nvidia's bastardized version of Ray Tracing they ain't interested, why bother, pay me and then we'll talk.
 
Associate
Joined
21 Apr 2007
Posts
2,483
As long as it offers performance for the end users, doesn't really matter how they're doing it. DLSS2.0 offers better performance and image quality than AMD.

It matters a lot whilst there are games that can't run or don't support DLSS, and right now that happens to be pretty every single game bar a handful of exceptions. I'm not against DLSS just that its practical use seems to be a mile away from the hyperbole
 
Associate
Joined
14 Aug 2017
Posts
1,194
Nvidia holds no influence in the gaming market.

Dominant PC graphics card manufacturer, plus does the Nintendo Switch.
Totally, no influence!

I'll have some of what you're smoking :)


We are talking about next gen hardware that will likely cost well above 800. That requires dlss to be relevant.

When AMD are competitive at the high end, with or without DLSS, let's have this conversation again. Right now, talking about nvidia hardware as being "hamstrung" or "relying" on DLSS is rather hilarious.
 

bru

bru

Soldato
Joined
21 Oct 2002
Posts
7,360
Location
kent
I wouldn't go so far as to say "no influence" but Nvidia only operate in the home PC space, consoles is far larger where AMD are completely dominant and that's what game developers care about, if game consoles don't run DLSS or Nvidia's bastardized version of Ray Tracing they ain't interested, why bother, pay me and then we'll talk.

I assume you mean MIcrosoft's DXR. :(
 
Caporegime
Joined
17 Mar 2012
Posts
47,382
Location
ARC-L1, Stanton System
I assume you mean MIcrosoft's DXR. :(

Yes but the levels at which Nvidia run it you need their RT cores, that's not going to happen with consoles, they are going to do it much more efficiently or use whatever acceleration tech AMD may or may not have.

Nvidia try to be the "Respective Space Influences" but they are not as important as they think they are, Nvidia tried to play Samsung off against TSMC who told them to take a running jump off a cliff and gave almost all of the spare 7nm Capacity that HiSilicon left to AMD, AMD are a far larger wafer consumer than Nvidia and they get priority.
 
Soldato
Joined
8 Jun 2018
Posts
2,827
Dominant PC graphics card manufacturer, plus does the Nintendo Switch.
Totally, no influence!

I'll have some of what you're smoking :)




When AMD are competitive at the high end, with or without DLSS, let's have this conversation again. Right now, talking about nvidia hardware as being "hamstrung" or "relying" on DLSS is rather hilarious.
Then do us all a favor buy a switch and benchmark it LOL. The switch didn't become popular because it had 'the Nvidia' in it. Nintendo and the current market environment had everything to do with it. So again Nvidia has no gaming influence.

And let's be clear on the subject shall we? Nvidia can entice a developer on a particular game. However that does not equate to the gaming market. Cp2077 comes to mind.

Furthermore, I do not recognize Nvidia in the gaming market based on performance. GPU performance and the game market are not equivalent. That is called a false equivalence. One has nothing to do with the other.

How poorly or how well Nvidia does in performance matrix has no influence in the gaming market. That has been proven generation after generation after generation.

Therefore, waiting to see Nvidia's performance makes no bearing on what happens in the game in market.:D
 
Status
Not open for further replies.
Back
Top Bottom