• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

really getting fed up with the posts stating RTX/DLSS does not work this gen

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,172
Location
Greater London
that's sssssmokin :D

But one thing is certain. As a hobbyist with no loyalty one shouldn't become upset enough to make accusations towards another hobbyist over concerns of certain practices that's revealed in a article. As you say, we come here for the discussion with fellow enthusiasts... :D
So you are telling me you are not pro amd and not a big fan of nvidia? I am only saying it as it is, nothing to do with the article you keep mentioning :p

Oh and I am not upset ;)
 
Soldato
Joined
8 Jun 2018
Posts
2,827
So you are telling me you are not pro amd and not a big fan of nvidia? I am only saying it as it is, nothing to do with the article you keep mentioning :p

Oh and I am not upset ;)
No more then you are a pro nvidia and not a big fan of AMD. See how that works...

However you 'feel' about my allegiance still doesnt address why a top end, next gen card still needs games to have TAA for a performance boost using dlss against RDNA 2? And the ability to alter the game's existing IQ from within the drivers?

So let me get this straight. They are going to:
A. Get additional development time for these ampere press releases, existing, games and alter code to give them an advantage by altering IQ.
B. Inject taa
C. Add dlss for a performance boost

One word comes to mind: rigged. Another thought is crysis 2 tessellation controversy.
I hope amd has an answer for that.

Personally, I wouldn't have expected RDNA 2 to compete with a 3080ti to warrant the need for all these changes in benchmark review just for ampere. I mean, this is a first. Again, I'm not making that up. It's coming from a reviewer who's telling us how the review will go. Which is my interpretation. And, why I believe it's a bit underhanded.

Now if you want to continue calling me pro amd because of what I've read in the article then you are only ignoring the discussion at hand. Which is why I say you are upset. :D
 
Last edited:
Soldato
Joined
14 Sep 2008
Posts
2,616
Location
Lincoln
I'm sort of with you on seeing potential (possibly even intended) "shady" tactics from nVidia with this.. However, if the IQ is equal or greater AND it doesn't needlessly hinder competitors (and thus the consumer) - then isn't it a valid technology that SHOULD be included as part of the reviews?
You bring up Crysis 2, which met the first criteria of equal/greater IQ (since it was equal) but the second criteria failed - it hindered the competitor for no IQ benefit; in other words it didn't help the consumer at all. You could also use GameWorks as an example - it might have improved IQ but it needlessly hindered competitors as they were "blackbox" techs that stopped AMD developing support for. So nVidia do have a history of being shady and anti-consumer; however, if DLSS 3.0 were to provide equal/greater IQ - it doesn't hinder competitor technologies; it just provides added value to the consumer... then as much as I might not like the gap between AMD/nVidia widening, then it seems like it's something that we'd have to accept as a valid improvement, no? As long as nVidia didn't dictate terms to developers like "you can't support AMD technologies if you use DLSS" or something; ie removing the fair playing field.
 
Soldato
Joined
28 May 2007
Posts
10,049
I'm sort of with you on seeing potential (possibly even intended) "shady" tactics from nVidia with this.. However, if the IQ is equal or greater AND it doesn't needlessly hinder competitors (and thus the consumer) - then isn't it a valid technology that SHOULD be included as part of the reviews?
You bring up Crysis 2, which met the first criteria of equal/greater IQ (since it was equal) but the second criteria failed - it hindered the competitor for no IQ benefit; in other words it didn't help the consumer at all. You could also use GameWorks as an example - it might have improved IQ but it needlessly hindered competitors as they were "blackbox" techs that stopped AMD developing support for. So nVidia do have a history of being shady and anti-consumer; however, if DLSS 3.0 were to provide equal/greater IQ - it doesn't hinder competitor technologies; it just provides added value to the consumer... then as much as I might not like the gap between AMD/nVidia widening, then it seems like it's something that we'd have to accept as a valid improvement, no? As long as nVidia didn't dictate terms to developers like "you can't support AMD technologies if you use DLSS" or something; ie removing the fair playing field.

The problem with using DLSS in reviews for me is it's not really giving you a good idea of how powerful the card is in 99% of games which don't have DLSS. I would rather they tested all games equally and done a separate segment to show the benefits of DLSS both for performance and it's image quality.
 
Soldato
Joined
14 Sep 2008
Posts
2,616
Location
Lincoln
The problem with using DLSS in reviews for me is it's not really giving you a good idea of how powerful the card is in 99% of games which don't have DLSS. I would rather they tested all games equally and done a separate segment to show the benefits of DLSS both for performance and it's image quality.
I don't see why they wouldn't do that. Or if nVidia forces DLSS so it's not possible to test the same game with/without - then have a selection of games stating which are using DLSS and which aren't. it's not difficult to account for in the review process.
 
Soldato
Joined
8 Jun 2018
Posts
2,827
I don't see why they wouldn't do that. Or if nVidia forces DLSS so it's not possible to test the same game with/without - then have a selection of games stating which are using DLSS and which aren't. it's not difficult to account for in the review process.
The reviewer clearly stated that it would be used in comparison with rdna 2.0. So it's not hard to believe that the nvidia would indeed use it in the performance charts for just such a reason. I think it is concerning to know if whether or not they differentiate regular Benchmark methods to this alternate NDA 2.0 version. Or whatever you want to call it.
 

bru

bru

Soldato
Joined
21 Oct 2002
Posts
7,360
Location
kent
I reckon that DLSS 3.0 will bring the original idea of better AA at the same resolution ( no up scaling involved) with no performance hit at all.

Or better yet give the end user the choice, with a nice slider in game to select how much DLSS they want to use. Upscale for more performance or same resolution but better visuals.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,172
Location
Greater London
No more then you are a pro nvidia and not a big fan of AMD. See how that works...

However you 'feel' about my allegiance still doesnt address why a top end, next gen card still needs games to have TAA for a performance boost using dlss against RDNA 2? And the ability to alter the game's existing IQ from within the drivers?

So let me get this straight. They are going to:
A. Get additional development time for these ampere press releases, existing, games and alter code to give them an advantage by altering IQ.
B. Inject taa
C. Add dlss for a performance boost

One word comes to mind: rigged. Another thought is crysis 2 tessellation controversy.
I hope amd has an answer for that.

Personally, I wouldn't have expected RDNA 2 to compete with a 3080ti to warrant the need for all these changes in benchmark review just for ampere. I mean, this is a first. Again, I'm not making that up. It's coming from a reviewer who's telling us how the review will go. Which is my interpretation. And, why I believe it's a bit underhanded.

Now if you want to continue calling me pro amd because of what I've read in the article then you are only ignoring the discussion at hand. Which is why I say you are upset. :D
Haha. I said that based on all your posts overall. Not because of the discussion at hand ;)

Definitely not upset, quite the opposite actually, having a great day today :p:D
 
Soldato
Joined
18 Feb 2015
Posts
6,480
https://www.imgtec.com/blog/why-benchmarks-are-the-missing-piece-of-the-ray-tracing-puzzle/

The nuances of ray tracing

First, let’s examine why. It’s not often that a new real-time rendering feature requires its own dedicated analysis tooling in order to assess performance. Most new features have a limited footprint in all senses, be that the amount of hardware required to implement them, developer effort to integrate it into a rendering system, size of the driver implementation, what’s needed in any tools, and their impact on the rest of what’s happening in a frame.

Ray tracing is none of those things. A good implementation costs considerable area and effective integration into a rendering system and is only “easy” if you’re using it to implement something contained in scope and complexity, such as shadows. The driver and compiler work to add support for Vulkan Ray Tracing or DirectX Ray Tracing (DXR) is involved and complex, brand new tooling is required, and it has a big impact on your rendering system as a whole.

The inherent complexity of real-time ray tracing acceleration as a feature — that big footprint — means that there’s a lot of scope for implementation having a big effect on performance, and so more than any other new addition to the real-time rendering pipeline in recent memory, ray tracing needs good tools to peek under the covers of each implementation so that developers can learn how to adapt their code.

That’s exacerbated by the fact that there’s only really one implementation on the market today: Nvidia’s Turing. While Nvidia did release a DXR driver for their prior GPU generation, Pascal, it’s really only viable for prototyping and limited experimentation; Pascal lacks any hardware acceleration for DXR or Vulkan Ray Tracing and so it’s really slow!
So, it’s a real risk for developers to only have one implementation on the market to target. Good for Nvidia arguably, since almost all DXR and Vulkan ray tracing code being written today will be running on its GPUs, but not good when developers want to take their games and run it on implementations from other vendors in the future.

They’ll quickly find out that there is a spectrum of how ray tracing can be implemented in a GPU, and understanding the differences is laden with nuance, such as performance cliffs and non-obvious bottlenecks. There’s no substitute to profiling your own workloads in-depth to get the best possible view of course, but there’s most definitely room for software to help you understand how the hardware works and what it’s capable of.

Very interesting read. Perhaps something to think about when thinking about RNDA's ray tracing capabilities.
 
Man of Honour
Joined
13 Oct 2006
Posts
90,805
Nice - mind you that is nothing compared to what the renderer in Quake 2 is capable of without the geometry, material and so on limitations of a 25 year old engine :s

I really don't have much time for people who do the tech down knowing a bit of what is coming.
 
Soldato
Joined
18 Feb 2015
Posts
6,480
https://wowhead.com/news=317555.2/r...led-on-the-world-of-warcraft-shadowlands-beta

968251.jpg

968250.jpg
XFqHtXE_d.webp
 
Back
Top Bottom