• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The improvement in ray-tracing in the past 2 years

Soldato
Joined
18 Jun 2018
Posts
4,607
Location
Isle of Wight
:D

I know I came across that its defunct, that's not the slant. Any technology needs adoption and progress is made from the innovation - just look at F1 and cars. What some wont see is that on these forums people cherry pick arguments and when it suits the technology has to work out of the gates or its a flop.

However the price hikes in the past couple of years since pascal have been bad for consumers its got out of hand. Mid range cards commanding £400 is a joke. RTX is one of the culprits for this movement. You should have seen how many posts are on this forum for people asking about an upgrade only to decide -"yeah I got the 2060 in the end as its faster and also has raytracing".

Yeah, anyone suggesting that getting an 2060 for the ray tracing hasn't done their research. I'm planning on seeing where the sweet spot lies in the next gen/second hand market, for best performance with ray tracing. I suspect it'll be next gen, but 2nd hand 2080's may be sufficient.
 
Soldato
Joined
19 Dec 2010
Posts
12,019
It doesnt bode well for future games because in the last two years we've had a "handful of games" which suggests devs are not enthusastic about ray tracing in games. Especially when the only future AAA game is Cyberpunk that everyone seems to be looking forward to.

Up until 2 years ago there was no Ray Tracing hardware in consumer GPUs. These things take time. You are completely wrong if you think it's because devs aren't enthusiastic about this.

Now that the hardware is out there and everybody is onboard, Ray Tracing will be seen in more and more games going forward.
 
Associate
Joined
17 Sep 2018
Posts
1,425
Yeah, anyone suggesting that getting an 2060 for the ray tracing hasn't done their research. I'm planning on seeing where the sweet spot lies in the next gen/second hand market, for best performance with ray tracing. I suspect it'll be next gen, but 2nd hand 2080's may be sufficient.

Looking at rumours, RTX is supposed to be way better in 3000 series, so if you want the RTX on I'd speculate that a 3060 will beat a 2080. Which might make 2000 series somewhat redundant for RTX
 
Soldato
Joined
18 Jun 2018
Posts
4,607
Location
Isle of Wight
Looking at rumours, RTX is supposed to be way better in 3000 series, so if you want the RTX on I'd speculate that a 3060 will beat a 2080. Which might make 2000 series somewhat redundant for RTX

That's what I'm thinking too, but general performance on those cards will probably be worse, so a balance will need to be struck. We can already see that the tech has improved greatly, and 1440p with ray tracing basically gets to a playable (for me at least) number of frames on a 2080ti, so maybe I don't need better.

Not planning on getting a new card until around March next year (birthday), so should have been enough time for drivers, full range of cards etc to have been released, and a proper comparison can be done.
 
Soldato
Joined
19 Dec 2010
Posts
12,019
I dont really agree with that regarding RTX. If they brought out the new Xbox and Sony with 0 games there would be uproar. Nvidia bring out RTX hardware with one game or two support and silence. Its gets ridiculed on the whole.

It shows the PC Producers had no faith in Ray Tracing for RTX hardware. Nothing to do with time. Both Ray Tracing and HDR are visual enhancements they do not stop you playing a game and are not fundamental to the mechanics. HDR has been around for a few years now its not essential to a game. Neither is Ray Tracing. I dont think its time itself I think its why bother implementing something that just looks nice when it takes too much development work. After all its not full Ray Tracing its only partially using it for certain reflections etc.

Nvidia did get a lot of criticism for not having any games available at launch. I criticised them myself.

Yes, it's all to do with time. Developers need time to make changes. Look at DX12, how long did it take developers to take to that?

Don't understand the rest of your post. HDR, Ray Tracing aren't essential for gaming. Of course not. Very few graphical advancements in the last 20 years have been essential for gaming. But, thankfully, game developers and GPU manufacturers keep pushing the technology so that games are looking and playing better than ever.

You don't seem to realise that this last couple of years is the first step. Any developer worth his salt is going to be excited about the progress been made.
 
Soldato
Joined
19 Dec 2010
Posts
12,019
This is the most accurate summary I have read here in a while. You have another thread where naysayers are knocking someone for posting how the 5700XT with new drivers is beating a 2070S stating "it has to be great out of the box"; yet when it comes to nvidia and raytracing, theres a mammoth of a birth given saying "give it time". No. Its had two years now and people are quite right to state it was not needed and is not ready to ram down people's throats. It should be labelled a damp squib and lambasted enough so that even nvidia cant get away with it no matter how brainwashed some followers are. Lack of both games and enthusiasm from devs is now showing the cracks.

How is that the most accurate summary?

And how are you comparing getting proper drivers out to implementing a brand new technology like Ray Tracing? AMD got all the flak they deserved for their drivers, they used to take months to get a game ready driver out. In the case of the HD 7000 cards it took them 11 months before they got a driver out that actually offered decent performance. They are a lot better now thankfully.

You know Ray Tracing isn't Nvidia right? They are just the first company to have a Ray Tracing solution out there that can even get close to playable frame rates while using Ray Tracing. It's using Microsoft's DXR and Vulkan's API. The new Xbox is going to be using DXR and AMD's RDNA 2 will be supporting both.

Look at Dx12 as comparison, how long did developers take to get up to speed on this? That's despite Dx12 capable GPUs been out for a couple of years before Dx12 was officially launched.

Please tell me how it was rammed down peoples throat? Nobody forced anyone to buy Turing cards.

And another lack of "enthusiasm from game devs" comment. You could not be more wrong.
 
Soldato
Joined
21 Jul 2005
Posts
19,981
Location
Officially least sunny location -Ronskistats
Your missing the point @melmac in your haste to point snippets, when I refer to "raytracing", in this context its the guise of nvidia convincing you to require RTX. Similar to how they try to brand things like Gsync when there was an industry standard (adaptive sync) that didn't need any proprietary fork right? You know this!

You know Ray Tracing isn't Nvidia right? They are just the first company to have a Ray Tracing solution out there that can even get close to playable frame rates while using Ray Tracing. It's using Microsoft's DXR and Vulkan's API. The new Xbox is going to be using DXR and AMD's RDNA 2 will be supporting both.
 
Soldato
Joined
19 Dec 2010
Posts
12,019
As its original use case in bf5 was to disable it for 2080 TI users. Which was never change. However, that ruleset change or the goal post moved when they include its use for the 2080 TI in games after bf5.

My only summation is that Nvidia could not implement their version of TAA in bf5. Which is why dlss 2.0 was never implemented in bf5. A game that originally started this. Which to me tells me what the achilles heel of dlss is.

What? BF V has TAA? DLSS did work on the 2080Ti but only if you are using 4K.

Where do you get your info from? This is like in the other thread where you were giving out about DLSS because it has to be added per game, but you were praising FidelityFX because it didn't. When the actual facts are that FidelityFx does have to be added per game. Even Humbug admitted he was wrong about this.
 
Soldato
Joined
19 Dec 2010
Posts
12,019
Your missing the point @melmac in your haste to point snippets, when I refer to "raytracing", in this context its the guise of nvidia convincing you to require RTX. Similar to how they try to brand things like Gsync when there was an industry standard (adaptive sync) that didn't need any proprietary fork right? You know this!

First of all when Nvidia came up with Gsync, adaptive sync did not exist.

Second, No it's you who is missing the point. Nvidia's RTX is using the open standards(DX12, Vulkan) And until AMD/Intel release their Ray Tracing solutions(which are going to be using the same open standards BTW) If you want to try out Ray Tracing then you have to buy a Nvidia RTX card.
 
Soldato
Joined
8 Jun 2018
Posts
2,827
Your missing the point @melmac in your haste to point snippets, when I refer to "raytracing", in this context its the guise of nvidia convincing you to require RTX. Similar to how they try to brand things like Gsync when there was an industry standard (adaptive sync) that didn't need any proprietary fork right? You know this!
When he cannot refute what you're saying all he will do is allegedly claim that you implied or said something that you didn't say. In which you have to stop what you're doing correct him and ask him to verify where you said whatever it is he's trying to claim you're saying.

He replied to me claiming that I said that bf5 now has TAA/2080ti 4k.:rolleyes:
 
Soldato
Joined
21 Jul 2005
Posts
19,981
Location
Officially least sunny location -Ronskistats
..allegedly claim that you implied or said something that you didn't say. In which you have to stop what you're doing correct him

He is a strange creature, think of it like that.

A quick scan tells me that:

Adaptive-Sync is a proven and widely adopted technology. The technology has been a standard component of VESA’s embedded DisplayPort specification since its initial rollout in 2009.

OK. So when did nvidia release G-sync?:

In 2013, nVidia proudly announced the completion of the development and actual implementation into their products of a new technology called G-Sync. The core principle of G-Sync is to synchronize display refresh rate with the refresh rate of the GPU. And this time it was not only a software improvement, but also a hardware implementation – nVidia has developed a chip to be built into the monitor that receives commands from the GPU in order to ensure a proper synchronization.

Err OK then @melmac what did you say again?:
First of all when Nvidia came up with Gsync, adaptive sync did not exist.

So quite frankly when I say your missing the point.. your missing the point (again).

No it's you who is missing the point.

:rolleyes:
 

bru

bru

Soldato
Joined
21 Oct 2002
Posts
7,360
Location
kent
Your missing the point @melmac in your haste to point snippets, when I refer to "raytracing", in this context its the guise of nvidia convincing you to require RTX. Similar to how they try to brand things like Gsync when there was an industry standard (adaptive sync) that didn't need any proprietary fork right? You know this!

You do realize that NVidia's Gsync was out before there was an industry standard.:rolleyes:
 
Man of Honour
Joined
13 Oct 2006
Posts
90,819
OK. So when did nvidia release G-sync?

There was no desktop monitor industry standard for it - the only uses were in professional displays and some laptop implementations (which weren't like current adaptive sync) and despite some claims otherwise NO interest in bringing it to the desktop environment. While I can't prove it these days I've seen the emails relevant to this and the lack of interest from other companies (some I think to try and preserve their professional commercial interests i.e. signage and air traffic control displays, etc.) to develop a full desktop solution.

Even then the current adaptive sync standard reuses (in a hack manner) existing features like panel self refresh to make things work and all the issues that come from that approach* - which speaks to it being rushed to market rather than being a ground up development as some try to claim (some of these things would be the first thing you'd tackle if you set out to develop the feature from the ground up rather than in a rush just to make it work).

* Problems with supporting a wide refresh range, flicker, low frame rate recovery latency, lacking adaptive dynamic overdrive, etc.
 
Last edited:
Soldato
Joined
6 Feb 2019
Posts
17,466
There was no desktop monitor industry standard for it - the only uses were in professional displays and some laptop implementations (which weren't like current adaptive sync) and despite some claims otherwise NO interest in bringing it to the desktop environment. While I can't prove it these days I've seen the emails relevant to this and the lack of interest from other companies (some I think to try and preserve their professional commercial interests i.e. signage and air traffic control displays, etc.) to develop a full desktop solution.

Even then the current adaptive sync standard reuses (in a hack manner) existing features like panel self refresh to make things work and all the issues that come from that approach* - which speaks to it being rushed to market rather than being a ground up development as some try to claim (some of these things would be the first thing you'd tackle if you set out to develop the feature from the ground up rather than in a rush just to make it work).

* Problems with supporting a wide refresh range, flicker, low frame rate recovery latency, lacking adaptive dynamic overdrive, etc.

Prior to the first gsync monitors releasing and seeing linus tech tips put up a video testing the first gsync monitor in far cry 3, I had never heard about this technology at all. So I'm happy to credit it to Nvidia for bringing it to mainstream.

It also took some getting used to, I didn't get my first gsync screen until 2017 as prior to that I didn't see the point. If you've lived without it for so long its hard to understand the benefits. Same goes for high refresh monitors, I can't live under 120hz now when using a mouse but it's been difficult trying to convince others of the benefits
 
Last edited:
Soldato
Joined
21 Jul 2005
Posts
19,981
Location
Officially least sunny location -Ronskistats
Wayback machine lists their website listing it in November 2013. It says monitors will follow shortly (that can do it) so that would be 2014 onwards. I agree @Grim5 I never considered it till a couple of years ago when in the position to get a new display to make sure it had this type of feature, mainly to check it out and see what its all about.

Getting back OT, yes of course its rumoured that raytracing will be better in the 3000 series, most points have been that the first wave of buyers in the 2000 series it should have been better.
 
Back
Top Bottom