• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The thread which sometimes talks about RDNA2

Status
Not open for further replies.
Caporegime
Joined
17 Mar 2012
Posts
47,559
Location
ARC-L1, Stanton System
RTX is nvidia's hardware platform. DXR = DX12u which is how game access the RTX platform. Control is a DX12u game which means it uses DXR. The 20 series gpu's were the first to support DXR in hardware on the PC. DX12u unifies Windows PCs with the upcoming Xbox Series X platform.


You keep repeating yourself, DXR and RTX are not the same thing, if devs are anything to go by it remains to be seen if Nvidia can actually run Microsoft DXR.
 
Associate
Joined
15 Nov 2018
Posts
163
I'm just saying, it would be better if you get out of that proprietary lock in.

No one can deny the fact that proprietary lock out is bad but at the same time if he plans to stick with using a g-sync panel for a while the 3080 is a no brainer.

In terms of the raytracing argument, AMD and Nvidia are using different implementations to achieve the same thing so there will be a difference in terms of performance depending on which method the dev chooses to optimize for.
 
Associate
Joined
9 May 2007
Posts
1,284
You keep repeating yourself, DXR and RTX are not the same thing, if devs are anything to go by it remains to be seen if Nvidia can actually run Microsoft DXR.

This is a guy that thinks he can judge the best graphics card. Who knows how much vRAM developers will need. Could the lack of understanding be any more total?

Both turing and RDNA2 support DXR 1.1 https://www.techpowerup.com/271484/...eature-level-12-2-turing-and-rdna2-support-it

Currently, NVIDIA's "Turing" based GeForce RTX 20-series are the only GPUs capable of feature-level 12_2. Microsoft announced that AMD's upcoming RDNA2 architecture supports 12_2, too.
 
Last edited:
Caporegime
Joined
17 Mar 2012
Posts
47,559
Location
ARC-L1, Stanton System
No one can deny the fact that proprietary lock out is bad but at the same time if he plans to stick with using a g-sync panel for a while the 3080 is a no brainer.

In terms of the raytracing argument, AMD and Nvidia are using different implementations to achieve the same thing so there will be a difference in terms of performance depending on which method the dev chooses to optimize for.

They use completely different Async engines at the hardware level, up till now Nvidia have been using their own version of it for RT, that is why i distinguish RTX and DXR, they are not the same thing, DXR was designed for consoles in partnership with AMD for RDNA2.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,508
Location
Greater London
I'm just saying, it would be better if you get out of that proprietary lock in.
No you said I made a mistake. It was not.

I am not locked into ****. This screen has been powered with a AMD card for longer than a Nvidia one. When the time is right I will be upgrading to a OLED that does both Freesync 2 and G-Sync compatible. Hell I have a 1440P 165Hz Freesync 2/G-sync compatible monitor downstairs that the missus uses. But I prefer this monitor.

Buying a monitor is not as easy as you make it sound like, that is a bloody lottery in itself. I managed to get this one on my second try. For me to sell this now and buy a monitor that is 4K and does both Freesync and G-sync we are talking about a lot of money. You gona help me pay for that? Nope. Not to mention all the other facts I gave you. G-sync was just one. What about the three most anticipated games of mine being Nvidia sponsored and having RT and DLSS? You going to tell me I made a mistake in taste too? I should have preferred Godfall instead of Cyberpunk?

But I made a mistake...

See, this is what I am saying. It is not black and white like you make it. You need to look at an individuals needs before going on like you do. Both the 3080FE and 6800XT are great cards, there is no mistake. So easy up with the fanboy crap please ;)
 
Caporegime
Joined
17 Mar 2012
Posts
47,559
Location
ARC-L1, Stanton System
No you said I made a mistake. It was not.

I am not locked into ****. This screen has been powered with a AMD card for longer than a Nvidia one. When the time is right I will be upgrading to a OLED that does both Freesync 2 and G-Sync compatible. Hell I have a 1440P 165Hz Freesync 2/G-sync compatible monitor downstairs that the missus uses. But I prefer this monitor.

Buying a monitor is not as easy as you make it sound like, that is a bloody lottery in itself. I managed to get this one on my second try. For me to sell this now and buy a monitor that is 4K and does both Freesync and G-sync we are talking about a lot of money. You gona help me pay for that? Nope. Not to mention all the other facts I gave you. G-sync was just one. What about the three most anticipated games of mine being Nvidia sponsored and having RT and DLSS? You going to tell me I made a mistake in taste too? I should have preferred Godfall instead of Cyberpunk?

But I made a mistake...

See, this is what I am saying. It is not black and white like you make it. You need to look at an individuals needs before going on like you do. Both the 3080FE and 6800XT are great cards, there is no mistake. So easy up with the fanboy crap please ;)

You explained you bough the screen before Free-Sync was a thing, when you put things in the right order you have no reason to feel burned.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,508
Location
Greater London
You explained you bough the screen before Free-Sync was a thing, when you put things in the right order you have no reason to feel burned.
I gave you many reasons, but it was not good enough as you kept saying I made a mistake.

There is nothing to feel burned about. I am super happy with this card as I do not do not take sides. I enjoy the hardware ;)
 
Associate
Joined
9 May 2007
Posts
1,284
This video tests different CPU and the AMD 6800xt.

Just interesting but the performance of the 9900k is lower than reviews for the 6800xt with a 9900k.
https://www.guru3d.com/articles_pages/amd_radeon_rx_6800_xt_review,13.html


Assassins Creed: Valhalla 1080p
The Core i9 9900K (8c/16t) @ defaults for 1080p gets 85fps in the guru3d 6800xt review but in the video it gets 74fps on a 9900k @ 5Ghz with a 6800xt? Time 2:25

Intel Core i9-9900K @ 5.0 GHz techpowerup 97.9fps with a 3080. https://www.techpowerup.com/review/...la-benchmark-test-performance-analysis/4.html

5900x 6900xt



https://adrenaline.com.br/artigos/v...erformance-e-como-fica-a-disputa-com-rtx-3080

5900x 1440p


Intel Core i9-9900K @ 5.0 GHz 1440p


4k 10900k https://www.tweaktown.com/news/7614...alla-does-hit-4k-60fps-on-rtx-3090/index.html

76143_09_actually-assassins-creed-valhalla-does-hit-4k-60fps-on-rtx-3090_full.png


5900x 4k



Even an AMD Ryzen 9 3900X is faster with a 3080 than a 5900x.



Looks like the nvidia cards are slower on 5000x series cpu based systems.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom