• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The thread which sometimes talks about RDNA2

Status
Not open for further replies.
Soldato
Joined
18 Feb 2015
Posts
6,484
I think this is without DLSS. On my 3080fe with Ultra preset + RT Ultra + DLSS Balance I get 60fps on my 4k60hz monitor. I have played about 3hrs yesterday and lowest I saw was around 55. I haven't tried it without DLSS off.
Balanced is bad in this game, so you're really playing with a gimped version. Quality is ok, but still nothing to write home about. Even on quality, go near a puddle and compare with/without DLSS and you will notice DLSS cannot resolve reflections properly and is significantly blurrier than native.

1. Is Ray Tracing even implemented in a huge amount of games?
2. An fps player like me who enjoys high frames would probably just turn this off
3. does ray tracing make THAT MUCH of an impact visually?
1. No
2. Yes
3. It depends and it's also subjective. I would say in something like Minecraft RTX or Control, it's a proper showcase technology. On the other hand, in BF V, in WD:L etc it's kinda meh. It does what it says on the tin, but not worth enabling imo.


Why ignore the DLSS charts?
Because DLSS gimps the image significantly and in particular in relation to RT. It cannot resolve reflections properly. So why would I compare a blurry image with native? It makes no sense. And I don't see how a 49 fps average WITH DLSS on is a boon either?

UeykcQW.jpg.png

4K has always been more Epeen than a smart choice. Now the arguement is people will be using 4k TVs, but then they sit 2+ meters from the screen. Run it at a sensible resolution such as 1440p. Very playable.

Personally I would have preferred a drop in raster performance and more of the die be devoted to RT.

I actually do use a 4K TV as a display and sit about 1.5m away from it, it's definitely worth it 99% of the time. Sometimes you can get away with 1800p just as well (KCD) or 1440p (FH4).
 
Soldato
Joined
6 Aug 2009
Posts
7,071
Yup you are correct, once a precedent with that kind of price is set, and people STILL buy it regardless, then that's a price that will set a new benchmark for the future and even be exceeded as we have seen in the case of the outrageously priced 3090. :mad:

I mean seriously wtf... just think about it... 150% extra cost for 15% extra performance. It would make me feel physically sick to my stomach and ashamed of myself to buy one and fuel that madness. :o

So many people in the AMD Big Navi thread were saying that if AMD released a 3090 competitor then it wouldn't be any cheaper as "they're not a charity and don't want to be seen as budget". Well, AMD have completely destroyed that logic by actually pricing the 6900XT FAIRLY using a far better engineered and more efficient gaming GPU that is cheaper to produce, far smaller, yet performs on a par with a card costing 60-70% more. :)

I see all this stuff as an opportunity to buy smart and avoid getting gouged. I do the same with the supermarkets, it's become something of a sport now :cool:
 
Soldato
Joined
28 Oct 2009
Posts
5,291
Location
Earth
Well im all set for the 6800xt after looking at the 1440p charts and that was without smart memory/Rage mode so looks promising but will decide once reviews are out looking to buy one of the partner cards will be interesting to see what boost speeds the partner cards can get upto maybe get close to 6900xt performance ?

I still read some have concern going to AMD because of drivers , I have owned the sapphire 5700xt nitro+ se going on 1 year now never had issues maybe when I got one the drivers got fixed ? and also video of hardware unboxed saying they never encountered issues some I dont think never owned AMD gpu just go with the bandwagon

feels like the nvida camp only have Ray Tracing which eats up FPS anyway personally feel it will be more useful for next gen I prefer to have more FPS, hopefully AMD can add something similar to DLSS
 
Associate
Joined
14 Dec 2016
Posts
958
I mean I get what you are saying but AMD are using DirectX which is the standard implementation plus I presume any console developers will use DXR over RTX to avoid needing to use two different implementations.

This very much so...

AMD positioned themselves to be in the consoles and this gen their PC hardware is very similar to console hardware, they can leverage many of the features of DX12U, i wonder if there are some features of DX12U that Nvidia wont be able to use and AMD will? stuff that will be in both Xbox SX and the PC? any game using these features will automatically perform better on AMD with Ryzen3 and 6xxx GPUs no?
 
Caporegime
Joined
8 Jul 2003
Posts
30,062
Location
In a house
Balanced is bad in this game, so you're really playing with a gimped version. Quality is ok, but still nothing to write home about. Even on quality, go near a puddle and compare with/without DLSS and you will notice DLSS cannot resolve reflections properly and is significantly blurrier than native.


1. No
2. Yes
3. It depends and it's also subjective. I would say in something like Minecraft RTX or Control, it's a proper showcase technology. On the other hand, in BF V, in WD:L etc it's kinda meh. It does what it says on the tin, but not worth enabling imo.



Because DLSS gimps the image significantly and in particular in relation to RT. It cannot resolve reflections properly. So why would I compare a blurry image with native? It makes no sense. And I don't see how a 49 fps average WITH DLSS on is a boon either?

UeykcQW.jpg.png



I actually do use a 4K TV as a display and sit about 1.5m away from it, it's definitely worth it 99% of the time. Sometimes you can get away with 1800p just as well (KCD) or 1440p (FH4).

AMDs version of DLSS, is supposed to look worse than that, according to RGT.

I mean I get what you are saying but AMD are using DirectX which is the standard implementation plus I presume any console developers will use DXR over RTX to avoid needing to use two different implementations.

I thought Nvidia were also using the standard DirectX, DXR, as their RTX isn't RT, its just the name of their cards, that have hardware RT, the R just denotes that, i.e card without hardware RT, GTX, card with hardware RT, RTX.
 
Last edited:
Associate
Joined
19 Jun 2017
Posts
1,029
I mean I get what you are saying but AMD are using DirectX which is the standard implementation plus I presume any console developers will use DXR over RTX to avoid needing to use two different implementations.

See what happens is that DXR is like an API (basically a library of standardised functions)
Now folks need to write code on DXR.. perf depends on the creativity of the programmer
This code will also interact with existing (non-DXR) data structures which makes it more complicated.

I am not confident if there will be a unified implementation for both cards atm
Lets see how it all pans out..

Edit:i feel AMDs implementation is more flexible and hence, superior, and can benefit from a better implementation
 
Soldato
Joined
12 May 2014
Posts
5,235
It's not hard to find benchmarks showing Quake 2 RTX at 1440p running greater than 60FPS on a 3080. That's a full scene denoised path tracing example. So pretending that hardware is not ready makes no sense.

I can't find any demos of AMDs RT performance, but as it uses shared hardware it may not even be meeting the dissappointing 2080Ti levels.
What's Nvidias performance in the marbles demo with and without DLSS? From memory that has "up to date" geometry and textures. Admittedly it is a tech demo but it means we know the upper limit.
 
Associate
Joined
14 Dec 2016
Posts
958
Balanced is bad in this game, so you're really playing with a gimped version. Quality is ok, but still nothing to write home about. Even on quality, go near a puddle and compare with/without DLSS and you will notice DLSS cannot resolve reflections properly and is significantly blurrier than native.


1. No
2. Yes
3. It depends and it's also subjective. I would say in something like Minecraft RTX or Control, it's a proper showcase technology. On the other hand, in BF V, in WD:L etc it's kinda meh. It does what it says on the tin, but not worth enabling imo.



Because DLSS gimps the image significantly and in particular in relation to RT. It cannot resolve reflections properly. So why would I compare a blurry image with native? It makes no sense. And I don't see how a 49 fps average WITH DLSS on is a boon either?

UeykcQW.jpg.png



I actually do use a 4K TV as a display and sit about 1.5m away from it, it's definitely worth it 99% of the time. Sometimes you can get away with 1800p just as well (KCD) or 1440p (FH4).

That is quite the visual hit you are sacrificing for performance, however in a fast paced FPS game i can see it not being an issue, unless at distance enemy players are harder to spot... like i say, Raytracing right now for me is purely a screenshot mode you enable when you wanna show how good something looks, defnitely not something you want to leave on full time if it gimps your performance so much. a lot i guess varies by game type.. fast paced games can you honestly say your noticing all the Ray tracing effects? slower paced games you will, and in those types of games i guess lower FPS wont be too much of a hinderance.

Cyberpunk 2077 will be interesting, as its generally a slower paced game with some FPS elements.
 
Soldato
Joined
14 Jul 2005
Posts
8,343
Location
Birmingham
Because DLSS gimps the image significantly and in particular in relation to RT. It cannot resolve reflections properly. So why would I compare a blurry image with native? It makes no sense. And I don't see how a 49 fps average WITH DLSS on is a boon either?

UeykcQW.jpg.png

If a game is being played to take high quality screenshots for publication, then by all means run it with settings on max and ray tracing on etc. However if you're running past shooting baddies then the reflection quality in the puddle isn't that critical, so turn off raytracing or turn on DLSS or whatever.

Like all things, the significance depends on the use case. We all want better graphics of course but got to put things into context.
 
Man of Honour
Joined
30 Oct 2003
Posts
13,251
Location
Essex
I can't see any benefit to putting 24GB on the 6900. The 3090 having that amount of ram is mainly of benefit to content creators, 3D artists and others who are using the card for productivity. AMD are targeting their cards at gamers and also not pretending it makes 8k gaming a reality so there's nothing to be gained from more ram. Nvidia still leads in productivity due to the architecture of their cards with cuda core acceleration baked into a lot of productivity apps and the fact it's more of a compute card rather than gamer focussed.

Sorry but 3090 is not a compute card, it doesnt have pro level drivers like a titan, has half rate fp32, is not a compute card.
 
Associate
Joined
26 Apr 2017
Posts
1,252
How is it that two companies spend years building their own architecture, independently from each other, and they're pretty much the same speed?

Did AMD have enough time to check the speed of Nvidia's cards, and add just enough compute units to equal them? Or is something else going on?

Cards do math simply.
Its like pixels, triangles etc...
Then we have latency checks, so a slow down with for example if the gpu need to wait to run a instruction - it will have a latency hit.
so all engineering basically tries to eliminate latency hits as much as doable.
Infinity cache, amd smart access allows this latency hit to be removed as much as possible.
if you seen a bakery they have loads of buns coming on a regular interval and if anytime the machine stops or an error happens then less buns are produced per hour.

so the solutions to eliminate for example latency hit, bandwidth limitations that amd did with infinity cache and smart access showcase why they caught up to nvidia and surpass them and ensure the AMDominance for gaming
 
Soldato
Joined
26 Oct 2013
Posts
4,012
Location
Scotland
Its a rumour from RGT, was on one of his videos, he says its coming in a future driver update, and is faster than Nvidias, as doesn't look as good.

I have followed things pretty well and never heard any tangible information about it at all. It's just a name currently for me.

If it looks worse then it won't be used. People defending DLSS blurring saying you don't notice it during gameplay is weird to me but there we go.
 
Soldato
Joined
20 Oct 2004
Posts
13,059
Location
Nottingham
Which CPU are you going to get? A 5900x/5950x would do a very good job at Vray on it's own.

5900X

You could also switch render engines.

That's not really practical from a workflow or existing investment point of view. I don't really want to be spending the equivalent on new software as the GPU. The choice is simply forget GPU rendering and hope that the 5900X 24 node count is acceptable (it's never going to be equal because of hybrid rendering where you utilise both GPU and CPU) or stick with green (I don't want to do this as I think in this instance AMD deserve the support.
 
Soldato
Joined
14 Jul 2005
Posts
8,343
Location
Birmingham
How is it that two companies spend years building their own architecture, independently from each other, and they're pretty much the same speed?

Did AMD have enough time to check the speed of Nvidia's cards, and add just enough compute units to equal them? Or is something else going on?

Its the fabs that progress the underlying chip technology itself, and as well as their own R&D budgets they're probably working with academic institutions as well. Hence they are all kind of progressing the same sorts of underlying technology advancement at the same rate, which they then put out to market.

Then the likes of AMD and NVidia buy into these via their contracts with the fabs, probably 2 years before products are out of the door. The GPUs we're seeing now are based on technologies that were being researched 2 years ago.
 
Status
Not open for further replies.
Back
Top Bottom