• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
Soldato
Joined
6 Feb 2019
Posts
17,464
Suppose it depends on resolution also. I am at 1440p 165hz. There just isn't enough of a jump to make it worthwhile for me at £650.

It's not bad, it's decent. But I was expecting better and it curbes my bad thoughts about getting one tomorrow without waiting for Big Navi. I have come very close.

The 3080 is a 4k card. It's been mentioned in multiple reviews today, you should not buy this card if you don't game at 4k and if you are crazy enough to use this card at 1080p or 1440p, make sure you pair it with the fastest gaming CPU and overclock the crap out of it. A 3080 at 1080p should be paired with a 10900k overclocked to 5.3ghz, otherwise you should not waste your money.
 
Soldato
Joined
10 Oct 2012
Posts
4,415
Location
Denmark
Or TSMC is a bad choice, altogether, and AMD is screwed :(

"
  • Pricing. Samsung has the best wafer pricing the industry has ever seen. Being the largest memory manufacturer does have its advantages and wafer pricing is one of them."
https://semiwiki.com/semiconductor-manufacturers/samsung-foundry/7926-samsung-vs-tsmc-7nm-update/

TSMC is a fab leader. Nvidia got greedy one way or another. Business as usual. If AMD is screwed it's of their own making and not due to TSMC 7nm node.
 
Soldato
Joined
19 Apr 2012
Posts
5,182
The 3080 is a 4k card. It's been mentioned in multiple reviews today, you should not buy this card if you don't game at 4k and if you are crazy enough to use this card at 1080p or 1440p, make sure you pair it with the fastest gaming CPU and overclock the crap out of it. A 3080 at 1080p should be paired with a 10900k overclocked to 5.3ghz, otherwise you should not waste your money.

So a 3070 would be recommended for 1440P 144hz if someone were to go down the NVidia route?
 
Soldato
Joined
30 Dec 2011
Posts
5,419
Location
Belfast
Or TSMC is a bad choice, altogether, and AMD is screwed :(

"Pricing. Samsung has the best wafer pricing the industry has ever seen. Being the largest memory manufacturer does have its advantages and wafer pricing is one of them."
https://semiwiki.com/semiconductor-manufacturers/samsung-foundry/7926-samsung-vs-tsmc-7nm-update/

I don't think so because Samsung with its low pricing enables the $700 RTX 3080.
With TSMC it could have cost north of $1000, easily.

TSMC 7nm is an established process and if AMD Navi is messed up it would be their own fault. Though the chances of that happening are slim because AMD have had RDNA and Zen CPUs on 7NM for a while now. Though we will find out soon enough.
 
Soldato
Joined
14 Sep 2008
Posts
2,616
Location
Lincoln
So a 3070 would be recommended for 1440P 144hz if someone were to go down the NVidia route?

For "today" -- but for a year from now or 2 years, when games are released with next-gen consoles as the lowest denominator, then a 3080 might be the better choice... or you could get the 3070 now and upgrade in a year or 2 if needed. Whichever way suits :)
 
Caporegime
Joined
18 Oct 2002
Posts
39,267
Location
Ireland
Don't question the master plan. It's working perfectly... for nVidia.

Even if they did release info people would still go with the "err meh gawd drivers" pish and buy NVIDIA anyway. And we all seen what happened with the performance claims of the 3080, people would be equally sceptical of any AMD performance claims and not want to wait to find out one way or the other.
 
Soldato
Joined
19 Apr 2012
Posts
5,182
For "today" -- but for a year from now or 2 years, when games are released with next-gen consoles as the lowest denominator, then a 3080 might be the better choice... or you could get the 3070 now and upgrade in a year or 2 if needed. Whichever way suits :)

Yeah fair point.
I'll keep it in mind once I see AMD's hand. Would quite like a successor to my Vega56.
 
Associate
Joined
4 May 2012
Posts
80
The 3080 is a 4k card. It's been mentioned in multiple reviews today, you should not buy this card if you don't game at 4k and if you are crazy enough to use this card at 1080p or 1440p, make sure you pair it with the fastest gaming CPU and overclock the crap out of it. A 3080 at 1080p should be paired with a 10900k overclocked to 5.3ghz, otherwise you should not waste your money.

What about people who are trying to pull the highest frames possible? Hardly a point getting a 3080 and using it at 4K when in most games you will be averaging 60fps or less.

I am all about high framerate, with highish detail. What's the point in having a 165hz monitor if you can't get a GPU to utilize it?
 

ljt

ljt

Soldato
Joined
28 Dec 2002
Posts
4,540
Location
West Midlands, UK
What about people who are trying to pull the highest frames possible? Hardly a point getting a 3080 and using it at 4K when in most games you will be averaging 60fps or less.

I am all about high framerate, with highish detail. What's the point in having a 165hz monitor if you can't get a GPU to utilize it?

If all you care about is frames per second and you want to extract ridiculous frame rates at 1440p, then really a R5 3600 really wasn't the best choice of CPU for the job.
 
Soldato
Joined
19 May 2012
Posts
3,633
The 3080 is a 4k card. It's been mentioned in multiple reviews today, you should not buy this card if you don't game at 4k and if you are crazy enough to use this card at 1080p or 1440p, make sure you pair it with the fastest gaming CPU and overclock the crap out of it. A 3080 at 1080p should be paired with a 10900k overclocked to 5.3ghz, otherwise you should not waste your money.


Thank you.
 
Soldato
Joined
12 May 2014
Posts
5,225

rdna2 INFO
I would recommend that everyone at least watches 11:17 to 12:26. It is a snippet fron the Cerny presentation that discusses RDNA 2. It seems that some of us may have glossed over it.

He basically goes over AMDs design goals for RDNA2. One of the goals he mentions seems to loosely "support" the massive cache that Paul mentioned. Just for the record that doesn't mean it is true.
 
Soldato
Joined
19 May 2012
Posts
3,633
What about people who are trying to pull the highest frames possible? Hardly a point getting a 3080 and using it at 4K when in most games you will be averaging 60fps or less.

I am all about high framerate, with highish detail. What's the point in having a 165hz monitor if you can't get a GPU to utilize it?

I'm sure there are cheaper cards which will still get you a very acceptable framerate.

If you MUST have the highest of highest framerates all the time and you can tell the difference instantly between 165hz and 120hz when it dips, then sure... give NVIDIA your money. But with the VRR tech packed into our displays, its fairly logical to think the money saved on going for a cheaper card can go to display, audio etc.
 
Permabanned
Joined
16 Sep 2020
Posts
25
I saw a claim that Nvidia tried to bluff TSMC for a lower price. They did the arrogant, "we are Nvidia and we will go elsewhere if you don't give us a better price than all your other customers". TSMC basically said "knock yourself out, we can sell all the 7nm wafers we can make". This basically meant Nvidia could not get the 7nm wafers for Ampere and had no choice but to use the inferior Samsung 8nm process.

If it's true then fair play to TSMC not giving in to the bullyboy tactics. It left Nvidia on an inferior node with clear power issues. One thing is 100% accurate and factual... Ampere is on Samsung 8nm. The question is "why"?
  • Either Nvidia decided to up their margins by using a cheaper and inferior process Nvidia thought "profit"
  • Nvidia felt they had such a lead over AMD they could afford to cheap out and go for a cheaper and inferior node Nvidia thought "profit"
  • Or Nvidia tried to strong-arm TSMC for a preferential wafer price but TSMC declined their "generous offer" because Nvidia thought "profit"
Pick one or all of the above, or even make your own reason buy it wont change the fact that they did and it looks like that decision was a poor one.

What was funny is the AdoredTV "Overvolted" form a few weeks ago where they laughed and declared that Amepre was 100% on TSMC 7nm because "why would they use the crap Samsung process".
Lol
 
Associate
Joined
4 May 2012
Posts
80
If all you care about is frames per second and you want to extract ridiculous frame rates at 1440p, then really a R5 3600 really wasn't the best choice of CPU for the job.

I wouldn't say it's 'all' I care about. I like a mixture of high detail and good frames. I was sick of console gaming and getting frame drops. I am aiming for 100fps+ on high detail at 1440p to give room for frame drops. I got the 3600 because it gives good gaming performance and is good in multithreaded loads when I need it.

I am prepared to go 4800-4900x later this year.
 
Status
Not open for further replies.
Back
Top Bottom