• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
Caporegime
Joined
17 Mar 2012
Posts
47,380
Location
ARC-L1, Stanton System
There is no industry standard formula. They decide a number ahead of time and back into it. The AMD calc literally has THREE floating variables that can change the TDP itself. You're also conflating what board manufacturers are doing with what's strictly a published spec by the chip vendor themselves.

Did you notice that the HSF resistance change is the only difference in how they got the 3600/3600x TDP values? That's not any kind of a standard. That's just coming up with a number ahead of time and adjusting the variables to back into it.

Here's a simple question. If put the same heatsink on a 3600 and 3600x, would they both consume the same power?

Depends on the heatsink.
 
Caporegime
Joined
17 Mar 2012
Posts
47,380
Location
ARC-L1, Stanton System
I'll save the back and forth here.

If it's a 50 watt heatsink both CPU's will use 50 Watts.
TDP 50 Watts at 70c, once the CPU reaches 50 Watts the heatsink will be unable to keep the CPU under 70c and it will throttle.
 

bru

bru

Soldato
Joined
21 Oct 2002
Posts
7,360
Location
kent
All these companies will put figures out that make them look as good as possible, sometimes even bending what the consumer would understand to make them look better.

You have to remember that the moment you are not running at the same settings they tested at you wont get the same results, these things are normally buried in the footnotes somewhere at launch time.

TDP is a interesting one, because both red/blue and even green team can decide to care about it or not on any given day and the fans of each will decide whether to be worried about it accordingly.

As for the product placement and being paid for things.
This was a big thing a year or so ago, when Techpower up used Steves image in an article about it.

Bottom line it doesn't happen on Gamers Nexus it was also picked up by Linus and Jay, hardware unboxed and many others if I recall. Proven to be completely unfounded, then as it would be now. Apart from anything NVidia wouldn't part with the money anyway.
 
Caporegime
Joined
17 Mar 2012
Posts
47,380
Location
ARC-L1, Stanton System
Product placement is a thing and I can see how Nvidia wouldn't want Radeon boxes clearly visible in GeForce reviews.
Other than that it's much easier just to do sponsorship content.
Nothing wrong with either of those things.
 
Caporegime
Joined
17 Mar 2012
Posts
47,380
Location
ARC-L1, Stanton System
Caporegime
Joined
18 Oct 2002
Posts
29,679
Soldato
Joined
6 Jun 2008
Posts
11,616
Location
Finland
There is no industry standard formula. They decide a number ahead of time and back into it.
Intel does exactly same but just in lot bigger scale.
If cooling can keep temperature under control 8 cores happily chew all the way to 200W in AVX loads.
https://www.tomshardware.com/reviews/intel-core-i7-9700k-9th-gen-cpu,5876-2.html

And here you complain about AMD's going up to 100W with 8 core...
(and 12/16 core to 140W)


Intel doesn’t have any enforcement with the partners which is the problem. It’s not the defining of the specs but they don’t penalize board partners from doing whatever they want to look good in reviews.
AMD locked PCIe v4 out from pre X570 boards in AGESA, so pretty certain Intel could similarly take tighter grip on power draw.

And Intel would certainly have ways to coerce mobo makers "doing their own":
Withholding BIOS codes for some time...
Saying next time you'll pay more for chipset and NIC...
Or get less of those/at later time than others... (no other supplier of chipsets)

But Intel doesn't want to change situation because they benefit from it with their CPUs getting some extra heavy load performance.


your post history is just vapid talk with no self generated data or depth in feedback in any thread. Hopefully it’s more clear now.
While your posting history in this thread is straight AMD bad bashing - Intel good shilling like from paid troll.
 
Caporegime
Joined
17 Feb 2006
Posts
29,263
Location
Cornwall
It could be that the 5700XT is sticking around but in a lower slot, maybe the newer card's will be named 5800RXT and 5900RXT
We've known for ages that RDNA1 was getting a "refresh" and would be part of the new line-up.

Sadly I can't help but suspect all the RDNA2 cards will be priced above the 5700XT. I highly doubt AMD will "refresh" the RDNA1 cards into a much lower price bracket. The manufacturing costs will be the roughly same as before, give or take, so they aren't going to drop the prices all that much.

So if the refreshed cards are part of the new line up, that means the only place for the newer cards is higher up the product stack at greater cost.
 
Soldato
Joined
21 Jul 2005
Posts
19,981
Location
Officially least sunny location -Ronskistats
Interested to see what AMD come away with. I'm looking to swap out a Vega 56 fairly soon to make use of the higher refresh rate of a 1440P monitor.

Similar for me. I got a cheap 4k monitor not long back and although some older games ran well (Mass Effect: Andromeda) in my dusty library, most I would run at 2k with modest settings. Given I thought I splashed out for it at £300, I will only compartmentalise up a notch this time round so that will rule out anything like the 3080 or above.

I am hoping a weak flavour of a big navi can punch above a 2080ti for say £500 and that would see me out for the time being.
 
Soldato
Joined
19 Apr 2012
Posts
5,182
Similar for me. I got a cheap 4k monitor not long back and although some older games ran well (Mass Effect: Andromeda) in my dusty library, most I would run at 2k with modest settings. Given I thought I splashed out for it at £300, I will only compartmentalise up a notch this time round so that will rule out anything like the 3080 or above.

I am hoping a weak flavour of a big navi can punch above a 2080ti for say £500 and that would see me out for the time being.

I would be happy with that! That would suit me perfectly for my needs. I may have missed it in the thread but is there any idea on when an announcement could be made?
 
Status
Not open for further replies.
Back
Top Bottom