• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
Soldato
Joined
26 Sep 2010
Posts
7,146
Location
Stoke-on-Trent
Get down off that high horse of yours buddy, the air's a bit thin up there so you've clearly missed the point. Let me summarise the GN TDP video for you: "AMD put this number on their box but it's higher in real life! Evil! Lying AMD!" and then not a single mention that Intel do exactly the same. SO frankly you can take the pomposity about not understanding Intel's specs or alluding to any technical ignorance and put it elsewhere, it's irrelevant to the point actually being made.

I'm not quite sure what extensive testing you are expecting from a rumour discussion thread, but clearly the lack of meat and real numbers has been a great disappointment to you. Care to run along then?
 
Associate
Joined
28 Sep 2018
Posts
2,242
Get down off that high horse of yours buddy, the air's a bit thin up there so you've clearly missed the point. Let me summarise the GN TDP video for you: "AMD put this number on their box but it's higher in real life! Evil! Lying AMD!" and then not a single mention that Intel do exactly the same. SO frankly you can take the pomposity about not understanding Intel's specs or alluding to any technical ignorance and put it elsewhere, it's irrelevant to the point actually being made.

I'm not quite sure what extensive testing you are expecting from a rumour discussion thread, but clearly the lack of meat and real numbers has been a great disappointment to you. Care to run along then?

your post history is just vapid talk with no self generated data or depth in feedback in any thread. Hopefully it’s more clear now.
 
Associate
Joined
28 Sep 2018
Posts
2,242
Why not it's really simple, 3700X 65 watt TDP at 3.6Ghz, 9900K 95 watt TDP at 3.6Ghz, at those clocks they both use TDP levels of power, anything above that is power level (PL) boosting.
partners are instructed to use minimum TDP cooling.

Intel doesn’t have any enforcement with the partners which is the problem. It’s not the defining of the specs but they don’t penalize board partners from doing whatever they want to look good in reviews. That’s the issue intel has and need to correct. GN went through this in pretty good detail.

It’s also a 5 min test do yourself also on any platform.
 
Soldato
Joined
26 Sep 2010
Posts
7,146
Location
Stoke-on-Trent
your post history is just vapid talk with no self generated data or depth in feedback in any thread. Hopefully it’s more clear now.
Because my post history is predominantly on rumour and speculation threads, so how exactly do you generate data or feedback for something that doesn't exist?

And you have the audacity to start accusing others of being toxic. Hilarious child, jog on...
 
Caporegime
Joined
17 Mar 2012
Posts
47,384
Location
ARC-L1, Stanton System
Intel doesn’t have any enforcement with the partners which is the problem. It’s not the defining of the specs but they don’t penalize board partners from doing whatever they want to look good in reviews. That’s the issue intel has and need to correct. GN went through this in pretty good detail.

It’s also a 5 min test do yourself also on any platform.

This is exactly what Intel do and why, tho to be fair AMD do exactly the same thing, they are as bad as each other.
None of this has anything to do with AMD putting 65 Watts on the box while actually using 85 Watts, this because AMD allows the CPU to boost past the 3.6Ghz the TDP is rated for, cooling permited.
Intel do the same thing but Steve Burke is trying to have his viewer's believe Intel are behaving properly by explaining PL states, its an attempt to bamboozle viewers with technical fluff as a way to explain away how it is AMD and only AMD are wrong for exactly the same thing Intel do.
Clear and obvious bias.
 
Soldato
Joined
6 Aug 2009
Posts
7,070
I don't really see why there is any confusion regarding TDP. Just run each platform at their max and measure the power usage, forget all these massaged figures.

I like lower power use simply because it equals a lower room temperature. My overclocked 2500K and crossfire 7970's almost gave me heat stroke!
 
Soldato
Joined
10 Oct 2012
Posts
4,415
Location
Denmark
When you make two rant videos about AMD's higher than rated TDP power consumption, 120 Watts for 105 TDP or 85 Watts for 65 TDP while not even ignoring Intel's 190 Watts for 95 watt TDP but trying to say this is right because you don't understand PL states is ridiculous, both AMD and Intel use exactly the same boost PL states only Intel use a far more exaggerated boost difference.
That is a clear and obvious bias.

Would you be so kind as to link me to the video in question?
 
Associate
Joined
28 Sep 2018
Posts
2,242
Intel do the same thing but Steve Burke is trying to have his viewer's believe Intel are behaving properly by explaining PL states, its an attempt to bamboozle viewers with technical fluff as a way to explain away how it is AMD and only AMD are wrong for exactly the same thing Intel do.
Clear and obvious bias.

AMD's own equation for defining TDP has shifting variables in it so unless you meet those exact specifications at any given time, your TDP is going to shift accordingly. Couple that with how PB works for their frequency, it's a bit of a mess:

https://cdn.discordapp.com/attachments/657018064788783114/743475266596372630/unknown.png

https://cdn.discordapp.com/attachments/657018064788783114/743475453506879508/unknown.png

Ignore, GN, Intel and whatever else for the moment. Just looking at the above methodology, you can't reasonably look at the guidance from AMD themselves and say "yeah that's a good repeatable way to broadcast TDP info."
 
Caporegime
Joined
17 Mar 2012
Posts
47,384
Location
ARC-L1, Stanton System
AMD's own equation for defining TDP has shifting variables in it so unless you meet those exact specifications at any given time, your TDP is going to shift accordingly. Couple that with how PB works for their frequency, it's a bit of a mess:

https://cdn.discordapp.com/attachments/657018064788783114/743475266596372630/unknown.png

https://cdn.discordapp.com/attachments/657018064788783114/743475453506879508/unknown.png

Ignore, GN, Intel and whatever else for the moment. Just looking at the above methodology, you can't reasonably look at the guidance from AMD themselves and say "yeah that's a good repeatable way to broadcast TDP info."
Its a standard way. Energy is converted into heat.
So Thermal Design Power relates to the amount of heat the cooler needs to be able to dissipate, 65 Watts or 95 Watts..... The slight fib is in converting that directly into energy consumption, tho they are comparable, at least close enough, and Intel use the same formula, its industry standard.
 
Soldato
Joined
31 Oct 2002
Posts
9,851
Personally i hate the way he tested that as it's not true to what people would do. Who is going to be using a 2080ti with these cpu's at med or high setting at 1080p. All that says to me is at these settings Intel new cpu's are better. Try 1440p with Ultra and let me see the graphs as it's more true to what users will do with that kind of setup. It's pretty well known that Intel in gaming have a slight advantage when testing for it at lower settings end resolution. What he tested was in no way an average gaming scenario for the majority with those setups.

2080ti performance is about to be obtainable by the 3070/3080 cards, and their AMD equivalents. So the frame time data testing using a 2080ti is very important.
 
Associate
Joined
28 Sep 2018
Posts
2,242
Its a standard way. Energy is converted into heat.
So Thermal Design Power relates to the amount of heat the cooler needs to be able to dissipate, 65 Watts or 95 Watts..... The slight fib is in converting that directly into energy consumption, tho they are comparable, at least close enough, and Intel use the same formula, its industry standard.

For AMD, HSF resistance, the Tcase temp and ambient are all subject to change and thus impact the overall draw of the chip in relation to what's advertised. If any of those variables change, they impact the TDP of the chip. The TDP of your AMD chip is determined by those factors.

On the intel side, https://www.anandtech.com/show/13544/why-intel-processors-draw-more-power-than-expected-tdp-turbo The measure based on RAW power draw without those variables into play.

I fail to see how they're comparable.
 
Soldato
Joined
8 Jun 2018
Posts
2,827
All that is wrong is the infographic put in editing - he links to the correct one https://imgur.com/a/ZGrGJBu - you can't really go bleating about the whole video being botched on the basis of that.


(The only mistake is in the presentation of that section - you can view both the graphs in the video and the linked one in the correct context and it doesn't change the story it is just a mess up in presentation).
Oh, but I can since this is your fallacy to appeal to authority on the subject. And that's just one aspect.

'm not saying you should watch his other videos in context of the video I linked to - I said that in context to your claims about the background products - not sure how you can't even get simple things like that right.
I can scrutinize the vlog for it's poor delivery for the message you try to promote here. Which causes the vlog to fall flat on it's face.



There is one mistake in it which doesn't invalidate the rest - and is a simple misplacement of the correct graph.
With any cognitive dissonance aside it refutes your claims that he's setting AMD fanboys straight.


Not sure what you are even saying here - I posted something in passing, that was related to an earlier part of the thread, that somehow seems to have triggered you - if the hat doesn't fit you don't need to get irritated by it.
You did it to prove a point. A point that fell on it's own sword the moment in stumbled on it's own presentation.


You are the one here that is trying to blow a simple presentation error into something bigger to try and discredit something you don't like...
Well, you don't have to take my word for it.

Steve is an arse. Not an idiot, an arse.
The AMD are soomther comes from the original Ryzen days where AMD were up against intel's ageing 4 core CPU's, the reason it came about is because many mainstream reviewers picked up on it and reported it 3 YEARS AGO.
Steve knows this, to pick that up now as if its a fresh claim when no longer relevant is willful ignorance to fabricate a "Steve Burke puts the Internet to rights"
What it is is typical Steve Burke manufactured look at me the genius narcissist snobery.
That was worth the mobile touch screen effort

And to this end I would agree. :D

But I've seen these attacks on those who are either neutral or prefer AMD before. Way back when there was rumors were Radeon was going to beat Nvidia. A lot of responses I read today (strawmen, etc) is the same cognitive dissonance I read back then.

If RDNA 2 is competitive then that would be a great boost to the PC market and should bring prices of gpus back in line. But we will see ;)
 
Last edited:
Caporegime
Joined
17 Mar 2012
Posts
47,384
Location
ARC-L1, Stanton System
For AMD, HSF resistance, the Tcase temp and ambient are all subject to change and thus impact the overall draw of the chip in relation to what's advertised. If any of those variables change, they impact the TDP of the chip. The TDP of your AMD chip is determined by those factors.

On the intel side, https://www.anandtech.com/show/13544/why-intel-processors-draw-more-power-than-expected-tdp-turbo The measure based on RAW power draw without those variables into play.

I fail to see how they're comparable.

I read the whole thing, it describes what TDP is, all of it standard TDP stuff, no different to how AMD or anyone else does, it's an industry standard formula.
Beyond that, and this is the purpose of the article it it explains why Intel CPU's use much more power than the TDP they put on the box, also with how different motherboard vendors use their own firmware to regulate boost behaviour, such as that acronym Asus use that I can't remember.
That's it....
 
Man of Honour
Joined
13 Oct 2006
Posts
90,824
Oh, but I can since this is your fallacy to appeal to authority on the subject. And that's just one aspect.


I can scrutinize the vlog for it's poor delivery for the message you try to promote here. Which causes the vlog to fall flat on it's face.




With any cognitive dissonance aside it refutes your claims that he's setting AMD fanboys straight.



You did it to prove a point. A point that fell on it's own sword the moment in stumbled on it's own presentation.



Well, you don't have to take my word for it.



And to this end I would agree. :D

But I've seen these attacks on those who are either neutral or prefer AMD before. Way back when there was rumors were Radeon was going to beat Nvidia. A lot of responses I read today (strawmen, etc) is the same cognitive dissonance I read back then.

If RDNA 2 is competitive then that would be a great boost to the PC market and should bring prices of gpus back in line. But we will see ;)

LOL if you are trying to prove Steve wrong in what he said you aren't exactly going about it the right way.
 
Associate
Joined
28 Sep 2018
Posts
2,242
I read the whole thing, it describes what TDP is, all of it standard TDP stuff, no different to how AMD or anyone else does, it's an industry standard formula.
Beyond that, and this is the purpose of the article it it explains why Intel CPU's use much more power than the TDP they put on the box, also with how different motherboard vendors use their own firmware to regulate boost behaviour, such as that acronym Asus use that I can't remember.
That's it....

There is no industry standard formula. They decide a number ahead of time and back into it. The AMD calc literally has THREE floating variables that can change the TDP itself. You're also conflating what board manufacturers are doing with what's strictly a published spec by the chip vendor themselves.

Did you notice that the HSF resistance change is the only difference in how they got the 3600/3600x TDP values? That's not any kind of a standard. That's just coming up with a number ahead of time and adjusting the variables to back into it.

Here's a simple question. If put the same heatsink on a 3600 and 3600x, would they both consume the same power?
 
Soldato
Joined
8 Jun 2018
Posts
2,827
I remember about 2 years ago about the GPP debacle. And even though the media got involved and complaints filed with the FTC/FCC, etc. There was a nagging question never answered. Why did nvidia do it? AMD wasn't a threat to them and there was no indication that the GPU market would change anytime soon. The attempt to make GPU branding all nvidia was thought to be a whimsical attempt at just being evil.

But now, I ponder on the possibly that Nvidia was aware of RDNA's potential to disrupt the GPU/Gaming (PC and Console) market. Although it remains to be seen from AMD to date it's very possible for AMD to market themselves as the defacto solution for gaming.

"If you want to get the best experience in games buy AMD. Developers prefer it, so should you!" marketing campaign alone could be very disruptive and shift more mindshare.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom