• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Rant ? Intel Fan? Ryzen Truth ?

Associate
OP
Joined
20 May 2019
Posts
505
Location
London
Uhhh no, HUB Quality is the settings Hardware Unboxed use to find the best balance between Graphic Fidelity and FPS count... Watch their video on The Division 2, you can drastically improve your FPS over the stock "Ultra" settings without gimping the look of the game by altering a few settings.

So HUB Quality is NOT the same as High / Ultra etc etc, its their own set of settings unique to the game to give what they perceive as the best balance of fps and graphical fidelity.

Just compare picture 1 to picture 2, it's the same fps at 3200.
 
Soldato
Joined
19 Feb 2011
Posts
5,849
Just compare picture 1 to picture 2, it's the same fps at 3200.

that maybe the case but HUB Quality is there own settings, regardless of the fps being shown, its their settings they use for the best balance of fps and graphics.

https://www.youtube.com/watch?v=3nN47ZqQ8DM&list=PL7m5C6_P_lnXQhO8YRLfVVMSGo0UwDIne&index=11&t=0s this is the video that HUB Quality FPS is from. Which also likely means this was done ages ago and probably has never been redone with security fixes etc as well...

So uhh yeah....
 
Associate
Joined
28 Jan 2003
Posts
2,379
Location
Bristol
Ninefivezero is clearly having a wind up.

Ultra quality vs HUB quality in one shot

BF 5 is DX11 vs DX12 in mem section :D

There will be tweaking for Ryzen 3k too, its only been out a day, give it a chance.
 
Associate
Joined
24 Jun 2019
Posts
130
Location
Aberdeen
I do not agree with what you are saying. I understand it, but I do not agree. Stock for stock is the right way to test and after the bugs are worked out you can test OC vs OC. It would be testing a modified BMW vs a stock Merc equivalent and then saying "look the BMW is faster!" Well no kidding, it is modded.

You can have stock comparisons and then OC one .
 
Caporegime
Joined
18 Mar 2008
Posts
32,747
What if people stopped using a ****** resolution to make a point? If all you own is a 1080p monitor and you spend $1000 on a GPU you're failing to use and a $500 CPU you've clearly overspent on, then that's just lolworthy.

God knows how much the Ram costs as well...

I can sort of see it for BF5 if you have a 240Hz monitor, but then you'd be testing absolute minimal competitive settings and not "ultra", so many farcical scenarios, it's hilarious.

Intel, Nvidia and their partners (Asus especially must ******* love this level of clutching), gotta justify spending $2000 to play at minimal settings to maybe get a very slight advantage and significant placebo.

Can't wait for Streaming to erase the low-mid end revenue stream as well, the next decade will be interesting.
 
Last edited:
Associate
OP
Joined
20 May 2019
Posts
505
Location
London
What if people stopped using a ****** resolution to make a point? If all you own is a 1080p monitor and you spend $1000 on a GPU you're failing to use and a $500 CPU you've clearly overspent on, then that's just lolworthy.

God knows how much the Ram costs as well...

I can sort of see it for BF5 if you have a 240Hz monitor, but then you'd be testing absolute minimal competitive settings and not "ultra", so many farcical scenarios, it's hilarious.

Intel, Nvidia and their partners (Asus especially must ******* love this level of clutching), gotta justify spending $2000 to play at minimal settings to maybe get a very slight advantage and significant placebo.

And you are this angry because it doesn't make sense with 2080ti on 1440/4k ? I agree, it doesn't make sense. But what will happen when 3080ti launch? There will come a time when Ryzens won't max out those cards. If there's a 20% gap in 1080p, there will be a gap at 1440p.

That's why I am holding off and not buying a GPU this generation, 1070 is doing just fine and I will buy a card that will get my CPU utilisation closer to 90% at 1440p, that might be 3080ti, might not.
 
Soldato
Joined
9 Mar 2015
Posts
4,550
Location
Earth
There method seems fine to me, here is what they mention on there site:

The 8th and 9th-gen Intel Core processors were benchmarked on the Gigabyte Z390 Aorus Ultra, using the same DDR4-3200 CL14 memory, but they were cooled using the Corsair Hydro H115i RGB Platinum 280mm liquid cooler. Do note the Intel CPUs are not TDP restricted as that’s not the out of the box experience, so we are showing the absolute best case scenario for out of the box performance. Finally, our graphics card of choice was the MSI Trio GeForce RTX 2080 Ti.

Next, Intel has a turbo table on there CPU's and will boost accordingly. Hardly AMD's fault that there method gets closer to the limit out the box then Intel's turbo table does. In terms of cache, not many people bother OCing that and it rarely changes the number. At least with my 8700k overclocking the cache does nothing.

In regards to manually overclocking, hardly make things equal, more so the 5.0 Ghz you say. A famous site which sells chips based on clock speeds they hit and are stable at suggests only 35% of 9900k's were fully stable at 5.0 GHz and 87% at 4.9 GHz in stress tests, which is a reasonable ask as the Ryzen chips are similarly stable in stress tests out the box (I would hope). So then its a matter of, what is a suitable number to OC to that ensures everyone happy.

Now do not mistake me, I agree, there is a little bit more overhead on the Intel side in terms of overclocking it seems, but for the purpose of an apple's to apples comparison, out the box and a appropriate set of RAM seems to be the most easy way to compare these things.
 
Caporegime
Joined
18 Mar 2008
Posts
32,747
And you are this angry because it doesn't make sense with 2080ti on 1440/4k ? I agree, it doesn't make sense. But what will happen when 3080ti launch? There will come a time when Ryzens won't max out those cards. If there's a 20% gap in 1080p, there will be a gap at 1440p.

That's why I am holding off and not buying a GPU this generation, 1070 is doing just fine and I will buy a card that will get my CPU utilisation closer to 90% at 1440p, that might be 3080ti, might not.

I'm not particularly angry, just bored of irrelevant benchmarks.

I know for enthusiasts they can fiddle themselves all day that they have a $4000 computer, but they're a hugely tiny niche and if they're not playing at 1080p, they're at 4K like a sensible person with that budget would be, or similarly demanding ultra-wide/multi-monitor.

It's not to say they shouldn't be tested at these resolutions, but it's highly disingenuous when using it as a stick to beat others with, especially when the main driver of CPU/GPU sales is cost and sensible choices based on the resolution they're at (also cost).

A 2080ti being used at 1080p frankly looks more like a synthetic bench than a real one frankly, bar the usual suspect, the real world matters to people.
 
Last edited:
Associate
OP
Joined
20 May 2019
Posts
505
Location
London
I'm not particularly angry, just bored of irrelevant benchmarks.

The main purpose of this thread was to show that the benchmarks were scaled to favour AMD and if this sort of thing would be done by Intel then everybody would be up in arms.

I don't care what people buy or don't buy. They can grow funky moustaches buy iMacs and pretend to be designers for all I care ;)
 
Associate
OP
Joined
20 May 2019
Posts
505
Location
London
only 35% of 9900k's were fully stable at 5.0 GHz and 87% at 4.9 GHz in stress tests,

"top 89% of tested 9700Ks were able to hit 5.0GHz or greater"

I didn't know 9900 did so poorly, I did assume that at least 80% of them would do 5 out of the box.

Testing at 4.9 then would be more realistic, I agree.
 
Soldato
Joined
13 Jun 2009
Posts
6,847
"top 89% of tested 9700Ks were able to hit 5.0GHz or greater"

I didn't know 9900 did so poorly, I did assume that at least 80% of them would do 5 out of the box.

Testing at 4.9 then would be more realistic, I agree.
Later i9-9900K chips perform slightly worse on average because they are being binned for the i9-9900KS.
 
Associate
OP
Joined
20 May 2019
Posts
505
Location
London
Later i9-9900K chips perform slightly worse on average because they are being binned for the i9-9900KS.

That makes sense. I still secretly hope that they get scared of the 12/16 Core chips and bring out the 10 Core Comet Lake and since no one would buy new platform for it, they would make it on z390, that would be sweet :)
 
Soldato
Joined
28 May 2007
Posts
18,257
The problem you’re testing a GPU to show CPU performance is a fundamentally flawed method. Added to that injecting an artificial limit by testing at low resolutions isn’t going to help. All this test would show is CPU development is a long way ahead of GPU’s in a gaming environment. It wouldn’t be a good indication of future gaming performance. It’s clear AMD have the faster hardware Intel.
 
Associate
Joined
28 Jan 2003
Posts
2,379
Location
Bristol
Perhaps he doesn't realize that Intels stock performance is 2666 for memory speed officially, AMDs is 3200. So with memory at 3200 for both they are testing Intel overclocked vs AMD stock, that is being favourable to Intel not the other way around :p
 
Last edited:
Back
Top Bottom