• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The 5700G (APU) gets higher or equal min. fps in all tested games vs Intel's DG1 graphics card

Soldato
Joined
30 Jun 2019
Posts
7,875
Averages of results at 1080p and 720p resolutions here:
https://cdn.mos.cms.futurecdn.net/7sUiYiG9TaqZFTU2VaLmXd-970-80.png
https://cdn.mos.cms.futurecdn.net/XhkxkKUU6LugCczeUi4qV4-970-80.png

Intel needs to pull their finger out!

From here:
https://www.tomshardware.com/uk/features/intel-xe-dg1-benchmarked

I'd expect the DG2 to max out at about 8-9 times as powerful as the DG1, given that the DG1 has a TDP of 30w.

If performance scales with power usage, 8 x 30 = 240w, plus some extra power consumption for a higher end memory bus and up to 16GBs of GDDR6 VRAM. So, that would be a TDP of about 270w for a desktop graphics card.

Power consumption will be the performance limit, as always with graphics cards. They might be able to improve on 8x the performance of DG1 if they use a more power efficient fabrication process than Intel's 10nm.
 
Last edited:
Soldato
OP
Joined
30 Jun 2019
Posts
7,875
Still not great though, Intel's first graphics card (in a very long time) losing in games to an APU, which apparently will only cost ~£250.

Fair enough though, the DG1 gains some ground in other tasks.
 
Soldato
Joined
9 Aug 2013
Posts
2,701
Location
S. Wales
not great, but wasnt widely released iirc, for SI's instead, the DG2 is what folk are waiting for, that will tell us how good intel will be/wont be i guess, wont be that much longer hopefully
 
Associate
Joined
31 Dec 2010
Posts
2,459
Location
Sussex
DG1 is basically integrated graphics on a PCIe card. Integrated v integrated.
However it does have a much higher power budget than the integrated equivalent. I think Intel call it Iris XE Max, although Xe Max might be 96 EU while DG1 was cut down 80 EU?
From earlier leaks, that didn't make much difference with them both about the same speed.
 
Soldato
Joined
15 Oct 2003
Posts
14,792
Location
Chengdu
Yeah, that's because Intel couldn't made anything more powerful, even if they'd wanted to, they are still in the learning phase really.

That just isn't true, and I see a bit of bias in your postings. If they really wanted to, they could have dumped more money into it and made this something else.
Or maybe you're right, and LPDDR4X and a horribly slow bus speed is absolute balls to the wall performance in Intel's mind... :D

I'm semi disappointed in the 5000 series APUs. I really hoped AMD were going to push the limit and we'd see RDNA2 in there. Maybe there's little point, whilst we're still on DDR4 but I'm waiting for a bigger jump.
 
Associate
Joined
31 Dec 2010
Posts
2,459
Location
Sussex
If they really wanted to, they could have dumped more money into it and made this something else.
I'm doubtful, because Intel have thrown money at problems before and it didn't work.
And we're not talking about pocket change:
  1. G5 modern effort. $billions, result: zilch. Remnants of modern business sold to Apple for peanuts.
  2. Atom contra revenue. $billions, result: zilch. Millions of subsidised Atom tablets were made, most are probably landfill now.
  3. McAfee
  4. Larabee
  5. Itanium
  6. Networking. Not a total failure as Intel NICs are well respected (their WiFi drivers however are junk IMO), but despite $billions the really lucrative market remains in Cisco's hands.
  7. Others I've forgotten?
For 1 & 2, see fit instance:
https://www.extremetech.com/extreme/227720-how-intel-lost-10-billion-and-the-mobile-market
So, yes this time could be different and their streak of throwing money away might end, but I won't hold my breath.
 
Soldato
OP
Joined
30 Jun 2019
Posts
7,875
For a desktop / mobile APU, a Series S like APU design should be the goal (at least for RDNA 2), that thing is a beast. As some have tried to tell me previously (alas, with great effort ;)) , this would require HBM2 VRAM for a PC version, unless they can find a way to stack GDDR6 on top of the GPU, as space on the PCB area is very limited. Although, this would make it more of a hybrid between a traditional GPU and an APU.
 
Last edited:
Back
Top Bottom