• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."
  • Community Question: What makes a good motherboard?

News Intel to develop discrete GPUs

Page 31 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

mikk

Platinum Member
May 15, 2012
2,958
770
136
Just look at the driver if you don't believe me. I already linked data structure descriptions on Reddit. Download 100.8885, open iga64.dll, start at offset 0x22fab8 and subtract 0x180000c00 from every pointer. Shouldn't be that hard!

Do you think it makes sense that TGL-H Xe is 12_5 based while Alder Lake which comes later than TGL-H is 12_2 based?
 

stebler

Junior Member
Sep 10, 2020
17
58
51
Actually, Tiger Lake HP is probably a completely different product from Tiger Lake H. I updated the table to reflect that.
 

mikk

Platinum Member
May 15, 2012
2,958
770
136
Makes more sense. It's still odd that Meteor Lake is based on Gen12_7 same as DG2 and Ponte Vecchio, it implies that it gets the same feature set. And then we might see an EU count upgrade to 128 EUs, (1 slice/tile DG2).

There are some game benchmarks from Xe Max:


Bad performance, it loses against Zenbook UX425E which is a bad performer as we know. I wonder if there is GPU throttling involved, the Swift 3x only has 1 fan and two shared pipes for CPU and GPU. Otherwise the latency penalty must be big.
 
  • Like
Reactions: Tlh97 and Grazick

NTMBK

Diamond Member
Nov 14, 2011
8,904
1,977
136
I don't understand why Intel is trying so hard to torpedo their new graphics brand. They must have spent millions on all the rebranding to Xe, so why are they associating it with such terrible performance? Once again, they are teaching people that Intel sticker = bad graphics.
 
  • Like
Reactions: Tlh97 and Qwertilot

Shivansps

Diamond Member
Sep 11, 2013
3,110
786
136
I don't understand why Intel is trying so hard to torpedo their new graphics brand. They must have spent millions on all the rebranding to Xe, so why are they associating it with such terrible performance? Once again, they are teaching people that Intel sticker = bad graphics.
Because they rebranded and did it wrong. They launched just 1 GPU and nothing make it clear that it is a low end gpu similar to a 1030/rx550
.
 

IntelUser2000

Elite Member
Oct 14, 2003
7,249
1,839
136
There's a polish site that tested Iris Xe Max. It's 5% faster than the Xe iGPU in the Asus UX425E, which based on Witcher test uses 20W less and cuts battery life normalized to Whr by a significant amount. Meaning 5% faster than the Xe iGPU set at 17W or less.

Like why?

Seems like an awful lot of hoops for nothing. They truly needed the Xe to shine, because the gains weren't going to be special(AMD/Nvidia/IMG all promised big gains) to them.

Yet it disappoints. Tigerlake Xe goes from 2x to "Up to 2x, but 70% really, and in optimal configurations". And the Xe Max, I hoped for performance akin to optimal Tigerlake configuration set at 25W, but nope.

The LPDDR4x usage is a fail IMO. The low load power will always be better on iGPUs. Perhaps the Xe is still not power efficient to pair it with GDDR6. At least it would have been faster.

This does not bode well for their high end, where it won't compete with outdated competition.
 
Last edited:

guidryp

Senior member
Apr 3, 2006
590
465
136
I don't understand why Intel is trying so hard to torpedo their new graphics brand. They must have spent millions on all the rebranding to Xe, so why are they associating it with such terrible performance? Once again, they are teaching people that Intel sticker = bad graphics.

I don't think a laptop level part will hurt them when starting from where they are now.

Intel is starting from the bottom with GPU expectations, nothing the do really matters until if/when they deliver something compelling in terms of perf/$, and then they can start to build.

Even if Intel eventually do deliver something that has a perf/$ edge, they still have a long way to go to prove themselves on the driver front.

I know I definitely won't buy whatever they deliver, until other people are playing guinea pig for a year or so.
 

Tup3x

Senior member
Dec 31, 2016
446
316
106
DG1 is their test project. It allows maybe a bit more flexibility in a laptop but essentially it's the same as the integrated version, just external. I guess its main purpose is just to be a testbed for drivers.
 
  • Like
Reactions: Tlh97 and guidryp

mikk

Platinum Member
May 15, 2012
2,958
770
136
From Sisoftware:

Thanks to the faster, dedicated LP-DDR4X memory, DG1 has a 14% bandwidth gain over the integrated Xe. However, due to the PCIe3 x4 connection upload/download is slow – thus needs to be overlapped with compute to prevent bottlenecks. But it also supports PCIe4 for the future.

Note*: PCIe3 x4 connection – not integrated. (DG1 supports PCIe4 but not used here)
 

IntelUser2000

Elite Member
Oct 14, 2003
7,249
1,839
136
I guess its fine since they released another driver and fixed few more bugs with crashing in games.

One can live with slow performance, but crashing, that's another story!

I wonder if they'll ever release a desktop version and have GDDR6? Will it be any faster?
 

Leeea

Member
Apr 3, 2020
128
170
76
This feels like a stock market launch.

Nobody is intended to use this crap. Laptop discrete GPUs for compute are unheard of. Even for compute, Nvidia says hi with a whole line up of better laptop focused products. For mobile gaming this thing is a pathetic joke.

However, it makes great headlines get to the stock price up so the executives can try and cash out. That feels like the only purpose to this flaming waste of time.
 

beginner99

Diamond Member
Jun 2, 2009
4,667
1,078
136
However, it makes great headlines get to the stock price up so the executives can try and cash out. That feels like the only purpose to this flaming waste of time.
Maybe or it's stupid OEMs. This would make sense in Comet-lake laptops which are and will remain pretty common. Better performance and media capabilities. Pairing it with the exact same iGPU in tigerlake is obviously a complete waste. Agree with that.
 

jpiniero

Diamond Member
Oct 1, 2010
8,494
1,485
126

Intel announces a discrete Server GPU with 4 DG1's that have 8 GB each. Weird product but okay.
 

IntelUser2000

Elite Member
Oct 14, 2003
7,249
1,839
136
Intel announces a discrete Server GPU with 4 DG1's that have 8 GB each. Weird product but okay.
This is for media streaming servers. They had two previous successors, which used an entire Iris Pro Xeon with the CPU cores and everything.

Apparently before Raja, these types would have been the only dGPUs for Xe. But for streaming something like a 4x 512EU is completely overkill.
 

mikk

Platinum Member
May 15, 2012
2,958
770
136
Acer Swift 3x: https://laptopmedia.com/review/acer-swift-3x-sf314-510g-review-discrete-graphics-from-intel-why-not/

The cooling setup is better than on the Swift 3/Swift 5 iGPU version.

Comparatively, the Acer Swift 3X is equipped with a bigger fan that provides roughly 60% airflow improvement. Multiple cooling modes and dual D6 copper heat pipes maximize the cooling efficiency of the Acer Swift 3X
 

ASK THE COMMUNITY