News Intel GPUs - Intel launches A580

Page 31 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

mikk

Diamond Member
May 15, 2012
4,112
2,108
136
Just look at the driver if you don't believe me. I already linked data structure descriptions on Reddit. Download 100.8885, open iga64.dll, start at offset 0x22fab8 and subtract 0x180000c00 from every pointer. Shouldn't be that hard!


Do you think it makes sense that TGL-H Xe is 12_5 based while Alder Lake which comes later than TGL-H is 12_2 based?
 

stebler

Member
Sep 10, 2020
25
75
61
Actually, Tiger Lake HP is probably a completely different product from Tiger Lake H. I updated the table to reflect that.
 

mikk

Diamond Member
May 15, 2012
4,112
2,108
136
Makes more sense. It's still odd that Meteor Lake is based on Gen12_7 same as DG2 and Ponte Vecchio, it implies that it gets the same feature set. And then we might see an EU count upgrade to 128 EUs, (1 slice/tile DG2).

There are some game benchmarks from Xe Max:


Bad performance, it loses against Zenbook UX425E which is a bad performer as we know. I wonder if there is GPU throttling involved, the Swift 3x only has 1 fan and two shared pipes for CPU and GPU. Otherwise the latency penalty must be big.
 
  • Like
Reactions: Tlh97 and Grazick

NTMBK

Lifer
Nov 14, 2011
10,208
4,940
136
I don't understand why Intel is trying so hard to torpedo their new graphics brand. They must have spent millions on all the rebranding to Xe, so why are they associating it with such terrible performance? Once again, they are teaching people that Intel sticker = bad graphics.
 
  • Like
Reactions: Tlh97 and Qwertilot

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
I don't understand why Intel is trying so hard to torpedo their new graphics brand. They must have spent millions on all the rebranding to Xe, so why are they associating it with such terrible performance? Once again, they are teaching people that Intel sticker = bad graphics.

Because they rebranded and did it wrong. They launched just 1 GPU and nothing make it clear that it is a low end gpu similar to a 1030/rx550
.
 
  • Like
Reactions: Tlh97

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
There's a polish site that tested Iris Xe Max. It's 5% faster than the Xe iGPU in the Asus UX425E, which based on Witcher test uses 20W less and cuts battery life normalized to Whr by a significant amount. Meaning 5% faster than the Xe iGPU set at 17W or less.

Like why?

Seems like an awful lot of hoops for nothing. They truly needed the Xe to shine, because the gains weren't going to be special(AMD/Nvidia/IMG all promised big gains) to them.

Yet it disappoints. Tigerlake Xe goes from 2x to "Up to 2x, but 70% really, and in optimal configurations". And the Xe Max, I hoped for performance akin to optimal Tigerlake configuration set at 25W, but nope.

The LPDDR4x usage is a fail IMO. The low load power will always be better on iGPUs. Perhaps the Xe is still not power efficient to pair it with GDDR6. At least it would have been faster.

This does not bode well for their high end, where it won't compete with outdated competition.
 
Last edited:

Heartbreaker

Diamond Member
Apr 3, 2006
4,222
5,224
136
I don't understand why Intel is trying so hard to torpedo their new graphics brand. They must have spent millions on all the rebranding to Xe, so why are they associating it with such terrible performance? Once again, they are teaching people that Intel sticker = bad graphics.


I don't think a laptop level part will hurt them when starting from where they are now.

Intel is starting from the bottom with GPU expectations, nothing the do really matters until if/when they deliver something compelling in terms of perf/$, and then they can start to build.

Even if Intel eventually do deliver something that has a perf/$ edge, they still have a long way to go to prove themselves on the driver front.

I know I definitely won't buy whatever they deliver, until other people are playing guinea pig for a year or so.
 

Tup3x

Senior member
Dec 31, 2016
944
925
136
DG1 is their test project. It allows maybe a bit more flexibility in a laptop but essentially it's the same as the integrated version, just external. I guess its main purpose is just to be a testbed for drivers.
 

mikk

Diamond Member
May 15, 2012
4,112
2,108
136
From Sisoftware:

Thanks to the faster, dedicated LP-DDR4X memory, DG1 has a 14% bandwidth gain over the integrated Xe. However, due to the PCIe3 x4 connection upload/download is slow – thus needs to be overlapped with compute to prevent bottlenecks. But it also supports PCIe4 for the future.

Note*: PCIe3 x4 connection – not integrated. (DG1 supports PCIe4 but not used here)
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
I guess its fine since they released another driver and fixed few more bugs with crashing in games.

One can live with slow performance, but crashing, that's another story!

I wonder if they'll ever release a desktop version and have GDDR6? Will it be any faster?
 

Leeea

Diamond Member
Apr 3, 2020
3,599
5,340
106
This feels like a stock market launch.

Nobody is intended to use this crap. Laptop discrete GPUs for compute are unheard of. Even for compute, Nvidia says hi with a whole line up of better laptop focused products. For mobile gaming this thing is a pathetic joke.

However, it makes great headlines get to the stock price up so the executives can try and cash out. That feels like the only purpose to this flaming waste of time.
 

beginner99

Diamond Member
Jun 2, 2009
5,208
1,580
136
However, it makes great headlines get to the stock price up so the executives can try and cash out. That feels like the only purpose to this flaming waste of time.

Maybe or it's stupid OEMs. This would make sense in Comet-lake laptops which are and will remain pretty common. Better performance and media capabilities. Pairing it with the exact same iGPU in tigerlake is obviously a complete waste. Agree with that.
 

jpiniero

Lifer
Oct 1, 2010
14,510
5,159
136

Intel announces a discrete Server GPU with 4 DG1's that have 8 GB each. Weird product but okay.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Intel announces a discrete Server GPU with 4 DG1's that have 8 GB each. Weird product but okay.

This is for media streaming servers. They had two previous successors, which used an entire Iris Pro Xeon with the CPU cores and everything.

Apparently before Raja, these types would have been the only dGPUs for Xe. But for streaming something like a 4x 512EU is completely overkill.
 

mikk

Diamond Member
May 15, 2012
4,112
2,108
136
Acer Swift 3x: https://laptopmedia.com/review/acer-swift-3x-sf314-510g-review-discrete-graphics-from-intel-why-not/

The cooling setup is better than on the Swift 3/Swift 5 iGPU version.

Comparatively, the Acer Swift 3X is equipped with a bigger fan that provides roughly 60% airflow improvement. Multiple cooling modes and dual D6 copper heat pipes maximize the cooling efficiency of the Acer Swift 3X
 

mikk

Diamond Member
May 15, 2012
4,112
2,108
136
HotChips Xe GPU Arcitecture presentation:





Some functions are optional depending on the version like Ray Tracing which we should see for Xe HPG:

raytracingmmkkd.png
 

Mopetar

Diamond Member
Jan 31, 2011
7,797
5,899
136
Just WOW

I'm really curious how capable this thing is

If nothing else, it should probably be able to put out enough heat that you could pan fry an egg with it. Hopefully Ryan can get situated and caught up with everything else by the time this drops because I really want to see a deep dive.
 

Saylick

Diamond Member
Sep 10, 2012
3,084
6,184
136
Wow, judging from relative the size of the HBM chips, those 16 compute chiplets must be pretty small, no?