News Intel GPUs - Intel launches A580

Page 44 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,784
136
Intel iGPUs are notorious for being bad at mining. The OpenCL driver stack isn't up to snuff. Unless Intel corrects that problem, DG2 will probably fail as a compute card. Mining or otherwise.
 

mikk

Diamond Member
May 15, 2012
4,111
2,105
136
Intel iGPUs are notorious for being bad at mining. The OpenCL driver stack isn't up to snuff. Unless Intel corrects that problem, DG2 will probably fail as a compute card. Mining or otherwise.


Intel has an active OpenCL driver stack since years unlike Nvidia which was stuck on OpenCL1.2 for years and Intel iGPUs were traditionally quite god In OpenCL benchmarks compared to their AMD iGPU counterparts, often better than realworld gaming. Luxxmark or Geekbench Compute looks fine when i compare Iris Xe with 4800U or 5700U for example. Intel was first in OpenCL 3.0.
 

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,784
136
Intel has an active OpenCL driver stack since years unlike Nvidia which was stuck on OpenCL1.2 for years and Intel iGPUs were traditionally quite god In OpenCL benchmarks compared to their AMD iGPU counterparts, often better than realworld gaming. Luxxmark or Geekbench Compute looks fine when i compare Iris Xe with 4800U or 5700U for example. Intel was first in OpenCL 3.0.

NV doesn't care about OpenCL because CUDA.

That aside, have you tried doing any serious work with OpenCL on an intel iGPU outside of a few canned benchmarks? Everyone I've ever heard from that has tried it has had nothing good to say about Intel's OpenCL driver stack.

edit: in case you think I'm bsing you.

mining on Intel iGPUs is no simple task.
 
  • Like
Reactions: xpea

Mopetar

Diamond Member
Jan 31, 2011
7,797
5,899
136
Rats... Intel missed their opportunity to release their product during the "great" 2021 card shortage. Even if it was a medicore performer, they still would sold out of stock for at or above MSRP.

Let's just hope it's the great GPU shortage of '21 instead of the great GPU shortage of the early 20's.

Even if mining tapers off a lot, Intel could still supply that market segment and great profit to themselves and the wholehearted cheers of the gaming community.

Let them target the data center and mining markets while working the kinks out and refining their design. Even if they aren't selling a lot of discrete cards to gamers all of their CPUs are still going into laptops or other machines used for light gaming.
 

biostud

Lifer
Feb 27, 2003
18,193
4,674
136
Hopefully their cards will be far superior in mining, so the gamers can buy nvidia and AMD :p
 

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
mining on Intel iGPUs is no simple task.

To be fair, mining on any IGP is just terrible, because is not supported on some miners and no optimised on the ones that do work.
And in fact NBminer and Phoenix miners are the only 2 that i know to work for Vega IGPs.
 

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,784
136
To be fair, mining on any IGP is just terrible, because is not supported on some miners and no optimised on the ones that do work.
And in fact NBminer and Phoenix miners are the only 2 that i know to work for Vega IGPs.

Claymore Ethereum miner worked on my A10-7870k and 7700k, out-of-the-box. The point of what I pasted is that, using the current drivers, it's extremely difficult to get something as elementally simple and ubiquitous as an OpenCL-based mining program to run on most (if not all?) Intel iGPUs. If Intel brings over the same driver stack for DG2/Arc then it's reasonable that initially, DG2 will bomb out using standard OpenCL mining software.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,226
9,990
126
I had no issue running BOINC/Einstein on an Intel iGPU, 7th-Gen and up. Those were the only ones that were certified for Compute. Skylake wasn't. Granted, the app had to be specifically written, I believe, to detect and utilize the Intel iGPUs. But they were included.
 

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,784
136
I had no issue running BOINC/Einstein on an Intel iGPU, 7th-Gen and up. Those were the only ones that were certified for Compute. Skylake wasn't. Granted, the app had to be specifically written, I believe, to detect and utilize the Intel iGPUs. But they were included.

Apparently OpenCL support got . . . I don't want to say worse, but different with some of the 2021 drivers? At least that's what was indicated in the mining thread I linked above. Also, as you say, the software had to be specifically coded for Intel devices which is odd. Anything that is OpenCL 1.2 compliant just runs on AMD and NV hardware, no problem.
 

mikk

Diamond Member
May 15, 2012
4,111
2,105
136
Sometimes it requires support from the dev side, the mining effort probably is low from them because iGPU isn't their target.
 

Gideon

Golden Member
Nov 27, 2007
1,608
3,563
136
Some very juicy info:


Looks like it has a lot of matrix cores (tensor core analogs) and a RT unit in every core. Large L2 cache. Intel has also implemented temporal AI based upscaling (true DLSS analog, unlike FSR) which utilizes said tensor cores

Intel DG2 GPUs will be build using Xe-Cores, those will be grouped in Render Slices. Each Xe-Core will be build using 16 Vector Engines (256-bit) and 16 Matrix Engines (1024-bit per engine). Under each Xe-Core there is a single Ray Tracing Unit (without a fancy name like AMD’s or NVIDIA’s). Thus, DG2 GPUs will have an equal number of Xe-Cores and Ray Tracing Units. The DG2 GPUs feature Memory Fabric which is a large L2 Cache.
 
  • Like
Reactions: Tlh97
Feb 4, 2009
34,494
15,729
136
This seems kind of nifty.


Admittedly I don’t see much of a difference on some of these samples but certainly not all of the samples.
 
  • Like
Reactions: Tlh97 and Gideon

blckgrffn

Diamond Member
May 1, 2003
9,110
3,028
136
www.teamjuchems.com
Jan 12, 2021
30
64
61
Nice. 16 TFLOPs at max config put it around AMD 6800 or nvidia 3070 performance levels in that singular metric. Clearly a lot of other variables there but it's something.

Also, full N6 which I guess we already knew but interesting to see it in an Intel slide deck.
My guess is 18-20 TFLOPs, iris xe max is 1.65GHz, 1.5x this would give something between 2.4Ghz to 2.5Ghz.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
This seems kind of nifty.


Admittedly I don’t see much of a difference on some of these samples but certainly not all of the samples.
If you look closely you can see:
-The Intel one has less detail (e.g. the very small writing on the pipe).
-The Intel one is over sharpening (e.g. all the larger text has gone bold).
-There is still some shimmer around the pipework edges so that's not all gone.
-Some things seem to come/go, e.g. behind the pipes on the wall there's a diagonal bit that has a kind of a cheese grater surface that seems to go in/out of focus).

However for V1 it looks pretty nifty.
 
Feb 4, 2009
34,494
15,729
136
If you look closely you can see:
-The Intel one has less detail (e.g. the very small writing on the pipe).
-The Intel one is over sharpening (e.g. all the larger text has gone bold).
-There is still some shimmer around the pipework edges so that's not all gone.
-Some things seem to come/go, e.g. behind the pipes on the wall there's a diagonal bit that has a kind of a cheese grater surface that seems to go in/out of focus).

However for V1 it looks pretty nifty.

Good summary. The pipes were tough to tell other than the text on the 1080 image. I felt the Intel image looked a little off but couldn’t figure out what was off.
Overall I agree it looks great to me and honestly while playing a game or watching a video you would never perceive those differences.
 

eek2121

Platinum Member
Aug 2, 2005
2,904
3,903
136
Rats... Intel missed their opportunity to release their product during the "great" 2021 card shortage. Even if it was a medicore performer, they still would sold out of stock for at or above MSRP.
Let's just hope it's the great GPU shortage of '21 instead of the great GPU shortage of the early 20's.

Even if mining tapers off a lot, Intel could still supply that market segment and great profit to themselves and the wholehearted cheers of the gaming community.

Let them target the data center and mining markets while working the kinks out and refining their design. Even if they aren't selling a lot of discrete cards to gamers all of their CPUs are still going into laptops or other machines used for light gaming.

The GPU shortage is expected to persist well into next year. Unfortunately, as Intel is using TSMC, don’t expect retail availability to improve much.
 

blckgrffn

Diamond Member
May 1, 2003
9,110
3,028
136
www.teamjuchems.com
My guess is 18-20 TFLOPs, iris xe max is 1.65GHz, 1.5x this would give something between 2.4Ghz to 2.5Ghz.

I was just quoting Ryan and his impression that clock speeds are likely to come in at ~2ghz in the article you linked.

I mean, if only mhz scaled linearly with power efficiency that would be awesome. Lower clock speeds is a safe assumption until prove otherwise IMO :)
 

Tup3x

Senior member
Dec 31, 2016
944
925
136
I'm interested to se XeSS in action. On paperi it is as close to DLSS 2.0 as it can get. If the can copy NVIDIA's driver features (v-sync settings that work great with adaptive sync displays, NULL, forcing anisotropic filtering*).

*That actually works. AMD only supports DX9 games (and possible older) but as far as I know even that has been broken for RDNA2 cards.
If you look closely you can see:
-The Intel one has less detail (e.g. the very small writing on the pipe).
-The Intel one is over sharpening (e.g. all the larger text has gone bold).
-There is still some shimmer around the pipework edges so that's not all gone.
-Some things seem to come/go, e.g. behind the pipes on the wall there's a diagonal bit that has a kind of a cheese grater surface that seems to go in/out of focus).

However for V1 it looks pretty nifty.
You should ignore the magnified sample footage parts. If they used pixel resize instead of what looks like bicubic scaler then it would have been more useful, but since they didn't... 1:1 stuff (4k video down sampled to my 1440p screen) does look really nice I must say.
 

blckgrffn

Diamond Member
May 1, 2003
9,110
3,028
136
www.teamjuchems.com
Well, given they are seemingly spending more silicon on Matrix Math (tensor core type workloads) than even Ampere SMs then they better be bringing some magic to use that.

Otherwise if they just get smashed on raster rendering and *need* magic upscaling to save their 1440p/4k performance it will be disappointing.

Clearly this is a price/availability/capability cocktail situation so I am pleased they keep walking this to the finish line. It will be interesting at the very least.
 

LightningZ71

Golden Member
Mar 10, 2017
1,627
1,898
136
I find it interesting that their improved drivers for Xe-Hp are indicated as having improvements to the Xe-LP products, and that the XeSS tech can use the DP4a units on the Xe-LP to also successfully upscale. I hope that someone takes the time to see if the 80/96eu parts in Tiger Lake benefit from these improvements enough to put some distance between them and Vega8 in the Cezanne. It'll also be interesting to see how Cezanne and TigerLake work at 900p -> 1080p upscaling an higher detail settings.
 

moinmoin

Diamond Member
Jun 1, 2017
4,933
7,618
136
Does nobody want to start a dedicated thread for XeSS (excess?)? That's a nice surprise, turning the upscale-post-processing battle into a three-way one. Hope it runs reasonable well on other GPUs as well. Great that it will be eventually open sourced as well.