News Intel GPUs - Intel launches A580

Page 60 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Isn't that only because they were out competed?
If they had a better product than nvidia and ATI they would have stayed in business?

I got interested in their history and starting looking into it. Yes product is part of the reason but at the same time they bought a distributor and started manufacturing their own. So at the same time it cost them massive amounts of money buying the company it alienated the partners making their cards. When it comes to cost, I don't think the acquisition just cost them in terms of money. It probably changed the dynamic of the company and made things more difficult to get the roadmap going.

It's the perfect time to launch a new video card. Even if it has some flaws, you can undercut the competition on price and still make money.

I agree with this too but they need to focus on being a distributor of chips like they are doing now. Even their "Intel laptop" it's never their own brand, since they allow the interested parties to rebrand it as their own. Only the NUCs are theirs.

Actually I know Intel is really good at that. If you have the necessary resources, manpower, and the skill to set up a company to build products using their chips, they will help you a ton. Documents, reference PCBs, technical support, etc. The mini PC makers like GPD, how do you think they started in the first place?

Product is only half of the story. Rest is human relations.
 
Last edited:

Frenetic Pony

Senior member
May 1, 2012
218
179
116
Yes that was a response to @Frenetic Pony



Yes, I believe it's 4 slices based on this

gen12_9 / xe2_hpg
- ELG_x1_2x4
- ELG_x2_5x4
- ELG_x3_7x4
- ELG_x4_10x4

With x4 meaning 4 slices. It jives with the rumor mill as well. It's not a monolithic die. It's tiles from now on.

Eh, there's no way they're making that many chips that big on TSMC 3nm until what, 2024 at least? It's been delayed into next year at the earliest, and Apple will have bought 90% or more of the wafers so what supply would they make them on? Sure, Intel "bought supply" but we all know no one outbids Apple.

Thus my guess being their own process. They've got to use it for something, else why have it and not just shutdown every fab until it's overhauled to be good enough? Doubling performance, or to be clear better as that's at least plausible, puts them right alongside where AMD and Nvidia are just a few months after, and on a process that's probably better for profit margins. If they're making anything on a new TSMC node next year it's probably HPC stuff on 4x; you don't need the mass market cost controls there if you're charging ten thousand+ a chip (well, package today).
 

biostud

Lifer
Feb 27, 2003
18,193
4,674
136
I got interested in their history and starting looking into it. Yes product is part of the reason but at the same time they bought a distributor and started manufacturing their own. So at the same time it cost them massive amounts of money buying the company it alienated the partners making their cards. When it comes to cost, I don't think the acquisition just cost them in terms of money. It probably changed the dynamic of the company and made things more difficult to get the roadmap going.



I agree with this too but they need to focus on being a distributor of chips like they are doing now. Even their "Intel laptop" it's never their own brand, since they allow the interested parties to rebrand it as their own. Only the NUCs are theirs.

Actually I know Intel is really good at that. If you have the necessary resources, manpower, and the skill to set up a company to build products using their chips, they will help you a ton. Documents, reference PCBs, technical support, etc. The mini PC makers like GPD, how do you think they started in the first place?

Product is only half of the story. Rest is human relations.

With the margins possible partners can earn on video cards atm, it shouldn't be that hard to find someone willing to make your products, specially not when you are intel.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Eh, there's no way they're making that many chips that big on TSMC 3nm until what, 2024 at least?

At 3nm 640EUs won't be that big. It'll probably be about 250mm2 each, considering how 512EUs on 6nm is 400mm2, and I am thinking there will be feature enhancements on top of being 640EUs, so it could end up being under 250mm2.
 

Glo.

Diamond Member
Apr 25, 2015
5,658
4,418
136
At 3nm 640EUs won't be that big. It'll probably be about 250mm2 each, considering how 512EUs on 6nm is 400mm2, and I am thinking there will be feature enhancements on top of being 640EUs, so it could end up being under 250mm2.
According to AdoredTV's sources, 320 EU tile would be about 80 mm2.

So 640 EU tile, on the same node would be under 200mm2, more in the range of 160-175 mm2.

Remember, 3 nm process from TSMC is two proper node shrinks below 6 nm. We still have 5/4 nm process and then is 3 nm process.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
According to AdoredTV's sources, 320 EU tile would be about 80 mm2.

So 640 EU tile, on the same node would be under 200mm2, more in the range of 160-175 mm2.

Remember, 3 nm process from TSMC is two proper node shrinks below 6 nm. We still have 5/4 nm process and then is 3 nm process.

See, the iGPUs are different. 96 Tigerlake EU's on Intel 7 only takes up slightly under 45mm2. So by that logic 512EUs should be 250-300mm2 at max right? Maybe even smaller if TSMC 6nm is slightly denser. But it ends up being 400mm2.

Because I/O scales poorly, and a high end dGPU requires lot more of it. I got the 200-250mm2 numbers purely based on downsizing from the 400mm2 on TSMC 6nm.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Vega 8 vs Xe comparison.

Look at the CPU clock speed, power, and GPU clock speed results. AMD's GPU clocks are very stable, and CPU clocks are very low, generally under 2GHz.

Intel's CPUs go often over 3GHz, and some reach 4GHz! You can see comparing the power meter where the Intel setup uses more power is generally where the CPU clock frequency is high.

Also where it performs lower is when the iGPU is clocked low at 1-1.1GHz range, rather than 1.3GHz it's supposed to. Whereas on AMD the GPU clocks are very stable.

So this is the traditional Intel iGPU issue where it's prioritizing the CPU over the iGPU so it steals the power available and doesn't perform as well in games. Also Zen mobile has a very efficient CPU architecture. Running Tigerlake at 4GHz of course doesn't help.
 

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
could the fact that it was never released to the public be why there are no good drivers?
Just spitballing I have no idea.

They are working on the drivers, the last driver is from December 14 and mentions this.
Error message seen when running Rise of the Tomb Raider* (DX12) on Intel® Iris® Xe Discrete graphics.

And known issues
[Intel® Iris® Xe Discrete graphics]: Intermittent crash or hang may be seen in Conan Exiles* (Low End Laptop Mode” in game settings), Forza Horizon 4* (DX12), Forza Motorsport 6* (DX12), Spyro: Reignited Trilogy*(DX11). • [Intel® Iris® Xe Discrete graphics]: Minor graphic anomalies may be observed in Assassin’s Creed Valhalla* (DX12), Code Vein*(DX11), Death Stranding* (DX12), Microsoft Flight Simulator*(DX11), GRID 2019* (DX12)

The other reason why they never launched this to the public is (i belive) they dont want to create the impression that intel GPU are low end, so there still a chance for the DG1 launching to the public after DG2. As i said the bios ROM chip is on the GPU, they just need to be flashed with the bios.

But here is the thing, DX11 and OpenGL implementation is good. I havent seen any DX11 or OGL issues so far, but i know they exist because it is on the driver notes. And the performance is actually good for a 80EU gpu with 68GB/s, it performs like a GTX750TI at times, i cant make a comparison because i do not longer have a 750TI, but Witcher 3 perf on the DG1 is equal or slightly faster, thats for sure.

But Vulkan is just broken, all games i tried on Vulkan are unplayable due to graphical errors, and DX12 also have these issues, but they are a lot less frecuent. If the DG2 gets delayed im petty sure it is due to drivers if DG1 status is any indication.
 
  • Like
Reactions: Tlh97 and psolord

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
But here is the thing, DX11 and OpenGL implementation is good. I havent seen any DX11 or OGL issues so far, but i know they exist because it is on the driver notes. And the performance is actually good for a 80EU gpu with 68GB/s, it performs like a GTX750TI at times, i cant make a comparison because i do not longer have a 750TI, but Witcher 3 perf on the DG1 is equal or slightly faster, thats for sure.

That's reassuring for compatibility with older titles. So I would say good news.

DX12 and Vulkan implementations can always be fixed, but I doubt anyone is dumping a lot of effort into these older APIs.
 

Dayman1225

Golden Member
Aug 14, 2017
1,152
973
146
XMG has a Laptop with Intel Arc launching in late Q2

edit: Article has been updated, the roadmaps are inaccurate the Arc models are still in planning stage


 
Last edited:
  • Like
Reactions: Tlh97 and scineram

Hulk

Diamond Member
Oct 9, 1999
4,191
1,975
136
Seems like GPU's are even more of a moving target than CPU's and Intel, already being behind is having trouble just jumping of the train and landing a product. There is no better time than right now to launch a GPU though. If it's actually available and decent they'll sell.
 

LightningZ71

Golden Member
Mar 10, 2017
1,627
1,898
136
Knowing Intel and their track record with discrete video cards, they'll get their card out about one week after the complete implosion of the Crypto market, resulting in absolutely cratering GPU prices, and them loosing billions on the first generation. They'll rapidly exit the market, vowing never to try again, and we'll be left with the current duopoly to try to strangle us again in the future.
 

moinmoin

Diamond Member
Jun 1, 2017
4,933
7,619
136
Knowing Intel and their track record with discrete video cards, they'll get their card out about one week after the complete implosion of the Crypto market, resulting in absolutely cratering GPU prices, and them loosing billions on the first generation. They'll rapidly exit the market, vowing never to try again, and we'll be left with the current duopoly to try to strangle us again in the future.
Right now it certainly seems like they are trying to get the timing right for such an event to happen.
 
  • Like
Reactions: Leeea and KompuKare

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
Knowing Intel and their track record with discrete video cards, they'll get their card out about one week after the complete implosion of the Crypto market, resulting in absolutely cratering GPU prices, and them loosing billions on the first generation. They'll rapidly exit the market, vowing never to try again, and we'll be left with the current duopoly to try to strangle us again in the future.

We're not that lucky... :(
 

gdansk

Golden Member
Feb 8, 2011
1,977
2,354
136
Knowing Intel and their track record with discrete video cards, they'll get their card out about one week after the complete implosion of the Crypto market, resulting in absolutely cratering GPU prices, and them loosing billions on the first generation. They'll rapidly exit the market, vowing never to try again, and we'll be left with the current duopoly to try to strangle us again in the future.
Gelsinger, even before being offered the CEO position, did an interview where he seemed disappointed that Intel ceded data center compute to Nvidia. I don't think they will give up until after his retirement
 

KompuKare

Golden Member
Jul 28, 2009
1,012
923
136
Right now it certainly seems like they are trying to get the timing right for such an event to happen.
This kind of thing is so hard to time (in)correctly, so it's good that Intel has so many resources it can bring to bear on this.
Maybe there are advantages in being so big that they could shrug off losing billions on Atom contra revenue, Larrabe, 5G modems, and so on: they didn't have to fire anyone for any of that (AFAIK) so maybe the geniuses responsible can be employed again?
 
Jul 27, 2020
15,749
9,815
106
Raja is such a COMPULSIVE lying POS. Millions of GPUs into the hands of miners most likely. Intel has blatantly told the world that it won't limit crypto-mining on their GPUs AT ALL, basically endorsing it.
 
  • Like
Reactions: Tlh97 and Tarkin77
Jul 27, 2020
15,749
9,815
106
Maybe there are advantages in being so big that they could shrug off losing billions on Atom contra revenue, Larrabe, 5G modems, and so on: they didn't have to fire anyone for any of that (AFAIK) so maybe the geniuses responsible can be employed again?
Too bad none of those "geniuses" had the courage to tell their superiors that their plans were crap and would lead to nothing but loss of time, money and most importantly, precious man-hours.
 

jpiniero

Lifer
Oct 1, 2010
14,510
5,159
136
This kind of thing is so hard to time (in)correctly, so it's good that Intel has so many resources it can bring to bear on this.
Maybe there are advantages in being so big that they could shrug off losing billions on Atom contra revenue, Larrabe, 5G modems, and so on: they didn't have to fire anyone for any of that (AFAIK) so maybe the geniuses responsible can be employed again?

Assuming mining continues to be popular enough they shouldn't have any problem making decent money on this.
 

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
Raja is such a COMPULSIVE lying POS. Millions of GPUs into the hands of miners most likely. Intel has blatantly told the world that it won't limit crypto-mining on their GPUs AT ALL, basically endorsing it.

There is no software support for Intel GPU mining ATM, at least not public. lolminer definately changed the opencl platform for intel arc gpus, but mining dosent work yet.
 
Jul 27, 2020
15,749
9,815
106
If Intel is serious about mining, they might get their best open source programmers working on getting it running on their GPUs. Their software support has always been great, other than GPU drivers.
 

gdansk

Golden Member
Feb 8, 2011
1,977
2,354
136
They don't have to do anything. If it is profitable it'll be ported quickly. For compute, Intel focuses on their OneAPI rather than specific applications (like mining).

Gaming is a different story since it will not be profitable for game developers to test on Xe. Intel has to pay that price to get their foot in the door.