News Intel GPUs - Intel launches A580

Page 120 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

NTMBK

Lifer
Nov 14, 2011
10,233
5,014
136

Jon Peddie estimates that Intel's has invested about $3.5 billion in its discrete GPU development and that these investments yet have to pay off. In fact, Intel's AXG has officially lost $2.1 billion since its formal establishment in Q1 2021. Given the track record of Pat Gelsinger, Intel's chief executive who scrapped six businesses since early 2021, JPR suggests that AXG might be next.

I hope they're wrong... I want the group to last long enough to release GPUs built on Intel processes.
 

moinmoin

Diamond Member
Jun 1, 2017
4,944
7,656
136
Well great, never investing enough into drivers for iGPUs. Then building from zero on short notice and being surprised that with lackluster software the hardware it should support would appear lackluster as well. That that results in little acceptance in the market and little income to offset all the short term investing, making them seem like straight losses, should surprise noone. Really have to wonder what Intel actually did between the announcement of development on dGPUs nearly 5 years ago and whenever (last year?) they noticed they had to actually show something. Too little too late seems to be a fitting summary.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
Nearly 50 games and not a single one with DX11. Come on Intel, there are still more than enough old and even new titles without Vulkan/DX12. Just show us already how atrocious your drivers are for DX11 titles.

It might be easier to just use a DX11-to-Vulkan shim at this point. Optimizing directly for DX11 could be a year long process. There is a performance hit, but it may be less then using the native drivers.

My 2c worth of opinion.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Nearly 50 games and not a single one with DX11. Come on Intel, there are still more than enough old and even new titles without Vulkan/DX12. Just show us already how atrocious your drivers are for DX11 titles.

Intel has clearly stated that tier one games are going to be DX12/Vulcan only. And then DX11 will be tier two, which still have a lot of optimization to be done.
 

Tup3x

Senior member
Dec 31, 2016
960
942
136
It might be easier to just use a DX11-to-Vulkan shim at this point. Optimizing directly for DX11 could be a year long process. There is a performance hit, but it may be less then using the native drivers.

My 2c worth of opinion.
Like what Samsung does on Android for Xclipse 920: they use ANGEL for OpenGL ES. Doesn't seem to be too optimised... Real world gaming performance is quite meh. I expected much more (even the Vulkan performance is lackluster).
 

gdansk

Platinum Member
Feb 8, 2011
2,081
2,562
136

I hope they're wrong... I want the group to last long enough to release GPUs built on Intel processes.
Jon Peddie said:
The best thing Intel could do at this juncture is to find a partner and sell off the group.
Who is going to want to buy a failed unit? Intel needs the IP for integrated graphics. One massive mistake (PV) and one under-performer (Alchemist) doesn't doom the group to future failure. Who else would pay for that? Intel needs to develop Xe anyway. If cost is an issue they don't need to develop anymore custom super computers versions of it -- that was Raja's Sisyphean task.

Intel needs to compete rather than abandoning ship constantly. ARM competition will slowly end the lucrativeness of the server market. They need something else
 
Last edited:

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
Like what Samsung does on Android for Xclipse 920: they use ANGEL for OpenGL ES. Doesn't seem to be too optimised... Real world gaming performance is quite meh. I expected much more (even the Vulkan performance is lackluster).

Shims are not a magic bullet by any means. But it may be the path of least resistance while they optimize the native drivers. I suppose get to "know" the architecture, what it can do and what doesn't work well. Don't forget NV/AMD has a 20 year head start in the optimization department.

If the game gets old enough, you can just brute force it. DX9-era games run on potatoes. Even something pushing DX9 to the limit like The Witcher 2 runs just fine on a modern IGP. The challenge there is making sure they run at all.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,841
3,189
126
Can't say i didn't see this coming.
You don't need a IV League education to know how poorly Intel handled this.

Another site theorizing intel will ditch dgpu's.


Seriously, the entire board at intel should be fired.
Product manager all the way to CEO needs to be replaced.

They took a company which was considered a monopoly into a circus fiasco where learning why Raja is called Raja was more important then actually talking about real launch dates.
 

Tup3x

Senior member
Dec 31, 2016
960
942
136
I don't know what kind of monkeys are working as analysts there but is there anyone that would expect them to make profit while creating new things that they obviously can't sell yet? Sure they are late and they clearly underestimated the software side but to me at least it looks like things are definitely starting to change in not so distant future.
 

biostud

Lifer
Feb 27, 2003
18,242
4,755
136
I don't know what kind of monkeys are working as analysts there but is there anyone that would expect them to make profit while creating new things that they obviously can't sell yet? Sure they are late and they clearly underestimated the software side but to me at least it looks like things are definitely starting to change in not so distant future.
In a market as competitive as this, timing is everything. If you launch a product that is worse than your competitors last gen hardware, you will find it hard to earn any money. Imagine they had the product ready for sale a year ago with working drivers.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,841
3,189
126
It didn't matter if it was a poor gaming card.
If it mined, it would be sold faster then intel can play there iconic "ding ding ding ding".

The raw fact is they did a horrible job at timing.
Its like how star citizen will never get released, because they keep changing it, when the whole sci fi mmo genre is almost dead, with the exception possibly of starfield which is slated to come out later this year.

Back in 2019, the people who dropped a lot of money on videocards did not care if it pulled 4k FPS and DLS and had high frame rates.
They mostly cared at how many megahash the card could pull out and at what wattage.

I said this way back, they could sell dookie, as long as it mined etherium, while sipping electricity, and sold it in masses at 300% markup, and still probably would of had supply issues as the demand was there.
Then they could of worked on fixing the card for gamers after they profited from the miners.
Im sure the gamers would of thanked the miners for this.... funding RnD on a card so it can play games properly.

But no, they wanted a pretty doll which is not even in high tier class, but low to mid-low, and still ended up with massive delays because the color lip stick didn't work out too great.

Intel needs to realize, they are no longer a monopoly.... and trying to step into dGPU's against nvidia, and AMD whose had mega platforms established, is a losing battle without lots of life preservers, which miners could of supported.
 

maddie

Diamond Member
Jul 18, 2010
4,739
4,668
136
I don't know what kind of monkeys are working as analysts there but is there anyone that would expect them to make profit while creating new things that they obviously can't sell yet? Sure they are late and they clearly underestimated the software side but to me at least it looks like things are definitely starting to change in not so distant future.
Nobody expected them to make a profit by now. The questions being asked is how much more we need to spend & for how many years to at least start getting some positive returns. Then you start thinking about recouping total investment. Intel is losing money now and surely (hopefully) the insiders know the true state of affairs. They will be cutting, what & how much is the unknown.
 

moinmoin

Diamond Member
Jun 1, 2017
4,944
7,656
136
The questions being asked is how much more we need to spend & for how many years to at least start getting some positive returns.
Right now they are spending all the money a second time all over their penny pinchers until very recently thought they saved on the iGPU business. I really hope Pat realized this madness.
 
  • Like
Reactions: Tlh97 and coercitiv

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,228
136
In a market as competitive as this, timing is everything. If you launch a product that is worse than your competitors last gen hardware, you will find it hard to earn any money. Imagine they had the product ready for sale a year ago with working drivers.

Yes, in fantasy world it turned out better. So?

Back in reality they are having a rough go of it.

Outsiders suggesting the give up and sell the unit are out to lunch. It's worthless outside Intel, the only way a third GPU player gets off the ground is with Intel supporting it.

Intel's big GPU failures were not buying ATI, and not pursuing GPUs sooner.

The need for GPU is only going to increase going forward. Intel needs to bite the bullet and power through and get generation two working better, and generation three better than generation two...

IMO if Intel bails out of GPUs this time, that points to declining future.
 
  • Like
Reactions: MangoX

maddie

Diamond Member
Jul 18, 2010
4,739
4,668
136
Yes, in fantasy world it turned out better. So?

Back in reality they are having a rough go of it.

Outsiders suggesting the give up and sell the unit are out to lunch. It's worthless outside Intel, the only way a third GPU player gets off the ground is with Intel supporting it.

Intel's big GPU failures were not buying ATI, and not pursuing GPUs sooner.

The need for GPU is only going to increase going forward. Intel needs to bite the bullet and power through and get generation two working better, and generation three better than generation two...

IMO if Intel bails out of GPUs this time, that points to declining future.
Intel's problem is not a declining future but a declining present. The future is now. Lower margin gamer GPUs is not a good investment when you're hemorrhaging money and will continue almost certainly for several more years. Just compare the silicon needed for between CPUs and GPUs for a given sales price, plus all that essential software work with each big game.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,783
7,117
136
Intel's big GPU failures were not buying ATI, and not pursuing GPUs sooner.

-Its weird to read this.

I feel like Intel would have bought ATI, seen the disasterous HD2900 series release, then canned ATI dgpus after a gen or two and relegated them to igpu grunt work.

While everyone was definitely in DAAMIT mode for a while there with the AMD/ATI merge, AMD is the kind of company that seems to enter into markets with some degree of discretion and forethought.

They didnt have the luxury of just dropping a couple bil on a whim like Intel did.

In short, if Intel had bought ATI in 2006, we would have ended up with a GPU monopoly for Nvidia.
 

beginner99

Diamond Member
Jun 2, 2009
5,210
1,580
136
I don't know what kind of monkeys are working as analysts there but is there anyone that would expect them to make profit while creating new things that they obviously can't sell yet? Sure they are late and they clearly underestimated the software side but to me at least it looks like things are definitely starting to change in not so distant future.

Agree. They aren't really selling anything at big numbers and a new product always has huge upfront R&D costs that in your books should take 10+ years to recuperate or else you are simply unrealistic.

The reason for intel entering the dGPU market is of course due to compute in server but let's not forget also due to volume. Modern factories simply are darn expensive as we all know here and the more expensive they get, the more volume you need to make it worthwhile. dGPUs don't need to be super profitable as long as they provide additionbal volume to make the cost for the fabs worth it. Of course with intels process woes, that currently doesn't really add up. But I guess the hope is it will, once they sorted that out.

Still, I have to say I'm not shocked it is not a smooth ride given the leader they choose for the dGPU division. It's a thing I simply can't wrap my head around why you would think that guy will be helpful in the mission. See how things started to go better for AMD after he left?
 
  • Like
Reactions: MangoX

KompuKare

Golden Member
Jul 28, 2009
1,014
926
136
Still, I have to say I'm not shocked it is not a smooth ride given the leader they choose for the dGPU division. It's a thing I simply can't wrap my head around why you would think that guy will be helpful in the mission. See how things started to go better for AMD after he left?

While I'm certainly no Kudri fan, I'm sure he was around when some of initial goals of RDNA and RDNA2 were set. And while I'm also certain he doesn't deserve all the credit, wasn't while he was there that Radeon drivers really started become as polished as they are now?

Of course, I am always very suspect of someone who seems to play far too much organisational politics, self-promotion, and saying exactly what people want to hear (yes-man). And the last attribute may have got him into Intel in the first place, plus from a lot of accounts Intel do seem to live by the first attribute: playing (organisational) politics. So most of the critique level at Kudri is possible correct.
 
  • Like
Reactions: igor_kavinski