News Intel GPUs - Intel launches A580

Page 159 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

maddie

Diamond Member
Jul 18, 2010
4,740
4,674
136
I'm not sure whether to take this seriously. Due to all the delays and resulting contract penalties Aurora is already known to be at cost at best, if not even a loss leader for Intel. So Aurora GPUs raising the overall ASP seems off. Also how small has to be the overall shipment if 60k units are this significant?
It appears that took the total sales and divided by the consumer ASP to get sales volumes. A few very expensive GPUs will give false higher sale numbers.
 

moinmoin

Diamond Member
Jun 1, 2017
4,952
7,661
136
It appears that took the total sales and divided by the consumer ASP to get sales volumes. A few very expensive GPUs will give false higher sale numbers.
Makes sense. Sounds like there may well be some bookkeeping tricks involved on Intel's part, like still counting the whole revenue for Aurora as planned and bill the penalty somewhere else. Now that I write this I think I actually recall that something like the latter already happened?

Edit: Yeah, we had just that back in July 2021...
 

NTMBK

Lifer
Nov 14, 2011
10,237
5,020
136
I'm not sure whether to take this seriously. Due to all the delays and resulting contract penalties Aurora is already known to be at cost at best, if not even a loss leader for Intel. So Aurora GPUs raising the overall ASP seems off. Also how small has to be the overall shipment if 60k units are this significant?

I'd expect a high ASP for specialised data centre GPUs, even if Intel is taking a bath due to low yields, R&D cost etc. ASP is not the same as profit margin.
 

moinmoin

Diamond Member
Jun 1, 2017
4,952
7,661
136
I'd expect a high ASP for specialised data centre GPUs, even if Intel is taking a bath due to low yields, R&D cost etc. ASP is not the same as profit margin.
The point was that both revenue, margins and "profit" for Aurora are vastly diminished by the penalty Intel had to pay (around 50% of Aurora's revenue, see link in my previous post) for the repeated delays. But looks like it's perfectly fine to book revenue and "profit" and pretend the penalty never happened.

So, going back to JPR's correction of the number, not only is the effect of Aurora a one time thing, accounting for the penalty it may well not exist at all.
 

NTMBK

Lifer
Nov 14, 2011
10,237
5,020
136

On Friday afternoon, Intel published a letter by Jeff McVeigh, the company’s interim GM of their Accelerated Computing Systems and Graphics group (AXG). In it, McVeigh offered a brief update on the state of Intel’s server GPU product lineups and customer adoption. But, more importantly, his letter offered an update to Intel’s server GPU roadmap – and it’s a bit of a bombshell. In short, Intel is canceling multiple server GPU products that were planned to be released over the next year and a half – including their HPC-class Rialto Bridge GPU – and going all-in on Falcon Shores, whose trajectory has also been altered, delaying it until 2025. There’s a lot to digest here, so let’s dive right in.
 

blckgrffn

Diamond Member
May 1, 2003
9,127
3,066
136
www.teamjuchems.com
I thought the Anandtech article was a good read. I am guessing that we will still see the desktop refresh as there has to be some hardware level changes that they want to implement sooner rather than later.

Then again, Intel’s real revenue driver on the GPU front has to be Datacenter and Laptop solutions, the desktop in my mind is an opportunistic grab that you know appealed to management. You are already building drivers for tens/hundreds of millions of laptop GPUs and you’re building out serious GPUs for the datacenter, why not take the “easy” route and develop add in GPUs that use all this “synergy”. Of course it’s never that easy.

Even Nvidia gives us a big tell on where they aim their solutions for by adding a bunch of largely useless silicon to every GPU now and inventing reasons why we should want it. It’s all technology that has to be enabled and is off by default (haha, unless it stealth isn’t).

In this case, with the datacenter skipping a refresh until 2025 are we really going to see a desktop cadence like was mapped out? In my mind only if it meshes with their laptop roadmaps. And if we don’t get new Intel desktop GPUs until 2025? That seems fairly ludicrous because they are likely to get gapped pretty hard by the cards from AMD and Nvidia through the end of the year.
 
  • Like
Reactions: Tlh97 and Aapje

maddie

Diamond Member
Jul 18, 2010
4,740
4,674
136
Maybe Intel sees that they have to gamble hard to catch up in both process and design. Risky.
 

coercitiv

Diamond Member
Jan 24, 2014
6,201
11,903
136
Maybe Intel sees that they have to gamble hard to catch up in both process and design. Risky.
Gambling got them where they are today. In the past 10 years they always prioritized leapfrogging the competition over delivering boring upgrades like clockwork. I believe this is their corporate mentality, they think they are the best and therefore must dominate the competition through sheer innovation and technical skill. Even their business model was built around domination. This behavior lingers on even today, with top management still talking about regaining leadership in X years. It's getting really old though, like an empire always threatening to strike back while sinking resources in secret Death Star projects instead of sending in the conventional army.

I hope what's happening now is less about a leadership moonshot and more about administering painful medicine, jettisoning compromised projects to make room for the return of the tick tock machine... or the tock machine, whatever works for them.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,227
5,228
136
Can we get an Intel subforum? ;)

Can you stop changing the thread title?

It's annoying when strange threads pop into my list that aren't recognizable as something I responded to.

I now understand why other boards lock that down...
 

moinmoin

Diamond Member
Jun 1, 2017
4,952
7,661
136
In the past 10 years they always prioritized leapfrogging the competition over delivering boring upgrades like clockwork.
But since the beginning of 14nm there also always has been that strange disconnect about wanting to leapfrog, delays while trying that and fillers that are even more boring upgrades, like all the Skylake derivatives.

Can you stop changing the thread title?
I think it's fine. The first part "Intel GPUs - " never changes, the latter part usually is informative or good fun.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Then again, Intel’s real revenue driver on the GPU front has to be Datacenter and Laptop solutions, the desktop in my mind is an opportunistic grab that you know appealed to management.

That's not how it worked according to Q3 marketshare data and availability of products.

Contrary to expectation that Intel could bundle and sell a bunch of laptops and GPUs that way, there are almost no high end ARC laptops(bit of A380Ms).
Tell me where are those numerous OEMs using A770Ms in laptops? At best 1-2 A730Ms at best.

OEMs are actually more worried about Intel's execution and driver issues. According to the marketshare data it's Desktop dGPU they gained share.

Yes they are very important. Also people realize despite the revision Intel has more than 50% of AMD's volume in marketshare?

The established market doesn't matter as much since that's already being sold. What if AMD had 1% and Intel had 8%? AMD's financial impact is not affected by already sold Radeons. You can argue shipped vs sold, but they are offering very good deals right now, that even big channels like HWUB acknowledged.

In one video they even said in terms of volume it did not sell little. It's just that at the best they are even per product to make it happen, when reality it requires profit at a product level to begin to think about making investment back.

But if trends continue until Battlemage and since drivers will be much more mature, this could change quickly. They just need to stick to it.
 
Last edited:

blckgrffn

Diamond Member
May 1, 2003
9,127
3,066
136
www.teamjuchems.com
That's not how it worked according to Q3 marketshare data and availability of products.

Contrary to expectation that Intel could bundle and sell a bunch of laptops and GPUs that way, there are almost no high end ARC laptops(bit of A380Ms).
Tell me where are those numerous OEMs using A770Ms in laptops? At best 1-2 A730Ms at best.

I wasn't talking about laptop dGPUs. I was talking about the massive amount of Xe/G7/etc. number of APU/CPUs they sell. It's staggering. Writing good drivers for those is good business too.

OEMs are actually more worried about Intel's execution and driver issues. According to the marketshare data it's Desktop dGPU they gained share.

They went from 0% to something more than 0%. Where are all the Intel powered pre-builts? XPS? Legion? etc. That's just as valid as your laptop point above. Where are the OEM wins? They do have a lot of desktop integrated GPUs, yes, but its way less likely that they serve double duty as business and pleasure as the ones that are in a laptop, because it's trivial to add something better to nearly any desktop. As for the OEM solutions, they might exist, but I haven't seen them and they aren't in my Costco coupon book. But what is it, OEM DGPU wins matter or they don't?

Yes they are very important. Also people realize despite the revision Intel has more than 50% of AMD's volume in marketshare?

In value, one super computer accounting mistake wiped out half of the value of DGPUs shipped? lmao, clearly they should try to get more super computer wins because if they had two Auroras they could be just as well off.

It's also still shipped to channel. Who is buying them? Are a bunch of them going into some back office Dells on a hot buy as Intel looks to move stock? What long term value does that have? I am sticking to my guns that AMD getting the consoles is huge as it allows them near permanent relevance. This allows for essentially all RDNA gpus to be guaranteed to be optimization targets for all major game engines and titles. For day one bugs and all the rest, it matters. How many titles are doing any performance or bug testing on Intel DGPU right now unless they are paid to do so? I would imagine they test on some laptop and desktop integrated GPUs and call it good. That's all fine, but Intel themselves have told us how divergent the performance and capabilities of these products are. Supposedly.

The established market doesn't matter as much since that's already being sold. What if AMD had 1% and Intel had 8%? AMD's financial impact is not affected by already sold Radeons. You can argue shipped vs sold, but they are offering very good deals right now, that even big channels like HWUB acknowledged.

In one video they even said in terms of volume it did not sell little. It's just that at the best they are even per product to make it happen, when reality it requires profit at a product level to begin to think about making investment back.

But if trends continue until Battlemage and since drivers will be much more mature, this could change quickly. They just need to stick to it.

If deals sold hardware, AMD would have had plenty of great market winners in the past. The truth seems to be you have to be competitive at a high level - heck, maybe at the halo level. Merely sorta competitive at absolute performance while burning more power and using more silicon is a 290x approach to solving the problem. That didn't do AMD much good, and it's unlikely to do Intel much good in the coming months.

Yes, they need to keep at it if they want to be truly relevant. Of course, the real question is will they? At least the drop in silicon demand likely bought them some reprieve in volume pricing. It'd be awesome for the market if they had a killer SKU, even for a few months. They desperately needed to launch 6-9 months earlier, if not sooner. We would have been happy for anything at that point. Bad for Intel is a softening hardware market and either AMD or worst case nvidia getting serious about the $200-$300 price point on 5nm while Intel waits a year or two. They'll be simply outclassed.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,487
20,580
146
I'd very much like to hear from any ARC owners if they were asked to submit to the Steam survey. I have done 7 or 8 fresh windows install in the last 6 months, and the only one that asked me to submit the survey when I installed Steam, was the Nvidia build. I am not saying there is something fishy about the survey...no wait, yes I am. :p Where is ARC on it?

Dom has done a revisit of ARC, including a quick look at the BiFrost. I don't think he mentions its biggest advantage over the LE is the ease of maintaining it later. He had one crash in Cyberpunk'd. And after testing another card and using DDU, the A770 required a CMOS clear every time. That sounds like something they can fix with a firmware update on the GPU?

The rest of the TLDW is that he saw the same performance gains as other sites. Though as only a few have pointed out, there are still frustrating frame pacing issues in some games that need fixing. Gears 5 among them. TW:Warhammer 3 is a huge improvement on the order of CSGO. Eliminated graphical issues too = well done. That is a massive time sink series. Not a Tomb Raider where most only use the benchmark now, making it silly to even use.

 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,815
7,171
136
I'd very much like to hear from any ARC owners if they were asked to submit to the Steam survey. I have done 7 or 8 fresh windows install in the last 6 months, and the only one that asked me to submit the survey when I installed Steam, was the Nvidia build. I am not saying there is something fishy about the survey...no wait, yes I am. :p Where is ARC on it?

Dom has done a revisit of ARC, including a quick look at the BiFrost. I don't think he mentions its biggest advantage over the LE is the ease of maintaining it later. He had one crash in Cyberpunk'd. And after testing another card and using DDU, the A770 required a CMOS clear every time. That sounds like something they can fix with a firmware update on the GPU?

The rest of the TLDW is that he saw the same performance gains as other sites. Though as only a few have pointed out, there are still frustrating frame pacing issues in some games that need fixing. Gears 5 among them. TW:Warhammer 3 is a huge improvement on the order of CSGO. Eliminated graphical issues too = well done. That is a massive time sink series. Not a Tomb Raider where most only use the benchmark now, making it silly to even use.


- Funny you mention that. Was running my 2200G system for MONTHS, no ask for a survey. Put in my old 980Ti so my wife could play HWL and boom, request for a survey on the next start-up.

Could be that it just detected another user or some change to the system but its odd that it never thought to capture the Vega in the 2200G but immediately looked to grab the 980Ti.

Edit: For that matter my Sig Rig has never been polled by Steam either...
 
Last edited:

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
Writing good drivers for those is good business too.

That's exactly where Intel has always fallen through. It's not that their (IGP) drivers are -bad-*, it's that they could be so much better with a bit of effort. Their hardware is pretty capable, they just don't seem to care too much about extracting maximum performance from it.

*Whatever else, they've (almost) always been both rock solid and rock stable. Intel also had a worrying habit of dropping support on perfectly good hardware after a few years. That at least seem to be in the past.
 
  • Like
Reactions: Tlh97 and blckgrffn

Leeea

Diamond Member
Apr 3, 2020
3,625
5,368
136
*Whatever else, they've (almost) always been both rock solid and rock stable. Intel also had a worrying habit of dropping support on perfectly good hardware after a few years. That at least seem to be in the past.
Um, no.

That statement is just massively delusional.

Intel's iGPU drivers are so awful they had to straight up disable dx12 on some chips:


Intel didn't get as much flak as it should have for its broken iGPU drivers because most users just disabled the iGPU all together and installed something from Nvidia, AMD, Matrox, VIA, or etc. Basically anything but Intel.


Intel even breaks web browsers these days:
 
  • Like
Reactions: Tlh97 and moinmoin

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
I wasn't talking about laptop dGPUs. I was talking about the massive amount of Xe/G7/etc. number of APU/CPUs they sell. It's staggering. Writing good drivers for those is good business too.

Ok I get you. But you see though, they started ramping up efforts as soon as the dGPU effort started. If you hear through what Tom Petersen said, previous to this they were oblivious to things that were simple to us DIY enthusiasts.

But they are putting in real effort now. Sure they should have done before. "Better late than never". And Battlemage will have a much better foundation. They are already putting out Game On driver support ahead of both companies and in some of them put it in striking distance of the 3070.

They went from 0% to something more than 0%. Where are all the Intel powered pre-builts? XPS? Legion? etc.

I am not talking about those OEMs. OEMs in general are not using them hence why there's almost no Intel dGPU laptops, since laptops don't have prebuilts.

Their sell is through individual cards like Newegg, Microcenter. The last report showed that the availability of A750/770 was when they got the 3-4% marketshare. It's clearly not Dell or prebuilts. It's people who want to support the third player. DIY market.

If deals sold hardware, AMD would have had plenty of great market winners in the past. The truth seems to be you have to be competitive at a high level - heck, maybe at the halo level.

Yes but AMD wasn't zero. And overall I agree in that this is not sustainable. They are trying to gain presence and acknowledging mistakes made with price cuts.

DIY is buying. Nowhere near Nvidia, but in terms of volume a significant portion of AMD.

That's why I posted a while ago that it was a surprise. AMD has very little brand presence. Actually Intel still has a lot. That's why I can believe them selling half of AMD in volume.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
Um, no.

That statement is just massively delusional.

Wait, what? I've used Intel IGPs, and quite a lot of them, since the original "Extreme" Graphics*, and almost never had stability issues. True, the drivers are not fancy or necessarily suited for gaming. But they work, and are usually rock solid.

*Those things had so many other funny issues, but they weren't driver related as such.

Intel's iGPU drivers are so awful they had to straight up disable dx12 on some chips:

It says it right there in the link itself. It's a security issue. If you need DX12 on 4th gen or can live with it, don't update your driver. I get why they'd quick fix it for enterprise. Its possibly not really worth investing the resources in fixing a bug in an almost 10 year old IGP that is too slow to make use of DX12 anyway.
 

mikk

Diamond Member
May 15, 2012
4,140
2,154
136
4th gen is too slow for basically any DX12 game, who cares. And that's 4th gen Core with Gen7.5 graphics, it doesn't tell anything about iGPUs drivers for newer architectures.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,487
20,580
146
ETAprime tests the A750. I don't understand why he keeps talking about used 3070s at the end. Completely different market here in the U.S.. One is NIB with a 3yr warranty. The other is a crapshoot for how long it works correctly. That is, if it doesn't show up with the VRAM already worn out, like the 3060 I got from Ebay.

 
  • Like
Reactions: Tlh97 and NTMBK
Feb 4, 2009
34,576
15,789
136
ETAprime tests the A750. I don't understand why he keeps talking about used 3070s at the end. Completely different market here in the U.S.. One is NIB with a 3yr warranty. The other is a crapshoot for how long it works correctly. That is, if it doesn't show up with the VRAM already worn out, like the 3060 I got from Ebay.

VRAM wears out?
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,487
20,580
146
VRAM wears out?
It degrades/ becomes faulty if run long enough and hot enough, sometimes above its TJmax. Cryptominers trying to juice every hash are prime candidates for that kind of abuse.

The first card I ever had VRAM degrade on, was an nV ti 4600. By editing it's firmware to underclock the VRAM enough, I was able to use it for another few months until I got a new card. With the 3060 I ordered, it went from code 43 at desktop trying to install drivers, to not being able to make it to windows, or complete a fresh install. Card could not have been more than 2yrs old, almost certainly less than that. I bought it from a miner that was selling a good number of the same exact model in individual auctions. Correlation does not equal causation, but it's the smart money bet that's what happened.