News Intel GPUs - Intel launches A580

Page 163 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Dayman1225

Golden Member
Aug 14, 2017
1,152
974
146
The latest 4148 beta driver brings Game On support for Deceive and Diablo 4 Beta. Also performance optimization for Sons of The Forest.

Driver sizes
4148 Beta: 890MB
4146: 1GB
4125 Beta: 1.1GB
4123 Beta: 1.1GB
4091: 1.1GB
4090 Beta: 1.2GB
4032: 1.2GB
3975 Beta: 1.2GB - Unifies dGPU and iGPU drivers
3959: 1.2GB - DX9 optimization driver
3802: 1.2GB
3793 Beta: 1.2GB
3491 Beta: 1.3GB
3490: 1.3GB
Latest drivers seem to be half the size now
 

Exist50

Platinum Member
Aug 18, 2016
2,445
3,043
136
I think that Intel has to split up. Both the manufacturing and design side have become so complex that they require the full attention of management. Worse, both sides of the company seem to making each other fail. The design problems result in the foundries having to wait really long for products to be ready for manufacturing, while the foundry problems cause issues for the design teams. The IFS plan is undermined by the foundries being focused on helping internal customers, which means that there is a lack of good tooling and access for external customers, and also apprehension by external customers to choose Intel, for fear of helping their competing products. External customers may also fear that they get a 2nd rate experience, with the best stuff going to internal customers.

By splitting up the company, either side can't hide behind the failings of the rest of the company anymore. Intel Design would become a direct competitor to Nvidia and AMD, and has to start shaping up so they can meet market demand with fewer designs and require a single or at last not 12 steppings for their design.

IFS has to learn how to attract customers that aren't forced to use them.
There's certainly some merit to that kind of split, but I don't see how it would work financially. At the end of the day, their design side is currently paying for all the manufacturing side's expansion. I think this would only be feasible years down the road when IFS is established.
they've mentioned they want to do a mobileye like spin-off for the foundry BiZ( where its a public company with the majority owned and controlled by intel,intel owns 90-98% of mobileye after the ipo iirc)
and talked about the transistion to an "internal foundry model" multiple months ago https://www.intel.com/content/www/us/en/newsroom/news/intel-embraces-internal-foundry-model.html
That doesn't seem to be what they're saying. Instead, it seems to be more about holding both the design teams and fabs to a customer vs seller accountability model. So if the design team has a bug and needs to do another stepping, they have to pay for it. Likewise, if the fab misses targets, they have to eat the cost.
 

Mopetar

Diamond Member
Jan 31, 2011
7,837
5,992
136
not his goal,at least for now, he mentioned he's leaving to start a software company around generative ai.

So he's hoping to get a bunch of venture capital to burn through on what will no doubt be a disappointing product that at best has the hopes of being acquired by an even bigger and more clueless company?

He should have just thrown in something about using the blockchain as part of this generative AI to create a metaverse, and he'd have given everyone a blackout on their BS bingo card.
 

Mopetar

Diamond Member
Jan 31, 2011
7,837
5,992
136
can he go to NVIDIA next? he would fit right in with Jensen.

JHH was one of the people who founded NVidia. If he comes across as an arrogant prick it's because he's earned it. No one who's founded a company would hire someone they think would only harm their company. Only a deranged parent would knowingly hurt their baby.
 
  • Like
Reactions: Lodix

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
JHH was one of the people who founded NVidia. If he comes across as an arrogant prick it's because he's earned it. No one who's founded a company would hire someone they think would only harm their company. Only a deranged parent would knowingly hurt their baby.

Remember how people say engineers should be selected for CEOs?

Well, unlike AMD/Intel Jensen has been CEO from the very beginning of Nvidia and he's an engineer at heart.
 

Aapje

Golden Member
Mar 21, 2022
1,382
1,864
106
Well, unlike AMD/Intel Jensen has been CEO from the very beginning of Nvidia and he's an engineer at heart.

And it shows, because Nvidia has never really lost their way when it comes to their actual technology, unlike Apple, AMD and Intel.
 

Mopetar

Diamond Member
Jan 31, 2011
7,837
5,992
136
Remember how people say engineers should be selected for CEOs?

Well, unlike AMD/Intel Jensen has been CEO from the very beginning of Nvidia and he's an engineer at heart.

Someone who founded a company or has essentially been there since the start is going to have a completely different perspective than someone with no real connection that's been managing or leading half a dozen other companies for decades before taking on the CEO role at company X.

I don't think you need to be an engineer to be a good CEO, but you do need to understand the company's core products and if it's a company where engineers create those products the CEO will need to know when to defer to their experience and expertise.

For a company like Intel the CEO is also going to need to be someone the various lead engineers and upper-middle management respect and fear. Otherwise they get too busy trying to one-up each other to try to move up the corporate ladder.

A lot of Intel's woes likely come down to internal politicking and petty squabbles where the company ends up sabotaging itself. Microsoft was notorious for doing this while Ballmer was in charge because everyone thought he was an idiot and the various division heads were all trying to make themselves look like the next choice for CEO.
 

mikk

Diamond Member
May 15, 2012
4,140
2,154
136


RedGamingTech claims 64 Xe cores for Battlemage.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
I don't think you need to be an engineer to be a good CEO, but you do need to understand the company's core products and if it's a company where engineers create those products the CEO will need to know when to defer to their experience and expertise

It's engineer, fab guy, or finance. Those are the big choices. If you look at Intel, Fab guy = Barrett, Kraznich, Finance = Otellini, Swan

It seems Fab experience isn't necessarily good for the company because there's a difference between relentless execution and future vision. They excel at the former but fail at the former(of course Kraznich failed miserably at both).

It's certainly not finance, because Otellini's mistakes are costing the company today. So from a practical perspective it has to be engineer. When it comes to respect Pat Gelsinger has loads of that. There always exist groupthink, and he's one of them.

One article said there has to be a balance between vision and management and his experience at VMWare means he has both of that, and the needed software perspective, which he didn't have it back at Intel.

There's absolutely no doubt in my mind big part of the drivers actually improving(and rapidly) has to do with that, indirectly by Gelsinger bringing in Lavender and reorganizing the group.
 
  • Like
Reactions: Tlh97 and Mopetar
Mar 11, 2004
23,075
5,557
146
And it shows, because Nvidia has never really lost their way when it comes to their actual technology, unlike Apple, AMD and Intel.

Except for doing their own custom CPUs, modems (most Tegra in general really), bumpgate, FX series...

Nvidia has screwed up their tech plenty. They rarely do so catastrophically, but there are instances of that. In fact Nvidia might have sunk as much money as Intel in trying to get traction in the mobile space (only to fail for similar but also distinct reasons), only Nvidia somehow failed while leveraging ARM CPUs because (primarily it was due to Nvidia's business practices) they kept screwing up their own custom designs, then couldn't get their modem sorted, then screwed up custom CPUs again (think they did that what 3 times?) then tried forcing adoption through litigation only that backfired on them as well before they then tried to just buy ARM itself and that failed as well, so its not like you have to look that far to see Nvidia's failures.
 
Last edited:

KompuKare

Golden Member
Jul 28, 2009
1,016
932
136
Except for doing their own custom CPUs, modems (most Tegra in general really), bumpgate, FX series...

Nvidia has screwed up their tech plenty. They rarely do so catastrophically, but there are instances of that. In fact Nvidia might have sunk as much money as Intel in trying to get traction in the mobile space (only to fail for similar but also distinct reasons), only Nvidia somehow failed while leveraging ARM CPUs because they kept screwing up their own custom designs, then couldn't get their modem sorted, then screwed up custom CPUs again (think they did that what 3 times?) then tried forcing adoption through litigation only that backfired on them as well before they then tried to just buy ARM itself and that failed as well, so its not like you have to look that far to see Nvidia's failures.
Of course as a consumer, I mostly only cared about bumpgate which cause me and people I know a lot grieve.

I guess the difference between what the actual damaged caused by all those millions of parts with defective solder and what Nvidia eventually paid out (but only the US, the rest of the world good nothing) was the money they could then squander on Tegra, modems!
 

KompuKare

Golden Member
Jul 28, 2009
1,016
932
136
They are "optimizing" it further: https://www.club386.com/cyberpunk-2...verdrive-mode-featuring-path-tracing-visuals/

Oh my! How the 4090 owners will gloat now.
That reads like an Nvidia press release. It probably is.
I guess they want a new meme: "but can it run Cyberpunk 2077?"

Difference being, Crysis wasn't just designed to be a vendor sponsored GPU showcase.

Without fake frames even 4090 would be lucky to hit movie frame rates.

Anyway, aside from the fancy lighting and reflections everywhere for $100s million game am I the only one who thinks the graphics look quite low polygon after all the hype?
 

linkgoron

Platinum Member
Mar 9, 2005
2,298
818
136
A bit late to the party, but it's not very surprising that Koduri left/was fired after being demoted. I think that Intel can make ARC work, with enough work with regards to the drivers etc, but it's hard to deny that it has been very late, probably missed many internal goals as well as performance goals, and that all of the regular classic Koduri behavior from AMD was displayed at Intel. You would expect that the software and drivers issues especially would have been taken more seriously and with better execution when someone with Koduri's experience is one of the leads of the project.
 
  • Like
Reactions: Vattila

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
I think Koduri being a problem is just a consequence of the real problem that has been plaguing Intel for the past two decades, culture and management problem.

If they were actually on the ball, then they would have been better long time ago.

Ah well. Based on what @Exist50 is saying Koduri's role has been minimal since December of last year. General manager is former driver head Lisa Pearce now.
 
Jul 27, 2020
16,326
10,337
106
Anyway, aside from the fancy lighting and reflections everywhere for $100s million game am I the only one who thinks the graphics look quite low polygon after all the hype?
You are right. I was thinking why does this game look like some old game from 2010s with fancy effects? Maybe they were going for a depressing looking city. If so, yeah, that looks highly depressing.
 
  • Haha
Reactions: moinmoin

KompuKare

Golden Member
Jul 28, 2009
1,016
932
136
You are right. I was thinking why does this game look like some old game from 2010s with fancy effects? Maybe they were going for a depressing looking city. If so, yeah, that looks highly depressing.
There is a slight thing with sci-fi games: lots of surfaces can be super smooth with no bumps, cracks or bumpmaps. So quite low-polygon. Though similar looking at Mass Effect scenes.

But against that, something like Fallout 4 does have plenty of bashed, rusted and run down surfaces.
I think that in CB2077 having so many surfaces looking perfect distracts from any kind of dystopian feel.
 

beginner99

Diamond Member
Jun 2, 2009
5,210
1,580
136
I think that Intel can make ARC work, with enough work with regards to the drivers etc
Maybe, maybe not. Currently ARC underperfoms given it's die size (and with that manufacturing cost). But even iGPU from intel have had this problem for years. I suspect it's not just driver but also the entire uArch needs to be changed. In general it would make sense for the to create affordable mid range gpus in large volume. No need to be the fastest or most efficient but they need to be profitable which means better performance/mm2 is required.
Such a high volume part can fill up the factories better than just CPUs.
 
  • Like
Reactions: Tlh97 and coercitiv

coercitiv

Diamond Member
Jan 24, 2014
6,203
11,909
136
Battlemage and Celestial are apparently in the books.

TSMC has reportedly won some very large production orders from Intel. The orders are not only for the next-generation Battlemage graphics processing units, but also for the follow-up Celestial architecture GPUs. Taiwan’s Commercial Times cites industry sources for its insider info, which also includes some tantalizing news nuggets regarding processes, timings and volume.
 

KompuKare

Golden Member
Jul 28, 2009
1,016
932
136
Battlemage and Celestial are apparently in the books.

Both a long way off apparently:
According to the source, “Intel will launch the second-generation Battlemage graphics chip with the Xe2 architecture in the second half of 2024, and the third-generation Celestial graphics chip with the Xe3 architecture in the second half of 2026.”
So that sounds like over a year until Battlemage and around 3 until Celestial.

Plenty of time to sort out legacy drivers then!
 
  • Like
Reactions: ZGR

IEC

Elite Member
Super Moderator
Jun 10, 2004
14,330
4,918
136
2H2024? That really doesn't bode well.

It's not like their competitors will be sitting on their laurels. I'd expect RTX 5000 series and RX 8000 series would be launched by then...
 
  • Like
Reactions: Lodix and ZGR