News Intel GPUs - Intel launches A580

Page 81 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Aapje

Golden Member
Mar 21, 2022
1,379
1,855
106
To me it's unclear why the DG2 driver is behind their regular iGPU driver. Not only the build number is much lower, the drivers files like the Vulkan or the oneVPL GPU driver are older. iGPU 1660 driver supports Vulkan 1.3 whereas DG2 is on Vulkan 1.2 with this newly released driver. However I know that their is a unified driver in the works, although I don't know about the release schedule.

The much lower build number is what you expect if they made a separate driver based on the existing one and starting to count at 1 again.

And Vulkan 1.3 has been released on April 1st for the iGPU driver, so it's no surprise that this is not yet in the other driver.
 

mikk

Diamond Member
May 15, 2012
4,133
2,136
136
And Vulkan 1.3 has been released on April 1st for the iGPU driver, so it's no surprise that this is not yet in the other driver.


The new DG2 driver came out 9 days later and it's not a surprise it didn't come with the iGPU driver feature set? :laughing: I would say it's shocking that their DG2 driver is so much behind after all these delays. We are not just talking about the 3D driver code, the media files are older as well. People were saying iGPU issues and DG2 issues must be the same but it might be not so easy, otherwise they could have easily used the iGPU driver, apparently they cannot.
 

Aapje

Golden Member
Mar 21, 2022
1,379
1,855
106
The new DG2 driver came out 9 days later and it's not a surprise it didn't come with the iGPU driver feature set? :laughing: I would say it's shocking that their DG2 driver is so much behind after all these delays.

In a situation where there are different driver branches, it is indeed not a surprise that features added to one branch take a bit of time to be ported to the other. 'So much behind' is less than two weeks, which isn't actually a long time at all.

That they haven't put more priority in merging these drivers does suggest that they have been struggling with the DG2 driver, which is also shown by the driver not even properly identifying the Arc GPU.
 

mikk

Diamond Member
May 15, 2012
4,133
2,136
136
I think it's too early to say how Arc performs, this is not only a graphics driver issue at the moment. DTT slows down the Samsung Galaxy Book 2 Pro, in some cases it's twice as fast without DTT. It's installed by default, what a fail is this.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
@mikk You are failing to realize Dynamic Tuning is part of the Deep Link and power sharing to optimize thermals.

Of course it performs better. The stuttering part needs to be tweaked but with DTT the TDP is no longer separate between the two main chips but treated as one - say 50W for CPU + GPU. 30+30W is greater than 50W. 60 is greater than 50!

I'm pretty sure if the CPU is set to 45W and the GPU set to 50W it would perform even better. What a surprise!

This is also good example of why discrete systems will always be faster, even with identical specs. The average frame rate problem may be resolved, but things like stuttering might never be. You are sharing something. Ideally you can use power sharing between the two chips to allocate power between the two and gain perf/w, but computers and AI is stupid so you lose more than information from theory suggest.

Thermals are an issue for notebook systems and manufacturers tend to be on the conservative side to avoid warranty issues. "Oh let's just unlock TDP levels to the maximum on a T&L!" That's actually a bit stupid for a reviewer to say. He should have said "DTT needs more tuning by Samsung and Intel to resolve stuttering issues and abnormally low frame rates in some cases".

Perhaps Intel needs to give users more knobs on Dynamic Tuning rather than assuming all the users have bricks for their brains. I'd like to be able to adjust total TDP and TDP of the CPU and GPU. And you'd have a system level thermal benchmark to tell the user whether it's safe or not to run at the user set settings. Too high of a TDP setting will cause problems down the road.
 
Last edited:

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Some are saying the launch might be late as Q4 for desktop variants, with special edition in Q3. So maybe July extreme limited launch with BTS availability for the rest.
 
  • Wow
Reactions: NTMBK

mikk

Diamond Member
May 15, 2012
4,133
2,136
136
@mikk You are failing to realize Dynamic Tuning is part of the Deep Link and power sharing to optimize thermals.

Of course it performs better. The stuttering part needs to be tweaked but with DTT the TDP is no longer separate between the two main chips but treated as one - say 50W for CPU + GPU. 30+30W is greater than 50W. 60 is greater than 50!


It's clearly not working, don't you realize this? With this kind of performance penalty and stuttering the game playability is much below an Xe LP 96EU ADL-P iGPU, what is the point in a dGPU? In high performance mode DTT should be disabled. It works in 3dmark GPU but not in real world gaming. ADL-P 4+8 at 15-20W cannot properly feed the A350M in higher fps games which is the culprit for the low GPU utilization. This is not a graphics driver issue.


GY123bo.jpg
 

eek2121

Platinum Member
Aug 2, 2005
2,930
4,026
136
It's clearly not working, don't you realize this? With this kind of performance penalty and stuttering the game playability is much below an Xe LP 96EU ADL-P iGPU, what is the point in a dGPU? In high performance mode DTT should be disabled. It works in 3dmark GPU but not in real world gaming. ADL-P 4+8 at 15-20W cannot properly feed the A350M in higher fps games which is the culprit for the low GPU utilization. This is not a graphics driver issue.


GY123bo.jpg

Hey, it isn’t Intel’s fault Alder Lake is a power hog! oh wait…

Also, it could absolutely be a graphics driver issue. There are too many unknowns to rule any single thing out.

Is that the game’s CPU usage or system load?
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
It works in 3dmark GPU but not in real world gaming. ADL-P 4+8 at 15-20W cannot properly feed the A350M in higher fps games which is the culprit for the low GPU utilization. This is not a graphics driver issue.

It IS working. With DTT On the CPU is capped at 15W. They chose to set it at 15W. If they change the DTT settings to 20W+ it would perform much better.

Again, 3DMark Time Spy is representative of performance of games running at very low fps, at least for the low end GPUs. It won't be CPU limited on Time Spy. It'll be a different case for Fire Strike, that's why it performs relatively poorly.

Stuttering is a driver based issue(yes graphics as well, since it'll be part of the dynamic power sharing system) but average fps will increase if they upped the power allocation for the CPU.

Laptops have thermal issues and it's limited for a reason. If you look at idiotic reviews like that one or the user responses you'd think everyone would be happy with a 2-inch thick 10lbs desktop replacement running CPU and GPU each at 100W. But no, they want that and be paper thin and featherlight too.
 

mikk

Diamond Member
May 15, 2012
4,133
2,136
136
It IS working. With DTT On the CPU is capped at 15W. They chose to set it at 15W. If they change the DTT settings to 20W+ it would perform much better.

Again, 3DMark Time Spy is representative of performance of games running at very low fps, at least for the low end GPUs. It won't be CPU limited on Time Spy. It'll be a different case for Fire Strike, that's why it performs relatively poorly.

Stuttering is a driver based issue(yes graphics as well, since it'll be part of the dynamic power sharing system) but average fps will increase if they upped the power allocation for the CPU.

Laptops have thermal issues and it's limited for a reason. If you look at idiotic reviews like that one or the user responses you'd think everyone would be happy with a 2-inch thick 10lbs desktop replacement running CPU and GPU each at 100W. But no, they want that and be paper thin and featherlight too.


Doesn't work in the meaning of it makes no sense because it will be slower than the iGPU with DTT enabled. 3dmark GPU scores are not affected, CPU scores are. I'm not just talking about stuttering, the low GPU utilization in the first video with DTT enabled obviously was caused by a CPU bottleneck and is not a graphics driver issue.

The CPU can go higher than 15W in the video, it depends. Overall it seems to be limited to 35W for both CPU+GPU with DTT. A350M TGP is advertised from Intel at 25-35W, if I buy such a device I expect 25-35W and not 20W or less in the highest performance mode. If the OEM cannot cool it down properly they shouldn't use a dGPU. The Galaxy Book 2 Pro only uses 1 heatpipe, the upcoming Acer 16x features two heatpipes.
 

Aapje

Golden Member
Mar 21, 2022
1,379
1,855
106
A350M TGP is advertised from Intel at 25-35W, if I buy such a device I expect 25-35W and not 20W or less in the highest performance mode. If the OEM cannot cool it down properly they shouldn't use a dGPU.

It's normal for laptop makers to set a lower TDP if their cooling cannot cope. The deceptive part is that they don't publicize it, so consumers don't know that two laptops with the same CPU and/or GPU actually don't perform the same.
 
Last edited:

xpea

Senior member
Feb 14, 2014
429
135
116
How many more delays before they just cancel it and move straight to Battlemage?
At this point it's really alarming and it will end up fighting against Lovelace. Good luck with that !
Basically, it looks like Alchemist is just a (very costly) beta test for Battlemage. Let's hope Pat won't sack this Gaming GPU division. We need competition...
 
  • Like
Reactions: psolord

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Doesn't work in the meaning of it makes no sense because it will be slower than the iGPU with DTT enabled. 3dmark GPU scores are not affected, CPU scores are. I'm not just talking about stuttering, the low GPU utilization in the first video with DTT enabled obviously was caused by a CPU bottleneck and is not a graphics driver issue.

If it's severely CPU bottlenecked, they can just change the allocation so it uses more of the CPU. So 35W Total can mean 15W CPU + 20W GPU or 20W CPU + 15W GPU. No point having a 25W GPU if it's CPU limited and 20W is fine.

Obviously the performance isn't as high as 25W+25W but it'll be better.

And driver problem is a good speculation as Intel drivers traditionally had problems allocating the two properly in an iGPU. Of course the difference is back then it prioritized too much on the CPU.

Yes the low utilization is a TDP allocation issue, but that's potentially just another part of driver not handling things properly. Stuttering is a bigger issue though.

It's normal for laptop makers to set a lower TDP if their cooling cannot cope. The deceptive part is that they don't publicize it, so consumers don't know that two laptops with the same CPU and/or GPU actually don't perform the same.

Not being able to power it up fully is acceptable. The problem here is that it's not allocating it properly. Also stutters, which are a driver issue and not necessarily GPU but in this case the whole system which includes GPU drivers, DTT, etc.

At this point it's really alarming and it will end up fighting against Lovelace. Good luck with that !
Basically, it looks like Alchemist is just a (very costly) beta test for Battlemage. Let's hope Pat won't sack this Gaming GPU division. We need competition...

Only inkling of hope that I have that they'll stick around little longer is that Gelsinger seems to be taking the necessary risks that they should have took 10 years ago. Pretty much after the reign of founder CEOs ended, they were way too comfy in their position. Craig Barrett, Paul Otellini, Brian Kraznich just went business as usual.

I like the company as a whole in that they can do much better and their over-focusing on margins was a potential issue. I wanted someone that can say "short term sacrifice long term gain". Look at how the share price is tanking now. Average investors are delayed and also a little stupid. But at least the CEO is making good decisions. He pretty much told the board and investors "screw your margins when the future of the company is in jeopardy".

They say they want leadership graphics, AI and compute, and naturally if you have leadership in graphics it won't be a stretch for that turn into a decent dGPU.
 
Last edited:

Aapje

Golden Member
Mar 21, 2022
1,379
1,855
106
How many more delays before they just cancel it and move straight to Battlemage?
The grapevine says that they've been churning them out and just aren't selling them yet (probably due to the driver issues). If so, this silicon surely will be sold eventually.

However, I can see them doing a paper launch for the discrete cards, where they make barely enough for the reviewers, but normal consumers can't actually get them. And them putting nearly all of them in laptops.
 

Tup3x

Senior member
Dec 31, 2016
959
942
136
Positive thing is that it doesn't look like hardware is the issue. They sure overlooked the software though. I do wonder how far they are with Battlemage...
 

Aapje

Golden Member
Mar 21, 2022
1,379
1,855
106
Wasn't that the plan all along? Just goes to show you shouldn't buy the first generation of anything.
It was never going to be competitive across the spectrum, but the optimistic take was that Intel was going to provide competitive low/mid end products, perhaps by making zero profit or a loss, to buy market share.

My expectation was that the drivers would sink the product, but I didn't expect it to be this bad.
 

mikk

Diamond Member
May 15, 2012
4,133
2,136
136
Positive thing is that it doesn't look like hardware is the issue. They sure overlooked the software though. I do wonder how far they are with Battlemage...


At this rate probably not before late 2023. Even the Arrow Lake iGPU is on Xe HPG and this is a (H1?) 2024 platform. Lunar Lake should be Intels first Xe2 iGPU platform in H2 2024+. I would expect roughly 1 year between a dGPU implementation and a CPU+iGPU implementation. Maybe the gap can be narrowed down with chiplets in the future but not for now including the first chiplet platforms MTL, ARL and LNL.
 

Ajay

Lifer
Jan 8, 2001
15,431
7,849
136
Wasn't that the plan all along? Just goes to show you shouldn't buy the first generation of anything.
Yes, that is often the case. I didn’t like the first Radeon product or the GeForce 256, and stuck with my Matrox (G200/400) & Voodoo2 combo for a while. GF2 was better - although the display circuitry was still terrible.
 

jpiniero

Lifer
Oct 1, 2010
14,585
5,208
136
It was never going to be competitive across the spectrum, but the optimistic take was that Intel was going to provide competitive low/mid end products, perhaps by making zero profit or a loss, to buy market share.

My expectation was that the drivers would sink the product, but I didn't expect it to be this bad.

Coulda made a boatload of money by selling mining AI cards...
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
Yes, that is often the case. I didn’t like the first Radeon product or the GeForce 256, and stuck with my Matrox (G200/400) & Voodoo2 combo for a while. GF2 was better - although the display circuitry was still terrible.

Having used both the 256/2/2MX, I still think my (Creative) Voodoo Banshee had better image quality then any of them. With Matrox right alongside.
 

Ajay

Lifer
Jan 8, 2001
15,431
7,849
136
Having used both the 256/2/2MX, I still think my (Creative) Voodoo Banshee had better image quality then any of them. With Matrox right alongside.
I’ll have to take your word for it. Number 9 and the Matrox Millennium series were excellent. Matrox G series kicked it up a notch, which impressed me.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Having used both the 256/2/2MX, I still think my (Creative) Voodoo Banshee had better image quality then any of them. With Matrox right alongside.

Back then displays were analog so having a proper display circuitry was critical. Ever since they moved to digital such as DVI and HDMI, it's not so important anymore.

Matrox was generally known to be top, and Creative was not too bad.

My first dGPU was a Creative Annihilator 2 MX. In my high school years the box image showing a planet and the sun about to dawn was pretty amazing to me. Also the name Annihilator. That was $199 cdn at London Drugs.

The Geforce 2 MX had the picture right before dawn and with the GTS it was sunrise. It looked damn amazing.
 
Last edited: