News Intel GPUs - Battlemage rumoured cancelled (again)

Page 174 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Hitman928

Diamond Member
Apr 15, 2012
5,392
8,280
136

Hope this masquerading option comes to the ARC Windows drivers too. Would be particularly interesting to see what kind of optimizations games turn on (or off) when they detect a certain vendor.
This is kind of funny. It’s not that pretending to be another vendor turns on AMD/NV optimizations that were artificially blocked from Intel GPUs, which is the case previously with AMD CPUs and Intel optimized software. Rather, pretending to be another vendor turns off the hardware accelerated XESS code path which is broken in Linux. So, in order to avoid their own broken features, they pretend to be a different vendor to get the game to turn them off. Glad they found a work around, but the solution is pretty humorous.
 
Jul 27, 2020
16,820
10,767
106
It’s not that pretending to be another vendor turns on AMD/NV optimizations that were artificially blocked from Intel GPUs
I was thinking about the interesting stuff that could be found with especially Geforce sponsored games if such an option were exposed in the ARC Windows drivers.
 

H433x0n

Senior member
Mar 15, 2023
926
1,013
96
Looks like Battlemage will have 2 dies, with the top die being equivalent to ACM-G10 (Arc A770). Battlemage could perform at 4070 Ti tier of performance (or greater) if it does end up being 400mm^2 of N4 silicon over a 256bit memory bus.

source
 

Dayman1225

Golden Member
Aug 14, 2017
1,152
974
146
Intel has released a new driver improving DX11 performance quite a bit in a wide range of games and a new update/beta for PresentMon for performance analysis


 

Heartbreaker

Diamond Member
Apr 3, 2006
4,238
5,244
136
This is kind of funny. It’s not that pretending to be another vendor turns on AMD/NV optimizations that were artificially blocked from Intel GPUs, which is the case previously with AMD CPUs and Intel optimized software. Rather, pretending to be another vendor turns off the hardware accelerated XESS code path which is broken in Linux. So, in order to avoid their own broken features, they pretend to be a different vendor to get the game to turn them off. Glad they found a work around, but the solution is pretty humorous.

Weird. Why would XESS be running automatically? It's an optional upscaling path.
 
Jul 27, 2020
16,820
10,767
106

To provide further specifics, the layoffs include 10 GPU software development engineers, eight system software development engineers, six cloud software engineers, six product marketing engineers, and six system-on-chip design engineers.
That's not gonna help, Pat. That's not gonna help you at all!
 

Aapje

Golden Member
Mar 21, 2022
1,434
1,954
106
This is exactly the kind of card you expect to do well for the price, as it has to be priced cheaply relative to the power of the hardware, because the drivers aren't that good. But optimizing just one use case (Stable diffusion) is relatively easy, so there the card can actually perform to the level of its hardware.
 

DrMrLordX

Lifer
Apr 27, 2000
21,709
10,983
136
Why? And if not why even make entirely level gpus which would be perfect as a fab filler?

They don't have the excess capacity they expected, and

if past products are any indicator, their nodes aren't working well for GPUs for some reason (Ponte Vecchio was moved entirely off Intel nodes, and Intel is moving aggressively to take N3 wafers for GPUs etc.).
 
Jul 27, 2020
16,820
10,767
106
their nodes aren't working well for GPUs for some reason (Ponte Vecchio was moved entirely off Intel nodes, and Intel is moving aggressively to take N3 wafers for GPUs etc.).
No leaks regarding what the actual reason could be? They've been putting out iGPUs on the same silicon slab as the CPU for decades. What changes when it's just GPU on the silicon?
 

DrMrLordX

Lifer
Apr 27, 2000
21,709
10,983
136
No leaks regarding what the actual reason could be? They've been putting out iGPUs on the same silicon slab as the CPU for decades. What changes when it's just GPU on the silicon?

You got me. Nobody from Intel is saying why they ripped Intel 7 and Intel 4 out of Ponte Vecchio (I think they use one of the 10nm node variants for the interposer or whatever, but nothing else). Pretty sure Meteor Lake iGPU is on a TSMC node, and it looks all of Arrow Lake's iGPUs will be TSMC as well.

Battlemage will also probably be on TSMC.

The only node they seem to have significant capacity for right now is Intel 7 or Super 7 or whatever you want to call the latest 10nm iteration (10ESF+?). And they have already demonstrated that they would rather use N6 for GPU/iGPU over that node.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,238
5,244
136
Why? And if not why even make entirely level gpus which would be perfect as a fab filler?

IMO, Intel is far enough behind on GPUs that they can't handicap GPU efforts by using less than top process, and they are still behind on process.

If/when they truly catch up to TSMC on process, they can do GPU on their in house process.
 

beginner99

Diamond Member
Jun 2, 2009
5,211
1,582
136
The only node they seem to have significant capacity for right now is Intel 7 or Super 7 or whatever you want to call the latest 10nm iteration (10ESF+?). And they have already demonstrated that they would rather use N6 for GPU/iGPU over that node.
So bascially still the same issue with new processes not yielding high enough with larger dies. If there newer processes would actually work I doubt they would go with TSMC.