igor_kavinski
Lifer
- Jul 27, 2020
- 28,173
- 19,203
- 146
Hope this masquerading option comes to the ARC Windows drivers too. Would be particularly interesting to see what kind of optimizations games turn on (or off) when they detect a certain vendor.
This is kind of funny. It’s not that pretending to be another vendor turns on AMD/NV optimizations that were artificially blocked from Intel GPUs, which is the case previously with AMD CPUs and Intel optimized software. Rather, pretending to be another vendor turns off the hardware accelerated XESS code path which is broken in Linux. So, in order to avoid their own broken features, they pretend to be a different vendor to get the game to turn them off. Glad they found a work around, but the solution is pretty humorous.
Hope this masquerading option comes to the ARC Windows drivers too. Would be particularly interesting to see what kind of optimizations games turn on (or off) when they detect a certain vendor.
I was thinking about the interesting stuff that could be found with especially Geforce sponsored games if such an option were exposed in the ARC Windows drivers.It’s not that pretending to be another vendor turns on AMD/NV optimizations that were artificially blocked from Intel GPUs
Now don't go saying crazy things.Is fixing XESS in Linux an option?
This is kind of funny. It’s not that pretending to be another vendor turns on AMD/NV optimizations that were artificially blocked from Intel GPUs, which is the case previously with AMD CPUs and Intel optimized software. Rather, pretending to be another vendor turns off the hardware accelerated XESS code path which is broken in Linux. So, in order to avoid their own broken features, they pretend to be a different vendor to get the game to turn them off. Glad they found a work around, but the solution is pretty humorous.
It's not automatic, it's running fine on Windows.Weird. Why would XESS be running automatically? It's an optional upscaling path.
That's not gonna help, Pat. That's not gonna help you at all!To provide further specifics, the layoffs include 10 GPU software development engineers, eight system software development engineers, six cloud software engineers, six product marketing engineers, and six system-on-chip design engineers.

AnandTech Forums: Technology, Hardware, Software, and Deals
Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.www.anandtech.com
That's not gonna help, Pat. That's not gonna help you at all!
True but then they don't need to beat a 4090. All they need is the best deal in the $300-$500 space to fill up their factories.Doesn't exactly feel like they're investing in the GPU business :/
And that is a reasonable goal to have as a new player in the gameTrue but then they don't need to beat a 4090. All they need is the best deal in the $300-$500 space to fill up their factories.
But their GPUs aren't using Intel nodes?True but then they don't need to beat a 4090. All they need is the best deal in the $300-$500 space to fill up their factories.
They likely never will.But their GPUs aren't using Intel nodes?
Not yet. But I think that was the plan. Without using own fabs it indeed doesn't seem to make much sense.But their GPUs aren't using Intel nodes?
Why? And if not why even make entirely level gpus which would be perfect as a fab filler?They likely never will.
Why? And if not why even make entirely level gpus which would be perfect as a fab filler?
No leaks regarding what the actual reason could be? They've been putting out iGPUs on the same silicon slab as the CPU for decades. What changes when it's just GPU on the silicon?their nodes aren't working well for GPUs for some reason (Ponte Vecchio was moved entirely off Intel nodes, and Intel is moving aggressively to take N3 wafers for GPUs etc.).
No leaks regarding what the actual reason could be? They've been putting out iGPUs on the same silicon slab as the CPU for decades. What changes when it's just GPU on the silicon?
Why? And if not why even make entirely level gpus which would be perfect as a fab filler?
Possibly because of horrible power efficiency and the GPU thermal throttling without water cooling or something.Nobody from Intel is saying why they ripped Intel 7 and Intel 4 out of Ponte Vecchio
So bascially still the same issue with new processes not yielding high enough with larger dies. If there newer processes would actually work I doubt they would go with TSMC.The only node they seem to have significant capacity for right now is Intel 7 or Super 7 or whatever you want to call the latest 10nm iteration (10ESF+?). And they have already demonstrated that they would rather use N6 for GPU/iGPU over that node.
