• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

News Intel GPUs - we've given up on B770, where's Celestial already

Page 84 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
It's not implausible since they pushed Alchemist back again and again. At a certain point, the next gen is ready.

Intel uses 1 year cycles for their CPUs and seems to want to have the same frequency for their GPU releases. It seems quite possible that Alchemist was actually aimed at Q4 at first, so then the next release would be aimed at the next Q4.

Of course, getting new hardware ready doesn't help with fixing the drivers.
 
I'm seeing a lot of 6700 XT's and 3070's on Amazon at not-that-horrible prices. Looks like Intel will have a tough time selling their cards if they price them high.
 
Intel missed an excellent opportunity by a wide margin.

Yeah. if they launched 6 months ago the could have made high margins selling every single card and gain market share. Now? I and probably man others will simply wait as prices come down and the big ? around their drivers will only really be overcome with more aggressive pricing.
 
Yeah. if they launched 6 months ago the could have made high margins selling every single card and gain market share. Now? I and probably man others will simply wait as prices come down and the big ? around their drivers will only really be overcome with more aggressive pricing.

If the drivers are the delay currently and was back then it most assuredly was more so back then so what use would launching a product that would have been made a mockery of, where reviews would have a parade of visual and other hiccups to show, have been? The result would be people saying "Tsk tsk Intel, first impressions.." probably. It's unfortunate for us and INTC that drivers are proving hard to master but "Q1" has been questioned for a long time and now if we're lucky it's late Q2, I just don't see the point of releasing an unfinished product
 
If the drivers are the delay currently and was back then it most assuredly was more so back then so what use would launching a product that would have been made a mockery of, where reviews would have a parade of visual and other hiccups to show, have accomplished? People saying "Tsk tsk Intel, first impressions.." probably

Coulda sold it as a mining AI Card.
 
I don't understand how their Xe iGPU is playing games yet their dGPU with the same DNA is having driver issues. What did they change in the dGPU architecture that made it so different from the iGPU that their driver team can't wrap their heads around?
 
It's not implausible since they pushed Alchemist back again and again. At a certain point, the next gen is ready.

Intel uses 1 year cycles for their CPUs and seems to want to have the same frequency for their GPU releases. It seems quite possible that Alchemist was actually aimed at Q4 at first, so then the next release would be aimed at the next Q4.

Of course, getting new hardware ready doesn't help with fixing the drivers.

As I said earlier in this thread- at some point, it makes more sense to just cancel Alchemist and skip straight to Battlemage.
 
As I said earlier in this thread- at some point, it makes more sense to just cancel Alchemist and skip straight to Battlemage.

Not if they have warehouses full of chips already. Dumping most of them into laptops makes most sense to me now.
 
I don't understand how their Xe iGPU is playing games yet their dGPU with the same DNA is having driver issues. What did they change in the dGPU architecture that made it so different from the iGPU that their driver team can't wrap their heads around?

Xe iGPU also has issues. It's acceptable for an iGPU that costs very little but not for a dGPU.

Yes it doesn't excuse them taking this long.
 
I don't understand how their Xe iGPU is playing games yet their dGPU with the same DNA is having driver issues. What did they change in the dGPU architecture that made it so different from the iGPU that their driver team can't wrap their heads around?
Two different driver paths, from the looks of it. Far Cry 6 is a lot less broken on the dGPUs (only moderately broken skybox) than it is on the iGPUs too (entire screen is epilepsy inducing, sun turns into a black hole slowly consuming the entire screen until the game crashes).
 
IMHO, Intel can’t skip Alchemist due to OEM contracts and the need to continue to use it as a learning experience in the field. They just need to keep pushing on year after year if they ever hope to catch up the the moving targets that AMD and NV represent. I remember buying the first Radeon card - for six months, it was agony.
 
When do they announce that the entire project has been scrapped?
Listening to Pat Gelsinger interviews, from before he was offered CEO position, it sounds like he considered allowing Nvidia in the data center Intel's biggest mistake. I don't think he'd cancel their GPGPU efforts.

Consumer GPUs might not be strictly necessary for that goal but they are good pipe cleaner products for any GPU Intel wants to make at TSMC.
 
One difference compared to Intel's other projects is that their GPU lineup also includes the iGPUs. So having a very competitive dGPU helps with their iGPUs and vice versa.

Also, again many companies that use the excuse "we always wanted the datacenter not client" did it before their fall into oblivion. Competitive client product is much more difficult to make and success there is easier to translate into workstation, not the other way around.
 
Rumor has it that AMD's next iGPU will be much more powerful, perhaps by adding a small RDNA2-based die to an MCM CPU.

In general I think that MCM blurs the lines between iGPUs and discrete GPUs, so it makes much more sense to do both.
 
It all depends on when someone at Intel presses the BDSM button (as they will be getting whipped raw by reviewers).
 
Back
Top