News Intel GPUs - Intel launches A580

Page 92 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
At this rate it will launch together with Battlemage. A bit sad that company like Intel needs to use Chinese/Koreans as alpha testers for their drivers. It doesn't look like hardware is the issue so I'd imagine this might not push Battlemage release date.

Battlemage is "2023-2024" so Alchemist won't intercept with BM at all.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,839
3,174
126

You see this is why we voted against paying the board. Intel Board is nothing but absoulte TRASH, that has no idea what timing and speculation is.

Delays again...

They completely missed the HOT segment where they could of sold dookie just because its a GPU, even if it performed bad, but was priced so people could afford it, and it was at least as good as a 3050.

But no...

The entire Intel Board needs to be laid off.... even if we have to pay them to get the hell out...
We need a Lisa Su counter part @ intel, because current Intel is NOT working.
They are taking the entire company under, soon its gonna be a Cyrix moment all over again watch.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Delays again...

The bad guys are already out. It wasn't just because of the "bad guys", but because they FIRED 20K employees or something over stupid reasons like "oh you're too old" or "I don't like you". The latter of which was BK's legacy. Gelsinger hired 13K employees, not just high profile ones like the Nehalem architect.

They are still in the process of changing the company. You don't change a 100K+ place overnight, especially when the problems were brewing for more than a decade.

Also, the article for the delay is dated May 10th. It's June 1 now.
 
Jul 27, 2020
15,745
9,810
106
1654177439395.png

Look at how abysmal the driver performance was before and how much it improved subsequently. They still have to figure out splatting.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,222
5,224
136
They completely missed the HOT segment where they could of sold dookie just because its a GPU, even if it performed bad, but was priced so people could afford it, and it was at least as good as a 3050.

But no...

The entire Intel Board needs to be laid off.... even if we have to pay them to get the hell out...
We need a Lisa Su counter part @ intel, because current Intel is NOT working.
They are taking the entire company under, soon its gonna be a Cyrix moment all over again watch.

You can't just jump into a new technology space by willing it.

Pat Gelsinger appears to be trending Intel in the right direction bringing the focus back to engineering, but he's only been there since 2021.

If you want to compare him to Lisa Su, you need to give him comparable amount of time.

Lisa Su took over AMD in 2014. It really wasn't until 2020 until their GPU division become competitive (RDNA) again. So she gets 6 years, but you won't give Gelsinger 2 years?

Plus AMD already had discrete GPUs as core business, and it's a new business for Intel, so if anything Intel should get more time.
 

jpiniero

Lifer
Oct 1, 2010
14,510
5,159
136
The main thing is that Intel isn't likely to remain interested for long... especially if they feel like they would need to fab the gaming GPUs externally to have something sellable.

It's an easy thing to kill to cut costs.
 

JasonLD

Senior member
Aug 22, 2017
485
445
136
The main thing is that Intel isn't likely to remain interested for long... especially if they feel like they would need to fab the gaming GPUs externally to have something sellable.

It's an easy thing to kill to cut costs.

Unlikely since big APUs are most likely to happen in the future.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,222
5,224
136
The main thing is that Intel isn't likely to remain interested for long... especially if they feel like they would need to fab the gaming GPUs externally to have something sellable.

It's an easy thing to kill to cut costs.

Intel should have been more forward looking back in 2006, and snapped up ATI.

It's kind of stunning that it took them this long to realize the importance of GPUs.

I don't think anyone was realistically expecting great thing from their first generation.

Hopefully they are in it for the long haul, and can make a credible product in a few years.
 

moinmoin

Diamond Member
Jun 1, 2017
4,933
7,619
136
It's kind of stunning that it took them this long to realize the importance of GPUs.
Intel knew all along the importance of GPUs. Intel is the company that put iGPUs in all its consumer chips. AMD is just catching up to that. Intel saw the danger of GPUs in datacenters early on, that why they pushed development on Larrabee/Xeon Phi since 2006.

The big mistake was first starving then shuttering Xeon Phi without a direct replacement available in the products portfolio.
 

maddie

Diamond Member
Jul 18, 2010
4,723
4,627
136
Intel knew all along the importance of GPUs. Intel is the company that put iGPUs in all its consumer chips. AMD is just catching up to that. Intel saw the danger of GPUs in datacenters early on, that why they pushed development on Larrabee/Xeon Phi since 2006.

The big mistake was first starving then shuttering Xeon Phi without a direct replacement available in the products portfolio.
Intel saw GPUs as trivial. Basic gaming, video output, etc. They thought that they could keep x86 as the center for parallel computing, which they did see as important, no doubt adding patented instructions to ensure dominance and competitor lock-out. The rapid scaling of GPUs must have sent shockwaves through management.
 
  • Like
Reactions: Tlh97 and KompuKare

moinmoin

Diamond Member
Jun 1, 2017
4,933
7,619
136
Intel saw GPUs as trivial. Basic gaming, video output, etc. They thought that they could keep x86 as the center for parallel computing, which they did see as important, no doubt adding patented instructions to ensure dominance and competitor lock-out. The rapid scaling of GPUs must have sent shockwaves through management.
Larrabee/Xeon Phi was intended to be Intel's x86-based answer to the rapid scaling of GPUs. The Aurora exascale supercomputer was originally to be built around Xeon Phi and to be finished back in 2018(!). Intel's major mismanagement in that area was shuttering the Xeon Phi line before having an equivalent replacement. And the equivalent replacement still hasn't launched, and we are in 2022!

(@jpiniero, I still don't know what you were referring to with "Skylake Servers". Standard server chips never were and never could have been a replacement capable of running Aurora alone. Care to explain your response?)
 

maddie

Diamond Member
Jul 18, 2010
4,723
4,627
136
Larrabee/Xeon Phi was intended to be Intel's x86-based answer to the rapid scaling of GPUs. The Aurora exascale supercomputer was originally to be built around Xeon Phi and to be finished back in 2018(!). Intel's major mismanagement in that area was shuttering the Xeon Phi line before having an equivalent replacement. And the equivalent replacement still hasn't launched, and we are in 2022!

(@jpiniero, I still don't know what you were referring to with "Skylake Servers". Standard server chips never were and never could have been a replacement capable of running Aurora alone. Care to explain your response?)
When did Larrabee start being conceptualized? 2004, 2005 timeframe? GPUs had something like 100-200 shader cores. Intel thought they could stay in the game with x86. Can see the Oracles at Intel saying, "control the language and you control the world".

I guess it's the same human failing that doomed so many companies. The inability to obsolete your own products. In this case, it's not even that, as both CPU & GPU profitably coexist. Just myopic stupid greed, "WE WANT ALL", and you end up way behind.

edit: Corrected shader core count
 
Last edited:

jpiniero

Lifer
Oct 1, 2010
14,510
5,159
136
(@jpiniero, I still don't know what you were referring to with "Skylake Servers". Standard server chips never were and never could have been a replacement capable of running Aurora alone. Care to explain your response?)

Intel did get several HPC deals which use just Skylake Server and no accelerators. I think they realized they would never be able to catch nVidia in performance because of how big the CPU cores are, hence the GPU project.
 
  • Like
Reactions: Tlh97 and moinmoin
Jul 27, 2020
15,745
9,810
106

"I always think we're 30 days from going out of business," Huang says. "That's never changed. It's not a fear of failure. It's really a fear of feeling complacent, and I don't ever want that to settle in."

Intel is the opposite of that. Until a few years ago, they thought themselves invincible and so big that no one could challenge them. That's the beginning of any company's downfall.
 

moinmoin

Diamond Member
Jun 1, 2017
4,933
7,619
136
When did Larrabee start being conceptualized? 2004, 2005 timeframe? GPUs had something like 100-200 shader cores. Intel thought they could stay in the game with x86. Can see the Oracles at Intel saying, "control the language and you control the world".

I guess it's the same human failing that doomed so many companies. The inability to obsolete your own products. In this case, it's not even that, as both CPU & GPU profitably coexist. Just myopic stupid greed, "WE WANT ALL", and you end up way behind.

edit: Corrected shader core count
And no plan B. Which I find most mind-boggling of all. Integral part of myopic overconfidence I guess.
 
  • Like
Reactions: Tlh97 and maddie

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Remember, it was 10nm delay that led to demise of Xeon Phi. The same management that led to losing the leadership of their strongest area won't have foresight to do things like create a replacement.

The process delay actually started with 14nm. It was 6 month delayed. That's why the first product was Core M, and it was pretty disappointing.
 
  • Like
Reactions: Tlh97 and moinmoin

Saylick

Diamond Member
Sep 10, 2012
3,084
6,184
136
And no plan B. Which I find most mind-boggling of all. Integral part of myopic overconfidence I guess.
Speaking of no plan B, the same thing happened with 10nm. There was really no fallback to hedge if 10nm didn't meet schedule.

They knew 10nm was super aggressive with their scaling targets and they thought they could do it without EUV (to give them credit, it wasn't mature enough anyways when 10nm targets were established). However, they really shot themselves in the foot by being headstrong and thinking that if they just threw more resources at the problem, they could get 10nm resolved. In hindsight, they should have either pivoted away sooner from the aggressive scaling targets and/or decide to use EUV. Meanwhile, the stars align for TSMC, who were previously a year or two behind Intel, because EUV starts to mature right when they started setting up scaling targets that could benefit from EUV. They start making orders for EUV machines while Intel still thinks they don't need them yet.

What Intel should have done is have two separate teams: one designs 10nm without EUV with relaxed scaling, and another designs 10nm with EUV but the original scaling targets. Whichever one looked more likely to meet schedule, that's the one that gets used first. Ideally it would be the traditional DUV option. Then, when the EUV option is viable, it gets used. This is no different than what TSMC did by introducing small changes across iterative nodes. The point is, TSMC develops multiple nodes simultaneously. Intel did not.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Speaking of no plan B, the same thing happened with 10nm. There was really no fallback to hedge if 10nm didn't meet schedule.

I think what really compounded the issue is what you said in addition to firing experienced staff and engineers in general.

You cannot fire people with invaluable experience, kill morale and expect the most ambitious target to be met.
 

Leeea

Diamond Member
Apr 3, 2020
3,599
5,340
106
The dream of an Intel GPU seems to be fading away into irrelevancy every day now.

A marketing department shout fading to an echo and then into vaporware.
 

NTMBK

Lifer
Nov 14, 2011
10,208
4,940
136

iU32KiRmXParKACThubVwA-970-80.png.webp


Look, a desktop card! Shame about the drivers