News Intel GPUs - Intel launches A580

Page 99 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

TheELF

Diamond Member
Dec 22, 2012
3,967
720
126
Man, if these numbers are to be believed... this isn't looking good for Intel compared to what many had hoped.

They are comparing the A770 to a mobile 85 watt 3060. It wins by an average of 12%. Keep in mind the desktop 3060 is about 20% faster on average than the mobile version. So the desktop version of the 3060 is around 10% faster than the Intel A770
It's also a mobile 770, and while the average is 12% the 770, if and when it wins, it wins by about 30% and loses by around 10-15%
Its looking like Intel is well over half a decade behind NV and AMD who aren't going to slow down.
For several years now amd and nvidia have moved to ray tracing and Ai resolution scaling because the straight up performance increase is incredibly slow.
 
  • Like
Reactions: Leeea

Thunder 57

Platinum Member
Aug 19, 2007
2,647
3,706
136
I wonder how you defended Prescott back in the day. I mean it seems like Intel can do no wrong even when they release a turd like this. Why else would they limit to China just like Cannon Lake? That was another piece of crap.

I also have to disagree about performance increase being incredibly slow. Since GPU's are so parallel in nature, as long as density improves performance will follow. AMD's current products make a mockery out of Vega. NVIDIA has been excelling (except for price) for generations now.

Raja seems like a village idiot after the Vega fiasco. No idea why Intel wanted him just look at what we are seeing. Delayed garbage. I suspect drivers can improve performance quite a bit but I wouldn't recommend one until they show they can do that.
 

mikk

Diamond Member
May 15, 2012
4,111
2,105
136
It was to be expected. Someone with connections in Intel(I need to search forum to give credit) said that they outsourced product validation, hence countless delays.

It would have been nice to have more competition in GPU market but it is not happening soon.

Edit: https://forums.anandtech.com/thread...ure-lakes-rapids-thread.2509080/post-40770411
Thanks @Exist50 !


He speaks about the server validation team, how is this related to GPU and in particular GPU drivers?

I watched several of the chinese reviews and from I could see there weren't any obvious compatibility issues or any major graphics glitches. Also their known issues list from the Arc driver is short compared to the iGPU driver (which runs on much newer build numbers)

I think Intel mainly tried to improve compatibility, once this is sorted out they can tackle the performance issues. They have to because not only is it dGPU related, they also have at least two upcoming CPU generations using this architecture for the iGPU. Battlemage is also a Gen12 derivative, even though it's 12.9 Gen.
 
  • Like
Reactions: Leeea
Jul 27, 2020
15,738
9,796
106
Intel A770:
21.7B transistors

Geforce 3060:
12.B transistors
Almost double the number of transistors and still it's having a hard time??? What did they spend their transistor budget on? The performance deficit makes sense if it is choke-full of enterprise specific features and gaming workloads are working on it in emulation mode or something, almost as an afterthought.
 
Jul 27, 2020
15,738
9,796
106
Anyone know anything about rumors regarding the ARC Special Edition that they have been showing protected inside a glass box? Is it going to be faster than A770 and maybe have 24GB RAM or more?
 
  • Like
Reactions: Tlh97 and Leeea

GodisanAtheist

Diamond Member
Nov 16, 2006
6,715
7,004
136
I've been thinking, a lot of people are saying Intel launched at the wrong time... but maybe they launched at the right time for us.

Nvidia and AMD have a *STRONG* incentive to launch the next gen in high volume and low prices to strangle Intel "in the cradle" so to speak. I'm sure none of the established players would be happy with a disruptive third player, especially one like Intel.
 
Jul 27, 2020
15,738
9,796
106
  • XeSS (Xe Super Sampling) AI-Assisted Super Sampling Technology
  • XeGTAO (Ground-Truth Ambient Occlusion)
  • Xe DeepLink Technology
  • XeXMX with Xe-Cores
These technologies, if they turn out to be very effective, may gain a following among Intel fans, despite lower GPU performance.
 
  • Like
Reactions: Tlh97 and Leeea

randomhero

Member
Apr 28, 2020
180
247
86
He speaks about the server validation team, how is this related to GPU and in particular GPU drivers?

I watched several of the chinese reviews and from I could see there weren't any obvious compatibility issues or any major graphics glitches. Also their known issues list from the Arc driver is short compared to the iGPU driver (which runs on much newer build numbers)

I think Intel mainly tried to improve compatibility, once this is sorted out they can tackle the performance issues. They have to because not only is it dGPU related, they also have at least two upcoming CPU generations using this architecture for the iGPU. Battlemage is also a Gen12 derivative, even though it's 12.9 Gen.
Cough*Ponte Vecchio*cough

Being serious, Intel's GPU efforts show lack of resources, best example being their competitor AMD from 2013 to 2020.
It can be argued what resources they lack, but that post from Exist50 is great clue, money being last on the list.
 
  • Like
Reactions: Tlh97 and Leeea

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
They laid off engineers, many experienced in all areas not just server.

They've been struggling with GPUs for years and that was just in iGPUs. This will be one big test on whether the company can get out of it's comfort zone and expand.

I still think blaming it entirely on Raja is simple minded. Nothing is usually that simple especially for a company big as Intel and especially when they had crap GPUs since ever. When you have greater influence like CEOs it's different, and even then it takes years to shift. A small underfunded company like AMD during Vega days it's much easier for one person to influence things.

As Semianalysis says he was handed the industry's worst GPU architecture despite what some believe. Scaling up and expecting everything to work smoothly isn't reality. You need to work on low level details to do it properly.
 
Jul 27, 2020
15,738
9,796
106
As Semianalysis says he was handed the industry's worst GPU architecture despite what some believe.
Wasn't it his job to fix what was wrong with that architecture? Or design a new architecture for them? Going with something that was broken from the start would obviously be a predictable disaster. I don't believe that Raja had no choice in the matter. He did influence Intel's GPU design and he made some bad short-sighted decisions. He was unable to predict where AMD/Nvidia would be performance-wise by the time Intel could ship their GPU. Or he was correct but the delays messed things up for his predictions. If it's the latter, I still have a hard time believing that his hands were completely tied. Maybe he just sat back and drank his tequila, enjoying the sight of Intel bunnies running around frantically with one defective tape-out after another and grinning to himself thinking, "Hey! I think I'm really gonna enjoy my time here."

They say if you deliver stuff too quickly, everyone will think it wasn't that hard. But if you mix it up with delays and some drama, you will be given an award.
 

Mopetar

Diamond Member
Jan 31, 2011
7,797
5,899
136
I don't even get why they've struggled so badly. A GPU is by no means trivial to design, but Apple managed to make a fairly confident GPU for their M1 chips. It doesn't run DX12 so it's difficult to compare it directly against AMD and NVidia, but it does show that it's possible for another company to design a product that can compete well enough.

The only thing I can think of is that the upper management is too incompetent to find someone who can run a successful GPU division or it's too toxic to allow something like that to happen.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,222
5,224
136
I've been thinking, a lot of people are saying Intel launched at the wrong time... but maybe they launched at the right time for us.

Nvidia and AMD have a *STRONG* incentive to launch the next gen in high volume and low prices to strangle Intel "in the cradle" so to speak. I'm sure none of the established players would be happy with a disruptive third player, especially one like Intel.

NVidia and AMD strangling Intel "in the cradle" not good for us.

What would be good for us, is Intel staying in the market and iterating until they have a competitive product to create a three way competition.
 

Leeea

Diamond Member
Apr 3, 2020
3,599
5,340
106
Apple managed to make a fairly confident GPU for their M1 chips
I would disagree.

Apples $2000 (best buy, cheapest) m1 Max GPU loses out to a 3050 Ti Laptop GPU at 1440p in Borderlands.
Total War Three Kingdoms, the M1 Max loses out to a GTX 1080 Mobile laptop GPU.

Apparently the m1 ultra is the best apple has, but I have not located any gaming benchmarks for it.

edit:
I found an m1 ultra benchmark:
for a mere $4,000 at best buy, you can get a system that will get 90 fps in Shadow of the Tomb raider at 1440p. Basically a 3060Ti level of performance.
 
Last edited:

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I would disagree.

Apples $3500 m1 Max GPU loses out to a 3050 Ti Laptop GPU at 1440p in Borderlands.
Total War Three Kingdoms, the M1 Max loses out to a GTX 1080 Mobile laptop GPU.

Apparently the m1 ultra is the best apple has, but I have not located any gaming benchmarks for it.

LTT just posted a video today with game benchmarks. However, there are very few games that run natively on Apple silicon. The link you posted doesn't take this into account. Running games in emulation is always going to be slower.
 

Leeea

Diamond Member
Apr 3, 2020
3,599
5,340
106
LTT just posted a video today with game benchmarks. However, there are very few games that run natively on Apple silicon. The link you posted doesn't take this into account. Running games in emulation is always going to be slower.
Yes, what you say is true.

But it is a straight up slaughter in the LTT video. WoW was the only game it was ok with, although driver issues were apparent.

Interestingly, Shadow of the Tomb raider does use Apples metal graphics API.
 

linkgoron

Platinum Member
Mar 9, 2005
2,286
809
136
Sadly, I can't say I'm surprised by how bad ARC is. Hopefully by Battlemage they'll have the drivers sorted out and they'll be way more competitive. Having a 6nm 150mm^2 card perform worse than a 3 year old 200mm^2 12nm card (1650), or a 75% cut-down 6nm 100mm^2 (6400) card is just sad. I believe that they can extract way more performance, but it seems like they're not really close.

Hopefully now that Raja Koduri has been promoted to EVP and is in the driver seat (no pun intended), things will get fixed!
 
  • Like
Reactions: Tlh97 and Leeea

itsmydamnation

Platinum Member
Feb 6, 2011
2,743
3,069
136
Great point! And Apple did it using PowerVR tech, which used to be so bad that Imagination Technologies had to bow out of the PC market.
that's not remotely an accurate reflection.
Funny how AMD and NV GPU's became far more like TBDR's as pixel complexity increased.

Sometimes its about the right technology at the right time , TBDR local storage cost was to high back in the day. Now large data movement is way to expensive.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Yes, what you say is true.

But it is a straight up slaughter in the LTT video. WoW was the only game it was ok with, although driver issues were apparent.

Interestingly, Shadow of the Tomb raider does use Apples metal graphics API.

The graphical glitches in WoW actually impact all video cards. Its bug during that particular fight.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,670
1,250
136
Hopefully now that Raja Koduri has been promoted to EVP and is in the driver seat (no pun intended), things will get fixed!

Or it dooms them to mediocrity.

The combination of high profile failures (there are so many similarities between ARC and Vega), his penchant for self promotion, his aptitude for ladder climbing... my intuition tells me that Raja is probably the problem and not the solution, or at the very least if he isn't steering into the rough water personally, he's not the man to right the ship. I could be, and hope that I am, completely wrong. A viable third option would be a godsend. Admittedly, I don't know much about the man, but what I can observe gives me a very slight Elizabeth Holmes vibe.