News Intel GPUs - more reviews coming in!

Page 142 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DAPUNISHER

Super Moderator and Elite Member
Moderator
Aug 22, 2001
25,122
10,026
146
Last edited:

StinkyPinky

Diamond Member
Jul 6, 2002
6,725
718
126
Quite impressive in Metro Exodus. It actually does look like Intel has more ray tracing oomph than NVIDIA and it can really shine in games lake these. Also the higher the resolution the better it does relative to competition.
View attachment 68729
Somewhat similar case in Control with RT enabled.

It probably should be trading blows with RTX 3070 class cards if the drivers didn't held it back.
Crazy, the 750 destroys them in this benchmark and yet dribbled along like a man with a bad prostate in some other games. I think Civ 6 has DX11 and DX12 versions, benchmarks would be interesting in that.
 

blckgrffn

Diamond Member
May 1, 2003
8,593
2,105
136
www.teamjuchems.com
LIMITED EDITION is just the name, not an attribute of the SKU. The LIMITED EDITION cards are as limited as Sapphire TOXIC cards are toxic! :p
Yes, yes, yes. OK.

They said stock of the 16GB was going to be really limited and that the generally available stock was going to be the 8GB variants of both cards.

Can you imagine if Nvidia had dropped a 3070Ti in 8GB and 16GB flavors and then when the original shipment of 16GB cards sold out they never made anymore, but the first run of reviews lauded the 3070Ti for alleviating memory issues that the lesser SKUs faced? That wouldn't age well either.

Hopefully Intel bothers to create more 16GB cards. That will be a test of their commitment to the bit, imo.

Written Kit Guru reviews for both models.

https://www.kitguru.net/components/graphic-cards/dominic-moass/intel-arc-a770-limited-edition-review/

https://www.kitguru.net/components/graphic-cards/dominic-moass/intel-arc-a750-limited-edition-review/

He notes that despite okay fps that the cards were borderline unplayable in Cyberpunk correction Days Gone. It gets worse from there.
I am going to quote the conclusion:

"Throughout my testing, I experienced incredibly poor frame times in certain games, visual glitches that affected two of the twelve games I wanted to benchmark, as well as game crashes and even system BSODs. Performance in DX11 titles is also a huge problem for Arc, while Rebar is absolutely essential for a hope of a smooth gaming experience. I'd add to that by saying I wasn't trying to go out of my way to find problems. I simply set out to benchmark a wide variety of titles, and this was my experience.

The problems are so varied and significant that it becomes impossible to recommend buying an ARC GPU right now. While the $349 price point for the A770 Limited Edition certainly looks good on paper, multiple RX 6650 XTs are currently selling on Newegg.com for between $300-329. That GPU offers performance that is broadly similar but on a platform that is just head and shoulders above Arc in terms of stability and consistency.

I'd also add to that with a word of caution on the A770 itself. My testing doesn't show a particularly large delta between the A750 and the A770, with the later card 9% faster on average at 1080p, despite being priced 21% higher. The A770 8GB model may well make more sense, but certainly, the A750 looks more attractive on paper.

That point is entirely academic at present though, as currently, we are not in a position to recommend any of the Arc lineup – Intel has plenty more work to do before we can consider making nuanced recommendations. We remain optimistic that Arc could be a success in the future, as we say there certainly are glimpses of strong potential here, and we look forward to testing the A750 and A770 as major updates land and hopefully change the picture.

Right now, however, Intel Arc isn't ready for the mainstream market."

 

SteveGrabowski

Diamond Member
Oct 20, 2014
4,983
3,711
136
I will focus on the negatives from the reviews I have seen and read so far for those that are pressed for time and need the quick and dirty.

Quality of life issues are numerous. Audio cut out, audio device disappearing, reboots required for troubleshooting. VR rendering problems, GPU installation and detection problems. Some requiring experience and extra hardware or iGPU to troubleshoot. Among its biggest weaknesses are the most popular things i.e. 1080p gaming (which something like 65% of gamers still use.) Performance issues and bugginess in some of the most popular games like GTAV, CS:GO, and Apex Legends. No fan speed control for the end user yet. Though in all fairness, the cards don't run hot or loud. PITA to disassemble. Worthless for systems without rebar. This one everybody has known, and Intel practically screams it at you, but it is a very real drawback for those owners.

The biggest issue of all; price. It doesn't matter why Intel has to charge this much, the prices are bad. AMD wins there. Nvidia wins on quality of life and good ray tracing capability.
Not just 1080p, the 1440p performance is a bit weak too, while its not strong enough to use as a 4k card. The 4k result that kind of splits the difference between the 3060 and 3060 Ti tells me this has the potential to be a decent 1440p card, but as of right now it's weaker than a 6650 XT at 1440p and I can get the 6650 XT cheaper and with much better driver support today.
 

coercitiv

Diamond Member
Jan 24, 2014
5,257
8,588
136
Yes, yes, yes. OK.
I fell for it too.

They said stock of the 16GB was going to be really limited and that the generally available stock was going to be the 8GB variants of both cards.
Here's the box for the Acer A770 OC... 16GB model :tearsofjoy:

1664998150380.png

I can already imagine the marketing brainstorm: We'll call it LIMITED EDITION. You know, like EVGA Classified.
 

DAPUNISHER

Super Moderator and Elite Member
Moderator
Aug 22, 2001
25,122
10,026
146
@blckgrffn

If you watch the coil whine part of the video, it is like nails on a chalk board for me. He said it happens in menus without frame caps. Dominick has a really good testing methodology and format IMO. Spiderman has broken lighting. Looks like Ozzy Steve couldn't hope to find it since he does his walking simulator 2 pump chump web swinging. :p And Dominick was inside a building with Peter out of his suit. TW: Warhammer 3 has serious issues too.
 
  • Wow
Reactions: blckgrffn

IntelUser2000

Elite Member
Oct 14, 2003
8,513
3,557
136
Not just 1080p, the 1440p performance is a bit weak too, while its not strong enough to use as a 4k card.
It looks like ARC performs consistently better on 4K than on 1440p, and 1440p performs better than 1080p.

4K performance relative to the competition might be an indicator of the potential of the hardware, because the lower resolutions expose more of the driver bottleneck which limits CPU limited performance(not always though). Remember Intel saying that using iGPU driver stack was an oversight?

GTA V is weak on Iris Xe iGPUs. Makes sense why it performs bad.

Looking at TPU results:
-Doom, RDR, Control, Metro Exodus are examples of well optimized games
-Metro Exodus, a game that performs exceptionally well on ARC, the "scaling" in higher resolutions falls relative to the competition, because optimizations generally matter less at higher settings and/or resolutions.
-In general the well optimized games, it scales just as well(or even slightly worse) than the competition
-In games that perform very poor in 1080p, generally catches up to RTX 3060 at 4K, pointing to driver weaknesses again. Examples include Battlefield V, and Borderlands 3.

Hardware is also at fault for lower than expected performance. Semianalysis said Raja was handed the worst GPU architecture out there. It's true. It seems like the performance is last gen per shader/fillrate/bandwidth.
 
  • Like
Reactions: Leeea

LightningZ71

Golden Member
Mar 10, 2017
1,478
1,691
136
The semiaccutate statement shouldn't be that far off. Intel stated that the Xe architecture was the basis of Arc. Granted that it is a second generation of the design, it isn't too far removed from it. As we see in benchmarks, with relatively similar memoty bandwidth, Tiger Lake Xe is close to Cezanne in gaming performance. Cezanne's iGPU is Vega VII based. Rembrandt is significantly faster than Alder Lake mobile's iGPU with similar bandwidth.
 
  • Like
Reactions: Tlh97 and Leeea

IntelUser2000

Elite Member
Oct 14, 2003
8,513
3,557
136
@LightningZ71 Semianalysis, not Semiaccurate.

It does tell you how far behind they were before since Iris Xe was a tremendous step in perf/watt and perf/mm2 over Gen 11 in Icelake and Gen 9 in Skylake derivatives.

I think there's a generation gap in mentality as well, because they were so used to making only iGPU designs.

That's why the expectation that they could simply scale up their iGPU and roll over the competition was naive, because yes at the very high level part it's true, but details are what allows the scaled up part to perform as expected per high level analysis.

And I mean high level analysis as in if you were to explain to your mom for example. :grin:
 
Last edited:

LightningZ71

Golden Member
Mar 10, 2017
1,478
1,691
136
Gen 11 graphics felt like intel thought "hmm, we've got a larger transistor budget available... lets just throw more of the same dwsign at it, just scaled up. That should do it!"

Itwas, admittedly, a notable improvement over what had been largely stagnant for half a decade.
 

igor_kavinski

Diamond Member
Jul 27, 2020
6,074
3,754
106
Semianalysis said Raja was handed the worst GPU architecture out there. It's true. It seems like the performance is last gen per shader/fillrate/bandwidth.
His job was to improve it or outright tell them that it was a bad idea to go ahead with it. If it were me, I would have got it in writing from them that I will do my best with the non-optimal architecture being handed over to me. Also, my mind keeps going back to my suspicion that Raja just isn't as good as the GPU architects at AMD/NVIDIA. He doesn't really understand the subtleties and nuances of what makes a GPU efficient. His approach is more brute force, hence the bigger die size and higher power draw.
 

JustViewing

Member
Aug 17, 2022
49
108
66
His job was to improve it or outright tell them that it was a bad idea to go ahead with it. If it were me, I would have got it in writing from them that I will do my best with the non-optimal architecture being handed over to me. Also, my mind keeps going back to my suspicion that Raja just isn't as good as the GPU architects at AMD/NVIDIA. He doesn't really understand the subtleties and nuances of what makes a GPU efficient. His approach is more brute force, hence the bigger die size and higher power draw.
I agree about Raja. He was given free hand and a blank check. He was hired for his supposed expertise and the belief he would deliver the goods. While really appreciate his honesty in that interview, it felt like an apology and please don't fire me I will do better next time.
 
  • Haha
Reactions: igor_kavinski

JustViewing

Member
Aug 17, 2022
49
108
66
Enabling Resizable bar won't be an issue for systems people bought in last 4-5 years. Even my old b350 + 2700x +RX6600XT combo has Resizable bar enabled and working. To my surprise even supports 5950x in that motherboard.
 

KompuKare

Senior member
Jul 28, 2009
853
557
136
I agree about Raja. He was given free hand and a blank check. He was hired for his supposed expertise and the belief he would deliver the goods. While really appreciate his honesty in that interview, it felt like an apology and please don't fire me I will do better next time.
And to think he has rumoured to have been in with a shot of becoming Intel CEO. No wonder Intel are in trouble!
 

eek2121

Platinum Member
Aug 2, 2005
2,201
2,844
136
The semiaccutate statement shouldn't be that far off. Intel stated that the Xe architecture was the basis of Arc. Granted that it is a second generation of the design, it isn't too far removed from it. As we see in benchmarks, with relatively similar memoty bandwidth, Tiger Lake Xe is close to Cezanne in gaming performance. Cezanne's iGPU is Vega VII based. Rembrandt is significantly faster than Alder Lake mobile's iGPU with similar bandwidth.
His job was to improve it or outright tell them that it was a bad idea to go ahead with it. If it were me, I would have got it in writing from them that I will do my best with the non-optimal architecture being handed over to me. Also, my mind keeps going back to my suspicion that Raja just isn't as good as the GPU architects at AMD/NVIDIA. He doesn't really understand the subtleties and nuances of what makes a GPU efficient. His approach is more brute force, hence the bigger die size and higher power draw.
I have not worked for a single company that would allow you to throw out an existing project and start from scratch. The GPU team (including Raja) has to take what they have and iterate from that.

I suspect drivers will be fixed eventually, and hardware revisions (Battle Mage) will fix other stuff.

The ReBAR requirement will likely stay, though I suspect they will do what they can to minimize the impact.

It is a good first effort. GPUs are complex beasts. If they fix all the driver issues and bump perf/watt by 40-50% next gen, they will be in good shape.

I suspect OEMs will do all Intel builds.
 
  • Like
Reactions: Tlh97 and Tup3x

Insert_Nickname

Diamond Member
May 6, 2012
4,820
1,491
136
Enabling Resizable bar won't be an issue for systems people bought in last 4-5 years. Even my old b350 + 2700x +RX6600XT combo has Resizable bar enabled and working. To my surprise even supports 5950x in that motherboard.
Even my original gen 1700 + Asrock B350M-Pro4 + WX2100 supports ReBAR with the latest BIOS.

I didn't even think either one of those supported it, but they do and run just fine with it enabled.
 

Hulk

Diamond Member
Oct 9, 1999
3,560
1,163
136
Speaking of OEM's, Intel has been the big supplier for OEM's primarily due to their ability to meet demand and compatibility requirements with software. With all of these driver issues I wonder if the normal partners, HP, Dell, and other are going to be supporting these ARC GPU's? If they end up having trouble moving them in retail channels it seems doubtful system integrators will want the tech support issues of people contacting them with driver issues. Just wondering.
 

LightningZ71

Golden Member
Mar 10, 2017
1,478
1,691
136
Enabling Resizable bar won't be an issue for systems people bought in last 4-5 years. Even my old b350 + 2700x +RX6600XT combo has Resizable bar enabled and working. To my surprise even supports 5950x in that motherboard.
Just because the combo has the ability to enable ReBar, it doesn't mean that it can use it optimally. I've been looking at people that have benched their 2700x chips with rebar on and off, and it doesn't appear to do so well with it. I'm interested to see if someone actually tests the Arcs with a 2700x and how it performs, specifically because I have one myself. I currently have a 1070ti, so it might be a decent upgrade for me that offers 16GB of VRAM.
 

DAPUNISHER

Super Moderator and Elite Member
Moderator
Aug 22, 2001
25,122
10,026
146
If all ARC 7 series have the level of coil whine Dominic's did, that's completely unacceptable.

Every reviewer talks about how when it shines, you can see the potential. Great, but I don't potentially game. ;) They are failing the most important metrics, 1080p and performance in some of the most popular games on the planet.

I am certain you guys are on point with OEMs using the cards. But when? They ship them in the current state and the return rate will be bonkers. They would get roasted in user reviews like we have rarely seen. Gamers that buy OEM builds aren't going to be okay with BSODs, system hard locks, broken games, nearly unplayable frame times, black screens, broken graphics, their favorite e-sport title running like garbage, or their old games, AA, and Indy titles being all over the place, from fine to completely broken. Some of which depends on Microsoft to fix. Good luck with that.

It all comes down to ROI before the bean counters ax finds their neck. I doubt the good will of the community is enough to stop that. For all the well wishing and "Welcome player 3" I am reading around the web, everyone is like - "I'll wait until they get the drivers ironed out, then buy it." Meanwhile financial hemorrhaging continues to be a gusher. Time will tell, but right now the promise of a bright future doesn't pay stock holder dividends.
 

igor_kavinski

Diamond Member
Jul 27, 2020
6,074
3,754
106
If they end up having trouble moving them in retail channels it seems doubtful system integrators will want the tech support issues of people contacting them with driver issues. Just wondering.
Maybe Intel will offer to cover their support costs? Raja has access to a vast pool of Indian call center talent :)
 

ASK THE COMMUNITY