News Intel GPUs - Intel launches A580

Page 141 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

exquisitechar

Senior member
Apr 18, 2017
657
871
136
It's a shame, but the drivers make these unusable as a gaming GPU to me. Intel needs to make huge strides on both the hardware and software side quickly. This huge N6 die, with its appropriately large shader count and wide memory bus, struggling to beat the 3060 and 6650 XT while drawing more power is not a pretty sight. If it weren't for the drivers, that wouldn't be such a problem for consumers, but it's disastrous for Intel and not going to work in the long run, as I said before.

Someone should test how it works with oneAPI, maybe it'll be usable for that.
 

blckgrffn

Diamond Member
May 1, 2003
9,128
3,069
136
www.teamjuchems.com
Tom's Hardware (Jared) had a pretty good review too, and positive but with a different flavor of optimism than Ars, with more focus on performance and less noting of terrible driver issues. This was noted, however:

"Finally, as with the Arc A380, my older MSI MEG Z390 Ace WiFi motherboard running the latest BIOS (with Resizable BAR enabled) still won't POST without some other graphics card installed. I put in a GTX 1650 Super and it booted right up, so there's still some sort of VBIOS or firmware problem. Other Z390 boards may work fine, or they might not, but every other graphics card I have works without any complaint in the PC. "


I thought I had read somewhere that Intel really was only wanting these used with 10th Gen and newer chipsets. I wonder how much of that is issues like this and how much of it is just wanting more recent hardware as a baseline. Intel should have a lot of insight into compatibility with their own chipsets and presumably started making some chipset level optimizations at some point in the past to ensure these cards were not hamstrung when launched.
 

KompuKare

Golden Member
Jul 28, 2009
1,016
934
136
Well, it looks like their optimisations worked in Spider-Man, considering how things were with A380 + old drivers.
View attachment 68722

Basically the results are all over the place. It kinda makes sense though and just shows the driver situation. Games that are optimised can run really nicely while the rest are either less than optimal and some are horrible.

Yet, CB specially called out issues with Spider-Man with for instance the Very High preset not working:

As regards VRAM management, PCGH had another potential one. In Guardians of the Galaxy, the 8GB A750 is really really far behind the A770:
h2L77Ri.png
 
  • Like
Reactions: DAPUNISHER

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,500
20,622
146
Paul's review shakes things up. With an all AMD platform of 7950x DDR5 and 6600XT, the 6600XT doesn't lose a single benchmark at 1080 or 1440 in his test suite. With the 6600XT being x8 do you think the DDR5 is helping it perform better at 1440?

 

clemsyn

Senior member
Aug 21, 2005
531
197
116
Tom's Hardware (Jared) had a pretty good review too, and positive but with a different flavor of optimism than Ars, with more focus on performance and less noting of terrible driver issues. This was noted, however:

"Finally, as with the Arc A380, my older MSI MEG Z390 Ace WiFi motherboard running the latest BIOS (with Resizable BAR enabled) still won't POST without some other graphics card installed. I put in a GTX 1650 Super and it booted right up, so there's still some sort of VBIOS or firmware problem. Other Z390 boards may work fine, or they might not, but every other graphics card I have works without any complaint in the PC. "


I thought I had read somewhere that Intel really was only wanting these used with 10th Gen and newer chipsets. I wonder how much of that is issues like this and how much of it is just wanting more recent hardware as a baseline. Intel should have a lot of insight into compatibility with their own chipsets and presumably started making some chipset level optimizations at some point in the past to ensure these cards were not hamstrung when launched.

Interesting, I have the same issue with my A380 and my z97 4770k, looks like old chipsets will give you issues regardless of rebar.
 

Panino Manino

Senior member
Jan 28, 2017
821
1,022
136
Quite impressive in Metro Exodus. It actually does look like Intel has more ray tracing oomph than NVIDIA and it can really shine in games lake these. Also the higher the resolution the better it does relative to competition.
View attachment 68729
Somewhat similar case in Control with RT enabled.

It probably should be trading blows with RTX 3070 class cards if the drivers didn't held it back.


AMD should be scared, with better drivers this GPU feels solid.
 

nicalandia

Diamond Member
Jan 10, 2019
3,330
5,281
136
AMD should be scared, with better drivers this GPU feels solid.
It's DOA

"The cost per frame graph above nails it -- as in nailed the A770 to the point where it's dead on arrival"

 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,500
20,622
146
Jay tested without rebar, and unlike the 380, there was no hitching or stuttering. You do lose a bunch of performance however, like 30-40+ percent. That's probably the 256bit bus coming into play. The 750 couldn't ray trace Guardians of the Galaxy while the 770 did great; more weirdness. He had a bunch of issues with Hitman but they went away the more he reran it. A bug of note was that after exiting every title, it reset the monitor back to 60Hz from 144.
 
  • Like
Reactions: Tlh97 and Leeea

linkgoron

Platinum Member
Mar 9, 2005
2,300
821
136
It seems as though Intel put this in the middle of nowhere. The performance is OK, just over 5700XT at 1080p and just over 6600XT at 1440p. perf/watt is worse than AMD and Nvidia, and price/performance is competitive at best. I don't really see why someone wouldn't prefer a more performant 6700xt for ~$50 more or a similarly performant 6600xt for ~$50 less, with way better drivers and better perf/watt.

This is just 3 years late with about the same price that it would have had 3 years ago. Even a year ago at this price would've been way better. Now, with RTX 4k and RDNA3 at the door step, I just don't see why anyone would go for one, unless as a curiosity.


Ugh what a frustrating release with the A770. You can see the seeds of a good card being there, but I play too many old games and too many newer low budget Japanese games that aren't going to get the optimization that an RDR2 or a Spiderman will get for this card to be worthwhile for me. But I have to admit it is tempting at 25% faster than the 6600 XT at 4k in Techpowerup's testsuite. That cpu bottleneck though is also kind of concerning seeing it only leads the 6600 XT by 3% at 1440p and is 9% slower at 1080p in the same testsuite.

You could easily just get a 6700XT, although it costs $50 more - it bring you more performance, better perf/watt and better drivers...
 

blckgrffn

Diamond Member
May 1, 2003
9,128
3,069
136
www.teamjuchems.com
Jay tested without rebar, and unlike the 380, there was no hitching or stuttering. You do lose a bunch of performance however, like 30-40+ percent. That's probably the 256bit bus coming into play. The 750 couldn't ray trace Guardians of the Galaxy while the 770 did great; more weirdness. He had a bunch of issues with Hitman but they went away the more he reran it. A bug of note was that after exiting every title, it reset the monitor back to 60Hz from 144.

Ars review stated some games in their setup became basically unplayable with major hitching and stuttering without rebar. And, weirdly, lots of corruption in browsers.

"I then rebooted, disabled ReBAR on the BIOS level, and played the same Cyberpunk segment again. The result was nigh unplayable, thanks to constant multi-second pauses and chugs. To give this scenario a fair shake, I immediately reloaded the save file in question and tried again in case this was a matter of one-time shader compilation causing the stutters. The bad numbers persisted between the tests.

Should your favorite games revolve around tight corridors or slower runs through last-gen 3D environments, the Arc GPU difference between ReBAR enabled and disabled can range from a margin-of-error sliver to a 10–15 percent dip. But even if you can stomach those issues, you might run into significant quirks outside of gaming. In my case, Google Chrome and Microsoft Edge would both routinely glitch with ReBAR disabled while videos played in any tab."


So it's probably a big YMMV with issues from your PC doesn't boot to rebar not mattering so much.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,500
20,622
146
Ars review stated some games in their setup became basically unplayable with major hitching and stuttering without rebar.

So it's probably a big YMMV with issues from your PC doesn't boot to rebar not mattering so much.
I need to read that full article, only skimmed a few of the last pages. I have no idea what Jay's test system was either. Heck, Paul said he had zero driver issues. o_O
 
  • Like
Reactions: Tlh97 and Leeea

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,500
20,622
146
Nvidia with its usual gamesmanship. Bunch of tech tubers are doing 4090 unboxings. Welcome to the big leagues Intel.
 
  • Like
Reactions: Tlh97 and blckgrffn
Mar 11, 2004
23,077
5,559
146
Ooof, people taking Ars seriously as a hardware test site. Even as far as Ars' weak reviewing goes, this is a farce. Its sad how much they're trying to gloss over serious issues so as to try and give a glowing review. If this was literally any other company or any other product they'd have given a stern "do not buy" (they've literally done that over much more minor issues) and yet they're trying to fluff Intel here for some reason. I don't get it it, at all.

And that people are acting like they're doing it as some pro-consumer "competition!" argument is even more absurd. How is a product that isn't just worse in basically every normal performance regard (perf/w, perf/$) but also has straight up broken aspects for many people, "pro-consumer"? Consumers shouldn't want garbage like this. Heck, people still cite more minor instances of issues by AMD as though it was so broken that they refuse to ever buy another AMD product, and yet they're going "eh, not bad, Intel" with this. So why this complete and utter hypocrisy?

AMD should be scared, with better drivers this GPU feels solid.

How so? It costs Intel more to make, it consumes more energy, and its literally broken in many instances?

PC Enthusiasts have become absolutely insane.
 

LightningZ71

Golden Member
Mar 10, 2017
1,628
1,898
136
Paul's review shakes things up. With an all AMD platform of 7950x DDR5 and 6600XT, the 6600XT doesn't lose a single benchmark at 1080 or 1440 in his test suite. With the 6600XT being x8 do you think the DDR5 is helping it perform better at 1440?


I wouldn't expect the memory being DDR5 to have too much of an impact as PCIe 4.0 x8 has a maximum throughput of 16GB/sec. DDR4-3200 is easily well above that already. There may be some sort of CPU performance dependency on AMD's drivers that enables the 6600Xt to gain performance against the ARC 770 when it has a leading edge CPU to handle the thread, but, that's purely speculation. It's possible that DDR5 is allowing enough available bandwidth to keep the CPU from hogging so much memory bus time that direct writes to the PCIe controller aren't hampered. There's also the possibility that having the faster IF on the 7000 series CPU allows more efficient PCIe root controller operation that may benefit the 6600XT more. It's tough to tell without profiling the whole system.

In general, it seems that Intel is still having VRAM and host DMA pool memory management issues with their drivers. That the 16GB card doesn't suffer as much as the 8GB cards seems to be a tell when benchmarks are being run in resolutions and texture levels that should work well with 8GB VRAM. They may be aggressively caching textures in VRAM over and above what other cards are doing (which might explain why they are so dependent on reBar) that manages to hide their issues on the 16GB card, but not on the lower RAM cards. It also seems to call into question why they didn't supply 8GB A770 cards to reviewers. On paper, it should be very close to the 16GB card in most situations. If this is a driver memory management problem, it'll probably be the inconsistent dumpster fire that the A750 is and show itself to not be worth the money asked for it.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,500
20,622
146
@LightningZ71

Thanks for the insight. I am certain some will get around to testing everything now that the cards are here.

On the Ars review: Dude is using a 5800X with 32GB of 3200. Doesn't even say if it's CL14 dual rank. Not what I'd call a professional reviewer's GPU test bench.
 
  • Like
Reactions: ZGR

KompuKare

Golden Member
Jul 28, 2009
1,016
934
136
AMD should be scared, with better drivers this GPU feels solid.
AMD of old might have been scared, and AMD are a testament of how you can survive (just about) on low margins.

Back then AMD regularly used way more transistors / die area to remain competitive but at far lower margins than Nvidia (the obvious exception being Hawaii which as way smaller than GK110).

Now it's Intel who are heading for lower margins using transistors like they are going out of fashion:
WiTfh91.png
 

blckgrffn

Diamond Member
May 1, 2003
9,128
3,069
136
www.teamjuchems.com
@LightningZ71

Thanks for the insight. I am certain some will get around to testing everything now that the cards are here.

On the Ars review: Dude is using a 5800X with 32GB of 3200. Doesn't even say if it's CL14 dual rank. Not what I'd call a professional reviewer's GPU test bench.

Ars is relevant due to their reach, but yeah. They regularly espouse some firm recommendations with some eyebrow raising test kit.

@darkswordsman17 I mostly use Ars as a place to see what the "masses" of "tech enthusiasts" - not necessarily computer hardware geeks - are being told. I've seen the takes published there parroted by many, so it's interesting to see what these "informed" folks are going to be told what to think.

At least the comments there seem to be fairly realistic.
 
Last edited:

Karnak

Senior member
Jan 5, 2017
399
767
136
AMD should be scared, with better drivers this GPU feels solid.
How so? It's 237mm² (6650XT) vs. 406mm² (A770). And the 6650XT/N23 would be even smaller on TSMC N6.

Not to mention the rumors about N33. Even if it's only somewhere in the middle of N22 and N21 performance-wise, that'd still be a massive lead vs. the A770 at allegedly ~200mm².

I'm not impressed with the performance given the huge die size. Wether it's raster or RT. Not to mention power consumption vs. the competition.
 
  • Like
Reactions: Tlh97 and KompuKare

blckgrffn

Diamond Member
May 1, 2003
9,128
3,069
136
www.teamjuchems.com
Nvidia with its usual gamesmanship. Bunch of tech tubers are doing 4090 unboxings. Welcome to the big leagues Intel.

I was waiting for the new MSRP for the 3060 at $299. If they don't bother, its because they don't care.

If it takes the A770 16GB to have good performance (I did watch that roundtable and they really do have poor memory performance scaling) and the 8GB is much closer to the A750 in worst case scenarios, they likely don't need to bother.

According to Intel, the 16GB is a limited sku as well.
 

StinkyPinky

Diamond Member
Jul 6, 2002
6,766
784
126
Let me express it in meme -

View attachment 68730

I am exaggerating a bit, but just a bit. I get it though. We all want the extra competition, so they are looking for a silver lining.

I think they're going a little easy because it's the first realistic competitor in 15 years.

Of course they shouldn't really be doing that and just judge the product as it is, but human nature is such we all want the scrappy underdog to punch above its weight (and with GPU's, that's Intel). But yeah the drivers are a hot mess. Pity, because the hardware looks pretty decent.