News Intel GPUs - Intel launches A580

Page 155 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Terzo

Platinum Member
Dec 13, 2005
2,589
27
91
I got the Intel model A770 a few weeks ago, sharing my experiences for anyone who's considering picking one up.

Setting up drivers was weird...I put it into a computer connected to a TV via HDMI. As soon as I installed drivers via Arc Control, I lost display output. I verified that Rebar was on, and that virtualization was off since I had seen some issues related to that. It turns out it's an HDMI thing; for some reason outputting to the HDMI 4 on the TV doesn't work (post driver install), but HDMI 1 is fine.

Unfortunately I didn't learn that until doing a clean windows install. If you're thinking about using the 770 for an HTPC/Console pc, losing display output may be as simple as switching to a different port.

One annoying thing with the Arc Control software is that Windows (11) is prompting me to give it permisison to make changes everytime i start the PC. I can probably just give it default permission but haven't bothered to google that one out yet.

In terms of performance, I can only speak to two games, and only in a subjective sense.
Jedi Fallen Order I haven't played much but it has worked without issue.
Forza Horizon 4, however, has run into some problems. It mostly works fine but sometimes it will freeze the PC and I have to force a restart. I've seen that happen loading the game, during a race, or navigating the menus. Knock on wood if I'm able to make it 5 minutes into a session then I'm usually safe, though there's been one painful exception to that.

I'm assuming (hoping?) those crashes will disappear as intel continues to mature their drivers. Until then I'll just accept that sometimes starting up forza may require a reboot or two to actually be able to play.
 
Jul 27, 2020
15,738
9,806
106
It turns out it's an HDMI thing; for some reason outputting to the HDMI 4 on the TV doesn't work (post driver install), but HDMI 1 is fine
It's a TV manufacturer skimping on features thing. Which model is it? Usually they will reduce the HDMI version of one or two ports to save cost or not support HDR or full chroma subsampling on the lesser ports. The big ones (Sony, LG, Samsung) mention the reduced functionality in the spec sheet of the TV.
 
Jul 27, 2020
15,738
9,806
106
I'm assuming (hoping?) those crashes will disappear as intel continues to mature their drivers. Until then I'll just accept that sometimes starting up forza may require a reboot or two to actually be able to play.
Don't worry about FH4. The internet is full of people experiencing crashes with this game, even with Nvidia/AMD cards.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Raja reiterated that improved driver for DX11 is coming soon. To get DX11 to be on par with DX12 in a relative sense, 15-25% is needed.

I know you refer to the iGPU driver (or don't you?) but having "Cyberpunk" in the same sentence as "well optimized" is just plain funny.

No, I am referring to ARC. There are reports of users saying performance is lower than expected in Cyberpunk, a DX12 game which should be on the decent for optimizations according to Intel. But in lower end CPUs, it performs quite a bit lower, probably because even in DX12 it still needs a lot.
 

Terzo

Platinum Member
Dec 13, 2005
2,589
27
91
It's a TV manufacturer skimping on features thing. Which model is it? Usually they will reduce the HDMI version of one or two ports to save cost or not support HDR or full chroma subsampling on the lesser ports. The big ones (Sony, LG, Samsung) mention the reduced functionality in the spec sheet of the TV.

One of the LG Oleds, from a few years ago. The previous video card (AMD 5700) was working fine on that HDMI port though. I'm not familiar with chroma subsampling, maybe 5700 didn't support that so it was a moot issue.
 
Jul 27, 2020
15,738
9,806
106
One of the LG Oleds, from a few years ago. The previous video card (AMD 5700) was working fine on that HDMI port though. I'm not familiar with chroma subsampling, maybe 5700 didn't support that so it was a moot issue.
Seems like an HDMI Deep Color issue. Probably that feature is not supported on port 4 and you didn't notice it because 5700 probably doesn't support HDR.

This is the setting detail for my LG C8:

1674311797553.png

One interesting thing that I noticed: With my Geforce GTX Titan X, the TV ran at 3840x2160p always so I was stuck at 50hz without knowing it. With 3090, my resolution changed to 4096x2160p in Windows so I got some extra pixels and also 60Hz refresh rate.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Editor for PCGameshardware.de says Intel has identified a bottleneck and a major update is coming which will improve performance across the board.

I know even in DX12(probably even Vulkan it's best API) it has significant bottlenecks. Users with AMD APUs report games like Cyberpunk significantly underperforming. Like 20%+. So the driver issue is exposed when it's bottlenecked by the CPU, that's why it doesn't perform bad with 13100 which is Golden Cove with much better performance than Zen 2.

Based on 1080p vs 4K results, 5-20% gains are possible at the 1080p resolution.

In 4K it commonly performs at the 3060 Ti/RX 6600XT level.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,271
19,907
146
A less flattering review. ARC is still having issues on the basic install and setup level. A number of QoL issues as a daily driver yet to be fixed. The control software needs to be refined. Games are still hit or miss.


I agree with him that it needs firesale pricing to be worthwhile. I'd seriously consider $300 for the ASRock 770, at $250 I'd definitely pull the trigger.
 

blckgrffn

Diamond Member
May 1, 2003
9,110
3,028
136
www.teamjuchems.com
Big annoyance for me, anyway on PCs with Intel Graphics and the Intel Driver Update utility installed - constant updates. I know it's a "good thing" but really it's a crazy cadence. I should need to opt in to get every little update, at least do a monthly stable release or something.

Second - the nice Intel driver interface (like I am used too with Radeons) hits a UAC prompt on log in. Every time. I googled this and its simply a "known issue" - frankly having a stop to click every time like that is just not acceptable. I believe once I hit "no" it stops asking each time? I don't know exactly and I don't feel like shaking it out.

My $.02. Relatively minor, I guess but signs that they aren't even at Radeon driver levels of delivery yet.
 
  • Like
Reactions: Leeea

amenx

Diamond Member
Dec 17, 2004
3,848
2,013
136
I hope Intel dont begin to feel they are falling victim to the sunk cost fallacy with Arc.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,271
19,907
146
I hope Intel dont begin to feel they are falling victim to the sunk cost fallacy with Arc.
Everything I have read says they have to stick it out for data center. Given that, I also can't see them completely abandoning the consumer market. Highest volume sales are traditionally the sub $300 market. As things have changed, maybe that is the sub $400 market now? They could target it with good bang for buck and be golden, as their software matures.

It also makes sense to have capable desktop and mobile gaming graphics so they can sell OEMs complete solutions. Along with AMD doing the same, they might start elbowing Nvidia out of some more market share over time.
 

coercitiv

Diamond Member
Jan 24, 2014
6,151
11,680
136
According to communication between Linus and Intel, they will get rid of the "overlay" app. Unspecified timeline though, so Arc owners will have to endure it a while longer.

I'll link the video bellow for anyone who wants to learn about even more bugs in Intel drivers.
 

mikk

Diamond Member
May 15, 2012
4,111
2,105
136
According to communication between Linus and Intel, they will get rid of the "overlay" app. Unspecified timeline though, so Arc owners will have to endure it a while longer.

I'll link the video bellow for anyone who wants to learn about even more bugs in Intel drivers.


Sounds like there will be an option to toggle between overlay and non overlay. See the comment from bizude.

 
  • Like
Reactions: coercitiv

blckgrffn

Diamond Member
May 1, 2003
9,110
3,028
136
www.teamjuchems.com
You can get used (well, mined-on, wink-wink - "just received as a present, opened just to check contents") GTX 1660 Super/Ti nowadays on the bay for $100-120. Nevermind $200.

Yup. And used 2060s all day long for less than $200 too. Maybe they are mined on but you know darn well the software will work and ever PC game is optimized for it at some level.

It still costs more than a new 6600 to boot.

If in someway this puts pressure on at least AMD to lower prices/release a more powerful part but actually hold the line on 6600/6600XT then that’s good too.

If AMD is supposed to get us cheaper Nvidia cards, I am fine with Intel existing to keep AMD honest on the entry level. ;)
 

GunsMadeAmericaFree

Golden Member
Jan 23, 2007
1,240
290
136
That's a pretty good price. Anyone looking to build a cheap gaming PC has to be thinking about the A750, despite all its warts.

Not me. As a casual gamer who is very focused on power use, I'm much more interested in seeing a sale on the A380 - maybe down around $120. The power draw is about 50% more than the usual 50 Watts I try to stick with, but with a sale I might be interested in that.
 

KompuKare

Golden Member
Jul 28, 2009
1,012
923
136
That's a pretty good price. Anyone looking to build a cheap gaming PC has to be thinking about the A750, despite all its warts. Neither AMD nor NV are pushing cards in that price range anymore.

edit: okay you can get some 6600 XTs in that price range. Or an old 1660 SUPER.
While ARC has been improving, back when TPU reviewed the A750 it was slower than the 6600XT and consumed 50W more (150W vs 200W). I really fail to see the value at $250.
That it is a 406mm² part on 6nm (vs Navi 23's 237mm² on 7nm) is only a problem for Intel. Consumers shouldn't care.

The only plus of A750 @ $250 is that it is far better at RT than RDNA2. In Cyberpunk at 1080P TPU had it at 31.2 vs the 6600XT at 19.4. Both far too slow for me, but that and the video code support are the only real pluses IMO.
If Intel want me beta test, then the discount vs Radeon had better be a lot more than $0.
 

QueBert

Lifer
Jan 6, 2002
22,378
706
126
While ARC has been improving, back when TPU reviewed the A750 it was slower than the 6600XT and consumed 50W more (150W vs 200W). I really fail to see the value at $250.
That it is a 406mm² part on 6nm (vs Navi 23's 237mm² on 7nm) is only a problem for Intel. Consumers shouldn't care.

The only plus of A750 @ $250 is that it is far better at RT than RDNA2. In Cyberpunk at 1080P TPU had it at 31.2 vs the 6600XT at 19.4. Both far too slow for me, but that and the video code support are the only real pluses IMO.
If Intel want me beta test, then the discount vs Radeon had better be a lot more than $0.

The Arc's AV1 encoding is on another level and blows away AMD and Nvidas offering. If you wanted a decent card for gaming that excelled at AV1 the A750 would be far better than a 6600xt. While I don't know a whole lot about AV1, a lot of people are making a big deal about it and apparently Intel nailed it with the ARC cards.
 

blckgrffn

Diamond Member
May 1, 2003
9,110
3,028
136
www.teamjuchems.com
The Arc's AV1 encoding is on another level and blows away AMD and Nvidas offering. If you wanted a decent card for gaming that excelled at AV1 the A750 would be far better than a 6600xt. While I don't know a whole lot about AV1, a lot of people are making a big deal about it and apparently Intel nailed it with the ARC cards.

It seems like the only positive thing so folks are hanging their hats on it.

The fact is that you probably don’t want it to stream games with because it’s gonna break more than say an Nvidia card and streamers tend to want their streams to work. You can get a “big boy” gaming card and get AV1 encoding too.

If you’re a content creator odds are you are probably quite aware how important - or not - AV1 encoding is for your work flow and likely already have one if it’s truly make or break for you.

In my mind these are gaming cards and they win or lose on that proposition. The other features outside of absolute performance in games are ancillary to the vast majority of purchasers.

On the flip side, I am wondering what all these driver improvements are bringing to all those Xe laptop chips - really all of the Tiger Lake and newer chips. Those G7 parts were almost gaming worthy, surely they are much better now? In my mind, that’s actually exciting in terms in how many PCs out there are gaming ready if it pans out.
 
Last edited: