Info The Intel ARC A750 LE and A380 Blog (and general Arc owners' thread). 55 games tested and counting. Hulk's results with Topaz and deeplink added.

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
27,994
19,014
146
ARC with FSR3 has a serious cool factor to it. Look how well it works even at 1440 on the A750. Again, not my testing.

 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
27,994
19,014
146
It works a treat. ARC using AMD FSR3 and frame generation is

2Yvn7U.gif


I am going to try this on the A380.
 
Last edited:

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
27,994
19,014
146
Got my AC:Mirage activated that came with the A380. Going to play for a little bit after getting the latest drivers installed, on the A750 and update the OP.
 
  • Like
Reactions: igor_kavinski

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
27,994
19,014
146
Best optimized game I have played since Dead Island 2. Like that game, it is obviously meant for consoles. No ray tracing of any kind. Graphics settings menu is sparse. CPU utilization never hit 50% but if it's like Odyssey that will change as the game goes on and you get to bigger cities and fights.

A750 LE runs 1080 Max settings averaging in the 70s fps. TAA, FSR, and XeSS are the choices available when using ARC. I chose XeSS with native for upscaling method. After I saw how well it ran, I played around with ultra quality and quality. Looks really good in this game and adds some fps. Adaptive sync works perfectly for me. Flatline frame graph at 60fps and never budged.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
27,994
19,014
146
Starfield is finally playable. Haven't had a crash yet. Able to get 60fps in many situations now and it looks good enough to enjoy playing. FSR 60% sharpening 60% res scaling. Before it looked like OG playstation and could barely managed 30fps with crashing.

ARC team keeps impressing with constant updates and improvements. The pieces are falling in place for a holiday push.

Concerning Mirage; I expect this to be a game where the A380 thoroughly trounces the 6400 and 1650. The 1630 will always be a bad joke at the prices it sells for. It is technically the closest priced card is the worst part.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,541
6,740
136
This may or may not be appreciated here, but TPU recently did a re-review of the Sparkle A750


At the end of the day, not much has really changed in terms of performance between the A750 and its peer competitors from over a year ago from a performance standpoint.

I'm sure game compatibility and the plug and play experience have definitely improved, but in terms of ultimate performance the needle hasn't really moved at all (I mean I guess if the choice is between 0 FPS game doesn't even launch and at least it works now then there has inarguably been an improvement on that front).

Take from it and @DAPUNISHER's blog what you will.

If I was building a low end PC for someone today, I'd still stick with a used 6600 or 3060 or 2060s over one of Intel's offerings as is.
 
Jul 27, 2020
14,776
9,040
106
I'm sure game compatibility and the plug and play experience have definitely improved, but in terms of ultimate performance the needle hasn't really moved at all
That's a small selection of games. I wouldn't draw much conclusion from those. I would trust DAPUNISHER's "seat of the pants" experience more than that review.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
27,994
19,014
146
That's a small selection of games. I wouldn't draw much conclusion from those. I would trust DAPUNISHER's "seat of the pants" experience more than that review.
Having owned 3 6600s and a 6600XT I prefer the A750 for modern gaming. Better ray tracing, and a hardware upscaler that is superior to FSR. Also more capable at 1440 as W1z showed.

For old games RDNA2 is much better because it's PnP.

I think the 7600 is the real competition. When it is under $250 it is the best budget deal around. Finally, it has a feature difference from RDNA2. Which will mature. It won't always be this initial demo driver stuff.

Used vs new is apples and oranges. Having gotten a nerfed 3060 of Ebay, I avoid the used market now. Have to assume any card with 6GB or more was mined on and is now prematurely on the wrong side of the bathtub curve. It dies down the road and you are right back where you started; looking for a new card. Only this time out of the money you spent. ARC comes with 2-3yr warranty. Maybe a game bundle.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,541
6,740
136
Power consumption is another big one with Arc. I'm about the last person to pitch a fit between some of the minor efficiency numbers put up by AMD/NV top end cards (especially when the argument devolves into cost) but ARC is on another level.

Consuming as much power as a 7800XT or 4070Ti is just otherworldly for the performance on tap. Power = heat = possible throttling etc especially in the kind of $40 cases and $40 power supplies ARC can expect to be paired with in general.

Never the less, no bad products, only bad pricing. IMO Arc can still stand to shed a few pounds on the pricing front.

1696625394364.png
 

H433x0n

Senior member
Mar 15, 2023
818
863
96
This may or may not be appreciated here, but TPU recently did a re-review of the Sparkle A750


At the end of the day, not much has really changed in terms of performance between the A750 and its peer competitors from over a year ago from a performance standpoint.

I'm sure game compatibility and the plug and play experience have definitely improved, but in terms of ultimate performance the needle hasn't really moved at all (I mean I guess if the choice is between 0 FPS game doesn't even launch and at least it works now then there has inarguably been an improvement on that front).

Take from it and @DAPUNISHER's blog what you will.
I disagree that performance hasn't moved - It's always improving. For example, if they were to redo this review today the results in TLOU & Resident Evil 4 would both be improved by +12% and +27% respectively. They've managed to improve performance like this multiple times this past year.
If I was building a low end PC for someone today, I'd still stick with a used 6600 or 3060 or 2060s over one of Intel's offerings as is.

I could make a case that the A750 is a viable option but it'd have to come with caveats. It's got Ampere tier RT performance, Xess is an excellent upscaler and encoding performance is second behind Nvidia (albeit there's still some productivity applications that need more development from the Arc team). If you're gaming on a newer platform w/ ReBar support and your game library is filled with Dx12 & Vulkan titles - it's not a bad choice. If you're on an older system (w/o ReBar support) and most of your games are Dx11 or older - it's best to skip it.
 

H433x0n

Senior member
Mar 15, 2023
818
863
96
Power consumption is another big one with Arc. I'm about the last person to pitch a fit between some of the minor efficiency numbers put up by AMD/NV top end cards (especially when the argument devolves into cost) but ARC is on another level.
It's nothing like the power usage differential in the top end cards.

Consuming as much power as a 7800XT or 4070Ti is just otherworldly for the performance on tap. Power = heat = possible throttling etc especially in the kind of $40 cases and $40 power supplies ARC can expect to be paired with in general.

Never the less, no bad products, only bad pricing. IMO Arc can still stand to shed a few pounds on the pricing front.

View attachment 86759
It's relatively minor since these are low end cards. The default PL is 190W with an option to increase it to 228W. The difference between the 3060 pulling 183W and the A750 pulling 220W isn't a huge difference.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
27,994
19,014
146
There aren't many games where my A750 uses more than 150-170w. I mostly use a frame cap, since I see no point in outpacing the TV too much. A few games like Starfield make it work super hard though. They are outliers.

And it's never throttled despite being in a nr200 mini-ITX case with only 2 fans. That includes playing hours of Starfield last night, which did have my LE over 190W at times.

Power is cheap in the vast majority of the U.S. I think citing an extra 50-75w like it's a deal breaker for residents is silly. As is using cheap cases and cooling as some kind of deciding factor; just silly. I can't speak for the Sparkle, but the LE is an overbuilt card. It always hits its 2400 boost, and never throttles.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,541
6,740
136
I disagree that performance hasn't moved - It's always improving. For example, if they were to redo this review today the results in TLOU & Resident Evil 4 would both be improved by +12% and +27% respectively. They've managed to improve performance like this multiple times this past year.

You're right, performance has improved, but market positioning vs its nearest competitors really hasn't. Below I have the September 2022 review and the October 2023 review.

A750 (using the stock intel version for consistency) started 16% behind the 6600XT and 4% behind the 3060 12Gb.

Today, the A750 is 8% behind the 6600XT and 2% behind the 3060 12GB (and back to being 16% behind the 7600).

The mix of games have changed somewhat to better favor the A750 in the newer October 2023 review (including Resident Evil 4 and The Last of Us which as mentioned had huge performance upgrades) but the actual market positioning of the A750 hasn't really changed in the grand scheme of things.

I agree that there are circumstances and reasons why I would buy an ARC card (I've been personally tempted from the tweaking and testing standpoint alone, much like DaPunisher) but I would still be hesitant to recommend it over equivalently priced alternatives from AMD/NV at this time.

Review from September 2022:

1696628889180.png

Review from October 2023:

1696628910096.png
 
  • Like
Reactions: Tlh97

Thunder 57

Platinum Member
Aug 19, 2007
2,632
3,674
136
There aren't many games where my A750 uses more than 150-170w. I mostly use a frame cap, since I see no point in outpacing the TV too much. A few games like Starfield make it work super hard though. They are outliers.

And it's never throttled despite being in a nr200 mini-ITX case with only 2 fans. That includes playing hours of Starfield last night, which did have my LE over 190W at times.

Power is cheap in the vast majority of the U.S. I think citing an extra 50-75w like it's a deal breaker for residents is silly. As is using cheap cases and cooling as some kind of deciding factor; just silly. I can't speak for the Sparkle, but the LE is an overbuilt card. It always hits its 2400 boost, and never throttles.

I've said this many times, but I will say it again. I really don't care about extra power use as far as costs go, because it will be minimal. The extra heat output though, that can be a deal breaker in places that get damn hot and have short winters. A PC can warm up a room to an uncomfortable level if it is run all out for too long.

I'm sure you knew that, but I still point it out because others may not even think of it. That is why I prefer more efficient products. I go out of my way to purchase more expensive efficient PSU's just to keep waste heat down, for example. If it saves a bit of money over its useful life, that's nice, but I don't really think or care about it.
 

H433x0n

Senior member
Mar 15, 2023
818
863
96
You're right, performance has improved, but market positioning vs its nearest competitors really hasn't. Below I have the September 2022 review and the October 2023 review.

A750 (using the stock intel version for consistency) started 16% behind the 6600XT and 4% behind the 3060 12Gb.

Today, the A750 is 8% behind the 6600XT and 2% behind the 3060 12GB (and back to being 16% behind the 7600).

The mix of games have changed somewhat to better favor the A750 in the newer October 2023 review (including Resident Evil 4 and The Last of Us which as mentioned had huge performance upgrades) but the actual market positioning of the A750 hasn't really changed in the grand scheme of things.

I agree that there are circumstances and reasons why I would buy an ARC card (I've been personally tempted from the tweaking and testing standpoint alone, much like DaPunisher) but I would still be hesitant to recommend it over equivalently priced alternatives from AMD/NV at this time.

Review from September 2022:

View attachment 86761

Review from October 2023:

View attachment 86762
The A770 went from 10% behind the 6600 XT to 5% ahead of the 6600 XT.

the A770 was only 2% ahead of the 3060 to now being 12% ahead of the 3060.

That’s not insignificant and at 1440 the delta is probably even larger.
 

Thunder 57

Platinum Member
Aug 19, 2007
2,632
3,674
136
The A770 went from 10% behind the 6600 XT to 5% ahead of the 6600 XT.

the A770 was only 2% ahead of the 3060 to now being 12% ahead of the 3060.

That’s not insignificant and at 1440 the delta is probably even larger.

I would take AMD or Nvidia over Intel because I play a lot of older games. Still, very impressive work by Intel's driver team. Hopefully the keep going with Arc and don't just give up on it.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,541
6,740
136
The A770 went from 10% behind the 6600 XT to 5% ahead of the 6600 XT.

the A770 was only 2% ahead of the 3060 to now being 12% ahead of the 3060.

That’s not insignificant and at 1440 the delta is probably even larger.

-True, but the A770 sells for roughly the same price as a 6700xt, in which case we're back to being ~17% slower again...
 
Jul 27, 2020
14,776
9,040
106
-True, but the A770 sells for roughly the same price as a 6700xt, in which case we're back to being ~17% slower again...
But with 33% more VRAM, better RT and pretty good Stable Diffusion performance. For anyone that values those things more, it's an easy buy over the 6700 XT.

I really think the A770 should drop further in price by year end, especially if Alchemist Refresh comes out.

EDIT: Forgot to mention AV1 decoding.
 
Last edited:

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
27,994
19,014
146
I am stoked we are even having this discussion. As I have previously written, a year ago I would have lost money if we wagered on the odds of it happening. I thought this initial foray into dGPUs was a Greek tragedy unfolding.

Context is also important when it comes to ARC. The amount of progress that's been made since its release is damned impressive. It put AMD in 3rd place for ray tracing and upscaling already. Some A770 16GB models have been as low as $300. Being able to crank up textures and use ray tracing + having good hardware upscaling in new titles, without worrying about running out of VRAM brings some value to the table.

I also like that the number of partner cards continues to grow. The new ASRock LP slot powered A380 fills a niche that has been largely neglected outside of workstation models. Only costing $120 is the cherry on top. I have the RX 6400 LP model and it has negatives the A380 doesn't have. Its only must haves are reBar and 2 slots. With the 6400 you don't get much of a media block, it needs PCIe 4.0, and it has 50% less ram. I own both, and I'd sell my 6400 before my A380. The ARC was a superior value too. I understand part of that is due to its hands on nature. Nevertheless that's how the scales balance for me.

Playing older games on my A750 has so far meant 2 games were nerfed. Watch Dogs 2 (fixable by using the cracked version that lacks DRM) and Delta Force Black hawk Down an ancient DX 8.1 game you can play on iGPU. Everything else runs really well with DXVK. For our crowd adding the files is trivial, so not a deal breaker by any means.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
27,994
19,014
146
But with 33% more VRAM, better RT and pretty good Stable Diffusion performance. For anyone that values those things more, it's an easy buy over the 6700 XT.

I really think the A770 should drop further in price by year end, especially if Alchemist Refresh comes out.

EDIT: Forgot to mention AV1 decoding.
The AV1 encoding is the more important feature for content creators and streamers.

I think it is hard to recommend the A770 over the 6700XT for most. What's cool though, is that there is even a viable alternative in the price range that doesn't cheese on ram. That alone is a win for gamers IMO. More choices, more competition, more pressure on the other vendors to deliver better bang for buck or risk market share slipping away.