Info The Intel ARC A750 LE and A380 Blog (and general Arc owners' thread). 55 games tested and counting. Hulk's results with Topaz and deeplink added.

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

sniffin

Member
Jun 29, 2013
141
22
81


Intel Xe Matrix Extensions (needed for Intel Extension for PyTorch) working on ARC according to this post: https://towardsdatascience.com/util...trix-extensions-xmx-using-oneapi-bd39479c9555

They don't explicitly state it but these are for Linux.

However, I did get their Docker container working. I have no idea what I did wrong the first time but I blame lack of sleep.

Anyway moving the training to the Arc GPU was easy once I got in. The attached screenshot is training an older NN (one of my first) using a very small batch size (10) that I was running on the CPU. Utilisation is pretty poor, but it is much slower at these small batch sizes than my 3080. Huge batch sizes (1000s) are working much better.

I will need to setup a proper benchmark to compare them. There is very little data about this online.
 

Attachments

  • Screenshot 2023-06-27 192729.png
    Screenshot 2023-06-27 192729.png
    190.3 KB · Views: 11

Tup3x

Senior member
Dec 31, 2016
930
916
136

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
27,994
19,014
146
Had a spontaneous system reboot while on Amazon Prime Video. It turned CSM back on for some weird reason. I have the security update bios from 6/1 so I am leaning towards the latest ARC beta drivers being responsible. It hasn't happened again, and it's on 24/7. No issues coming out of sleep mode as some report either.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,542
6,740
136

-Nice to see another "current gen" LP card. Pickin's are pretty slim for anything remotely modern in a LP setting.
 
  • Like
Reactions: DAPUNISHER

Pohemi

Diamond Member
Oct 2, 2004
8,360
10,315
146
I had initially purchased an A770 16GB with the rest of the parts for my new build a few months back. I decided shortly after assembly that it was likely overkill for the older games I still play, and decided to return it and buy an A750.

For my limited use, it's done very well. I'm mostly playing WoW and The Division 2, both at 4K and medium-high to high settings. If I drop the res to 1440p, I can set it to high/ultra. No stutter or hang-ups, even in WoW's 25-man raids with a ton happening onscreen.

I haven't done statistical testing like DAPUNISHER but have been more than happy with it, and glad I downgraded from the A770 and saved a Benjie.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
27,994
19,014
146

-Nice to see another "current gen" LP card. Pickin's are pretty slim for anything remotely modern in a LP setting.
The extra 2GB over the 6400 and 1650 LP cards is nice. If it wasn't for needing rebar, it'd be great for all those 6th and 7th gen Intel SFF OEM systems selling for dirt cheap on Ebay and other marketplaces.

@Pohemi

I will add your game results to the OP with attribution, if it is ok with you. If you have any other game results I haven't tried, I would like to add those too.

I got around to watching Bryan/TYC's latest video on ARC. Combined with all of the other content I have seen on ARC, the display and resolution used, how many displays are used, and the display connection type, can all be problematic.

Have to agree with his conclusion that it isn't for the average gamer looking for a smooth, trouble free, ownership experience. I haven't run into the alt tab issue or the vertical screen tearing he did. But I don't own either Diablo 4 or Last of Us which are the games he had the tearing with.
 
  • Like
Reactions: Tlh97 and Pohemi

Pohemi

Diamond Member
Oct 2, 2004
8,360
10,315
146
Yep, no problem. I have a ton of other (also older) games on different services like Steam, Bethesda, Ubisoft, etc. but hadn't installed anything but the two I mentioned. If I do, I'll report back with results.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
27,994
19,014
146
I keep getting the spontaneous reboots where it resets CSM. It has happened 3 more times since rolling back to the last WHQL drivers. I disabled ECO mode and turned off the 200MHz extra boost for the 5600 when I did the drivers. Just to eliminate the possibility it was an unstable overclock. It always happens on the web, with an interface like Amazon Prime VIdeo or Fanatical.

I am now connecting it directly to the tv without the A/V receiver doing passthrough. If it happens again, I'll have to move to a monitor with displayport to see if it's an HDMI issue. It happens there, I'll try a different system. If it happens there, I will RMA the card.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
27,994
19,014
146
I have been upgrading and using my main PC so the ARC is idle this last week.

Vex has an even handed 30 days with ARC video up that's worth the watch. Covers some content creation -

 
  • Like
Reactions: Pohemi

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
27,994
19,014
146
We'll have to see how things look when everyone has a game ready driver. But it's a outstanding showing by the A770. It says they are using an in house engine. I guess that will make this game an outlier?
 
  • Like
Reactions: Pohemi

Pohemi

Diamond Member
Oct 2, 2004
8,360
10,315
146
Vex has an even handed 30 days with ARC video up that's worth the watch. Covers some content creation -
....

Nice comparison review he does, I watched the entire thing.

Only one thing really surprised me, and it wasn't the performance of any of the tests...it was the fact that he's using a (relatively) old system with "only" PCI-e 3.0. He's honest and straightforward about it and mentions it in the vid, but I went...wait, what?

Being a hardware reviewer, I'm just a little surprised he doesn't run something newer. Most boards and CPUs now support 4.0, and 5.0 will become more common as the next year or two progresses. 6.0 is already out as a standard, but obviously isn't being utilized yet.

It makes me curious how limiting using PCI-e 3.0 could be compared to PCI-e 4.0. It'd affect all GPUs I would think so it isn't as if it'd necessarily favor one GPU over another. Still...testing current gen GPUs on an old gen mobo and CPU PCI-e standard...c'mon, man. lol
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
27,994
19,014
146
Nice comparison review he does, I watched the entire thing.

Only one thing really surprised me, and it wasn't the performance of any of the tests...it was the fact that he's using a (relatively) old system with "only" PCI-e 3.0. He's honest and straightforward about it and mentions it in the vid, but I went...wait, what?

Being a hardware reviewer, I'm just a little surprised he doesn't run something newer. Most boards and CPUs now support 4.0, and 5.0 will become more common as the next year or two progresses. 6.0 is already out as a standard, but obviously isn't being utilized yet.

It makes me curious how limiting using PCI-e 3.0 could be compared to PCI-e 4.0. It'd affect all GPUs I would think so it isn't as if it'd necessarily favor one GPU over another. Still...testing current gen GPUs on an old gen mobo and CPU PCI-e standard...c'mon, man. lol
Here's the deal: He is a 21yr old kid just getting his channel going. He can't afford stuff yet. The A770 was a loaner from a viewer that reached out to him. Otherwise he wouldn't even be covering it.

PCIe 3.0 x16 has plenty of bandwidth for ARC. It doesn't hold it back to any significant degree. Check out the PCIe scaling on techpowerup, even an x8 card like the 6600XT doesn't see any real world difference. Total difference is like 4% which is barely outside of margin of error stuff. It amounts to a couple of frames in most titles, nothing to worry about.

We have already seen that storage for games see no significant benefit even between SATA SSD and NVME. Nevermind the even smaller differences between nvme gens and speeds.
 
Feb 4, 2009
34,381
15,600
136
Here's the deal: He is a 21yr old kid just getting his channel going. He can't afford stuff yet. The A770 was a loaner from a viewer that reached out to him. Otherwise he wouldn't even be covering it.

PCIe 3.0 x16 has plenty of bandwidth for ARC. It doesn't hold it back to any significant degree. Check out the PCIe scaling on techpowerup, even an x8 card like the 6600XT doesn't see any real world difference. Total difference is like 4% which is barely outside of margin of error stuff. It amounts to a couple of frames in most titles, nothing to worry about.

We have already seen that storage for games see no significant benefit even between SATA SSD and NVME. Nevermind the even smaller differences between nvme gens and speeds.
Yeah time and time again it’s the same story for PCIE, AGP & the PCI bus. Newer faster speed of the bus typically doesn’t matter until years later when stuff is designed to use the faster bus.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
27,994
19,014
146
Yeah time and time again it’s the same story for PCIE, AGP & the PCI bus. Newer faster speed of the bus typically doesn’t matter until years later when stuff is designed to use the faster bus.
It's one of those checkboxes for OEMs to sell the new shiny.

Prosumers and content creators may benefit from wicked fast storage and GPU bandwidth, but for us gamers? Not so much. I mean it's nice to have, just not something to spend money on upgrading to, anytime soon after hitting the market.
 

Pohemi

Diamond Member
Oct 2, 2004
8,360
10,315
146
Newer faster speed of the bus typically doesn’t matter until years later when stuff is designed to use the faster bus.

This is why I wondered about it, since 4.0 has been relatively 'common' for 4-5 years now. I know the newest NVMe drives are the only devices nearing the limitations of 4.0, but 3.0 is an old standard when 5.0 is already now being utilized. I wasn't sure if 3.0 would be limiting for gen4 GPUs, because they've been out for a while now.
 
Feb 4, 2009
34,381
15,600
136
@DAPUNISHER
New, new Arc drivers. This release seems to target two specific games but I read something about “general xe super sampling improvements” or something to that effect.
Oops here:
 
Last edited:

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
27,994
19,014
146
Remnant 2 is a hot mess. It runs terribly on everything. First game to state it is designed to be used with upscaling. Everyone here that said it would happen; you should be venerated. :beercheers:

ARC support is especially bad. The devs say they are working to fix it, and have a beta specifically for ARC/ A750 in the meantime. Otherwise there is graphical corruption and even worse performance. No one should buy this game until a bunch of optimizations are added. ARC owners in particular should give it a hard pass.

 

Ranulf

Platinum Member
Jul 18, 2001
2,293
1,077
136
Just skimming the video, it doesn't seem to matter too much between low, med, high or xess. Lows in the 30s, in some parts high in the mid 60 range.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
27,994
19,014
146
Just skimming the video, it doesn't seem to matter too much between low, med, high or xess. Lows in the 30s, in some parts high in the mid 60 range.
Guy in the comments saying it's like that on their 1080ti too. Says he can go past 8GB vram playing for hours.

Digital Foundry has a video on the game. Seems better on console, though far from great. I think they speculate it uses Epic's upscaler on console.
 
Jul 27, 2020
14,776
9,040
106
Both Remnant II and Ratchet & Clank: Rift Apart integrate support for Intel XeSS technology (Xe Super Sampling) to improve gaming performance without sacrificing image quality, and the implementations show amazing scaling in our testing. On Remnant II, when running at 1080p High we see a 57% uplift in average frame rates and at 1440p Medium settings there is an 80% increase in FPS! And in Ratchet & Clank, the uplift at 1080p Very High is 44% and at 1440p High is 88% – amazing work from our developer partners and the Intel Arc graphics architecture!
I say BRAVO!

The higher you go, the more you get!

I really wish I knew who the mastermind is behind this awesome architecture that is so hungry for MOAR pixels!

And no, It's not Raja Koduri. Else Intel would have done everything to keep him.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
27,994
19,014
146
I say BRAVO!

The higher you go, the more you get!

I really wish I knew who the mastermind is behind this awesome architecture that is so hungry for MOAR pixels!

And no, It's not Raja Koduri. Else Intel would have done everything to keep him.
I playing the Poe's Law card. Are you serious? Bravo for a broken game that all but forces you to use upscaling?
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
27,994
19,014
146
Nah, I was more impressed with their upscaling performance improvements. Who cares about the game? :D
XeSS looks better on Nvidia cards than FSR, in both this steaming pile and Ratchet&Clank from what I've seen and read. Haven't seen it compared on Radeon yet.