News Intel GPUs - Intel launches A580

Page 145 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,498
20,622
146
Steam Desk users may care. :p
Gamers with an all AMD PC care too. Steam OS works great on them. ETAPrime often mentions Cyberpunk'd runs better on AMD in linux than windows. Wintel is still a thing.

ARC has been facing an uphill fight the whole way. Ceasing operations in Russia put them even further behind. (Please, no one drag the politics of it into the discussion, this isn't the venue.) Gandalf/Battlemage will hopefully fix some of the issues on the hardware side. And as they continue to reorganize and focus, the drivers will hopefully come along much faster.
 

mikk

Diamond Member
May 15, 2012
4,141
2,154
136
This was supposed to be fixed a while ago with the launch of the A380. I guess they de-priotized it? @mikk mentioned it. TAP talked about it in one of the inteviews too I think.


They haven't fixed it yet. According to PCGH they are working on it. The problem seems to be the memory clock speed, on my A380 it runs full speed all the time at 1936 Mhz. I hope they can improve this.
 
  • Like
Reactions: Leeea
Feb 4, 2009
34,583
15,799
136
From what I’ve read these don’t appear to be horrible cards, just not cards for the hardcore gamers. IMO as long as intel does “good enough” with these cards there will be another generation and those will improve upon what we have now, maybe those will be suitable for more than occasional gaming.
 

Hulk

Diamond Member
Oct 9, 1999
4,228
2,016
136
This is a tough one. I really want Intel to gain a foothold in the GPU market because the competition will drive each manufacturer to create better products at lower prices. But the problem I see isn't performance as much as the drivers. If you purchase a GPU and from the reviews know what framerates it can achieve and you are good with that performance for your dollar then fair enough. But if the stability of the driver isn't there then it's game over. Ultimately Intel will live or die on the drivers. The hardware is obviously good enough for the low to midrange GPU market if the price is right and the drivers are solid.

So the big question is will Intel see this out for the long run or bail in 2 years? If they stay the course and ultimately develop a solid driver then people will be having good, stress free times with these cards in 2 years. Otherwise they will be in a dump somewhere.

I might take a risk on checking one of these out if there is a good return policy.
 
  • Like
Reactions: Leeea

Aapje

Golden Member
Mar 21, 2022
1,385
1,865
106
From what I’ve read these don’t appear to be horrible cards, just not cards for the hardcore gamers.

They are not cards for any gamers at this point. They are just worse than a 6600 at actually playing games, unless you perhaps play only one game and that game works really well on Arc.

Right now the only real niche where they can actually be better is AV1-encoding. That will last until lower end next gen AMD/Nvidia cards come out.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
I think that is because it should have been 3070/6800 competitor. Now performing like 2 tiers below.

I think it's only MLID's misguided projections that had almost everybody conclude that ARC has to be that level.

At best I think 3060 Ti would have been it's rival. Their hardware IS behind, that's why they are slower than competitors compared to specs on paper. Needing 400mm2 for that level of performance is another metric telling you they are behind. Xe/Gen 12.x is a Vega competitor not even RDNA1.

And according to Raja, they are bottled necked by driver/batch counts per frame. So in 4k, this bottle neck will be less.

That's what they meant when they said they used the iGPU driver as the basis. Basically, with iGPU the performance is entirely limited by the GPU hardware so you can use more of the CPU to make the final performance better.*

But because with dGPUs the level of performance is that much higher, that optimization you did using CPU to boost performance comes and bites back. It looks like even the A380 is affected.

*Although I even dispute this claim somewhat. There has been several instances where their driver overly favors the CPU side.
 
Last edited:

mikk

Diamond Member
May 15, 2012
4,141
2,154
136
Apparently Intel failed with their GDDR6 memory controller and actually it's a licensed GDDR controller IP from outside the company says Anandtech. Imagine this GDDR6 controller without rBAR help. It wouldn't surprise me when MTL with the same GPU core architecture can be much more competitive, not to mention Intels big driver overhead is less of an issue on iGPUs. Imho even the older Xe LP is more competitive considering how low Xe LP clocks on Intels node versus RDNA2 on TSMC 7/6nm, the bigger problem is Intels higher CPU power usage on ADL in lower wattage like 15W.
 
  • Like
Reactions: Tlh97 and Leeea

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,498
20,622
146
unless you play only one game and that game works really well on Arc.
At the moment, that seems very difficult to determine. During the Linus live stream Luke could not get Forza Horizon 5 to run. He guessed it was because of his save point, but couldn't say for certain. FH5 is one of the most impressive games on ARC. Yet, when not benchmarking, but instead, trying to play the game, it can fail to even load.

Some reviewers calling buyers beta testers is generous. For normal users, these cards would have Xbox 360 RROD return rates. In the current state at least.

I will not be participating in the psy op campaign either. Intel is releasing a hot mess to the public, with so many caveats about its use, that it seems like a parody/lampooning. That they are trying to minimize the financial damage by charging so much is an insult. They should be taking a huge L. At $200 for the 770 I'd join in. At $350 I am ready to play the funeral dirge and start shoveling dirt on it.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,746
741
136
At the moment, that seems very difficult to determine. During the Linus live stream Luke could not get Forza Horizon 5 to run. He guessed it was because of his save point, but couldn't say for certain. FH5 is one of the most impressive games on ARC. Yet, when not benchmarking, but instead, trying to play the game, it can fail to even load.

Some reviewers calling buyers beta testers is generous. For normal users, these cards would have Xbox 360 RROD return rates. In the current state at least.

I will not be participating in the psy op campaign either. Intel is releasing a hot mess to the public, with so many caveats about its use, that it seems like a parody/lampooning. That they are trying to minimize the financial damage by charging so much is an insult. They should be taking a huge L. At $200 for the 770 I'd join in. At $350 I am ready to play the funeral dirge and start shoveling dirt on it.

Quite a few have pointed out that Intel instead of trying to shoehorn extra feratures into the drivers could have spent that time on bug fixes/performance. They may look good on a shelf but inside a PC, we expect it to work and in more than a few cases it either makes the RX 6500XT look good or craps the bed.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,228
5,228
136
Quite a few have pointed out that Intel instead of trying to shoehorn extra feratures into the drivers could have spent that time on bug fixes/performance. They may look good on a shelf but inside a PC, we expect it to work and in more than a few cases it either makes the RX 6500XT look good or craps the bed.

It's often the case that the team working on the core functionality, and on extra features are completely separate, and it isn't as simple as fix this first then do that.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Intel drivers relying on the CPU is nothing new. It has been that way since the original Extreme Graphics.

Well they had to back then because it didn't have hardware vertex shaders. That was fixed ages ago with the X3000(although it had weak one and didn't perform well enough until Sandy Bridge).

But I'm talking about how the CPU runs at say 3GHz+ running a game and taking up most of the power budget and starving the iGPU. If you took Throttlestop and simply forced the way the chip allocates power resources, then your game would often end up running better.

That's the difference of back then and now. Back then they had to, nowadays they are sort of choosing to. But they are somehow still stuck in that mentality.

The recent pace of driver improvements in ARC(while not enough still) is very different from the past 20+ years of driver work because things are actually being dealt with. Actually even with the Iris Xe the driver improvements were the Intel usual - snail paced. It took forever with the issues to get fixed, if it ever got done. So this is the silver lining.

Apparently Intel failed with their GDDR6 memory controller and actually it's a licensed GDDR controller IP from outside the company says Anandtech. Imagine this GDDR6 controller without rBAR help.

Remember the fancy video they put out on Twitter that they'll "unleash" their iGPU and make discrete cards? I believe that was in 2019 since they said it'll be released in 2020?

That's Cyberpunk 2077 levels of lunacy. Because TAP said they've been working on ARC for 3 years. So we're in 2022, meaning work started at the same time they claimed they are going to release dGPUs. How would the release in 2020 been possible? In 2019 ARC would have been on a drawing board!

Intel has been traditionally very good at memory controllers. Maybe they need to learn extra details for high end graphics workloads but licensing sounds like them using it to save time. Things like the ReBAR problem and how they didn't forsee issues could be this unrealistic timeframe they had.
 
Last edited:

VirtualLarry

No Lifer
Aug 25, 2001
56,353
10,050
126
Linus must be raking in that Intel money. Yet another video pimping ARC. This time it is the 380 as a secondary GPU in your system for handling AV1.
You have to remember, Linus' has a back-room filled with people and PCs to do his video-editing. For those people, the addition of a fairly inexpensive discrete card that can handle AV1 playback/encoding (the future of YouTube is AV1, after all, because royalty-free), makes a lot more sense than for you or I ordinary gamer / miner / non-YouTube persona.
 

PingSpike

Lifer
Feb 25, 2004
21,732
561
126
You have to remember, Linus' has a back-room filled with people and PCs to do his video-editing. For those people, the addition of a fairly inexpensive discrete card that can handle AV1 playback/encoding (the future of YouTube is AV1, after all, because royalty-free), makes a lot more sense than for you or I ordinary gamer / miner / non-YouTube persona.

Can you do comparable AV1 encoding with a CPU? I'm wondering if just throwing more cores at the problem makes more sense, since they're also good at more than one thing and don't have 20watts of idle power burn even when they aren't doing anything.
 
  • Like
Reactions: Leeea

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,498
20,622
146
You have to remember, Linus' has a back-room filled with people and PCs to do his video-editing. For those people, the addition of a fairly inexpensive discrete card that can handle AV1 playback/encoding (the future of YouTube is AV1, after all, because royalty-free), makes a lot more sense than for you or I ordinary gamer / miner / non-YouTube persona.
Yet that is exactly the audience this is targeted at. It is nothing more than an Intel infomercial. It keeps weirding me out watching him promote them, but have a segment covering all the downsides. It reminds me of the warnings about side effects in those drug commercials. :p
 

Aapje

Golden Member
Mar 21, 2022
1,385
1,865
106
You have to remember, Linus' has a back-room filled with people and PCs to do his video-editing. For those people, the addition of a fairly inexpensive discrete card that can handle AV1 playback/encoding (the future of YouTube is AV1, after all, because royalty-free), makes a lot more sense than for you or I ordinary gamer / miner / non-YouTube persona.

I also remember that Linus uploads in 4K and that YouTube transcodes to AV1 for the most popular video's which surely includes LTT's video's. So he has little reason to bother, since YouTube will do the work for him. I consider it highly unlikely that he is actually adding Arc cards to his machines. Besides, next-gen Nvidia/AMD have an AV1 encoder too, so he can just upgrade to a single new Nvidia/AMD card for the same benefit, while also having a stable system.

But...I also remember that Linus has taken about $100k from Intel in bribes money that his minions get to spend on their home setup, so that all of LTT has an incentive to treat Intel better as that effectively increases their salary.
 

mikk

Diamond Member
May 15, 2012
4,141
2,154
136
Can you do comparable AV1 encoding with a CPU? I'm wondering if just throwing more cores at the problem makes more sense, since they're also good at more than one thing and don't have 20watts of idle power burn even when they aren't doing anything.


For 900p-1080p 60 fps live streaming no because it's awfully slow. Even x264 slow is a challenge. That's why some streamers have a dedicated streaming PC, in such a case the additional 20W of an Arc A380 is nothing.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Linus must be raking in that Intel money. Yet another video pimping ARC. This time it is the 380 as a secondary GPU in your system for handling AV1.


But what he says actually make sense for a lot of people. It makes sense for streamers, or video content creators. The A380 is super cheap, and beat almost any GPU/CPU out there.

But, it is a niche market. And useless to most of us.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
You have to remember, Linus' has a back-room filled with people and PCs to do his video-editing. For those people, the addition of a fairly inexpensive discrete card that can handle AV1 playback/encoding (the future of YouTube is AV1, after all, because royalty-free), makes a lot more sense than for you or I ordinary gamer / miner / non-YouTube persona.

If the A310 ever shows up at retail, it'd be perfect for this. If it comes in a low profile version, so much the better.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,846
3,190
126
I wonder if the A380 will be better then a NNENC in transcoding on things like handbrake and plex @ H.265
If thats the case, i wouldn't mind getting one to replace my old Quadro P2200.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,498
20,622
146
But, it is a niche market. And useless to most of us.
That is my point.

However, it didn't stop him from advertising to us anyways. Saying things and I'm paraphrasing - You probably have a free PCIe slot in your system already. And - All of your favorite streaming services are going to be using it.

But wait! There's more! it is only $140. Never mind that most of you watching this are doing it on a phone, laptop, smart TV, firestick, Roku, Chromecast, etc. Some of you ae on PC. And some of those that are, stream while gaming. Yes! We're talking to YOU! ARC will change your life! Aren't you tired of Twitch partners getting all the bandwidth? Of course you are! ARC will make your low bit rate stream look as good as the pros! Buy the A380 now, and when you get it. Disclaimer appears on bottom of infomercial for one second - if they actually ever fill all the back orders,. ARC will make your life better! Order now! don't wait! Because supplies are limited! very limited.

Intel can keep the advertising and marketing machine going full blast. But In the end, they have to make customers happy with the products. As of now, I don't think they can do that. Returns, RMAs, buyer's remorse, and bad user reviews, will roll in.

And to conclude my musings with more informercial jargon. That's a lot of damage!

Flex Tape won't be able to fix that one though. :p
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
And to conclude my musings with more informercial jargon. That's a lot of damage!

Flex Tape won't be able to fix that one though. :p

It did have the infomercial feeling to it. Though to be fair, it did have valid information and side by side footage of various encodings.

Funny enough, Linus' voice isn't that far off some of the infomercial guys. So I could read the "That's a lot of damage!" in his voice, and it still sounds the part ;)
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,498
20,622
146
It did have the infomercial feeling to it. Though to be fair, it did have valid information and side by side footage of various encodings.
It did. But it is still so esoteric an area of interest, that they sought help from EposVox. Contrastingly, over a million views already. It's why intel is paying him to do it now, instead of continuing the traveling medicine show.
 

coercitiv

Diamond Member
Jan 24, 2014
6,213
11,954
136
This card only worked in a specific slot on our motherboard, and if we dared to plug the monitor into our iGPU while the ARC card was installed the system would blue-screen almost immediately.

These cards are not for games, they ARE the game.