News Intel GPUs - Intel launches A580

Page 102 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

mikk

Diamond Member
May 15, 2012
4,133
2,136
136
Intel will publish a "best experience" guide for getting the best out of their ARC GPUs, because they are so sophisticated, you actually need to read to understand how to use them best.

By the way, I miss Lisa Pearce. Wish she would post daily updates on her driver nightmares on her blog :p


She is the head of the drivers team since many years and failed, it's dubious Intel didn't replace her yet. She not only failed on the driver quality, she also fails on the communication constantly, she is always too late.

The big question would be what is in the pipeline, is there a new driver branch or any major improvement in the works and when it's ready. Or maybe it isn't even in the pipeline yet which means no real improvements for many months and certainly not this year. We have no clue.

I think they need a complete rework of the driver in the end, if not the performance will always vary a lot between the games I'm afraid.
 
  • Like
Reactions: Grazick

mikk

Diamond Member
May 15, 2012
4,133
2,136
136
Raja is not worried. He's hard at work getting Battlemage ready. Once it's ready, he will sit back and blame everything on Lisa. What a comfy job.


He won't blame everything to Lisa, even though they should do something and replace her to be honest. She is clearly not up to the task.
 

Saylick

Diamond Member
Sep 10, 2012
3,127
6,299
136
That's the same type of behavior as when my technically illiterate dad would simply pull the plug on our printer when it acted up and started printing pages and pages of gibberish. Instead of trying to diagnose and solve the problem, he just shut off the machine to get it to stop doing what it's not supposed to do.
 
Last edited by a moderator:
Jul 27, 2020
16,164
10,240
106
Reminds me of my technically illiterate uncle. I was using my PC. I just walked away for a few minutes, came back and saw that it was off. The main power switch had been turned off. He was the closest person. So I asked him if he had anything to do with it. He replied innocently, "It's wasting power. If you are not using it, turn it off to save on the electric bill". You can't imagine the meltdown I had. This was during the Win98 era when sudden shutdowns were more likely to corrupt data.
 

Leeea

Diamond Member
Apr 3, 2020
3,617
5,363
136
If the Flood comes, Newegg/Amazon/Bestbuy may start bundling RX6400/RX6500XT with PSUs for the price of the PSU. And I have no idea what they would do with ARC GPUs. Maybe donate them to African countries.
I would rather the arcs go into the dumpster.

I want pc gaming to succeed, not sour an entire continent on the experience. The ARC without rebar really is that bad.
 

KompuKare

Golden Member
Jul 28, 2009
1,014
924
136
As I've said before, what we don't know about the performance fiasco is whether that is all due to drivers. Or even if the driver team failed (well failed more than Intel's driver team usually fails).

Only hypothetical, but imagine the driver team had been promised certain specs or features but that these were held back by hardware revisions. Or were told at the last minute that a specific feature they had spent ages writing they drivers around doesn't actually work.

A scenario like that would mean any blame could be fairly shared between both the software and the hardware teams!

Besides, being the head of the GPU division in the end the drivers not being ready is ultimately Raja's fault.

Plus the failure in terms of perf/power and perf/transistors are very much on the hardware team.
 
Jul 27, 2020
16,164
10,240
106
Besides, being the head of the GPU division in the end the drivers not being ready is ultimately Raja's fault.
Problem is, he won't incur extra costs to fix things (like hiring better driver writers) so he can show the management how frugal he is and keep collecting his bonuses.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,451
20,462
146
Not a bad first try from Intel. It will only get better from here, if they bite down on their mouthpiece and keep swinging for a change, and don't say no mas. It looks like forced obsolescence is what both Intel and AMD are choosing to do with their entry level GPUs. Using the cards in older systems means being penalized. More so with the Arc if this testing extends to lower visual settings. The performance penalties on the 6400 are rarely from playable to unplayable. Only testing at settings that exceed the 4GB and using games that are the most intensive and poorly optimized really allow reviewers to do Leonardo DiCaprio pointing. The rest is working hard to foster drama.

Steve does a good job of explaining the way they test. I still don't care for it. "Understanding behaviors" is great "for science", it is pointless for us budget gamers. Test it the way we would use it. 1080p mix settings in demanding games. Res scaling or 900p if no other choice. High and Ultra are for older games when using cards like this. ETAPrime does this the best of bigger channels.

And if you are doing it for science, then don't compare performance in the conclusion. You can't say the games are unplayable the way you tested on the cards, then conclude if they are worth gaming on. I don't care what the testing might reveal about the higher end card coming out either. I want to know for example, if the card being tested can do 60fps avg in F1 at 1080p, and if so, which settings get it there.

I lulzed at the build quality compliment at the end. It is as dumb as TPU making the PCIE version a plus/pro. The 6400 sips power, it doesn't need a beefy build quality. Why do I need a big shroud and heatsink, or 2 fans for a card that is going to mostly use about 50W? GTfudgeO! Yet another reviewer that is out of touch with the hobby. That piece of paper as he called the 6400, can go in a tiny build, that's a good thing for the niche. Again, ETAPrime represents us best in these areas.

Anyways, I am encouraged by what Intel has done here. They have a lot of wood to still chop, but it's a start.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,228
136
Not a bad first try from Intel. It will only get better from here, if they bite down on their mouthpiece and keep swinging for a change, and don't say no mas.

This is the crux of it. I never expected I would have the slightest interest in generation 1, but can their commitment remain when uncompelling cards are released with unfortunate timing, leading to a big sales flop.
 

PingSpike

Lifer
Feb 25, 2004
21,730
561
126
Reminds me of my technically illiterate uncle. I was using my PC. I just walked away for a few minutes, came back and saw that it was off. The main power switch had been turned off. He was the closest person. So I asked him if he had anything to do with it. He replied innocently, "It's wasting power. If you are not using it, turn it off to save on the electric bill". You can't imagine the meltdown I had. This was during the Win98 era when sudden shutdowns were more likely to corrupt data.

LOL, once my mom did the same thing when I left my PC on running a hard disk defrag while I went to the store. I got to install windows again after that.
 
Jul 27, 2020
16,164
10,240
106
LOL, once my mom did the same thing when I left my PC on running a hard disk defrag while I went to the store. I got to install windows again after that.
Yeah. Normally, I bet she would have left it alone but on that particular day, she saw the weird colorful activity on the screen (defragging the bits and pieces to make them contiguous) and probably thought you had left some animation thingy running or maybe even a screensaver and decided it should be turned off, for reasons only she knows. Did you ask her why?
 

Mopetar

Diamond Member
Jan 31, 2011
7,831
5,981
136
Reminds me of my technically illiterate uncle. I was using my PC. I just walked away for a few minutes, came back and saw that it was off. . . . This was during the Win98 era when sudden shutdowns were more likely to corrupt data.

To be fair it was the Win98 era so it probably would have blue screened in the next half hour anyways.
 

Panino Manino

Senior member
Jan 28, 2017
820
1,022
136
Bigger cards are coming soon?



Edit:

This video also has a discussion with Intel Graphics Engineer Tom Petersen.

00:00 - Intel Arc GPU Design
02:15 - Intel's Naming Scheme Explained
03:51 - 2 Different Chips (ACM-G10 & ACM-G11 Specs)
05:43 - Engineering Discussion on Core Width & Design
07:00 - Cache, PCIe Generation, Lane Count Limits
08:18 - Why Only 8 PCIe Lanes?
09:23 - Intel A380 Specs & Price
10:55 - Arc Frequency Explained
14:11 - Power Consumption & Overclocking
17:13 - Driver Challenges & Reality
20:04 - Xe Core Deeper Dive (Architecture)
23:19 - Vector Units, Matrix Units, & Shaders
26:07 - Timelines, Launches
 
Last edited:
Jul 27, 2020
16,164
10,240
106
To be fair it was the Win98 era so it probably would have blue screened in the next half hour anyways.
Nah. Win98SE didn't hate my hardware that much but then it wasn't that powerful either. Just a Pentium 133 without MMX. Windows ME, though, man! Installed it, tried to use it and it was unstable. Went back to WIN98SE in less than a day. The next OS I loved using was Win2K. Especially moving the mouse around felt smoother and more elegant in motion than WIN98SE. Read somewhere that they moved the mouse related code into the kernel (or was it the other way around? Seriously don't remember). I've Win2003 server in a VM. Love using it way more than any of the later Windows servers.
 
  • Love
Reactions: MangoX

Frenetic Pony

Senior member
May 1, 2012
218
179
116
Huh? Yes, bigger cards are coming, soon(?) being up to Intel. The biggest card's real world performance can be estimated to be similar to a 3070ti's. This would've been great early this year with complete drivers.