News Intel GPUs - Intel launches A580

Page 144 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

LightningZ71

Golden Member
Mar 10, 2017
1,628
1,898
136
By all means please upgrade and take one for the team. After that let us know about the performance. :innocent:
LOL!

I'm more daydreaming at this point. I'm not going near to an Intel GPU until they have proven that they can go at least 18 months after volume release to retail while still releasing actual gpu drivers that make actual improvements and don't just abandon the things once sold. They have done this multiple times in the past with gpu products that weren't their in house iGPUs (that have all sorts of nasty volume contracts with OEMs that stipulate support).
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,846
3,190
126
im telling you they utterly bnotched everything.
They should of launched it during etherium mining period, got as much profit from it, and then hacked out the perfected drivers while the miners were funding the entire project.

This way they could of sold garbage wrapped in gold paper, and it would of sold as long as it could mine eth at the same power draw of a 1660 super with the same hash rates, and no one would care, because gamers wouldn't even be able to get there hands on one due to the miners.

But instead they decide to launch videocards, at the worst possible timing, when miners are left dead in gpu mining, and gamers are all pissed off at the vendors for charging more then what the card is worth and still thinking people will pay the current prices for this gen, and next.

As steve from Gamers Nexus says... "thanks a lot intel"....
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
@LightningZ71 I don't believe they will abandon it, and pretty much every thing related to cancelling came from MLID in one form or another.

They said that having an iGPU-only is a problem because it doesn't allow them to directly capitalize on the investments. That means client dGPUs, as the development for iGPUs and client dGPUs are closer than say compared to server GPUs. Certainly not on the driver side.

Also their CTO and manager of software and advanced technology division Greg Lavender, has said that it wasn't until middle of last year they started organizing all the software groups under one umbrella. When Greg asked Pat whether they have a software group Pat said "No, they are all over the place".
 
Last edited:

moinmoin

Diamond Member
Jun 1, 2017
4,956
7,675
136
Also their CTO and manager of software and advanced technology division Greg Lavender, has said that it wasn't until middle of last year they started organizing all the software groups under one umbrella. When Greg asked Pat whether they have a software group Pat said "No, they are all over the place".
Oh god, that certainly shows. Greg Lavender is somebody Pat essentially took with him from VMware. This explains why everything appears to have happened on short notice, it actually did, with Pat and Lavender both only joining last year and making those (correct) changes.
 

Leeea

Diamond Member
Apr 3, 2020
3,626
5,368
136
@LightningZ71 I don't believe they will abandon it, and pretty much every thing related to cancelling came from MLID in one form or another.
MLID said they would release the chips they already made / contracted for first. Apparently TSMC already made a bunch of chips for Intel, and is on contract to make battlemage for Intel, so they want to sell those before killing the project.

Which kind of makes sense, Intel is going to want to recup has much as they can from all the silicon they already had manufactured and has already signed a contract to manufacture. That is all money already spent.


Battlemage is a ways off in the future, so odds are we will not know for another two years if MLID was right.
 
Last edited:

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,498
20,622
146
In short, crazy designed led lamp that also can be used for occasional gaming. :mask:

It doesn't even matter if the software achieves AMD fine wine status now. When it comes time to re-paste, change pads, or replace a fan, the vast majority are just going to either sell it, or throw it in the e-waste bin. Don't forget this comes with a 3yr warranty. They must anticipate that with the number they produce and sell, that the financial hit from having to take RMAs on these, will be acceptable. That, or they failed to take it into account, like almost everything else about this product.

This is an unqualified disaster for the DIY market. I have worked on some annoying laptops and ALL-IN-ONEs over the years, but this thing is beyond stupid.
 

PingSpike

Lifer
Feb 25, 2004
21,732
561
126
Well, the A770 also has horrendous power idle consumption so if the driver fixes fixes for this are going to materialize it doesn't seem like it is in the immediate future.

 
  • Like
Reactions: Leeea
Jul 27, 2020
16,340
10,352
106
From the TPU review, I can see that the A770 has real potential, particularly in raytracing. Plus, the transient spikes are not insane. I'm actually going to prefer the A770 over the 6700 XT for DX12 games. Not bad, Intel. Not bad at all.
 

Leeea

Diamond Member
Apr 3, 2020
3,626
5,368
136
From the TPU review, I can see that the A770 has real potential, particularly in raytracing. Plus, the transient spikes are not insane. I'm actually going to prefer the A770 over the 6700 XT for DX12 games. Not bad, Intel. Not bad at all.
I think your being overly optimistic. It still has failures in dx12 games, just not the ones it is optimized for.


Well, the A770 also has horrendous power idle consumption so if the driver fixes fixes for this are going to materialize it doesn't seem like it is in the immediate future.

https://www.techpowerup.com/review/intel-arc-a770/38.html
44 watts idle? ouch!

also, AMD 6800xt at at 29 watts idle? also ouch!


I am happy to pay the power tax when I am gaming, but at idle* I would rather not!


*for reference, according to above char my rx6900xt will cost me $15.77 more per year in electricity then a rtx3080ti. I leave my computer on all the time. Contrary to my desires, most of the time I am not gaming :(.

edit: see Stuka87's post below:
 
Last edited:

Leeea

Diamond Member
Apr 3, 2020
3,626
5,368
136
That must be from when they tested it at launch. Which is weird, because TPU has actually tested the updated drivers?! The chart below shows how far the power consumption has dropped by. Multi-monitor is of course higher, which is typical of larger GPUs.

View attachment 68813
oh, sweet! Nice!

Thank you AMD!

Those driver updates saved me $26 per year :).

notes:
rx6900xt:
start idle wattage: 28
end idle wattage: 8
diff: 20 watts
( 20w / 1000 )kwh * 24hr * 365days * $0.15/kwh = $26.28


edit:
just realized I am multi-monitor, on the other hand, at those rates I should put my second monitor on a old gpu I have lying around and save some money...
 
Last edited:

blckgrffn

Diamond Member
May 1, 2003
9,128
3,069
136
www.teamjuchems.com
From the TPU review, I can see that the A770 has real potential, particularly in raytracing. Plus, the transient spikes are not insane. I'm actually going to prefer the A770 over the 6700 XT for DX12 games. Not bad, Intel. Not bad at all.

I thought that round table with Raja was really eye opening. It doesn't really look like a balanced design. Some games will run at 3070 levels, but if you hit some memory intensive bits it could be worse than a 3060 by a fair margin and that's the hardware. There are other tradeoffs that are noted there.

Given how memory performance nearly always matters and RT only matters in a small number of cases... IDK. I don't think I'd go for that now.

The next 3-6 months is the best time for the A770 to exist and then even it's strengths will likely be weaker than those of its segment peers.

If I can get one for sub $100 at some point I still will :D

Also, multimonitor power consumption on Radeons still gets me. @Leeea I think running a display at "high refresh rates" still kicks the memory into high gear and you get idle usage similar to the dual monitor cost.

I do put my monitors to sleep at home pretty quickly now, hoping that brings the clock speeds down, but TBH I haven't been able to observe or confirm that is a thing.
 
  • Like
Reactions: Tlh97 and Leeea

PingSpike

Lifer
Feb 25, 2004
21,732
561
126
That must be from when they tested it at launch. Which is weird, because TPU has actually tested the updated drivers?! The chart below shows how far the power consumption has dropped by. Multi-monitor is of course higher, which is typical of larger GPUs.

View attachment 68813

Glad to see this, I had the same feeling as Leeea about the 6800xt. It was especially weird since their latest low end cards have crazy low idle power.

IIRC AMD cards had (have?) a problem where they couldn't downclock the memory when running multimonitor.
 
  • Like
Reactions: Tlh97 and Leeea

blckgrffn

Diamond Member
May 1, 2003
9,128
3,069
136
www.teamjuchems.com
Glad to see this, I had the same feeling as Leeea about the 6800xt. It was especially weird since their latest low end cards have crazy low idle power.

IIRC AMD cards had (have?) a problem where they couldn't downclock the memory when running multimonitor.

It’s easy to see. Just fire up the overlay and watch the memory clock to the moon. I am running latest whql drivers, I’ll look a little later and see what is going on now.
 

Thunder 57

Platinum Member
Aug 19, 2007
2,675
3,801
136
Glad to see this, I had the same feeling as Leeea about the 6800xt. It was especially weird since their latest low end cards have crazy low idle power.

IIRC AMD cards had (have?) a problem where they couldn't downclock the memory when running multimonitor.

I thought it was fixed, but maybe not. Here is a bit about it back from when Anandtech reviewed video cards.
 

AnitaPeterson

Diamond Member
Apr 24, 2001
5,947
404
126
Ugh... those 4+3 power ports...all that energy consumption... for such modest showing and poor stability in games?
Did Intel get hit with the Bulldozer stick or what? Was Raja Koduri the original carrier of the "mo' powah" virus?

I remember how much I hated the RX570's power consumption, occasional glitches and overall size. The GTX1650 with GDDR6 that replaced it is a miniature jewel by comparison, and requires no external power.
 
  • Like
Reactions: NTMBK

blckgrffn

Diamond Member
May 1, 2003
9,128
3,069
136
www.teamjuchems.com
I am running 22.5.1 from late April... Really AMD, no WHQL drivers for nearly 6 months?

I can confirm that just with my 1 display at 165hz my memory is at ~1450mhz to ~2000ghz and probably soaking up the power. Even better is the fans don't spin so the card heats way up.

Anyway, I guess I can just change from "Recommended" to "Recommended + Optional" in the AMD control panel and see what happens...

Found my culprit - running Idle Champions in the background. Otherwise it runs 192 mhz. GPU power listed as 10W.
 
Last edited:

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,498
20,622
146
I have been listening to Linus and crew test ARC in any game the chat wants, while getting things done. I am an hour 45minutes in. What a disaster so far. Linus was doing his best, but it's putting lipstick on the pig at that point in the live stream they did. Rocket League is broke, Rocket league! Unbelievable.

Most the games have frame pacing issues or worse. I appreciate that they pointed out what I always rant about, that even 1% lows don't tell you about terrible gameplay sometimes. Turning on hairworks in Witcher 3 took like 66% of the performance away. The 3060 they are comparing it against is crushing it in GTAV. BeamNG was nerfed. ANNO 1800 had issues. Linus was a try hard at that point trying to help out ARC, no sale, sorry. They kept slipping in jabs at AMD which had me doing the -

200.gif


AMD is killing it as of today in bang for buck. Listening to used car salespeople try to convince the viewer AMD sucks in the middle of watching ARC crap the bed is surrealistic.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,746
741
136
Glad to see this, I had the same feeling as Leeea about the 6800xt. It was especially weird since their latest low end cards have crazy low idle power.

IIRC AMD cards had (have?) a problem where they couldn't downclock the memory when running multimonitor.

That bug goes all the way back to GCN 1.0, a few times AMD (or at least their vocal supporters) have claimed it has been fixed only for the bug to still be there. I still remember my old R9 290 running the VRAM at 1350MHz stock on multi monitors @ idle & 1500Mhz oc'd @ idle.
 
  • Like
Reactions: Leeea

PingSpike

Lifer
Feb 25, 2004
21,732
561
126
AMD is killing it as of today in bang for buck. Listening to used car salespeople try to convince the viewer AMD sucks in the middle of watching ARC crap the bed is surrealistic.

Yeah, the crap on AMD to try and make ARC look good show is beyond ridiculous at this point. AMD ain't perfect but its like a guy who got caught cheating on his wife pointing out that his neighbor sometimes leaves his dirty underwear on the floor. One of the most annoying things I kept reading was people saying Intel has an niche advantage because they have open source drivers for linux. I'm like, you know AMD has had those for 5 years or more right? And nobody outside of linux users even cares. (It's actually a really good thing though and a pretty strong advantage if you use the card on linux)
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Well, the A770 also has horrendous power idle consumption so if the driver fixes fixes for this are going to materialize it doesn't seem like it is in the immediate future.

This was supposed to be fixed a while ago with the launch of the A380. I guess they de-priotized it? @mikk mentioned it. TAP talked about it in one of the inteviews too I think.
 
  • Like
Reactions: Leeea

Leeea

Diamond Member
Apr 3, 2020
3,626
5,368
136
So it finally occurred to me, I can just check in the driver to see what idle wattage it is using.

For me, with 2x monitors, 6900xt: 14 watts
( hyper V, libreoffice, and firefox running in the background )
I am feeling happy about that

IIRC AMD cards had (have?) a problem where they couldn't downclock the memory when running multimonitor.
I just realized that seems to be fixed on mine. Memory is running at 192 MHz as I write this.

Found my culprit - running Idle Champions in the background. Otherwise it runs 192 mhz. GPU power listed as 10W.
nice!

Also, multimonitor power consumption on Radeons still gets me. @Leeea I think running a display at "high refresh rates" still kicks the memory into high gear and you get idle usage similar to the dual monitor cost.
I have two displays attached at the moment, 1440p at 144 hz, and an old 1024x768 60 Hz VGA on a display port adapter. The display port -> vga adapter likely pulls down 2 watts for its own function. 14 watts for those two displays,

but:
if I activate my lg c1 at 4k 120hz freesync, wattage jumped to 38 watts idle.
 
Last edited:

Tup3x

Senior member
Dec 31, 2016
965
951
136
That bug goes all the way back to GCN 1.0, a few times AMD (or at least their vocal supporters) have claimed it has been fixed only for the bug to still be there. I still remember my old R9 290 running the VRAM at 1350MHz stock on multi monitors @ idle & 1500Mhz oc'd @ idle.
Yeah, when I had R9 290 that thing ran hot while idling. Fans turned on while browsing and so on. On top of that it had occasional black screen issue and other annoyances. I got rid of it rather quickly.

I have had pretty bad experiences with Radeons and ATI products. I have no fond memories of Rage IIc. My X1800 XT had weird video decoding issues (some kind of green artifacts) and died shortly after one year. It was also loud and ran very hot. Also it was the firts time when I had to upgrade my PSU because the whole system froze in TES: Oblivion. Old bad experiences still haunts me.
 
  • Like
Reactions: Leeea

JustViewing

Member
Aug 17, 2022
135
232
76
From the TPU review, I can see that the A770 has real potential, particularly in raytracing. Plus, the transient spikes are not insane. I'm actually going to prefer the A770 over the 6700 XT for DX12 games. Not bad, Intel. Not bad at all.
I think that is because it should have been 3070/6800 competitor. Now performing like 2 tiers below. And according to Raja, they are bottled necked by driver/batch counts per frame. So in 4k, this bottle neck will be less.
 

moinmoin

Diamond Member
Jun 1, 2017
4,956
7,675
136
One of the most annoying things I kept reading was people saying Intel has an niche advantage because they have open source drivers for linux. I'm like, you know AMD has had those for 5 years or more right? And nobody outside of linux users even cares.
Steam Desk users may care. :p

Intel has been doing open source drivers longer than AMD, but looking how the Windows side of drivers are unfolding (abandonment of decades of driver expertise that should have been there for iGPU already, only using emulation for DX older than 12, building support from scratch game per game...) it looks like that expertise either isn't used or isn't worth anything on the whole.