News Intel GPUs - Intel launches A580

Page 152 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

KompuKare

Golden Member
Jul 28, 2009
1,015
930
136
The A770 die is 47% larger than a 3060 die and it's on a much better process. It shouldn't be surprising that it can beat a 3060 when the drivers aren't in the way.
Indeed, as pointed out many times in this thread: Intel are provding good cost/transistors even if they are loosing in perf/area or perf/power.
TPU list DG2-512 as 21.7 billion, GA106 as 12 billion. With nearly twice the transistors, I would expect better performance.

Kudos to Intel has they really have been cranking out some decent driver fixes and not just in the top 10 most popular modern games I had feared.
 

GunsMadeAmericaFree

Golden Member
Jan 23, 2007
1,245
290
136
The latest drivers address poor performance in some popular older titles. That will keep getting better. And I agree anything over the monitor refresh rate is unnecessary for those of us that are not the highly competitive online type.

A lot of people might not care, but we mostly play older & casual titles. It could really affect us if we installed one of the Arc 380 cards......
 

Hulk

Diamond Member
Oct 9, 1999
4,225
2,015
136
I'm not a gamer but looking at those reviews for $30 more it seems like the 6700XT is a much better buy.
 

mikk

Diamond Member
May 15, 2012
4,140
2,154
136


RedGamingTech says Intel is aiming for a midrange card and 225W with Battlemage. This is more believable than the dGPU cancelling story. Raja also said he thinks 225W is the sweetspot for gamers. Of course he says this because Arc 770 uses this amount but this can be also a hint for Battlemage that they don't go into the highend. However 225W on upgrade architecture and TSMC 3nm/4nm/5nm or whatever they are using for Battlemage should be a lot faster than the current Arc. What I don't believe is the release claim of RedGamingTech...2023 is not a realistic target. If someone spreads this it will be another delay festival. That's why Intel didn't specify the year yet.
 

GunsMadeAmericaFree

Golden Member
Jan 23, 2007
1,245
290
136


RedGamingTech says Intel is aiming for a midrange card and 225W with Battlemage. This is more believable than the dGPU cancelling story. Raja also said he thinks 225W is the sweetspot for gamers. Of course he says this because Arc 770 uses this amount but this can be also a hint for Battlemage that they don't go into the highend. However 225W on upgrade architecture and TSMC 3nm/4nm/5nm or whatever they are using for Battlemage should be a lot faster than the current Arc. What I don't believe is the release claim of RedGamingTech...2023 is not a realistic target. If someone spreads this it will be another delay festival. That's why Intel didn't specify the year yet.

50 - 55 Watts is the sweet spot for this gamer. Just give me as much performance as possible for the 50 watts. It's my way of helping the environment, and also ensuring that true advancement is happening - rather than just pumping more electricity into the card.
 
  • Like
Reactions: JustViewing

Thunder 57

Platinum Member
Aug 19, 2007
2,675
3,801
136
50 - 55 Watts is the sweet spot for this gamer. Just give me as much performance as possible for the 50 watts. It's my way of helping the environment, and also ensuring that true advancement is happening - rather than just pumping more electricity into the card.

Amazing that we used to have GPU's that did not require extremal power. And used a single slot like the 9700. I'm OK with using a bit more power but not to the extreme. Intel says one 8 pin for 225W is there target and I like that. I have never used a GPU requiring more than that and I do not want to considering the heat it would put out. I run A/C year round.
 

Mopetar

Diamond Member
Jan 31, 2011
7,837
5,992
136
Amazing that we used to have GPU's that did not require extremal power. And used a single slot like the 9700. I'm OK with using a bit more power but not to the extreme. Intel says one 8 pin for 225W is there target and I like that. I have never used a GPU requiring more than that and I do not want to considering the heat it would put out. I run A/C year round.

~200W is a good spot to land in, particularly now that CPUs are growing their power use as well.

Frankly Intel doesn't need to have a high-end GPU if they can make a competitive midrange product. That's where most consumers are shopping and picking up some market share is more important than having some $1,500 halo product that only Intel brand loyalists would buy.
 

GunsMadeAmericaFree

Golden Member
Jan 23, 2007
1,245
290
136
Amazing that we used to have GPU's that did not require extremal power. And used a single slot like the 9700. I'm OK with using a bit more power but not to the extreme. Intel says one 8 pin for 225W is there target and I like that. I have never used a GPU requiring more than that and I do not want to considering the heat it would put out. I run A/C year round.

What do you mean "used to" ? I've never bought or used one that required any external power hookup.
 

Thunder 57

Platinum Member
Aug 19, 2007
2,675
3,801
136
What do you mean "used to" ? I've never bought or used one that required any external power hookup.

So you have never used a video card that used more than 75W? Or closer to 50W with AGP? And almost no watts with PCI? I would say you are in the minority. As much as I like to joke about people thinking PC's are just for games I still think most have a video card that requires extra power. If it was self built anyway.
 

GunsMadeAmericaFree

Golden Member
Jan 23, 2007
1,245
290
136
So you have never used a video card that used more than 75W? Or closer to 50W with AGP? And almost no watts with PCI? I would say you are in the minority. As much as I like to joke about people thinking PC's are just for games I still think most have a video card that requires extra power. If it was self built anyway.
Nearly all of mine have been self built, though a couple were just majorly upgraded. We rarely play any 3D type games. We do lots of gaming, but it's mostly casual and 2D type games. I don't like games that have 3D movement - I much prefer side scrolling or top down turn based games.

My current video cards on 3 systems use 50 watts. Before that, I used a low profile card that drew just over 30 watts. I like performance, but won't consider anything that makes my video card use more power than the cpu does - which is 65 Watts max.

Actually, the one I'm building my newest system from has the video built in to the processor, so it's 65 Watts for cpu and gpu combined.
 
  • Like
Reactions: scineram

Thunder 57

Platinum Member
Aug 19, 2007
2,675
3,801
136
Nooooooooooooooooooooooooooooooooooooooooooo

oooooooooooooooooooooooooooo

ooo

*sob*

Seriously when was the last time that anyone actually punished or fired Koduri? AMD never did it, he left on his own. Ouch for him.

Were you looking for this?

 

QueBert

Lifer
Jan 6, 2002
22,395
722
126
I follow technology but don't know about the people, is this Raja fellow being demoted a bad thing and maybe potential doom for ARC? I'm still planning to buy an A770 card, but I'm wondering how this news will affect ARC in the future.


I know on the Tomsheadware link Intel's quoted saying they'll remain fully committed to desktop ARC GPU. Which sounds reassuring, but also sounds like a company giving the right answer.
 

maddogmcgee

Senior member
Apr 20, 2015
384
303
136
I follow technology but don't know about the people, is this Raja fellow being demoted a bad thing and maybe potential doom for ARC? I'm still planning to buy an A770 card, but I'm wondering how this news will affect ARC in the future.


I know on the Tomsheadware link Intel's quoted saying they'll remain fully committed to desktop ARC GPU. Which sounds reassuring, but also sounds like a company giving the right answer.

Probably a good thing but it's hard to really know without working there. People love to hate on him which is both amusing and probably justified. Nothing he works on seems to be especially good.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I follow technology but don't know about the people, is this Raja fellow being demoted a bad thing and maybe potential doom for ARC? I'm still planning to buy an A770 card, but I'm wondering how this news will affect ARC in the future.


I know on the Tomsheadware link Intel's quoted saying they'll remain fully committed to desktop ARC GPU. Which sounds reassuring, but also sounds like a company giving the right answer.

Raja used to work for AMD, before he left for Intel.

He is known for making some questionable comments and design decisions. But it wasn't all bad. Polaris was a great GPU. But Vega wasn't. And he was there for some of the very early design work on Navi.

I think this is basically Intel pulling on his reigns a bit after a less than stellar launch of their desktop GPUs which has been hurt badly by some pretty poor software.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I follow technology but don't know about the people, is this Raja fellow being demoted a bad thing and maybe potential doom for ARC? I'm still planning to buy an A770 card, but I'm wondering how this news will affect ARC in the future.


I know on the Tomsheadware link Intel's quoted saying they'll remain fully committed to desktop ARC GPU. Which sounds reassuring, but also sounds like a company giving the right answer.

Raja used to work for AMD, before he left for Intel.

He is known for making some questionable comments and design decisions. But it wasn't all bad. Polaris was a great GPU. But Vega wasn't. And he was there for some of the very early design work on Navi.

I think this is basically Intel pulling on his reigns a bit after a less than stellar launch of their desktop GPUs which has been hurt badly by some pretty poor software. And having a hardware guy in charge of the software team was perhaps not a great idea.
 
  • Like
Reactions: Leeea

maddie

Diamond Member
Jul 18, 2010
4,740
4,674
136
Looks like Pat knows how to put someone in their right place.
Look at what they do, not at what they say.

This is what a company hedging their future decisions looks like. If they do decide to divest/close the consumer side and keep the much higher margin server/HPC side then this would allow the smallest disruption internally and for the clients.

They appear to want Raja continuing as he will be overseeing the tech for both. Maybe they simply think he makes a terrible overall manager. Nothing bad with that, a sort of peter's principle -1 step. Wish more organisations did that.
 
  • Like
Reactions: igor_kavinski

Thunder 57

Platinum Member
Aug 19, 2007
2,675
3,801
136
Raja used to work for AMD, before he left for Intel.

He is known for making some questionable comments and design decisions. But it wasn't all bad. Polaris was a great GPU. But Vega wasn't. And he was there for some of the very early design work on Navi.

I think this is basically Intel pulling on his reigns a bit after a less than stellar launch of their desktop GPUs which has been hurt badly by some pretty poor software.

Polaris was probably the best GPU that I've had since the 8800GT. I'm not sure what went wrong with Vega.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Polaris was probably the best GPU that I've had since the 8800GT. I'm not sure what went wrong with Vega.

Yeah, I bought a Sapphire RX480 Nitro+ on the day they came out, and kept it until I got my current 5700XT. The RX480 had a fantastic price to performance ratio. Probably something we won't see again.
 

Thunder 57

Platinum Member
Aug 19, 2007
2,675
3,801
136
Yeah, I bought a Sapphire RX480 Nitro+ on the day they came out, and kept it until I got my current 5700XT. The RX480 had a fantastic price to performance ratio. Probably something we won't see again.

I got a reference 480 as soon as I could. They went out of stock quickly. Oddly enough I got a 5700 (non XT) soon after as well. Great value at the time.