News Intel GPUs - Intel launches A580

Page 15 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
We'll have to see on node, but using Samsung 7 would make it easier to migrate to Intel 7 later due to both using EUV?

Even between joint venture foundries(like Samsung/GF/IBM alliance), you can't do straight porting. Intel and others will be way different.

Pretty much the only time you'll ever see main product line from Intel moving to a foundry node is if they give up on developing it entirely. I'm not sure how you think its in anyways simple to move Xe, which builds on a decade of development on an in-house architecture can be easily moved just as a transitional thing.

Yea, outside of startups, no CPU/GPU is a truly grounds-up chip. Every new chip builds upon foundations made by the previous chips.
 

jpiniero

Lifer
Oct 1, 2010
14,509
5,159
136
Pretty much the only time you'll ever see main product line from Intel moving to a foundry node is if they give up on developing it entirely. I'm not sure how you think its in anyways simple to move Xe, which builds on a decade of development on an in-house architecture can be easily moved just as a transitional thing.

Didn't say it would be easy... but if they are actually intending to sell GPUs, it can't be on Intel 10 nm and if they intend to be competitive Intel 14 isn't going to cut it.
 

Despoiler

Golden Member
Nov 10, 2007
1,966
770
136
Didn't say it would be easy... but if they are actually intending to sell GPUs, it can't be on Intel 10 nm and if they intend to be competitive Intel 14 isn't going to cut it.

I saw a rumor that they were going to fab GPUs @ TSMC. It would make sense if they don't have a workable in house process. It might also make sense if they are throwing as much money as they can at a problem that they buy up fab space to try to crowd out AMD and Nvidia.
 

NTMBK

Lifer
Nov 14, 2011
10,208
4,940
136
Considering Tigerlake has an Xe Graphics engine and that’s based on 10nm... im pretty sure the scaled up version will be too.

I thought that there was more than one Xe architecture? Perhaps the "client optimized" IGP will be on Intel, while the data-center optimized discrete GPU doesn't necessarily have to be.
 

jpiniero

Lifer
Oct 1, 2010
14,509
5,159
136
I should point out that the Rocket Lake rumor suggests there will be a 14 nm IGP chiplet along with a 10 nm version although it's possible that the 14 nm one is just Gen 9.5.

I guess that would open up the possibility that the first dGPUs are actually on Intel 14, but that would be tough to be competitive.
 
Mar 11, 2004
23,031
5,495
146
I should point out that the Rocket Lake rumor suggests there will be a 14 nm IGP chiplet along with a 10 nm version although it's possible that the 14 nm one is just Gen 9.5.

I guess that would open up the possibility that the first dGPUs are actually on Intel 14, but that would be tough to be competitive.

I don't think it'd be that tough. I'd guess the Intel's 14nm is better than TSMC's 12nm and its not like Turing is garbage. Honestly, if Intel hits Pascal levels of efficiency and its large enough (to be able to offer good performance levels) and a solid price (say they release a 1080Ti performing chip for $300-400), with maybe some interesting features (like a cutting edge video processing block, maybe some ray tracing capabilities or ability to handle more displays or higher res displays - so say it can manage 8+ 1080p displays off a single card or multiple 8K displays), it could be a decent alternative for quite a few people.

I wouldn't be surprised at all for Intel to be very aggressive on pricing when they first come out just to try and gain marketshare, with them being very strict about market segmentation (pro vs consumer, tiers of chips; actually I could see them offering efficiency focused ones that can't overclock and then enthusiast ones, much like they do in CPUs). I also expect they'll push form factors (they already have the NUCs, and a leg up in laptops and probably all-in-on PCs).

Something that will be really interesting and a potential measuring stick is if Apple brings out any Intel CPU+GPU devices.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
After seeing the announcement for the Navi release, I can see why Koduri bailed for Intel. Clearly something is very, very wrong in AMD's GPU organization since their CPU team is executing flawlessly. I hope Koduri went to flee the malaise which results in a 2016 era gpu in 2019, and didn't bring those problems with him to Intel. As far as I am concerned Intel is now our only game in town to give nVidia any run for their money.
 
  • Like
Reactions: DooKey

beginner99

Diamond Member
Jun 2, 2009
5,208
1,580
136
After seeing the announcement for the Navi release, I can see why Koduri bailed for Intel. Clearly something is very, very wrong in AMD's GPU organization since their CPU team is executing flawlessly. I hope Koduri went to flee the malaise which results in a 2016 era gpu in 2019, and didn't bring those problems with him to Intel. As far as I am concerned Intel is now our only game in town to give nVidia any run for their money.

Wel he was the boss of RTG so any mailaise exsiting there is his fault ultimately.
 
  • Like
Reactions: Tlh97 and ksec

sirmo

Golden Member
Oct 10, 2011
1,012
384
136
After seeing the announcement for the Navi release, I can see why Koduri bailed for Intel. Clearly something is very, very wrong in AMD's GPU organization since their CPU team is executing flawlessly.

I got just the opposite impression from the presentation. I mean it remains to be seen but if RDNA is truly a graphics optimized ISA and an arch without unnecessary compute features GCN has which don't get leveraged in most games. AMD could very well appease the gamers with it.

While back I saw an AMD patent which hinted at the return of some VLIW instructions. Which could mean some really cool posibilities.

This is exactly what AMD needed to do. Address their efficiency deficit in graphics workloads. And they did just that by the looks of it.

Don't confuse the fact that 5700 is targeting ~2070 performance levels. The chip is relatively small (252mm^2).
 
  • Like
Reactions: Tlh97 and ksec

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
After seeing the announcement for the Navi release, I can see why Koduri bailed for Intel. Clearly something is very, very wrong in AMD's GPU organization since their CPU team is executing flawlessly. I hope Koduri went to flee the malaise which results in a 2016 era gpu in 2019, and didn't bring those problems with him to Intel. As far as I am concerned Intel is now our only game in town to give nVidia any run for their money.

Huh??

What mid range GPU in 2016 had RTX 2070 performance? nVidia mid range card at that time was the GTX 1060, and it cannot even come close to a 2070. AMD's roadmap has shown that their 'big chip' release will be next year. Navi is the replacement for Polaris.

AMD getting the mid range card out first is by far the more important release than the big chip. The VAST majority of GPU's sold are mid range cards. The 2080Ti is a niche product that sells significantly less than lower end cards.

Intel would be smart to target this same area.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Huh??

What mid range GPU in 2016 had RTX 2070 performance? nVidia mid range card at that time was the GTX 1060, and it cannot even come close to a 2070. AMD's roadmap has shown that their 'big chip' release will be next year. Navi is the replacement for Polaris.

AMD getting the mid range card out first is by far the more important release than the big chip. The VAST majority of GPU's sold are mid range cards. The 2080Ti is a niche product that sells significantly less than lower end cards.

Intel would be smart to target this same area.

I never said midrange. In 2016 you could buy a 1080 and have about the same performance for about the same price as either the 2070 and new top end Navi cards, meaning they both do nothing to move the performance and price curve forward, meaning they're both hot garbage. Any other conclusion is nitpicking or denial

Almost no meaningful improvement over 3 years = fail
 

Dayman1225

Golden Member
Aug 14, 2017
1,152
973
146
Intel has hired chief architect of Xbox SoCs, John Sell. They have also hired Manisha Pandya from Apple.

Article said:
Intel has hired John Sell, the chief architect of next-generation console chips at Microsoft. Sell worked on the chips for the Xbox One, Xbox One X, and the upcoming Project Scarlett.

Sell changed his LinkedIn profile to a new job at Intel, and the company confirmed that it has hired him.

Article said:
Additionally, Manisha Pandya is joining Intel after 10 years at Apple, where she led multiple hardware technology teams. These teams included those responsible for platform power delivery hardware for all Apple products, several sensor technologies, battery charging, analog IPs, and custom silicon. She will also be on Raja’s staff, helping with our overall power and performance methodology strategy and working on key analog and power IP for discrete GPUs.
 

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,785
136
@Dayman1225

Hmmm, interesting . . . that's some real humble pie Intel is eating. First they hire a guy responsible for integrating AMD's semi-custom stuff into the Xbox, and then they hire Apple's power delivery expert.
 

itsmydamnation

Platinum Member
Feb 6, 2011
2,743
3,073
136
I never said midrange. In 2016 you could buy a 1080 and have about the same performance for about the same price as either the 2070 and new top end Navi cards, meaning they both do nothing to move the performance and price curve forward, meaning they're both hot garbage. Any other conclusion is nitpicking or denial

Almost no meaningful improvement over 3 years = fail
This is really poor logic. Amd could sell this card for 300ish usd and likely make more margin then rx480 did.

That's the problem here, this is a mid range card with high end pricing. As an actual soc it looks fine.
 
  • Like
Reactions: Tlh97

ondma

Platinum Member
Mar 18, 2018
2,718
1,278
136
Huh??

What mid range GPU in 2016 had RTX 2070 performance? nVidia mid range card at that time was the GTX 1060, and it cannot even come close to a 2070. AMD's roadmap has shown that their 'big chip' release will be next year. Navi is the replacement for Polaris.

AMD getting the mid range card out first is by far the more important release than the big chip. The VAST majority of GPU's sold are mid range cards. The 2080Ti is a niche product that sells significantly less than lower end cards.

Intel would be smart to target this same area.
Yea, we have heard this excuse for lack of a competitive top end card ever since the 1080 came out and all AMD had to answer with was midrange, inefficient cards.
 

ondma

Platinum Member
Mar 18, 2018
2,718
1,278
136
@Dayman1225

Hmmm, interesting . . . that's some real humble pie Intel is eating. First they hire a guy responsible for integrating AMD's semi-custom stuff into the Xbox, and then they hire Apple's power delivery expert.
Didnt hear it being called humble pie when AMD hired Keller.
 

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,785
136
Didnt hear it being called humble pie when AMD hired Keller.

You didn't?

By the time AMD turned to Keller, it was obvious to everyone but a few die-hards that AMD couldn't design CPUs worth using (with the possible exception of the cat cores . . . barely).
 

maddie

Diamond Member
Jul 18, 2010
4,722
4,626
136
You didn't?

By the time AMD turned to Keller, it was obvious to everyone but a few die-hards that AMD couldn't design CPUs worth using (with the possible exception of the cat cores . . . barely).
Poor Mike Clark and his team. Forever to be overlooked?

As an aside, Raja seems very extravagant. Anyone wonders at the politics at play now inside Intel GPU division.