Significant leaks: Intel working on true 3D packaging and Raja wants to enter dGPU "with a bang"

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ksec

Senior member
Mar 5, 2010
420
117
116
Ashraf Eassa, long-time Intel analyst at The Motley Fool (https://twitter.com/TMFChipFool), has become known for his reputation as being one of the most trustworthy and consistent leakers of insider information from Intel, such as code names.

And you need to mention and Code Names ONLY. He is also been consistently wrong about every Apple related Intel leaks. He was wrong on node, wrong on yield, wrong on 10nm prediction. I mean he does have a few leads I presume. But that is about it. And if you look back at the execution in the past 3 - 4 years of Intel. I bet none of those he leaks will arrive on time.
 

positivedoppler

Golden Member
Apr 30, 2012
1,103
171
106
If Intel is serious about gaming gpu, I expect them to top Vega or Navi since Raja should have enough inside info
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
If they release a gpu in 2020, I would guess it would be as fast as today's gtx1070. In 2020 that would be the lower end.
 
  • Like
Reactions: moonbogg

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
If they release a gpu in 2020, I would guess it would be as fast as today's gtx1070. In 2020 that would be the lower end.

As long as its priced accordingly, I wouldn't mind a 3rd player to break up the current duopoly. Of course, Intels track record WR to GPUs (or iGPUs) isn't exacty stellar. Then again, they never recruited anyone of the calibre of Raja, though how much difference one person can make to a historically mediocre graphics division remains to be seen.
 

Ventanni

Golden Member
Jul 25, 2011
1,432
142
106
I think Intel could create a reasonably competitive product give what we've seen in the progression of their iGPUs over the years, but it's a whole new ball game to compete with Nvidia and even AMD in the high end market. At the end of the day though, I still see it as this:

1. Intel has the resources, but lacks the IP and expertise to develop a dGPU to compete with Nvidia in future developing markets (mobile, AI, datacenter, etc).
2. Nvidia has the expertise and reasonable resources, but lacks the IP to develop a CPU to compete against Intel in any fashion.
3. AMD has the expertise and the IP, but lacks the resources to effectively compete with both at the same time in a meaningful way.

Strangely, the only company that has the expertise, IP, and resources to compete in all markets is Apple. They just don't want to.
 

Rifter

Lifer
Oct 9, 1999
11,522
751
126
No way this isnt a april fools. Intel releasing a high end gaming GPU, yeah right, when pigs fly.

Even if due to some miracle they released the hardware no way they would devote the resources to driver development to keep it going, they never have in the past. Intel GPU drivers are so far behind Nvidia/AMD they arnt even in the same race.

This is just PR BS IMO.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,746
740
136
Intel can release all the GPU's it likes, until they follow through on Drivers and Support they will fall as flat as their past efforts. Interested to see where they take their 3D stacking though.
 
  • Like
Reactions: Gideon and ksec

Genx87

Lifer
Apr 8, 2002
41,095
513
126
I have heard this story before. Intel turned one into their integrated chip. The other turned into Phi. I expect similar results when Intel stares at competing with Nvidia in a shrinking market or selling these into data centers and big profits. Some of this will of course filter down into their integrated stuff. When all is said and done I wouldn't be surprised if Raja takes another sabbatical and then leave Intel.
 

Shivansps

Diamond Member
Sep 11, 2013
3,851
1,518
136
$300-$350 wasn't a bad price for the i7 at the time. The problem was severe limited quantities which increased prices, along with poor motherboard support.

The 5775c punches well above its weight once unlocking the power limit and overclocked. The 65w cap really hurt it in benchmarks.

Then Intel took Skylake Skull Canyon and released it in even more limited quantities as a NUC.

Now the only eDRAM Iris Pro CPUs are mobile.

I was super excited about what eDRAM could bring to the table, especially for games. But it seems Intel has other plans.

Who wants to pay $350 to have IGP that performs like a $90 dGPU? if Intel had released a sub $150 I3 with Iris Pro, they would have a good success, there is a BIG market for good performing IGPs in the $50 to $150 area. But no, a $350+ I5/I7, and in limited quantities.
 
Last edited:

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
The focus of this discussion is on the GPU side of things but there are some interesting developments in the '3D' packaging side as well. AMD has effectively used 2.5 D packaging for GPU's and now for EPYC.

EPYC isn't 2.5D.
 

oak8292

Member
Sep 14, 2016
82
67
91
EPYC isn't 2.5D.

Are 'chiplets' in a multi chip package 2.5D?

From the article I linked;

"AMD did not talk the next greatest FinFET or nanowire transistors. Instead, they discussed how they utilized 3D-SIC to get the power efficiency gains in the GPU above. Then, they discussed how they split up their 32-core EPYC server-class chip to four 8-core “chiplets” (Figure 6). Even with a 10% overhead to add I/O to the 4 chiplets, AMD was able to reduce their overall cost, owing to basic die yield—chip yield goes down very quickly with chip area (somewhere between a square law and an exponential), so if you can test for known good chiplets, you can come out way ahead in simple cost for chips this size. In AMD’s case below, they achieved a 41% cost savings figure."
 

beginner99

Diamond Member
Jun 2, 2009
5,210
1,580
136
Then again, they never recruited anyone of the calibre of Raja, though how much difference one person can make to a historically mediocre graphics division remains to be seen.

let's be honest. AMD GPUs started going downhill the moment he joined in 2013, the year AMD released the last actually competitive GPU with the R9 290. His tracks records looks pretty meek. I always wonder why such managers easily get hired elsewhere...
 

Atari2600

Golden Member
Nov 22, 2016
1,409
1,655
136
Look at the clapping seals thinking Intel can deliver on this. They are just making themselves look absurdly stupid.

If Intel cannot make drivers that work for their integrated graphics on reduced resolutions with reduced latency requirements, reduced visual effects and the reduced CPU load that results, just how on earth do you expect them to produce drivers that can deliver AAA games at sufficient frame rates, sufficient percentile rates and do so with all effects properly displayed and without clogging up the CPU.

It wouldn't matter if they had God himself (or flying spaghetti monster if that's your disposition) coding - they'd need to be recruiting a team of over 100 software (driver) engineers, minimum, to deliver on this. Such a recruitment drive would be clear and obvious within the industry, yet we don't hear a peep.

Silly little clapping seals.

Comments underlined by moderator.
This is not allowed in our forums, and
is unnecessary in civilized technical
discussion/debate.


AT Mod Usandthem
 
Last edited by a moderator:

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Look at the clapping seals thinking Intel can deliver on this. They are just making themselves look absurdly stupid.

If Intel cannot make drivers that work for their integrated graphics on reduced resolutions with reduced latency requirements, reduced visual effects and the reduced CPU load that results, just how on earth do you expect them to produce drivers that can deliver AAA games at sufficient frame rates, sufficient percentile rates and do so with all effects properly displayed and without clogging up the CPU.

It wouldn't matter if they had God himself (or flying spaghetti monster if that's your disposition) coding - they'd need to be recruiting a team of over 100 software (driver) engineers, minimum, to deliver on this. Such a recruitment drive would be clear and obvious within the industry, yet we don't hear a peep.

Silly little clapping seals.
I can't see anything in the thread that rates such an attack on board members.
 

krumme

Diamond Member
Oct 9, 2009
5,952
1,585
136
Look at the clapping seals thinking Intel can deliver on this. They are just making themselves look absurdly stupid.

If Intel cannot make drivers that work for their integrated graphics on reduced resolutions with reduced latency requirements, reduced visual effects and the reduced CPU load that results, just how on earth do you expect them to produce drivers that can deliver AAA games at sufficient frame rates, sufficient percentile rates and do so with all effects properly displayed and without clogging up the CPU.

It wouldn't matter if they had God himself (or flying spaghetti monster if that's your disposition) coding - they'd need to be recruiting a team of over 100 software (driver) engineers, minimum, to deliver on this. Such a recruitment drive would be clear and obvious within the industry, yet we don't hear a peep.

Silly little clapping seals.
Lets not overdo this. They have lousy but fairly working drivers now and dx12 is comming fast now. They need the gpu for professional solutions anyway as thats an emerging market. Its comming. Its not like they need to be creative or highly innovative. Its firm goals. It just need funding and foremost capacity. They got both in spades.

Personally i dont care it will be big and inefficient for what it does. And 5 years out. More competition is always better and good for consumers.

Heck we had 8 years in this forum with a lot of guys that seem to enjoy the non compettitive situation on the cpu front. That one sided stupid amd bashing. One even claimed compettition was bad for consumers.
 
  • Like
Reactions: KompuKare

Atari2600

Golden Member
Nov 22, 2016
1,409
1,655
136
Lets not overdo this. They have lousy but fairly working drivers now and dx12 is comming fast now.

They have lousy, inefficient drivers on slow, inefficient hardware.

When you speed up the latter, you must speed up the former - otherwise you'll be seriously deficient relative to the competition. Furthermore, they'll need to properly display more complex effects and do so in a timely manner.

AMD and Nvida have big driver teams constantly moving the bar forward, and they are building on decades of knowledge and infrastructure. You simply cannot bypass that by hiring a few powerpoint rangers at the top.
 

coercitiv

Diamond Member
Jan 24, 2014
6,187
11,855
136
Heck we had 8 years in this forum with a lot of guys that seem to enjoy the non compettitive situation on the cpu front. That one sided stupid amd bashing. One even claimed compettition was bad for consumers.
But that's just it: many voices that are now (overly) enthusiastic about Intel's GPU efforts were quite pessimistic about AMD's Zen potential and cited the old mantra: "past performance is the best indicator for future performance". It's pure irony, especially considering AMD had no choice but laser focus on their CPU effort, while Intel has quite a few (golden) eggs to juggle with.

I look ant Intel and see past performance issues when facing new markets/segments, starting with their GPU history and continuing with their latest attempt of securing a position in the mobile market. They literally ended throwing money at it in the hopes of making it stick. And now we're supposed to believe they will manage to focus their attention on building a successful discrete gaming GPU product while their main revenue line is under direct pressure from both AMD and ARM friends. Meanwhile Nvidia figured the big money is in compute, yet Intel wants a piece of the gaming pie.

Wouldn't it make a lot more sense for Intel to start building a compute oriented GPU and only use it in consumer products where integration would offer them the upper hand? Why fight your competitor's strengths when you can build on yours?

We keep discussing about Raja in the context of of his gaming GPU expertise, but what about the professional products that were brought to the market under his tenure?
 
  • Like
Reactions: krumme

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Are 'chiplets' in a multi chip package 2.5D?

Their GPUs are 2.5D but their CPUs are not, including EPYC.

It's called "2.5D" because silicon interposers used to pair HBMs to GPUs have very high density connections allowing for the massive bandwidth HBM offers, and is a precursor to true 3D packaging where logic dies are stacked on top of each other. You can also see the HBM part is "3D" because the dies are stacked on top of each other.
 

oak8292

Member
Sep 14, 2016
82
67
91
Their GPUs are 2.5D but their CPUs are not, including EPYC.

It's called "2.5D" because silicon interposers used to pair HBMs to GPUs have very high density connections allowing for the massive bandwidth HBM offers, and is a precursor to true 3D packaging where logic dies are stacked on top of each other. You can also see the HBM part is "3D" because the dies are stacked on top of each other.

I am not sure if you are getting very specific about 2.5D or if you are not reading the quotes or links that I have posted. AMD is not using a monolithic die for EPYC. They are using 8 core 'chiplets' in a multi chip package. Here is a visual image from a AMD slide pack;

amd-iedm-2017-29-768x429.png
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
I am not sure if you are getting very specific about 2.5D or if you are not reading the quotes or links that I have posted. AMD is not using a monolithic die for EPYC. They are using 8 core 'chiplets' in a multi chip package. Here is a visual image from a AMD slide pack;

No, I don't think you are researching enough on the topic. I'll leave it for you to figure it out.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
IP6200 handily defeated AMD's fastest APU in gaming benches.

How bad could the drivers have been?
 

oak8292

Member
Sep 14, 2016
82
67
91
I'll leave it for you to figure it out.

Seriously. I have provided you with a link to technical article co-authored by AMD, a slide from an AMD slide pack and a blog by an industry insider on a presentation by AMD at IEDM 2017 on 3D-IC and you are going to suggest that I need to research this more.

Here is a link to an article on Anandtech;

"On the package are four silicon dies, each one containing the same 8-core silicon we saw in the AMD Ryzen processors."

https://www.anandtech.com/show/1155...w-7000-series-cpus-launched-and-epyc-analysis

That is rich. I think I have it figured out. What do you think 2.5D means?

P.S. For people uninterested in this back and forth there is another good article from Greg Yeric, installment 2 on where 3D is going at ARM.

https://community.arm.com/arm-research/b/articles/posts/three-dimensions-in-3dic-part-ii
 
Last edited:

beginner99

Diamond Member
Jun 2, 2009
5,210
1,580
136
Seriously. I have provided you with a link to technical article co-authored by AMD, a slide from an AMD slide pack and a blog by an industry insider on a presentation by AMD at IEDM 2017 on 3D-IC and you are going to suggest that I need to research this more.

Here is a link to an article on Anandtech;

"On the package are four silicon dies, each one containing the same 8-core silicon we saw in the AMD Ryzen processors."

https://www.anandtech.com/show/1155...w-7000-series-cpus-launched-and-epyc-analysis

That is rich. I think I have it figured out. What do you think 2.5D means?

His point is correct and it is that Epyc is classic MCM or said otherwise 2D. There is no interposer or anything else special. And I don't mean this in a negative way in fast this is good because it works and is cheap.
 

oak8292

Member
Sep 14, 2016
82
67
91
His point is correct and it is that Epyc is classic MCM or said otherwise 2D. There is no interposer or anything else special. And I don't mean this in a negative way in fast this is good because it works and is cheap.

Thanks beginner99. The AMD paper compared the performance of EPYC with a active or passive interposer. I guess they are leaving themselves room to improve EPYC with the same die on an interposer in the next iteration.

Why are we discussing AMD/EPYC in this thread?

Only in term where the industry is with respect to 3D and how Intel will transition to 'chiplets'.