Significant leaks: Intel working on true 3D packaging and Raja wants to enter dGPU "with a bang"

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

EXCellR8

Diamond Member
Sep 1, 2010
3,206
141
126
#27
NVIDIA definitely needs to be knocked down a few pegs; AMD isn't going to be the one to do it as we all know already. Intel certainly has the resources and funding to take on the green team but, like so many others, I'm skeptical that they'll be able to deliver a solid product right out of the gate. You never know though.
 

moinmoin

Senior member
Jun 1, 2017
918
403
106
#28
With Raja on board, Arctic Sound has been "split into two", with the second side of the coin being the gaming market. He is said he wants to "enter the market with a bang".
Is this "Poor Volta" Raja that we are talking about?
That's exactly what came to my mind when reading the above quote. I sure hope Raja's bullish attitude results in a competitive product instead a bizarre ad.
 

moonbogg

Diamond Member
Jan 8, 2011
9,759
64
126
#29
I don't think Intel will aim high enough for their first gaming card. I expect a low end solution, like an upgrade card to replace an APU's graphics or something like that. Probably cost like $100. I don't expect them to actually impress anyone with their big bang market entry. That's the practical side of me.
The hopeful side is very excited and I'd absolutely LOVE to have a badass, high end Intel GPU to match whatever sick new 8 core CPU I buy from them for a future gaming rig. I like matching stuff and I'd love to go Intel for both CPU and GPU. I'd use blue themed décor and skulls everywhere. Like RGB skulls and stuff. BRING IT ON INTEL! Come on Raja! Don't blow it man. Push them HARD and make it happen!
 

witeken

Diamond Member
Dec 25, 2013
3,868
11
106
#30
I don't think Intel will aim high enough for their first gaming card. I expect a low end solution, like an upgrade card to replace an APU's graphics or something like that. Probably cost like $100. I don't expect them to actually impress anyone with their big bang market entry. That's the practical side of me.
The hopeful side is very excited and I'd absolutely LOVE to have a badass, high end Intel GPU to match whatever sick new 8 core CPU I buy from them for a future gaming rig. I like matching stuff and I'd love to go Intel for both CPU and GPU. I'd use blue themed décor and skulls everywhere. Like RGB skulls and stuff. BRING IT ON INTEL! Come on Raja! Don't blow it man. Push them HARD and make it happen!
From the press release:
Koduri will expand Intel’s leading position in integrated graphics for the PC market with high-end discrete graphics solutions for a broad range of computing segments.
@Arachnotronic many people still in doubt!
 

moonbogg

Diamond Member
Jan 8, 2011
9,759
64
126
#31
^ Well, when intel says, "high-end discrete graphics solutions", I interpret that to actually mean, "high end when compared only to the usual warm sick we spew on people with our weak integrated solutions". That's surely what they mean by "high end". Its only high end in a vacuum, right? As in only when compared to what intel has in the market right now. Heck, any low end graphics card is high end compared to their integrated stuff.
So you see, this is all about reading between the lines and realizing that Intel is likely making products that Best Buy customers may consider "high end" when compared to the other word processors in the PC isle. Sorry but this is my honest expectation. I WANT much better. I did not choose to think this way. My opinion has been formed by external events that are out of my control, therefor, my opinion is out of my control. If Intel makes a good GPU that gamers actually want, then my opinion will swiftly change, and that will also have been out of my control.
 

mindless1

Diamond Member
Aug 11, 2001
4,645
108
126
#32
The hopeful side is very excited and I'd absolutely LOVE to have a badass, high end Intel GPU to match whatever sick new 8 core CPU I buy from them for a future gaming rig.
Forget gaming, there's no way Intel is going to invest in that level of driver support.
 

jpiniero

Diamond Member
Oct 1, 2010
6,577
317
126
#33
Forget gaming, there's no way Intel is going to invest in that level of driver support.
Yeah, that's the biggest issue right now with Intel's IGP gaming - the lack of driver optimization (cheats).
 

itsmydamnation

Golden Member
Feb 6, 2011
1,862
255
136
#34
Yeah, that's the biggest issue right now with Intel's IGP gaming - the lack of driver optimization (cheats).
But if we build a sea of x86 cores we wouldn't need driv......... oh wait..........
 

moonbogg

Diamond Member
Jan 8, 2011
9,759
64
126
#35
Forget gaming, there's no way Intel is going to invest in that level of driver support.
I know its like a HUGE operation, right? Its not just like, "Ok lets make a high end GPU". Its a big, huge deal that requires tons of resources and talent and all sorts of insane stuff. That's why I know we shouldn't get our hopes up because this thing is going to suck!
Also, I have to add that I don't actually have a lot of faith in Raja. He thought vega was good and vega was crap, so something isn't right with that. I'd expect him to make something weak and crappy and then be like all, "WOW look at this thing fly!"
 
Last edited:

Shivansps

Platinum Member
Sep 11, 2013
2,587
324
126
#36
Not to sound overly critical, but I don't remember them trying. Ever. They always talk a big game, but have never actually delivered on a single GPU design worth having. I think that if they were serious and actually trying, they would have had at least one competitive product in the past 40 years.

To be fair, they absolutely dominate the entry-level market with IGP.
They did try with the first Iris Pro, they had the best performing IGP for a while, but at a price no one same would pay.
 

mindless1

Diamond Member
Aug 11, 2001
4,645
108
126
#37
Yeah, that's the biggest issue right now with Intel's IGP gaming - the lack of driver optimization (cheats).
I was thinking bugfixes and features but if driver optimization gets it to an acceptable framerate it wouldn't otherwise achieve, yeah they ought to think about that too.
 

ZGR

Golden Member
Oct 26, 2012
1,828
43
126
#38
They did try with the first Iris Pro, they had the best performing IGP for a while, but at a price no one same would pay.
$300-$350 wasn't a bad price for the i7 at the time. The problem was severe limited quantities which increased prices, along with poor motherboard support.

The 5775c punches well above its weight once unlocking the power limit and overclocked. The 65w cap really hurt it in benchmarks.

Then Intel took Skylake Skull Canyon and released it in even more limited quantities as a NUC.

Now the only eDRAM Iris Pro CPUs are mobile.

I was super excited about what eDRAM could bring to the table, especially for games. But it seems Intel has other plans.
 

IntelUser2000

Elite Member
Oct 14, 2003
6,257
357
126
#39
$300-$350 wasn't a bad price for the i7 at the time. The problem was severe limited quantities which increased prices, along with poor motherboard support.
No one used Iris Pro parts because it sucked. They had to scale the usage of Iris Pro parts down because less and less manufacturers were using it.

The original Iris Pro in Haswell chips were used in handful of products. The review of a laptop with Iris Pro 5200 showed it had no advantage in light usage battery life, the laptop was expensive, and compared to similar class Nvidia GPUs it used more power in load. So who wants to use that?!? You were much better off using Maxwell GPUs. An integrated GPU that has consumes more power and expensive is a terrible integrated GPU.

The Broadwell based Iris Pro was used even less. Sure, it had the C desktop chips. But for the same price as Intel's desktop chips do you think they'll bundle 1.5x die size and an extra die for eDRAM? That's part of the reason why 5775C was a low clocked part. Outside of that barely anyone touched it. Because it was 20% faster than the anemic Iris Pro in Haswell.

The Skylake Iris Pro was the worst. Despite all the advancements, it was 20-30% faster. The NUC was literally the only implementation of the Iris Pro 580. Because the product sucked.

The regular non-Pro Iris parts are ok. The evidence of that is its being used in quite a few products.
 
Last edited:

epsilon84

Senior member
Aug 29, 2010
978
158
136
#40
Not to sound overly critical, but I don't remember them trying. Ever. They always talk a big game, but have never actually delivered on a single GPU design worth having. I think that if they were serious and actually trying, they would have had at least one competitive product in the past 40 years.

To be fair, they absolutely dominate the entry-level market with IGP.
Perhaps I'm showing my age a bit here, but I actually had an Intel i740 AGP card, one of the first AGP cards ever released. It was my first gaming PC, and it was much cheaper than the Voodoo 2 at the time, though performance was obviously a lot lower too. It still actually played games well though, and for a first attempt it wasn't bad, they never built on that though and kind of gave up.

http://www.tomshardware.com/reviews/graphic-chips-review-april-98,64-3.html
 
Jul 25, 2014
140
0
101
#41
If they're stacking logic, then they need to put something better than toothpaste under the IHS.
 
Oct 16, 2015
176
0
71
#44
Even if Intel would be able to summon magically fast gpu for gamers, moving to such card would be risky for consumer, as Intel really has horrible reputation when it comes to their divers. And unfortunately that reputation changes very slowly.
 
Mar 27, 2009
12,968
36
106
#45
Even if Intel would be able to summon magically fast gpu for gamers, moving to such card would be risky for consumer, as Intel really has horrible reputation when it comes to their divers. And unfortunately that reputation changes very slowly.
High end Workstation (or Server) card.

Think about an area that AMD is particularly weak that Intel could help them with.
 
Last edited:

IRobot23

Senior member
Jul 3, 2017
601
18
76
#46
LOL. I'll believe Intel taking the "Gamer market by storm", when I actually see it. I'm not giving them any benefit of the doubt in this area, since they've tried MULTIPLE times, and FAILED.
Gaming market is big.
 

krumme

Diamond Member
Oct 9, 2009
5,786
172
136
#47
Intel needs something that uses their excess capacity. Gpu does that in spades.

New api requires slimmer drivers and places more burden on engines.

Its perfect time to enter the market.
 
Sep 14, 2016
48
0
51
#48
Significant leaks: Intel working on true 3D packaging and Raja wants to enter dGPU "with a bang"

Next, on the packaging side, Ashraf has come with the major scoop that Intel is working on a 3D packaging technology called Fevoros, which will be able to package "smaller dies together as well as 3D stacking". It will be used post-Sapphire Rapids (so starting with 7nm Granite Rapids); EMIB will be used until SPR.

For some context as to what this significant news implies. It will be the successor of EMIB. EMIB is a 2.5D packaging technology that allowed for connecting dies (that are within um next to each other) with a low-power and very high-bandwidth interconnect (potentially over 4TB/s). EMIB's claim to fame and differentiator compared to the competition is that it doesn't use a very big (expensive) interposer, nor does it use yield-destroying TSVs (through silicon vias), but just a small embedded bridge in the package (as the name implies obviously).

Now, I have to say 3D stacking is already done: it is used in HBM. But as everyone knows Intel is not a DRAM manufacturer, so this implies that Intel is going to 3D stack logic dies (or 3D XPoint?). This would be a first. Everyone in the industry is interested in how well cooling of 3D stacked logic dies will go.
The focus of this discussion is on the GPU side of things but there are some interesting developments in the '3D' packaging side as well. AMD has effectively used 2.5 D packaging for GPU's and now for EPYC. Intel claims to have the most cost effective solution with EMIB but it isn't really showing up in products. It is supposed to be in the Stratix 10 and maybe a Xeon? AMD is using a 'more expensive' solution for 2.5D products and realizing cost savings.

Greg Yeric posted part 1 of what he says will be a three part series on 3DIC. Greg shows the economics around the 'chiplet' strategy of EPYC.

https://community.arm.com/arm-research/b/articles/posts/three-dimensions-in-3dic-part-1

There is a link in the Yeric article to a technical article on yield versus chip partitioning done by AMD for EPYC from UCSB/AMD.

https://seal.ece.ucsb.edu/sites/sea.../2017-iccad-stow-activepassiveinterposers.pdf
 

Dolan

Junior Member
Dec 25, 2017
14
7
41
#49
Is anyone here optimistic about their plans? I mean, even if it will be better this time (Larabe), do you think they will offer it at lower price?

I would guess that this will be an extension of EMIB: Intel is working on multiple generations of EMIB, for instance we know Falcon Mesa will use EMIB2, which will shrink the bump pitch to 35um from 55um for 2.5x the bandwidth, and Intel has previously said they were seeing as low as 10um in the labs.
For comparison: 65nm interposers has 45 um bumps (these was used back in 2011). You probably forgot to mention this detail.
 

ksec

Senior member
Mar 5, 2010
353
6
91
#50
Perhaps I'm showing my age a bit here, but I actually had an Intel i740 AGP card, one of the first AGP cards ever released. It was my first gaming PC, and it was much cheaper than the Voodoo 2 at the time, though performance was obviously a lot lower too. It still actually played games well though, and for a first attempt it wasn't bad, they never built on that though and kind of gave up.

http://www.tomshardware.com/reviews/graphic-chips-review-april-98,64-3.html
It was value for money, but at the time Glide still reign. Intel had drivers issues ( as with all other non-major manufacturer ) And the market changes very quickly, Riva 128, TNT, TNT2, Geforce. By the time Intel got their i740 out they were late already. And then they gave up, because they were not willing to invest into drivers development.
 


ASK THE COMMUNITY