Significant leaks: Intel working on true 3D packaging and Raja wants to enter dGPU "with a bang"

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

EXCellR8

Diamond Member
Sep 1, 2010
3,982
839
136
NVIDIA definitely needs to be knocked down a few pegs; AMD isn't going to be the one to do it as we all know already. Intel certainly has the resources and funding to take on the green team but, like so many others, I'm skeptical that they'll be able to deliver a solid product right out of the gate. You never know though.
 

moinmoin

Diamond Member
Jun 1, 2017
4,944
7,656
136
With Raja on board, Arctic Sound has been "split into two", with the second side of the coin being the gaming market. He is said he wants to "enter the market with a bang".

Is this "Poor Volta" Raja that we are talking about?
That's exactly what came to my mind when reading the above quote. I sure hope Raja's bullish attitude results in a competitive product instead a bizarre ad.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
I don't think Intel will aim high enough for their first gaming card. I expect a low end solution, like an upgrade card to replace an APU's graphics or something like that. Probably cost like $100. I don't expect them to actually impress anyone with their big bang market entry. That's the practical side of me.
The hopeful side is very excited and I'd absolutely LOVE to have a badass, high end Intel GPU to match whatever sick new 8 core CPU I buy from them for a future gaming rig. I like matching stuff and I'd love to go Intel for both CPU and GPU. I'd use blue themed décor and skulls everywhere. Like RGB skulls and stuff. BRING IT ON INTEL! Come on Raja! Don't blow it man. Push them HARD and make it happen!
 
  • Like
Reactions: pcp7, ZGR and IEC

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
I don't think Intel will aim high enough for their first gaming card. I expect a low end solution, like an upgrade card to replace an APU's graphics or something like that. Probably cost like $100. I don't expect them to actually impress anyone with their big bang market entry. That's the practical side of me.
The hopeful side is very excited and I'd absolutely LOVE to have a badass, high end Intel GPU to match whatever sick new 8 core CPU I buy from them for a future gaming rig. I like matching stuff and I'd love to go Intel for both CPU and GPU. I'd use blue themed décor and skulls everywhere. Like RGB skulls and stuff. BRING IT ON INTEL! Come on Raja! Don't blow it man. Push them HARD and make it happen!
From the press release:
Koduri will expand Intel’s leading position in integrated graphics for the PC market with high-end discrete graphics solutions for a broad range of computing segments.

@Arachnotronic many people still in doubt!
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
^ Well, when intel says, "high-end discrete graphics solutions", I interpret that to actually mean, "high end when compared only to the usual warm sick we spew on people with our weak integrated solutions". That's surely what they mean by "high end". Its only high end in a vacuum, right? As in only when compared to what intel has in the market right now. Heck, any low end graphics card is high end compared to their integrated stuff.
So you see, this is all about reading between the lines and realizing that Intel is likely making products that Best Buy customers may consider "high end" when compared to the other word processors in the PC isle. Sorry but this is my honest expectation. I WANT much better. I did not choose to think this way. My opinion has been formed by external events that are out of my control, therefor, my opinion is out of my control. If Intel makes a good GPU that gamers actually want, then my opinion will swiftly change, and that will also have been out of my control.
 
  • Like
Reactions: PingSpike

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
Forget gaming, there's no way Intel is going to invest in that level of driver support.

I know its like a HUGE operation, right? Its not just like, "Ok lets make a high end GPU". Its a big, huge deal that requires tons of resources and talent and all sorts of insane stuff. That's why I know we shouldn't get our hopes up because this thing is going to suck!
Also, I have to add that I don't actually have a lot of faith in Raja. He thought vega was good and vega was crap, so something isn't right with that. I'd expect him to make something weak and crappy and then be like all, "WOW look at this thing fly!"
 
Last edited:

Shivansps

Diamond Member
Sep 11, 2013
3,851
1,518
136
Not to sound overly critical, but I don't remember them trying. Ever. They always talk a big game, but have never actually delivered on a single GPU design worth having. I think that if they were serious and actually trying, they would have had at least one competitive product in the past 40 years.

To be fair, they absolutely dominate the entry-level market with IGP.

They did try with the first Iris Pro, they had the best performing IGP for a while, but at a price no one same would pay.
 

mindless1

Diamond Member
Aug 11, 2001
8,052
1,442
126
Yeah, that's the biggest issue right now with Intel's IGP gaming - the lack of driver optimization (cheats).
I was thinking bugfixes and features but if driver optimization gets it to an acceptable framerate it wouldn't otherwise achieve, yeah they ought to think about that too.
 

ZGR

Platinum Member
Oct 26, 2012
2,052
656
136
They did try with the first Iris Pro, they had the best performing IGP for a while, but at a price no one same would pay.

$300-$350 wasn't a bad price for the i7 at the time. The problem was severe limited quantities which increased prices, along with poor motherboard support.

The 5775c punches well above its weight once unlocking the power limit and overclocked. The 65w cap really hurt it in benchmarks.

Then Intel took Skylake Skull Canyon and released it in even more limited quantities as a NUC.

Now the only eDRAM Iris Pro CPUs are mobile.

I was super excited about what eDRAM could bring to the table, especially for games. But it seems Intel has other plans.
 
  • Like
Reactions: Drazick

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
$300-$350 wasn't a bad price for the i7 at the time. The problem was severe limited quantities which increased prices, along with poor motherboard support.

No one used Iris Pro parts because it sucked. They had to scale the usage of Iris Pro parts down because less and less manufacturers were using it.

The original Iris Pro in Haswell chips were used in handful of products. The review of a laptop with Iris Pro 5200 showed it had no advantage in light usage battery life, the laptop was expensive, and compared to similar class Nvidia GPUs it used more power in load. So who wants to use that?!? You were much better off using Maxwell GPUs. An integrated GPU that has consumes more power and expensive is a terrible integrated GPU.

The Broadwell based Iris Pro was used even less. Sure, it had the C desktop chips. But for the same price as Intel's desktop chips do you think they'll bundle 1.5x die size and an extra die for eDRAM? That's part of the reason why 5775C was a low clocked part. Outside of that barely anyone touched it. Because it was 20% faster than the anemic Iris Pro in Haswell.

The Skylake Iris Pro was the worst. Despite all the advancements, it was 20-30% faster. The NUC was literally the only implementation of the Iris Pro 580. Because the product sucked.

The regular non-Pro Iris parts are ok. The evidence of that is its being used in quite a few products.
 
Last edited:

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
Not to sound overly critical, but I don't remember them trying. Ever. They always talk a big game, but have never actually delivered on a single GPU design worth having. I think that if they were serious and actually trying, they would have had at least one competitive product in the past 40 years.

To be fair, they absolutely dominate the entry-level market with IGP.

Perhaps I'm showing my age a bit here, but I actually had an Intel i740 AGP card, one of the first AGP cards ever released. It was my first gaming PC, and it was much cheaper than the Voodoo 2 at the time, though performance was obviously a lot lower too. It still actually played games well though, and for a first attempt it wasn't bad, they never built on that though and kind of gave up.

http://www.tomshardware.com/reviews/graphic-chips-review-april-98,64-3.html
 

Excessi0n

Member
Jul 25, 2014
140
36
101
If they're stacking logic, then they need to put something better than toothpaste under the IHS.
 

Dygaza

Member
Oct 16, 2015
176
34
101
Even if Intel would be able to summon magically fast gpu for gamers, moving to such card would be risky for consumer, as Intel really has horrible reputation when it comes to their divers. And unfortunately that reputation changes very slowly.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Even if Intel would be able to summon magically fast gpu for gamers, moving to such card would be risky for consumer, as Intel really has horrible reputation when it comes to their divers. And unfortunately that reputation changes very slowly.

High end Workstation (or Server) card.

Think about an area that AMD is particularly weak that Intel could help them with.
 
Last edited:

krumme

Diamond Member
Oct 9, 2009
5,952
1,585
136
Intel needs something that uses their excess capacity. Gpu does that in spades.

New api requires slimmer drivers and places more burden on engines.

Its perfect time to enter the market.
 
  • Like
Reactions: Gikaseixas

oak8292

Member
Sep 14, 2016
82
67
91
Significant leaks: Intel working on true 3D packaging and Raja wants to enter dGPU "with a bang"

Next, on the packaging side, Ashraf has come with the major scoop that Intel is working on a 3D packaging technology called Fevoros, which will be able to package "smaller dies together as well as 3D stacking". It will be used post-Sapphire Rapids (so starting with 7nm Granite Rapids); EMIB will be used until SPR.

For some context as to what this significant news implies. It will be the successor of EMIB. EMIB is a 2.5D packaging technology that allowed for connecting dies (that are within um next to each other) with a low-power and very high-bandwidth interconnect (potentially over 4TB/s). EMIB's claim to fame and differentiator compared to the competition is that it doesn't use a very big (expensive) interposer, nor does it use yield-destroying TSVs (through silicon vias), but just a small embedded bridge in the package (as the name implies obviously).

Now, I have to say 3D stacking is already done: it is used in HBM. But as everyone knows Intel is not a DRAM manufacturer, so this implies that Intel is going to 3D stack logic dies (or 3D XPoint?). This would be a first. Everyone in the industry is interested in how well cooling of 3D stacked logic dies will go.

The focus of this discussion is on the GPU side of things but there are some interesting developments in the '3D' packaging side as well. AMD has effectively used 2.5 D packaging for GPU's and now for EPYC. Intel claims to have the most cost effective solution with EMIB but it isn't really showing up in products. It is supposed to be in the Stratix 10 and maybe a Xeon? AMD is using a 'more expensive' solution for 2.5D products and realizing cost savings.

Greg Yeric posted part 1 of what he says will be a three part series on 3DIC. Greg shows the economics around the 'chiplet' strategy of EPYC.

https://community.arm.com/arm-research/b/articles/posts/three-dimensions-in-3dic-part-1

There is a link in the Yeric article to a technical article on yield versus chip partitioning done by AMD for EPYC from UCSB/AMD.

https://seal.ece.ucsb.edu/sites/sea.../2017-iccad-stow-activepassiveinterposers.pdf
 

Dolan

Junior Member
Dec 25, 2017
14
10
51
Is anyone here optimistic about their plans? I mean, even if it will be better this time (Larabe), do you think they will offer it at lower price?

I would guess that this will be an extension of EMIB: Intel is working on multiple generations of EMIB, for instance we know Falcon Mesa will use EMIB2, which will shrink the bump pitch to 35um from 55um for 2.5x the bandwidth, and Intel has previously said they were seeing as low as 10um in the labs.
For comparison: 65nm interposers has 45 um bumps (these was used back in 2011). You probably forgot to mention this detail.
 

ksec

Senior member
Mar 5, 2010
420
117
116
Perhaps I'm showing my age a bit here, but I actually had an Intel i740 AGP card, one of the first AGP cards ever released. It was my first gaming PC, and it was much cheaper than the Voodoo 2 at the time, though performance was obviously a lot lower too. It still actually played games well though, and for a first attempt it wasn't bad, they never built on that though and kind of gave up.

http://www.tomshardware.com/reviews/graphic-chips-review-april-98,64-3.html

It was value for money, but at the time Glide still reign. Intel had drivers issues ( as with all other non-major manufacturer ) And the market changes very quickly, Riva 128, TNT, TNT2, Geforce. By the time Intel got their i740 out they were late already. And then they gave up, because they were not willing to invest into drivers development.
 
  • Like
Reactions: nathanddrews