• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

News Intel to develop discrete GPUs

Page 26 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

NTMBK

Diamond Member
Nov 14, 2011
8,809
1,855
136
Do you realize that its a socketed chip? Yes, the HPC version might come in sockets!

Though its not relevant to graphics, especially on client.
Huh, interesting! Should certainly make it easier for server vendors to build custom chassis for it.

What makes you say that? The gold "this way up" tab on the card?
 

IntelUser2000

Elite Member
Oct 14, 2003
7,095
1,628
136
Huh, interesting! Should certainly make it easier for server vendors to build custom chassis for it.

What makes you say that? The gold "this way up" tab on the card?
Yea and somebody mentioned that. There's no reason to develop a BGA socket like that.

Earlier tweets also showed the picture of the second package upside down. It was LGA.
 
Last edited:

MrTeal

Platinum Member
Dec 7, 2003
2,837
404
126
Yea and somebody mentioned that. There's no reason to develop a BGA socket like that.

Earlier tweets also showed the picture of the second package upside down. It was LGA.
Even BGAs have an orientation indication, so it's not impossible that they'd use an arrow on a BGA.
1593464950311.png
That definitely looks like a sockets chip though.
 

IntelUser2000

Elite Member
Oct 14, 2003
7,095
1,628
136
@MrTeal That's true. But their BGA server chip looks like this:
1593616409666.png

(Xeon Platinum 9200 is BGA)

The Xe looks like the second one. Not only that, the earlier leaked picture was LGA. If no one said anything I would have confused the smallest one with a future desktop chip.


Of course it could all be for testing purposes and end up being BGA anyway.
 

mikk

Platinum Member
May 15, 2012
2,836
671
136
This is a good example of the ongoing Gen12 driver work:


This avoids some performance regressions on Gen12 platforms caused by SIMD32 fragment shaders reported in titles like Dota2, TF2, Xonotic, and GFXBench5 Car Chase and Aztec Ruins.
 

mikk

Platinum Member
May 15, 2012
2,836
671
136
But what about Artic Sound? It was supposed to be on Intels 10nm process in 2021, this version might come as planned. As for DG2 and 7nm versions they have to switch to either TSMC or Samsung. If it's TSMC it could end up better in the end, I mean they can use a state of the art process node. Why do you think it doesn't bode well for Intels graphics line?
 

NTMBK

Diamond Member
Nov 14, 2011
8,809
1,855
136
Why? It might save their dGPU effort entirely. They'll actually have a working node, unlike the CPU design teams.
I see a couple of reasons.

Firstly, I don't think Intel can really compete just on the strength of their design. They're up against AMD and NVidia, who have been refining their discrete GPUs for decades, and have deep experience, connections with game developers, and highly optimized drivers. If Intel had a process advantage, that could cancel out their difficulties... but not on the same node.

Secondly, I believe that that dGPU project was conceived as a way to increase Intel's volume, fill their fabs, and make further process development more viable. Much like Intel's attempts at mobile SoCs, modems, AI, and more, they wanted to branch out and find areas outside of servers and PCs to fill the fabs. If they are just going to competitors instead (and helping fund the R&D of new processes for their rivals!), what is the long term justification? Why keep sinking resources into competing in a market which isn't your area of expertise?
 

DrMrLordX

Lifer
Apr 27, 2000
16,371
5,282
136
Why keep sinking resources into competing in a market which isn't your area of expertise?
1). Because Raja (heh)
2). If they can score net positive revenue from the effort, it helps the company in the short term
3). Intel needs an alternative to HPC that can't be filled by their Xeons, dead Phi product lineup, and FPGAs

Nothing that Intel has worked on up to this point has produced a major HPC product outside of their own Xeons. Phi found its way into some supercomputers, but Phi is dead. Ponte Vecchio was supposed to be Intel's shot at entering the GPGPU market which has been essentially closed to them. They had a chance to disrupt that market by producing purpose-built accelerators to supplant nVidia's and AMD's GPGPU products, but that never happened. For now, the HPC market is still essentially dominated by dGPU tech repurposed for compute (and given enough time, eventually nVidia and AMD will fully split research efforts on dGPUs and compute cards into separate product lines). If someone (Fujitsu?) is going to change that, it might happen, but it does not look like anything Intel has produced to-date will fit the bill. There was more to Raja's tenure at Intel than just filling out fab capacity and producing pipecleaner products. As it stands, Intel doesn't have any spare 10nm capacity (apparently) thanks to Tiger Lake, and with 7nm being such a mess, there's no way Raja is going to waste his time throwing designs at a broken process.
 

jpiniero

Diamond Member
Oct 1, 2010
8,289
1,355
126
Firstly, I don't think Intel can really compete just on the strength of their design. They're up against AMD and NVidia, who have been refining their discrete GPUs for decades, and have deep experience, connections with game developers, and highly optimized drivers. If Intel had a process advantage, that could cancel out their difficulties... but not on the same node.
Yep, I was just thinking about it. Especially with AMD getting their act together, Intel has no shot in dGPUs. Be an easy thing to shut down to cut costs. You'd still need to develop the IGP but that can be done at a much reduced scale.

The GPGPU products can stick around but they need to be on a competitive node. Now it could be on the chopping block if they lose Aurora.
 
  • Like
Reactions: NTMBK

IntelUser2000

Elite Member
Oct 14, 2003
7,095
1,628
136
The GPGPU products can stick around but they need to be on a competitive node. Now it could be on the chopping block if they lose Aurora.
Intel are talking about Tiger Lake (and its Xe GPU) this week. Looking forward to hearing about the architecture!
I think they might lose Ponte Vecchio for Aurora. This is very likely what the 6 month delay is about. It's the only timeframe that makes sense. Remember Aurora was already delayed because thanks to 10nm, Xeon Phi died. I think they lost a lot of goodwill there already and another 6 months might well be enough to consider AMD/Nvidia instead.

The rest of the stack with Sapphire Rapids and 3rd Gen Optane persistent memory looks okay.

It doesn't bode well for their dGPUs. Even the rumors are not too positive on the only known dGPU, the DG1. In contrast, Tigerlake's variant is seen in a much better light. So it probably doesn't scale well and at the high end it means it loses perf/watt to competitors.

Again, like I mentioned they can make decent iGPUs, but they never shown that it can efficiently scale! That's why all the Iris Pros failed.
 

gorobei

Diamond Member
Jan 7, 2007
3,066
216
106
moore'slawisdead put out his 2year breakdown on graphics chips. some bits on multichip gpu and intels infighting. (murthy was autocorrected to murphy). Xe dg2/3 may be ok but way too late if rdn3/hopper is already out .
mooreslaw intel.png
 
  • Haha
Reactions: Tarkin77

beginner99

Diamond Member
Jun 2, 2009
4,597
958
136
moore'slawisdead put out his 2year breakdown on graphics chips. some bits on multichip gpu and intels infighting.
It really surprises me that intel even hired Raja. It was clear his focus at AMD was playing games and not actually making good GPUs. He seems to be repeating the same marketing blunders like the "poor volta" thing while focusing on playing games. However it does seem the Murthy guy really was a huge problem and most likley the reason Keller left.
 

NTMBK

Diamond Member
Nov 14, 2011
8,809
1,855
136

FaaR

Golden Member
Dec 28, 2007
1,057
407
136
moore'slawisdead put out his 2year breakdown on graphics chips.
Oh, giant-head-pencil-neck guy who makes up weird crap out of thin air. Or well, not thin air; the directions fans are turned... lol

I'm not putting any stock in anything that guy is saying. He seems like a bigheaded (lol) guy who's in love with the sound of his own voice. If he's right about something, I'll notch it up to sheer coincidence.
 
  • Like
Reactions: rUmX and mikk

ASK THE COMMUNITY