News Intel GPUs - more reviews coming in!

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

jpiniero

Lifer
Oct 1, 2010
12,262
3,671
136
Historically they announced very low volume chips(high end usually) before ramping volume.
Correct, but as you saw with the Lenovo leak there's only going to be a tiny amount of Icelake out there. Broadwell's ramp was way higher than this.

Aurora's target performance must be at least 5 times more powerful than Summit, which uses 14nm Power 9 and 12nm V100, it won't be achievable in 14nm unless they make it ridiculously bigger.
That's why I was suggesting that the GPU has to be on a non-Intel node. The GPU can't be on 14 nm.
 

JasonLD

Senior member
Aug 22, 2017
479
440
136
That's why I was suggesting that the GPU has to be on a non-Intel node. The GPU can't be on 14 nm.
If they can get many core Xeon on 10nm, they will be able to use ther 10nm on ther GPU as well. Unless you have a very concrete evidence that Intel won't be able to release any big sized 10nm dies in 2020, it would be very safe to assume that they will release their GPU in 2020 on 10nm.
 

jpiniero

Lifer
Oct 1, 2010
12,262
3,671
136
All the leaks so far suggest even client CPUs on 10 nm is going to be tough and not that plentiful. "Big die" Xeons, etc, are just for shareholder PR and will be covered by 14 nm products like Cooper Lake.
 

jpiniero

Lifer
Oct 1, 2010
12,262
3,671
136
There was a guy from twitter that went to the OCP summit and said not only 10nm problems were fixed
I take fixed as "They might be able to ship some". Which is a big improvement from where they were with Cannonlake, which was completely unusable.

I don't think it's a bad thing if they are fabbing the GPUs at TSMC for instance. Gives it some credibility that they are serious about not screwing this up.
 

JasonLD

Senior member
Aug 22, 2017
479
440
136
All the leaks so far suggest even client CPUs on 10 nm is going to be tough and not that plentiful. "Big die" Xeons, etc, are just for shareholder PR and will be covered by 14 nm products like Cooper Lake.
Ice Lake Xeon and successor to cascade lake-AP is slated for 2020. Their latest roadmap suggest they will be released just on time for Aurora. I know you are skeptical, but I would trust actual planned product with concrete timeline more than your personal assumption.
 

jpiniero

Lifer
Oct 1, 2010
12,262
3,671
136
Aurora isn't likely using the AP line... seems they are using Xe dGPUs as the main focus. The CPU is more likely a Rapids on Tinsley platform but it could be even Cooper.
 
Mar 10, 2006
11,715
2,011
126
Aurora isn't likely using the AP line... seems they are using Xe dGPUs as the main focus. The CPU is more likely a Rapids on Tinsley platform but it could be even Cooper.
If Intel will be able to manufacture a Rapids chip in time for the 2021 supercomputer, why would a 10nm GPU be out of the question?
 
Last edited:

IntelUser2000

Elite Member
Oct 14, 2003
8,394
3,337
136
If Intel will be able to manufacture a Rapids chip in time for the 2021 supercomputer, why would a 10nm GPU be out of the question?
Because that would require believing 10nm can work.

The old guard with the crazy CEO is gone. Mere few months can change things a lot, and its hard to see it if you are adamant seeing it one way. 22nm had problems, but in a short time it became good enough that they were able to mass produce in Intel-scale.
 
  • Like
Reactions: Arachnotronic

jpiniero

Lifer
Oct 1, 2010
12,262
3,671
136
22nm had problems, but in a short time it became good enough that they were able to mass produce in Intel-scale.
You mean 14... and 14 had a very slow ramp, but obviously they were able to fix it eventually but it took several quarters.

If Intel will be able to manufacture a Rapids chip in time for the 2021 supercomputer, why would a 10nm GPU be out of the question?
The CPU tile doesn't need to be on 10. In fact it better not be on 10, for Intel's sake.
 

NTMBK

Diamond Member
Nov 14, 2011
9,942
4,257
136
If it uses HBM2, it could have a very small board (like the AMD Nano cards).
 

IntelUser2000

Elite Member
Oct 14, 2003
8,394
3,337
136
It's merely an artist rendition anyway.

HBM integration is still too expensive for consumers. They'll probably use GDDR6, or its variants for most.
 
  • Like
Reactions: Arachnotronic

Mopetar

Diamond Member
Jan 31, 2011
6,844
4,023
136
Intel could try to use HBM in order to drive further investment into production with the long term goal of reducing costs so that it is affordable for consumers. It obviously won't replace GDDR or DDR across all segments, but I can see Intel wanting to produce something that could be considered an x86 SoC where everything is combined into a single package, which makes a compelling product.

They know that AMD wants to develop that kind of product (see that old concept where they had a server chip with Zen cores, GPU chiplets, and HBM memory all on a single chip) so it's something that they need to test out and consider as well. Intel already has a little bit of experience with this in Hades Canyon which included HBM for the Vega GPU that was put on package.

I think it's fair to say that they have some interests in HBM and that it's not impossible to imagine them using it in some products.
 

Ajay

Lifer
Jan 8, 2001
11,576
5,267
136
Because that would require believing 10nm can work.

The old guard with the crazy CEO is gone. Mere few months can change things a lot, and its hard to see it if you are adamant seeing it one way. 22nm had problems, but in a short time it became good enough that they were able to mass produce in Intel-scale.
So, are you saying Intel won’t have a 10nm server CPU by 2020?
 

Ajay

Lifer
Jan 8, 2001
11,576
5,267
136
Last I read, Intel had two fabs that could do 10nm (can’t remember source). That’s down from the the originally projected eighth fabs that were supposed to ramp.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,394
3,337
136
So, are you saying Intel won’t have a 10nm server CPU by 2020?
No, that's not what I said.

Last I read, Intel had two fabs that could do 10nm (can’t remember source). That’s down from the the originally projected eighth fabs that were supposed to ramp.
Eight? Don't you mean they had 8 total? They always have fewest in the beginning of the process ramp, and get more online as time passes.

You can see its true for previous processes as well in this example: https://www.crn.com/news/components-peripherals/213403027/intel-fires-up-u-s-fabs-for-32nm-westmere-chips.htm
 

EXCellR8

Diamond Member
Sep 1, 2010
3,894
792
136
Did Intel ever reveal what sort of market they were aiming for with discrete GPU's? I mean it would make sense to target midstream in order to test the waters and mix things up a bit but I have a feeling that a "for gaming" graphics product may be a ways off.

I've been loosely following so apologies if this has been previously discussed.
 

jpiniero

Lifer
Oct 1, 2010
12,262
3,671
136
Did Intel ever reveal what sort of market they were aiming for with discrete GPU's? I mean it would make sense to target midstream in order to test the waters and mix things up a bit but I have a feeling that a "for gaming" graphics product may be a ways off.
Still quite hazy although what obviously attracted Intel in the first place is the margins of the high end products. That being said, I think there was a Alienware exec who said the top product would be slower than the 2080 Ti?
 

IntelUser2000

Elite Member
Oct 14, 2003
8,394
3,337
136
Did Intel ever reveal what sort of market they were aiming for with discrete GPU's?
A company like Intel typically doesn't enter a new market just to test waters. They fully intend to dominate in it.

(second slide)
https://www.anandtech.com/show/13699/intel-architecture-day-2018-core-future-hybrid-x86/5

Their revenue is heading for $70 billion this year. They must believe graphics is necessary to protect the massive market they already have, or they'll be able to take enough share and add tangibly to their revenue.

I hope their discrete graphics effort ends up more than just a side project that gets tossed away after a few years, and is a result of organic growth stemming from their integrated graphics efforts.

But I'm not so sure.
 

ASK THE COMMUNITY