Intel Chips With “Vega Inside” Coming Soon?

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Topweasel

Diamond Member
Oct 19, 2000
5,436
1,654
136
And hopefully, for the last time. The issues related to managing multiple GPUs still have to be dealt with, even if you put them in the same package. The interconnect may be faster, but at the logical level, it is still the same problem.

But it's not an interconnect the way you are treating it. It's a mesh that allows the device to be treated as one and isn't on any level except it being multiple dies any thing like Xfire. Where it is just instruction sharing.
 

NTMBK

Lifer
Nov 14, 2011
10,232
5,013
136
The Chinese rumor mill states it will be "Palo Alto" which is a...

24 CU/1536 ALU @ <1.1 GHz
w/ a single 4 GB HBM2 stack @ <1.5 GHz.

So, essentially RX 660 or RX Vega 24. ~180 mm² to 210 mm²... so 123 mm² (KBL-H) + 210 mm² (RX Vega 24) + 91.99 mm² (HBM2 die)... so about ~430 mm² to ~460 mm².

Going from the pictures in the Intel press release, it looks like the Chinese rumour mill nailed it:

Intel-8th-Gen-CPU-discrete-graphics-2.jpg


https://newsroom.intel.com/editoria...nce-cpu-discrete-graphics-sleek-thin-devices/

Single HBM2 package, and die that is roughly the right size for a GPU of that power.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,772
3,596
136
That's ~80% more GPU "teraflops" than the base PS4. So what about TDP? 25W cTDP on KBL-R + 50W GPU?
 
  • Like
Reactions: Drazick

Glo.

Diamond Member
Apr 25, 2015
5,705
4,549
136
1536 GCN5 cores? 1.1 GHz? It will be around GTX 1060 3 GB/GTX 1060 6 GB performance...
 

Bouowmx

Golden Member
Nov 13, 2016
1,138
550
146
It's real :eek:

Safe to say this is the Intel "dGPU" NUC, though if the dedicated GPU is bigger than Polaris 11, 100 W might be cutting it too short.
 

Glo.

Diamond Member
Apr 25, 2015
5,705
4,549
136
Well roughly speaking, 2 AMD SPs = 1 NVIDIA CUDA core, at the same clocks. I think it will be between mobile GTX 1050Ti and 1060.
Radeon Pro 555, with 768 GCN cores, has exactly the same performance as GTX 1050 Ti, which has 768 CUDA cores at the same clock speeds on both GPUs.

1536 GCN5 cores, 1.1 GHz with 204.2 GB/s bandwidth(800 MHz HBM2) will have around GTX 1060 3 GB performance.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Radeon Pro 555, with 768 GCN cores, has exactly the same performance as GTX 1050 Ti, which has 768 CUDA cores at the same clock speeds on both GPUs.

1536 GCN5 cores, 1.1 GHz with 204.2 GB/s bandwidth(800 MHz HBM2) will have around GTX 1060 3 GB performance.
It will probably be power limited, though.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,772
3,596
136
Radeon Pro 555, with 768 GCN cores, has exactly the same performance as GTX 1050 Ti, which has 768 CUDA cores at the same clock speeds on both GPUs.

1536 GCN5 cores, 1.1 GHz with 204.2 GB/s bandwidth(800 MHz HBM2) will have around GTX 1060 3 GB performance.
Sure they do, going by the "teraflops" number based on freq*core count*2 metric, but those numbers are bogus.
 
  • Like
Reactions: Drazick

Glo.

Diamond Member
Apr 25, 2015
5,705
4,549
136
Sure they do, going by the "teraflops" number based on freq*core count*2 metric, but those numbers are bogus.
No. Both GPUs have EXACTLY the same performance level in games. Radeon Pro 555, with 768 GCN cores, and around 900 MHz core clock, and GTX 1050 Ti with 768 CUDA cores with 900 MHz will perform exactly the same in games.

My company and I have tested it months ago, with Radeon Pro 455 from 2016 MacBook Pro 15 inch(which is the same as Radeon Pro 555), and GTX 1050 Ti, declocked to that level. And GTX 1050 Ti in Overwatch 1080 p was faster just by 3 FPS. In rest of games they were divided by 1 FPS, one way or another for any of the GPUs.

Core for core, clock for clock between the GPUs there is no difference between vendors. Nvidia just has higher clock speeds.
 
  • Like
Reactions: Drazick and prtskg

tamz_msc

Diamond Member
Jan 5, 2017
3,772
3,596
136
No. Both GPUs have EXACTLY the same performance level in games. Radeon Pro 555, with 768 GCN cores, and around 900 MHz core clock, and GTX 1050 Ti with 768 CUDA cores with 900 MHz will perform exactly the same in games.

My company and I have tested it months ago, with Radeon Pro 455 from 2016 MacBook Pro 15 inch(which is the same as Radeon Pro 555), and GTX 1050 Ti, declocked to that level. And GTX 1050 Ti in Overwatch 1080 p was faster just by 3 FPS. In rest of games they were divided by 1 FPS, one way or another for any of the GPUs.

Core for core, clock for clock between the GPUs there is no difference between vendors. Nvidia just has higher clock speeds.
Good to know, in that case this looks more promising.
 
  • Like
Reactions: Drazick

Jan Olšan

Senior member
Jan 12, 2017
278
297
136
Have you never seen PR departments in action? Stuff that is not ready to be publicly announced is always denied. The first line only talks about licensing and the second one is broad enough to not rule out selling GPU chips to Intel.

In an case, "debunked" has a bit more stronger connotations than "their PR contact didn't admit it exists".
Well, I have no interest in you believing this, so think what you want. I'll just say that this poster that turned out to be a fluke was not the actual evidence for Kaby Lake-G in the first place, so claiming "victory over the rumours" based on it is shortsighted. It was just the impulse for this thread, nothing more, nothing less.
There are three pieces of actual info that say the plan does exist (or used to at least). Out of those, two are public and not "debunked": the BenchLife leak, and the NUC roadmap that matches it.

Anyway, I'm all for dropping this topic and waiting for confirmation next year. In the meantime, something to think about:
Kaby Lake-H has package size 42mm x 28mm, which is the same as the package used for Skylake-H CPUs with GT4 GPU and eDRAM on package (see ARK). Now, acording to BL, the purported Kaby Lake-G chip comes in package size of 58.5 × 31.0 mm. And it is not because of the PCH coming onto the substrate, because there is a note saying "2-chip platform type" (which means it is like current Kaby-H with external PCH).


So, why the big package suddenly? Actually, let's brainstorm, what do you guys think? Should be fun :)

FCWPARx.jpg


For the "denials" see my previous opinion.

Heh, feels nice to be right on the internet. But I heard a confirmation of this from somebody a month or two ago, so it was cheating a bit. I tried to warn you though :)
 

FIVR

Diamond Member
Jun 1, 2016
3,753
911
106
Most people on this forum were wrong about this because it is an obviously boneheaded move for AMD to make. This product will vastly outperform raven ridge and make it look like a budget solution. It will give intel a huge amount of credibility in the notebook market. The only possible good things for AMD are more revenue and a reduction in nvidia notebook gpu marketshare.


It is an incredibly short-sighted decision from AMD management that betrays their lack of competence. It is a huge signal to all OEMs and customers that AMD is a "budget" cpu solution that should be avoided if you want real performance.


I never thought that AMD would be foolish enough to do this, or that intel would be clever enough to trick AMD management into making such a deal. I was very, very wrong.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Most people on this forum were wrong about this because it is an obviously boneheaded move for AMD to make. This product will vastly outperform raven ridge and make it look like a budget solution. It will give intel a huge amount of credibility in the notebook market. The only possible good things for AMD are more revenue and a reduction in nvidia notebook gpu marketshare.


It is an incredibly short-sighted decision from AMD management that betrays their lack of competence. It is a huge signal to all OEMs and customers that AMD is a "budget" cpu solution that should be avoided if you want real performance.


I never thought that AMD would be foolish enough to do this, or that intel would be clever enough to trick AMD management into making such a deal. I was very, very wrong.

You really tend to overreact in both directions. One day Raven Ridge is going to drive Intel out of business and the next, an expensive niche part with AMD inside is going to destroy AMD.

This really doesn't compete with integrated GPU/APU parts. It's a dGPU competitor, and it will be even more expensive than a conventional dGPU solution.

This niche product and Raven Ridge are not remotely competitors.
 

NTMBK

Lifer
Nov 14, 2011
10,232
5,013
136
Most people on this forum were wrong about this because it is an obviously boneheaded move for AMD to make. This product will vastly outperform raven ridge and make it look like a budget solution. It will give intel a huge amount of credibility in the notebook market. The only possible good things for AMD are more revenue and a reduction in nvidia notebook gpu marketshare.


It is an incredibly short-sighted decision from AMD management that betrays their lack of competence. It is a huge signal to all OEMs and customers that AMD is a "budget" cpu solution that should be avoided if you want real performance.


I never thought that AMD would be foolish enough to do this, or that intel would be clever enough to trick AMD management into making such a deal. I was very, very wrong.

Raven Ridge is aimed at 15W SoCs, a market segment this will never compete in. Consider this- Apple could have a lineup with Raven Ridge in 13" MBP, and still have this part reserved for the 15" MBP.