Intel Chips With “Vega Inside” Coming Soon?

Page 14 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DeathReborn

Platinum Member
Oct 11, 2005
2,743
734
136
They have made server versions of the Mini's in that past. I'd agree that it probably isn't going into an Apple product. Just that it could is all.

I have a generic Bluetooth Keyboard that is identical to that keyboard, so probably just a universal BT keyboard and not a Apple one.
 

StinkyPinky

Diamond Member
Jul 6, 2002
6,761
777
126
I'm pretty sure that's an Apple keyboard, it has the Apple layout which is different than the standard keyboards
 

ksec

Senior member
Mar 5, 2010
420
117
116
Arhaha... and there is Xeon-E 21xxG. Which is Coffee Lake, and I suppose the G means Radeon Vega Graphics.

I have this suspicious feeling all the "Pro" moniker Apple Products will get Xeon. Christ that means MBP are going to be more expensive.

New Intel Kabylake-G performance at GDC 2018.

Not much surprise. I hope Anandtech or RealworldTech will do an article as to that are the benefits combined.
 

NostaSeronx

Diamond Member
Sep 18, 2011
3,683
1,218
136
Intel is planning to swap over to chiplet pretty quick like. Prototype on 14nm, Product on 14nm, etc.

8 * 2 * 8 EUs => 128 EUs ÷ 4 => 32 CUs(equivalent to AMD) -> <200 mm squared
7 * 2 * 8 EUs => 112
8 * 2 * 6 EUs => 96 EUs ÷ 4 => 24 CUs(equivalent to AMD) -> <140 mm squared
7 * 2 * 6 EUs => 84

UPI interconnection with EMIB between dGPU Gen 9.X(Gen9 with ML instructions) and HBM2 G3(2.666 GHz).

Coffee Lake G will definitely be dropping AMD. fyi, CFL-8C has no GPU; 8+0.
 
Last edited:

NostaSeronx

Diamond Member
Sep 18, 2011
3,683
1,218
136
They don't have big Gen 9 GPUs
Intel does it is called GT4. Which is 3 * 3 * 8 => 72 EUs.

CHA/Coherent Home Agent for UPI, etc. (Added in the Prototype and tested, it was expected to work so the move to production should be rather fast after ISSCC; http://ieeexplore.ieee.org/document/8310172/)
3 * 3 * 8 for GT4 in old Gen9 to 6 * 2 * 6 for GT4 in new Gen9; the switch to smaller sub-slice is 2X speed; 800 MHz to 1.6 GHz or 1.0~1.15 GHz to 2 to 2.3 GHz, etc.

The IP is finished, time to actual product is any moment after Kaby Lake G. Which Intel is definitely aiming to get rid of. As AMD gave Intel Polaris with no ML instructions. Which is a step back for Intel; https://software.intel.com/en-us/ar...rning-inference-with-intel-processor-graphics
https://software.intel.com/en-us/articles/background-on-ai-and-the-move-to-the-edge
 
Last edited:

ksec

Senior member
Mar 5, 2010
420
117
116
I am not buying this.

The Gen9 72 EU is Iris Pro Graphics 580. That is now nearly 2 years old. And it is slow, very slow. It was used in the previous Skull Cannon and even Intel has said their newer Hades Canyon is much faster.

http://gpu.userbenchmark.com/Compar...-Iris-Pro-580-Mobile-Skylake/m422266vsm132950

And this shows it is 2.5x as fast. And this is with Iris Pro 128MB eDRAM advantage.

Even on your 72 EU to 128EU scaling, Still a difference between 1.8x and 2.5x.
 

NostaSeronx

Diamond Member
Sep 18, 2011
3,683
1,218
136
And this shows it is 2.5x as fast. And this is with Iris Pro 128MB eDRAM advantage.

Even on your 72 EU to 128EU scaling, Still a difference between 1.8x and 2.5x.
All I have to say for now is iGPU capabilities != dGPU capabilities.

- More space in chiplet, lower cost.
- More space => Binned primitive shaders, JIT in drivers; Implicit. Wider binned RBE, etc // accelerated by discrete units on die, which take space. ((Bigger caches, more registers, etc can be there as well))
- Lower cost => means HBM2 can be deploy; HBM2 G3 => 8 gigabytes @ 341 gigabytes per second(2666 MHz), HBM2 G2 => 4/8 gigabytes @ 307 gigabytes per second(2400 MHz).
(HBM G3 costs less than HBM2 G2 and HBM2 G2 is cheaper than HBM2 G1, etc)

128 megabyte eDRAM @ 102 gigabytes per second... games use more than that so... 35 gigabytes per second DDR4, etc.

There is also the clock rate increase, etc. Gen 9(14nm) != Gen 9++.x++ or whatever (14nm++ to 14nm# or 14nm-rust;; 14nm# is 14nm++ with Gen 1 ML design/14nm-rust is 14nm++ with Gen 2 ML design). ((PnR(Place N Route) design, not an AI doing everything, etc.))
 
Last edited:

jpiniero

Lifer
Oct 1, 2010
14,510
5,159
136
I would assume those Xeon E-G parts just have the Intel IGP enabled. Fitting the RTG GPU would be kind of hard with the LGA socket I would think.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
For a total of 1490 USD: that's pretty funny

$1440 actually, the base is $1320.

Laptop Magazine has a short review of the XPS 15 2-in-1, they say full review is going to be later today. $1499 for now, in a month $1299 configuration will be available.
 

zinfamous

No Lifer
Jul 12, 2006
110,512
29,099
146
Seems like a reasonable price? That is what ultrabooks cost. Or am I missing something that makes it funny?

Yes it is a reasonable price for Ultrabooks. I forget, but I think the issue with the Spectre was some other crappy components, like slow SSD, not enough RAM, sub-par display. ...I forget, to be honest, but I do recall luke-warm reception with the first reviews.
 

Bouowmx

Golden Member
Nov 13, 2016
1,138
550
146
The Dell page for the XPS 15 2-in-1 says its rated up to 15 hours for battery life. I bet a GTX 1070 system can't do that.
No, especially if for example, the system has G-SYNC: ASUS GL502VS-WS71.

My confusion is why no Taiwan brand has come out with the Intel Core i7-8809G in a standard-size laptop chassis at price competitive with offerings with NVIDIA GPU.