Intel Chips With “Vega Inside” Coming Soon?

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Not that I buy into the rumor but technically they only denied licensing deals which as was previously mentioned would not prevent AMD selling Intel finished GPU dies which intel then integrates into their product. I doubt this but there could be some reasons:

- pipe-cleaner for emib

It's a low volume product and not a huge issue if it fails. it's to gain actually experience in a real product on the market with EMIB. That has value too albeit of course the chips sold will be of tiny quantity and not recuperate the r&d costs directly.

You scored closest.

Though I think it is more place holder, than pipe-cleaner.

This kind of part is a long term strategic direction Intel is taking, but they don't have a dGPU part of their own, YET.

I think the first discrete GPU part from Intel will be aimed at taking the Radeons place
 

stockolicious

Member
Jun 5, 2017
80
59
61
You scored closest.

Though I think it is more place holder, than pipe-cleaner.

This kind of part is a long term strategic direction Intel is taking, but they don't have a dGPU part of their own, YET.

I think the first discrete GPU part from Intel will be aimed at taking the Radeons place

"I think the first discrete GPU part from Intel will be aimed at taking the Radeons place"

how old will we be when that happens? it took Raja and AMD 4 years or more to create Vega and the barrier to entry in dGPU space is as difficult as I have ever seen in tech. IMHO Raja is there to help develop semi custom products eventually in the data center where intel can add AMD's GPU as well as their own technology for a differentiated product. I think building a dGPU from the ground up is vapor.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
"I think the first discrete GPU part from Intel will be aimed at taking the Radeons place"

how old will we be when that happens? it took Raja and AMD 4 years or more to create Vega and the barrier to entry in dGPU space is as difficult as I have ever seen in tech. IMHO Raja is there to help develop semi custom products eventually in the data center where intel can add AMD's GPU as well as their own technology for a differentiated product. I think building a dGPU from the ground up is vapor.

You are assuming Intel has been doing nothing until Raja shows up, which is a faulty assumption IMO.

Also, The first generation dGPU form Intel, just needs to be a scaled up IGP part with more units.

8 times the EUs as the IGP and connected to HBM memory, it should make a reasonable entry level dGPU capable of replacing the Radeon chip they are buying from AMD for the integration project. So what if it is a bigger chip. Since it is made in house they don't have to share the money with AMD.
 

stockolicious

Member
Jun 5, 2017
80
59
61
You are assuming Intel has been doing nothing until Raja shows up, which is a faulty assumption IMO.

Also, The first generation dGPU form Intel, just needs to be a scaled up IGP part with more units.

8 times the EUs as the IGP and connected to HBM memory, it should make a reasonable entry level dGPU capable of replacing the Radeon chip they are buying from AMD for the integration project. So what if it is a bigger chip. Since it is made in house they don't have to share the money with AMD.

"You are assuming Intel has been doing nothing "

no im aware they have been trying to make a dGPU for many years now - apple tried too. without a deal where they are protected by AMD NVDIA patents i just dont buy that they can make one competitive enough and also they would not need Raja for that.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
"You are assuming Intel has been doing nothing "

no im aware they have been trying to make a dGPU for many years now - apple tried too. without a deal where they are protected by AMD NVDIA patents i just dont buy that they can make one competitive enough and also they would not need Raja for that.

The first generation doesn't have to match GPU cards, it just has to be able to replace the die they get from AMD, which AMD will be taking profit on. So Intel can afford to build their own larger die and brute force their way to a competitive product, because it will still be cheaper than buying from AMD.
 

firewolfsm

Golden Member
Oct 16, 2005
1,848
29
91
It's not die size but power efficiency that matters most. Simply scaling up the IGP, which isn't nearly as simple as you think, wouldn't give a very efficient chip, and wouldn't fit in the required TDP.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
It's not die size but power efficiency that matters most. Simply scaling up the IGP, which isn't nearly as simple as you think, wouldn't give a very efficient chip, and wouldn't fit in the required TDP.

You have power specifications based of Intels IGP? It seems to me that low IGP power use is job one for an IGP, and laptop with dGPU, usually let you shut it off and run on the IGP to save power.
 

stockolicious

Member
Jun 5, 2017
80
59
61
The first generation doesn't have to match GPU cards, it just has to be able to replace the die they get from AMD, which AMD will be taking profit on. So Intel can afford to build their own larger die and brute force their way to a competitive product, because it will still be cheaper than buying from AMD.

your way off on this - if you think Intel's first iteration will replace the AMD dGPU - not going to happen. AMD will be taking profits here but not HUGE profits - their semi custom deals are not that margin rich. Oh and INTC has been trying to "brute force" them self into their own GPU product for years its just not possible. INTC is in a bit of a quagmire as they should have bought NVDA a long time ago - nobody really saw what NVDA was becoming and they goofed that one. They instead payed NVDA for their IP which i would think NVDA was not really very friendly about helping them with - only saying that as INTC did not renew that IP deal in march. So the graphics IP INTC controls right now is not that strong which is why they made a deal with AMD. I think they want a strategic partnership with AMD to do some GPU custom deals in the DC and get the jump before NVDA gets wise and creates an interconnect between Ryzen/EPYC and their very popular GPU's.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
your way off on this - if you think Intel's first iteration will replace the AMD dGPU - not going to happen. AMD will be taking profits here but not HUGE profits - their semi custom deals are not that margin rich. Oh and INTC has been trying to "brute force" them self into their own GPU product for years its just not possible. INTC is in a bit of a quagmire as they should have bought NVDA a long time ago - nobody really saw what NVDA was becoming and they goofed that one. They instead payed NVDA for their IP which i would think NVDA was not really very friendly about helping them with - only saying that as INTC did not renew that IP deal in march. So the graphics IP INTC controls right now is not that strong which is why they made a deal with AMD. I think they want a strategic partnership with AMD to do some GPU custom deals in the DC and get the jump before NVDA gets wise and creates an interconnect between Ryzen/EPYC and their very popular GPU's.

You are the one way off. Repeating myths.

The Intel deal for NVidia patents doesn't expire, it was a perpetual license to the GPU porfolio. Which is almost certainly stronger AMDs GPU portfolio, so they have access to very good GPU IP.

Choosing an AMD part to integrate has nothing to do with NVidia patents. AMD was chosen to integrate with because AMD essentially building a similar product for themselves, making only tiny changes for Intel, so the quickest option to integrate.

This part also makes sense as the first target for Intel to go after with it's own dGPU. Since they control it as a captive platform.

What is a better market segment to try? A standalone GPU card? Not really, then they have to stand toe to toe with both AMD and NVidia.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
You have power specifications based of Intels IGP? It seems to me that low IGP power use is job one for an IGP, and laptop with dGPU, usually let you shut it off and run on the IGP to save power.

If you've been paying attention, you'd have known that while they are quite efficient on the low end, they aren't on the high power one.

Case in point is the Iris Pro part. The Iris Pro 580 using Gen 9 GPU achieves only 3500 in 3DMark11, despite having 3x the slices(3 slices), 128MB eDRAM, and 45W TDP. That also happens to be less than 30% faster compared to 45W Gen 8 based GPU that had only 2 slices.

Now imagine doubling or tripling the Iris Pro 580.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
If you've been paying attention, you'd have known that while they are quite efficient on the low end, they aren't on the high power one.

Case in point is the Iris Pro part. The Iris Pro 580 using Gen 9 GPU achieves only 3500 in 3DMark11, despite having 3x the slices(3 slices), 128MB eDRAM, and 45W TDP. That also happens to be less than 30% faster compared to 45W Gen 8 based GPU that had only 2 slices.

Now imagine doubling or tripling the Iris Pro 580.

That 45 W is shared with the CPU.
 

stockolicious

Member
Jun 5, 2017
80
59
61
You are the one way off. Repeating myths.

The Intel deal for NVidia patents doesn't expire, it was a perpetual license to the GPU porfolio. Which is almost certainly stronger AMDs GPU portfolio, so they have access to very good GPU IP.

Choosing an AMD part to integrate has nothing to do with NVidia patents. AMD was chosen to integrate with because AMD essentially building a similar product for themselves, making only tiny changes for Intel, so the quickest option to integrate.
. Since they control it as a captive platform.

What is a better market segment to try? A standalone GPU card? Not really, then they have to stand toe to toe with both AMD and NVidia.

"The Intel deal for NVidia patents doesn't expire, it was a perpetual license to the GPU porfolio"

clueless - look up perpetual and then look at the expiration - so you saying intel has a right to anything NVDA comes up with tomorrow? if so you dont know what perpetual means.
INTC can reuse patents up till a certain point and it appears they gave up on that and went with AMD.

please dont keep thinking intel is going to roll out a new dGPU soon - it looks like you might be in denial.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
"The Intel deal for NVidia patents doesn't expire, it was a perpetual license to the GPU porfolio"

clueless - look up perpetual and then look at the expiration - so you saying intel has a right to anything NVDA comes up with tomorrow? if so you dont know what perpetual means.
INTC can reuse patents up till a certain point and it appears they gave up on that and went with AMD.

please dont keep thinking intel is going to roll out a new dGPU soon - it looks like you might be in denial.

They have all those Nvidia patents licensed until those patents expire, after which everyone has access to them, so it is effectively perpetual.

I have no investment in Intel, monetary or otherwise, and I never said when I expected an Intel GPU, so what is there to be in denial about?

I just wrote that it is ridiculous to think Intel was doing nothing until they hired Raja. What if he said no? There were just going to keep doing nothing?

Raja is a high level manager. He's a high profile manager, but he won't be designing GPUs. He will just be high up the org chart.

The work before/after Raja isn't really likely to change all that much.
 

theeedude

Lifer
Feb 5, 2006
35,787
6,198
126
Intel's problem is they aren't seen as a cool company to work for in the Valley. Kind of like working for the government. Very hierarchical, run by beancounters, where rank and file engineer is a cog in a machine. Right or wrong, that's the perception. Look at Brian K. I am sure he is a nice guy, but as a recent college grad, are you excited to join his team after hearing him talk? The Valley is scorching hot for parallel/GPU computing talent because of AI. Intel is pretty far down the totem pole for most recruits, so they mainly get whatever others passed on. That's why guy who lead a losing GPU team and left with tail between his legs is considered good enough to lead theirs. That's why they have to keep spending tens of billions buying into AI instead of having successful home built efforts for much less. Those are their options for bringing in talent. Settle for second best or buy into the top tier, until their retention stock is done vesting. AMD is not much better, but at least they have the David vs Goliath good will going for them.
 

eek2121

Diamond Member
Aug 2, 2005
3,472
5,147
136
It has NOTHING to do with Patents. Intel is big enough to litigate any patent cases into settlement. These chips have 2 uses which require immediate development. 1) Apple hardware refresh. 2) Next Gen Intel NUC.

This was all Apple's doing, I guarantee it.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
That 45 W is shared with the CPU.

How much better do you expect it to do?

The original expectations were that the Skylake GT4e would do 50% better in 3DMark than the predecessor. It does 20-30%.

Barely any manufacturers used the Haswell Iris Pro because it sucked. It sucked compared to how much power it consumed, and how much it cost. Even less used Broadwell Iris Pro, and with Skylake outside of a single Intel NUC not even Apple uses it.

You could pair an Nvidia discrete GPU and end up being better in performance than the Iris Pro parts... that use the same power. It didn't deliver better battery life because somehow the idle power was higher, comparable to discrete GPUs, and it didn't deliver better battery life in load when gaming because perf/watt sucked.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
According to this guy on Youbube, Kaby-G is using Vega. The 8709G has a base of 3.1 and turbo of 3.9.

https://www.youtube.com/watch?v=D5P_uT4sjXs

Interesting. Geekbench results had both 694C and GFX804 designation. The latter is used on Polaris cards, while things are unclear for 694C.

There are rumors that they are also planning a GDDR5 variant. Was Polaris an early version for testing?
 

FIVR

Diamond Member
Jun 1, 2016
3,753
911
106
Intel's problem is they aren't seen as a cool company to work for in the Valley. Kind of like working for the government. Very hierarchical, run by beancounters, where rank and file engineer is a cog in a machine. Right or wrong, that's the perception. Look at Brian K. I am sure he is a nice guy, but as a recent college grad, are you excited to join his team after hearing him talk? The Valley is scorching hot for parallel/GPU computing talent because of AI. Intel is pretty far down the totem pole for most recruits, so they mainly get whatever others passed on. That's why guy who lead a losing GPU team and left with tail between his legs is considered good enough to lead theirs. That's why they have to keep spending tens of billions buying into AI instead of having successful home built efforts for much less. Those are their options for bringing in talent. Settle for second best or buy into the top tier, until their retention stock is done vesting. AMD is not much better, but at least they have the David vs Goliath good will going for them.

Most of what you said is accurate except for the last part. AMD is actually much better in terms of talent. Tesla, Apple and Intel are all trying to poach AMD employees. AMD has a much, much better perception in terms of being a company that can grow and offer opportunities in emerging markets like AI. They are miles ahead of Intel in that regards.


Right now, intel looks like slowly-failing company that is trying everything it can to find something beyond its core CPU business to drive revenue. So far they have not succeeded.
 

formulav8

Diamond Member
Sep 18, 2000
7,004
523
126
Actually, I believe Intel's issue's has not been any lack of talent, but lack of leadership.
 
  • Like
Reactions: ksec

NTMBK

Lifer
Nov 14, 2011
10,521
6,037
136
Most of what you said is accurate except for the last part. AMD is actually much better in terms of talent. Tesla, Apple and Intel are all trying to poach AMD employees. AMD has a much, much better perception in terms of being a company that can grow and offer opportunities in emerging markets like AI. They are miles ahead of Intel in that regards.


Right now, intel looks like slowly-failing company that is trying everything it can to find something beyond its core CPU business to drive revenue. So far they have not succeeded.

What has AMD got that makes them special at AI? They have a GPU architecture which is way behind Nvidia for AI, with no software infrastructure... And that's it. No dedicated inferencing hardware, no dedicated training hardware.
 

FIVR

Diamond Member
Jun 1, 2016
3,753
911
106
What has AMD got that makes them special at AI? They have a GPU architecture which is way behind Nvidia for AI, with no software infrastructure... And that's it. No dedicated inferencing hardware, no dedicated training hardware.

AMD has a massively-larger and better IP library regarding GPU tech than intel. Most AI hardware at this point is custom re-purposed GPU hardware... which is why nvidia and AMD are so well-positioned in the sector. Intel is a complete joke in comparison. Zero quality GPU hardware and zero quality IP.
 

FIVR

Diamond Member
Jun 1, 2016
3,753
911
106
Actually, I believe Intel's issue's has not been any lack of talent, but lack of leadership.

It's both and the latter has lead to the former. Intel has always viewed itself as a manufacturing company, not an engineering or "tech" company... and they are structured accordingly. The highest positions at intel are all related to their process engineering, not their architecture team. Why would top talent want to work at a company where they will be lorded over by Jim the bean counter from manufacturing so they can maintain mandatory 60% margins on brand new products?