Confirmed by AMD & Intel - Rivals Intel and AMD Team Up on PC Chips to Battle NVidia

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

beginner99

Diamond Member
Jun 2, 2009
5,210
1,580
136
The main problem with this is that it shows that EMIB is superior to interposer tech and with superior I mean offering the same for much cheaper price. R&D isn't the problem AMD can't release such an APU, it's price and volume. It's expensive and the assembly process is slow and complicated meaning low volume further affecting price and especially chance of getting design wins. Eg. no market. If they could build it cheaper and in higher volume, then I'm sure the R&D would not be an issue even for AMD.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,684
1,268
136
You are right, AMD won't be entertaining any thoughts about selling off RTG, that is a ridiculous notion. Their GPU division is fully integrated into the company and can't be just split off.

The only way I could see it working is if RTG were to become a joint venture. I expect odds of that are quite slim though
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
You're kidding, right?

They can control the fabrication costs by using their own process. They are not exclusive to their own products anymore.

But they aren't fabbing Radeon chips, they are buying them from AMD, so mentioning Rockchip has nothing to do with this discussion.
 

coercitiv

Diamond Member
Jan 24, 2014
6,204
11,909
136
Honestly, sometimes I wonder if there are still competent people here at the AT forums. Of course Intel signed a licensing agreement. Do you think AMD is just going to hand over a boatload of GPU parts and say give me money?
Intel openly denied singing a licensing agreement with AMD, this type of message has legal repercussions too. Sometimes not seeing competent people around you can say more about yourself than your environment.
 

coercitiv

Diamond Member
Jan 24, 2014
6,204
11,909
136
I still wanna know how this is Intel and AMD taking on Nvidia??? o_O
You mean, besides the fact that we don't know cost and volume? Well, this dGPU will outperform the 1050ti in laptops, probably also compete with the 1060 *if* small Vega is being shipped in a better state than big Vega. Make what you will out of this.
 

ZGR

Platinum Member
Oct 26, 2012
2,052
656
136
If anything, this should be an indication that the IRIS product line with the eDRAM cache is going to get the ax. That product required a relatively unique package and die setup that likely never achieved sufficient volume to cover it's costs. This is fundamentally where Intel wanted to go with it.

If this works out, I could see Intel and AMD extending the product line both higher and lower. This would allow Intel to pull the iGPU from all of its mainstream dies and only use them on their low end mobile and desktop dies. This would give more space to upper mainstream products to have more corez at 10nm and below with an optional MCMgpu of considerable performance available where needed in mobile and SFF. A 10nm intel cpu paired with a 7nm amd gpu and an HBM2 stack could likely fit in the same footprint as the current i3/5/7/9 line. With such a product line, Nvidia could be essentially shut out from almost all of its volume lower end sales and be starved for volume to spread costs out with.

I'm really sad to see Iris with its sweet L4 eDRAM cache go. I've been wanting a successor for the i7-5775C for years and now it looks like it may never happen.... We may see the Iris name stil exist however, but it will forego the L4 cache most likely.

Nvidia will have to really increase their performance gap if they want to be in notebooks. Tough times ahead for them!

I still wanna know how this is Intel and AMD taking on Nvidia??? o_O

Nvidia's GPU lineup is getting smaller and smaller every year for the notebook. Soon there will be no more low end and mid range dGPU's in laptops. It will be fast APUs/iGPUs from here on out. For Nvidia to remain dominant they must release more powerful mid range GPU's or keep holding onto the high end market.

Intel has been going after Nvidia for years with their HD graphics. AMD was never much of a threat until RR when it came to notebooks. Now AMD and Intel both have strong CPU cores and strong GPU cores. This looks really dire for Nvidia right now.

Only thing is Volta to save the day. If mobile Vega is much slower than Volta, then Nvidia will be completely fine and remain in the dominant position they were in. But for how long?
 
Last edited:

StinkyPinky

Diamond Member
Jul 6, 2002
6,766
784
126
Regarding the comments about Apple. Looking at the huge performance gains they seem to be getting from their own A series line of CPU's I'm not so sure they will be wanting to partner up with Intel for much longer. Apple love controlling all of their eco system and cutting out Intel would help. Those A11's cpus are quite impressive.
 
Apr 20, 2008
10,161
984
126
But they aren't fabbing Radeon chips, they are buying them from AMD, so mentioning Rockchip has nothing to do with this discussion.
Who says they aren't fabricating Radeon in-house for these chips? They're fabricating for Rockchip, so fabricating for AMD is not only plausible, it's probable.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
When they're only just starting to port modems about 3 years after buying the relevant company? I deeply doubt it.

The economics/timescales involved in porting a design to Intel's processes appear to be really large and this really isn't going to be a very high volume part.
 

Abwx

Lifer
Apr 2, 2011
10,949
3,462
136
Who says they aren't fabricating Radeon in-house for these chips? They're fabricating for Rockchip, so fabricating for AMD is not only plausible, it's probable.

Dont think so, because it s trivial to extract the electrical schematic out of the layout files...
 

scannall

Golden Member
Jan 1, 2012
1,946
1,638
136
I still wanna know how this is Intel and AMD taking on Nvidia??? o_O
I don't see that part either. Seems like it's a very custom part, and Apple doesn't like CUDA. They do however like OpenCL, which AMD supports.
 

iBoMbY

Member
Nov 23, 2016
175
103
86
Intel openly denied singing a licensing agreement with AMD, this type of message has legal repercussions too. Sometimes not seeing competent people around you can say more about yourself than your environment.

They made contract to buy a semi-custom GPU, not a license deal to implement any AMD graphics technology into their SoCs. So technically the statement "the recent rumors that Intel has licensed AMD's graphics technology are untrue", is still true. Over-specific denials are a real art form.
 

maddie

Diamond Member
Jul 18, 2010
4,740
4,674
136
The main problem with this is that it shows that EMIB is superior to interposer tech and with superior I mean offering the same for much cheaper price. R&D isn't the problem AMD can't release such an APU, it's price and volume. It's expensive and the assembly process is slow and complicated meaning low volume further affecting price and especially chance of getting design wins. Eg. no market. If they could build it cheaper and in higher volume, then I'm sure the R&D would not be an issue even for AMD.
I'm not sure this is a valid assertion.

EMIB is using a small interposer at the interface of the individual die versus placing them entirely on a monolithic interposer.

You will still have the microbumps and their assembly problems.

For a given bandwidth or signal routing requirement, you will still need an identical # of connections. EMIB is not magic to reduce this.

This next I have not seen discussed.
If you are connecting more than 2 chips, the signals route as [IC>EMIB>IC>EMIB>IC] vs [IC>SI>IC]. This means that where we would route the signal once into the SI and on to the final destination IC. With EMIB, you now have to ensure pathways are etched in the intermediate IC to ensure a continuous connection. This means a more costly IC with denser layers &/or more metal layers [more masking steps].

I have seen old price estimates for SI @ US$1 per 100mm^2. Not a show stopper.
 
Last edited:

DrMrLordX

Lifer
Apr 27, 2000
21,634
10,847
136
This is a low volume product, as almost every poster here has acknowledged.

Low volume? Nobody said that. They said it occupied a niche, that's all. Rest assured that Apple could find any number of creative ways to utilize this chip in their products.

Do you actually think AMD will sell millions of these highly expensive and specialized chips?

Remember, this is Apple handling the sales. The "fruity cargo cult" (lulz) is great at selling things. They sell a lot of Macbook Pro models.

Even if they did, do you really think AMD's margins are going to be anywhere near intel's?

Well you know, Intel did approach AMD about this, not the other way around . . .

what it does i destroy any mindshare AMD might gain with it's APU solutions. They will now be branded as "low cost alternatives".

Bollocks. It's a chip going in Macbooks and such. You won't be able to get it anywhere else. It's an OS X product, not a Win10 product. AMD's APUs aren't even supported in that ecosystem. I doubt that you'll see these chips running Win10 (though you might seem some with hacked Linux installs). No consumer or OEM will look at "wintel" APUs as "low cost alternatives" compared to the Mac products featuring these Intel/AMD hybrid chips. They are not alternatives at all, since you can't really do the same thing with them.

I'm really sad to see Iris with its sweet L4 eDRAM cache go. I've been wanting a successor for the i7-5775C for years and now it looks like it may never happen.... We may see the Iris name stil exist however, but it will forego the L4 cache most likely.

With the advent of faster DDR4 speeds, I was under the impression that eDRAM was losing its lustre anyway. Unless Intel was going to improve performance on it.

I do sort of agree though. The i7-5775c was the not-so-little chip that could.

Regarding the comments about Apple. Looking at the huge performance gains they seem to be getting from their own A series line of CPU's I'm not so sure they will be wanting to partner up with Intel for much longer. Apple love controlling all of their eco system and cutting out Intel would help. Those A11's cpus are quite impressive.

It is my understanding that Intel had to rope in RTG graphics for GPUs to help stave off that inevitable transition.
 

CatMerc

Golden Member
Jul 16, 2016
1,114
1,149
136
Regarding the comments about Apple. Looking at the huge performance gains they seem to be getting from their own A series line of CPU's I'm not so sure they will be wanting to partner up with Intel for much longer. Apple love controlling all of their eco system and cutting out Intel would help. Those A11's cpus are quite impressive.
It's a lot easier to catch up than it is to cut your own path.

There will come a point where Apple, just like Intel, will slow down their massive progress. There's only so much IPC you can extract.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Your posts on the INTEL IGP sounds a bit like possibly losing a loved one.
Sorry, team green here as far as video goes. My 4 boxes have 750ti / 750 / 730 GDDR5 / 710 cards.

Other posters pointed out that this chip will still have the Intel IGP for low power use and would still need the Intel IGP drivers along with the Radeon drivers.

Now, that doesn't make any sense to me, but that is apparently the case.
 

neblogai

Member
Oct 29, 2017
144
49
101
I wonder about this GPU having 24CUs and rumors of it being based on Polaris architecture. Thing is- that would probably leave it with 12CUs per compute engine and geometry processor- and have CUs underutilised like on Fiji. Other Polaris chips had 8-10CUs per CE (7 for cut down P11).
So:
1) maybe this is not an issue for applications Apple indends it for?, or
2) could it come in strange 24= 3 x 8CU configuration? or
3) have primitive shader (or more) hardware from Vega to help geometry processors feed the CUs?
 

senseamp

Lifer
Feb 5, 2006
35,787
6,195
126
Why?
- there might be good reasons to spin off RTG but imo the number one reason to do so is so RTG gets access to more funds for especially software development. This imo is a way to adress that issue. More income. And bigger userbase follow. You can say that makes rtg more attractive but its also is more attractive for amd.

As zlatan says in gpu forum just look at this as another oem deal.

They are allowing Intel to get a position in a market niche they could have all to themselves with Ryzen and Radeon on MCM. They are settling for OEM margins over selling their own solution into this market. Yes, an OEM deal, but not just another one. It serves interests of Radeon at the expense of the CPU group at AMD. They are effectively acting as a separate company at cross purposes with the rest of AMD.
 

Yakk

Golden Member
May 28, 2016
1,574
275
81
They are allowing Intel to get a position in a market niche they could have all to themselves with Ryzen and Radeon on MCM. They are settling for OEM margins over selling their own solution into this market. Yes, an OEM deal, but not just another one. It serves interests of Radeon at the expense of the CPU group at AMD. They are effectively acting as a separate company at cross purposes with the rest of AMD.

What if AMD & Ryzen already lost this OEM (Apple) bid for the CPU? Recuperating the GPU portion instead of a complete loss is still a great gain.