Confirmed by AMD & Intel - Rivals Intel and AMD Team Up on PC Chips to Battle NVidia

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

senseamp

Lifer
Feb 5, 2006
35,787
6,197
126
This only makes sense to me if AMD is looking to spin off RTG in not too distant future.
 

FIVR

Diamond Member
Jun 1, 2016
3,753
911
106
I certainly can adjust outlooks. But, I don't need to.

You are missing my point. My point is that you were absolutely certain that AMD can go only up, dismissing any reason that AMD could go down. Now you seem to be absolutely certain that AMD will go down, dismissing any reason that AMD could go up. You are overly confident (which is why I included that last quote of yours) and thus you have to continuously flip/flop from one viewpoint to another.

If your outlook includes uncertainty (less confidence), then your outlook doesn't need to flip/flop. My outlook that includes good and bad about a stock gets tweaked slightly as data comes and goes. Your outlook appears like a fish out of water flopping around aimlessly.

Part of trading short-term options is the necessity to adjust your outlook quickly. If a stock with a high beta like AMD isn't going up, it will likely go down. I have already made money on the $12 calls I sold early this morning and likely will cover within the next few days and write another set of calls.


Also, I never said that AMD "could only go up", merely that I was fairly sure it would go up to $20 in six months. I still think that is possible. There is no such thing as "certainty" in predicting price movements of different equities. Nobody can predict the future, but I can make a best guess based on the information I have and usually I turn out to be correct. I definitely was correct on my most recent AAPL trade which is up 270% since october 19th.


Maybe I should adjust my posting stye to convey less "certainty" in my predictions... but I figured you and others would understand that I am not clairvoyant and cannot predict the future.
 

caswow

Senior member
Sep 18, 2013
525
136
116
This only makes sense to me if AMD is looking to spin off RTG in not too distant future.

if there is no one who will buy amd cpu/gpu soc this big and expensive why would they even bother putting rnd into it? just sell them to intel and take the lower part with RR & co. win win win.

amd rtg is in a unique position which they should not break up.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,595
136
This only makes sense to me if AMD is looking to spin off RTG in not too distant future.
Why?
- there might be good reasons to spin off RTG but imo the number one reason to do so is so RTG gets access to more funds for especially software development. This imo is a way to adress that issue. More income. And bigger userbase follow. You can say that makes rtg more attractive but its also is more attractive for amd.

As zlatan says in gpu forum just look at this as another oem deal.
 

dahorns

Senior member
Sep 13, 2013
550
83
91
The most interesting thing about all this is that we may finally get a good comparison of competing nodes. It also seems to indicate that Intel's foundry capability is getting a bit more sophisticated.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
The most interesting thing about all this is that we may finally get a good comparison of competing nodes. It also seems to indicate that Intel's foundry capability is getting a bit more sophisticated.

What makes you think Intel is fabbing the Radeon chip? They could just be buying them from AMD like any other OEM.

Plus given how unique the solution is, it won't be directly comparable for power anyway.
 
Apr 20, 2008
10,065
984
126
I wrote them because i want out of AMD NOW, but I want to sell at 12$. How do you suggest I do that today?


There are tons of other companies I can invest in that won't sell their own competitive advantage away to their biggest competitors for a slight uptick in revenue.

I don't think you understand what's really going on. AMD might be indirectly killing off Intel's IGP development, making Intel dependent on integrating an AMD GPU into future products. This will payoff in a couple years massively.
 
Apr 20, 2008
10,065
984
126
What makes you think Intel is fabbing the Radeon chip? They could just be buying them from AMD like any other OEM.

Plus given how unique the solution is, it won't be directly comparable for power anyway.
Isn't intel fabbing for Rockchip anyway?
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
What makes you think Intel is fabbing the Radeon chip? They could just be buying them from AMD like any other OEM.

Plus given how unique the solution is, it won't be directly comparable for power anyway.
You are correct. Per Anandtech
The agreement between AMD and Intel is that Intel is buying chips from AMD, and AMD is providing a driver support package like they do with consoles. There is no cross-licensing of IP going on

So they are selling them graphics chips with an open HBM2 interface. Also mentioned in Anandtech's updated article
it can basically be confirmed that EMIB is only being used between the GPU and the HBM2. The distance between the CPU and GPU is too far for EMIB, so is likely just PCIe through the package which is a mature implementation.

So this probably a Vega 24, 1506 shader chip with HBM2 interface. Intel, instead of putting into an AIB like other partners, will instead throw it on an interposer with a PCIE link to a CPU.

Also telling is that this might be what we see released in the not so distant future to replace the RX 580. A Vega 24 with 4GB HBM2.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,729
136
Who says it will be that big? RR isn't that big.
This isn't RR. Full fat Vega is 484 mm^2, like someone who posted Chinese rumors - the GPU would be ~200mm^2, add another 100mm^2 for the HBM and you get over 400mm^2 for CPU(4+2)+GPU+HBM2. No chance of fitting that in the LGA1151 package.
 
Last edited:
  • Like
Reactions: Drazick

sirmo

Golden Member
Oct 10, 2011
1,014
391
136
AMD's interposer shouldn't be expensive either. AMD has talked about it being built on a much cheaper process. Likely passive 65nm. It doesn't need 9 layers either.. it is likely pretty cheap to manufacture compared to real chips. AMD has in the past themselves said that the biggest factor driving costs with HBM was the HBM memory itself and tooling (machines which can package these). The reason AMD opened HBM as a standard was to address these costs. Because a wider adoption helps lower them. Another notch in AMD's victory belt this is.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
You are correct. Per Anandtech

So they are selling them graphics chips with an open HBM2 interface. Also mentioned in Anandtech's updated article

So this probably a Vega 24, 1506 shader chip with HBM2 interface. Intel, instead of putting into an AIB like other partners, will instead throw it on an interposer with a PCIE link to a CPU.

Also telling is that this might be what we see released in the not so distant future to replace the RX 580. A Vega 24 with 4GB HBM2.

Vega 24 as a replacement for RX580 (36 CUs), doesn't seem like it would be an upgrade, it actually seems like it would be slower.

But I expect it will be a great Laptop part for OEMs.
 

DrMrLordX

Lifer
Apr 27, 2000
22,129
11,819
136
the real competition Intel is afraid of might be ARM, not Nvidia

there's always been rumors that Apple could go to a full custom ARM product stack. This could be Intel's desperate attempt to stave that off by addressing their main weakness, one that Apple has always been unhappy about

Bingo. Apple has their own in-house ARM designs that are starting to get really beefy. Intel doesn't want to lose Apple, but Iris isn't keeping them very happy so here you go.

There are tons of other companies I can invest in that won't sell their own competitive advantage away to their biggest competitors for a slight uptick in revenue.

Who said anything about this being a "slight" uptick in revenue? Besides, you seem to be operating under some paranoid delusion that Intel is going to produce lower-end APU products using AMD GPUs connected to Intel CPUs via EMIB, which is probably not going to happen. I know there are many who dream of interposer-connected HBM2 APUs from AMD someday, and that this product from Intel sort of end-runs those, but remember that AMD may actually have a competitive advantage on the CPU side once they switch to 7nm LP. At that point will OEMs demand an Intel CPU + AMD GPU hybrid since Apple got one custom-made for them? I don't think so.

Plus if AMD gets access to EMIB for their own projects then things could get interesting. Not that full interposers are undesirable in comparison. They have better bandwidth from what I understand . . .

This is a solid win for RTG. This is a solid win for AMD as a whole.

Overall I agree. I just don't think we'll see Intel APUs featuring AMD GPUs crushing Raven Ridge or any other AMD APU product in the future.

Yeah. The drivers better be good, because switchable graphics can be a pain.

Hmm, AMD + Intel drivers. Could be driver hell.

I don't think you understand what's really going on. AMD might be indirectly killing off Intel's IGP development, making Intel dependent on integrating an AMD GPU into future products. This will payoff in a couple years massively.

Depends on AMD's goals here. If they want to be the common provider of all desktop/laptop graphics chips then that isn't really a bad goal, especially if Intel can use their muscle to push Nvidia out of that space.

On the flipside, if every Intel iGPU becomes AMD-based, the APU competition comes down to: who has the best CPU cores? And that may be where AMD would be in some trouble. But then again, maybe not.
 

Dayman1225

Golden Member
Aug 14, 2017
1,160
996
146
https://browser.geekbench.com/v4/compute/811174
PN8YVSn.png


Looks to be Semi Custom Polaris w/HBM IP. 24CUs would be 1536SPs, correct?


LQXVAE7.png


People are also speculating it is Hades Canyon - 100w/66w 4/8 Parts w/dGPU(Polaris MCM) and Optane.


Oh and here is supposed performance.

V766bhp.png

Source

Not bad if its real!
 

Roland00Address

Platinum Member
Dec 17, 2008
2,196
260
126
What are the pros and cons of this compared to a traditional discrete graphic card in a laptop 45w form factor?
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
Great to see an iGPU really being pushed of course, but I'm not remotely sure about the less power bit for iso performance.

From that leak, the power draw etc it is roughly comparable to Pascal. Great, except that the timing will put it vs Volta, and it'll be a chunk behind that. You'll have to really, really want the single chip solution.
(Or love AMD, as per Apple :)).
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
Pros:
Less power.
Less board space.
Con:
More Expensive.

Remember AMD is selling semi custom silicon which has much lower margins than AMD discrete notebook GPU which in turn has lower margin than Nvidia discrete notebook GPU. Nvidia is the king of margins. For the same graphics perf Intel is probably getting the silicon dirt cheap. The challenge to Intel is EMIB complexity and HBM2 yield/costs. I think Intel's margins on Kaby G should be higher than if they just sold a Kaby CPU with Nvidia graphics.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
Great to see an iGPU really being pushed of course, but I'm not remotely sure about the less power bit for iso performance.

From that leak, the power draw etc it is roughly comparable to Pascal. Great, except that the timing will put it vs Volta, and it'll be a chunk behind that. You'll have to really, really want the single chip solution.
(Or love AMD, as per Apple :)).
I think the benefits of Kaby G with EMIB is z height. Volta will probably provide better performance at same power. But i think fitting a Volta chip with similar performance to Kaby G in an ultrathin <= 15mm is probably difficult. btw I am sure this chip was built for Apple. Its almost definitely going to be found in 2018 iMacs and Macbook Pros.
 

Dayman1225

Golden Member
Aug 14, 2017
1,160
996
146
I think the benefits of Kaby G with EMIB is z height. Volta will probably provide better performance at same power. But i think fitting a Volta chip with similar performance to Kaby G in an ultrathin <= 15mm is probably difficult. btw I am sure this chip was built for Apple. Its almost definitely going to be found in 2018 iMacs and Macbook Pros.

Oh Apple will definitely eat this up, I wonder if they would even finally refresh Mac Mini with it.
 

firewolfsm

Golden Member
Oct 16, 2005
1,848
29
91
This should definitely be lower power than having a discrete GPU because shorter interconnects (even if they are PCIe) will consume less power, as will HBM, which isn't currently found in laptop GPUs.

It seems Intel passed up the opportunity to use HBM as an L4 cache for the processor. This could increase compute performance and reduce power consumption by reducing calls to ram, and potentially even allowing it to idle, or removing it from the laptop entirely (if there were 8GB models).

This is a 14nm++ processor for the intel side right?
 

Glo.

Diamond Member
Apr 25, 2015
5,829
4,824
136
I think the benefits of Kaby G with EMIB is z height. Volta will probably provide better performance at same power. But i think fitting a Volta chip with similar performance to Kaby G in an ultrathin <= 15mm is probably difficult. btw I am sure this chip was built for Apple. Its almost definitely going to be found in 2018 iMacs and Macbook Pros.
At best Volta GV107 chip, which will compete with Vega 24 will have performance between GTX 1060 and GTX 980 Ti.