TPU: First Intel Processor with AMD Radeon Graphics Within 2017

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
As to AMD, as mentioned above, this is a familiar field for them. They have also an interest to fend off NVIDIA, and presence in Apple's hardware has no downside, as long as they are not losing money. They also share this boat called x86 with Intel - whether they like it or not, the truth is that AMD would be in a worse situation than Intel were x86 to falter in its fight against ARM, an ecosystem NVIDIA is heavily invested in. So against ARM, particularly NVIDIA, the two are united.

Good point. I don't put a huge amount of stock in these rumors, but it would be pretty sweet for AMD, I think, if they were true.
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
You guys make some valid points. The more I think about it, the more likely this seems at least somewhat plausible. I guess it depends on how well Raven Ridge performs and if AMD needs the win but I could see Intel throwing AMD a pile of cash to integrate a AMD GPU into upcoming Intel CPU's, doubly so if Apple involved. Nvidia doesn't have the experience with integration like AMD with X86, and Intel GPUs have always lagged behind AMD unless expensive EDRAM is used.

Thinking strategically AMD is in the drivers seat but they have a tough choice to make. Either let Intel relicense with Nvidia shutting themselves out of any potential design wins, or partner with Intel but potentially risk losing out on Raven Ridge APU adoption, which could make up a huge amount of notebook sales. However Nvidia already has massive adoption in notebooks and I'm sure AMD would be happy even if their GPU's were bound to Intel in order to claw back GPU marketshare. This would also likely lower the ASP without the need for a discrete card potentially attracting a lot more ODM wins.

Hmm....
 

plopke

Senior member
Jan 26, 2010
238
74
101
Aren't people going a bit overboard? Intel - nvidia deal expired , which leaves Intel with some options.
1)go without license, letting the lawyers fight it out,because according to some Intel still has access to Nvidia patents but will that be enough? http://seekingalpha.com/article/402...reement-nvidia-will-continue-well-beyond-2017
2)renew license with nvidia
3)find a other player who can license what they need, by agreement or buying them out
....
Then again it is fun to speculate over the future of Intel , sometimes I wonder if they actually still want to bother with consumer Mac/PC-market as a priority, specially if the apple-intel relationship goes south one day. There must be some people looking at Nvidia cashing in on machine learning and automotive industry and then look at their own future. Full focus on HPC/server market/machine learning and give out cut outs to PC-consumers , in that kind of Intel ,having a own iGPU division ?
 

lopri

Elite Member
Jul 27, 2002
13,314
690
126
EDRAM size was, what, 128 MB? That is a rather modest size.

1080p = 1920 x 1200 = 2073600 pixels

Each pixel needs 32-bit of data for displaying True Color

2073600 x 32-bit = 66355200 bits

Bits to bytes conversion

66244100 / 8 = 8294400 Bytes
8294400 / 1024 = 8100 Kilobytes
8100 / 1024 = 7.91 Megabytes

Therefore a 1080p image is approx. 8 MB in size. Now, for screen display without flickering, Double-buffering is mandatory (8 x 2 = 16 MB) and many apps call for Triple-buffering to avoid tearing. (8 x 3 = 24 MB)

Add 4 x FSAA for OS X's desktop composition which is necessary for sub-pixel / font rendering, etc. (24 x 4 = 96 MB)

That is before we get into Z-buffer and texture storing for 3D, which is up to individual app's request. Modern APIs also call for extra frames to "render-ahead." For 3D, 128 MB quickly becomes a bottleneck unless some other techniques are employed to reduce the frame buffer size. (memory compression? Tile-based rendering?) In any case, Intel could not have soldiered on with 128 MB EDRAM as a high-end option even for 1080p, and even then it would have to license the graphics IPs to make it work. Furthermore, monitor resolutions have been on a revolutionary path sine the introduction of EDRAM, and there does not seem to be an end to it just yet. Add more demanding usage cases for professionals/enthusiasts (e.g. dual-monitor), the size of EDRAM required for high-end SKUs would have to grow larger and larger, with no end in sight.

I guess this is where HBM comes into play. Doesn't AMD hold some sort of key technology in enabling HBM?
 
Last edited:

MajinCry

Platinum Member
Jul 28, 2015
2,495
571
136
EDRAM size was, what, 128 MB? That is a rather modest size.

-snip-

And that's just focusing on the backbuffers, the LDR image being sent to the display. You also have to factor in the HDR buffers, which can be 64bit (SM3) or 128bit (SM4+). Fallout 4 uses 7 HDR buffers, IIRC.
 

Despoiler

Golden Member
Nov 10, 2007
1,968
773
136
A fat AMD GPU core with Intel CPU + Intel's fabrication + EDRAM and DDR4 would be pretty nice. :)

There is no way in heck this is happening. AMD is about to drop some serious hurt on Intel on the consumer front. Why would AMD allow Intel to maintain their current position in the market? AMD's APUs are already competitive with Intel's equivalent chips in gaming loads. Within 1 FPS and that is with their weak excavator cores. That's how strong AMD's GPUs are. Once Raven Ridge, Ryzen cores, gets released there will be no question who to buy in that space. Intel is going to be flipped to the value position overnight. It's going to be a slaughter.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
If they licensed anything, it was because prior patent licenses ran out and they needed those to proceed doing their own development
 
  • Like
Reactions: Despoiler

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
There is no way in heck this is happening. AMD is about to drop some serious hurt on Intel on the consumer front. Why would AMD allow Intel to maintain their current position in the market? AMD's APUs are already competitive with Intel's equivalent chips in gaming loads. Within 1 FPS and that is with their weak excavator cores. That's how strong AMD's GPUs are. Once Raven Ridge, Ryzen cores, gets released there will be no question who to buy in that space. Intel is going to be flipped to the value position overnight. It's going to be a slaughter.

I'm not saying it will happen, I have no opinion on that one way or another. Just saying something like that would be a curious piece of hardware, an APU even enthusiasts might enjoy.
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
AMD's GPU is the only benchmark they have over intels iGPU offerings, I dont see AMD selling their advantage off!
 
  • Like
Reactions: Despoiler

ksec

Senior member
Mar 5, 2010
420
117
116
Even in the rare case of happening, I think this custom chip will be Apple only. It will be, a joint force of Intel and AMD to keep the Mac still in the x86 ecosystem. I am hoping there may be a chance for Ryzen to appear in updated iMac.

Apple hasn't been able to get any cost reduction from Intel. It literally has a monopoly on the x86 chip, especially in the mobile segment. It wasn't until the last two years did Apple finally catches up with their SoC, not to mention they have control with their GPU as well.

And in the grand view of things, an Intel modem inside iPhone is 10x more important then what ever it is on the Mac.

But I still think it is too much pride for Intel to swallow. Even if it is only "AMD's ATI Radeon".

Edit: And Intel's GPU is simply crap. Last year during the 12,000 lay off, rumours has it thousands were from iGPU department.
 
Last edited:

sm625

Diamond Member
May 6, 2011
8,172
137
106
AMD is finally on the cusp of producing an APU that can potentially crush Intel in the 25-45W notebook segment. They purchased ATI for exactly this reason. Why in blazes would they offer their gpu IP to Intel now? It would be the ultimate triple facepalm moment for AMD in a long history of facepalm moments.