Question Would AMD make more or less money, if every CPU (APU) had integrated graphics (put it in the I/O die, with a side-band to GDDRx, or maybe some HBM2)?

VirtualLarry

No Lifer
Aug 25, 2001
56,348
10,048
126
I was thinking about this, after someone posted a thread in OT about HDTVs and whole-floor A/C units, and not being able to take full advantage of what they bought.

That got me thinking, to people that maybe buy discrete GPUs, and then only use them to browse the web, and not game, or do DC/mining with them.

Then that got me to thinking about AMD's Ryzen CPUs, and how they *require* a GPU to go with them. Aaaaand... not to mention, AMD makes those separate GPU cards, well, maybe not the cards, but the GPU chips that go in them.

So, that got me thinking, on some economic theories, about things like "attach rates", and "complementary products", and how maybe it really was genius, to make your major product line, "require" one of your other products to work. (*Yes, I know that you can use an NVidia GPU in an AMD CPU system too, but let's go with the example presented for now.)

(Makes me wonder, why Peanut Butter and Jelly companies haven't merged, or why we don't have squeezable peanut-butter-and-jelly in one container. Or maybe we do, and I just didn't notice yet.)

So I was therefore also wondering about the feasibility of expanding the I/O die used in the consumer AM4 Zen2-based products, so that they could provide a minimalist (non-gaming) iGPU alongside powerful Ryzen Zen2 CPU chiplets.

I also wondered about business systems and OEM rigs, and how many of them supposedly wouldn't use AMD CPUs until their APUs came out (Ryzen Pro APUs being a Good Thing here), because most desktop business systems want to have integrated graphics, and often don't have the heat, noise, power, allowance for a dGPU, even something like an RX 550.
 

beginner99

Diamond Member
Jun 2, 2009
5,210
1,580
136
For OEM adopation of Ryzen products it would obviously help but in the big picture I doubt it would make AMD more money. People that buy ryzen usually are power users and/or gamers that need a dgpu anyway. a small igpu would be nice for cases your dgpu breaks or you sell of your old one and can wait couple weeks or month till you get a new one.
 

DrMrLordX

Lifer
Apr 27, 2000
21,634
10,852
136
iGPU might be an option with Zen3 if AMD does move to TSMC 7nm for the I/O die; however, I wouldn't expect it before a platform change. Remember that a lot of existing AM4 boards out there do not have the proper outputs for iGPUs. iGPU would have been a really bad idea for Zen/Zen+ so I can see why AMD built their AM4 ecosystem around top-performing chips without iGPUs.
 

naukkis

Senior member
Jun 5, 2002
706
578
136
Remember that a lot of existing AM4 boards out there do not have the proper outputs for iGPUs. iGPU would have been a really bad idea for Zen/Zen+ so I can see why AMD built their AM4 ecosystem around top-performing chips without iGPUs.

What? AMD did make their AM4-ecosystem around APU:s. Almost all MB's have display outputs. Zeppelin did not have possibility to have iGPU but it's little bit weird that Matisse don't have either - probably because IO-chip design is shared with Rome.
 

naukkis

Senior member
Jun 5, 2002
706
578
136
They might throw in a basic iGP, Intel GMA-style, to boost OEM adoption the next gen.
Area isn't free tho.

Actually it is. That's the reason for Intel iGPU:'s, they have to make some use to that free silicon space. Matisse IO-chip size is what, 120mm2 or so - there's plenty of space for GPU if used for that instead of massive cache coherency tables.
 

Yotsugi

Golden Member
Oct 16, 2017
1,029
487
106
if used for that instead of massive cache coherency tables.
Sacrificing the directory thus MC scaling for basic iGP is an unworthy tradeoff.
Also Intel iGPs hasn't been the "spare area usage" things ever since Sandy Bridge.
We can all thank Apple for that.
 

Abwx

Lifer
Apr 2, 2011
10,953
3,474
136
This will be adressed by the next gen APU wich is supposed to have 8C/16T and up to 20 CUs, possibly that on the mobile side only Microsoft will get a fully enabled GPU with other OEMs being limited to 16-18 CUs.
 

Yotsugi

Golden Member
Oct 16, 2017
1,029
487
106
This will be adressed by the next gen APU wich is supposed to have 8C/16T and up to 20 CUs, possibly that only Microsoft will get a fully enabled GPU with other OEMs being limited to 16-18 CUs.
Renoir is like 10CUs and everything after is about anything but ALU count.
 

DrMrLordX

Lifer
Apr 27, 2000
21,634
10,852
136
What? AMD did make their AM4-ecosystem around APU:s.

No they didn't. They made the low-end chipsets and low-end boards around APUs. AM4 launched with Bristol Ridge using the A320 chipset.

Almost all MB's have display outputs.

No they don't. Some x370 boards had no display outputs at all. I know my ASRock x370 Taichi didn't have one, and the C6H doesn't have one either. I think MSI, Gigabyte, and Biostar did have them on their x370 boards though. x370 Gaming K7 has an . . . HDMI out I think? Biostar provided all kinds of display outputs on theirs.

Area isn't free tho.

Area becomes less of a problem when you move to a denser node.

Kinda of goes against the whole point.

No it doesn't. If AMD needs more transistors on their I/O die (such as for an iGPU), then moving to a denser node makes sense. By the time of Zen3's launch, price per wafer on TSMC's 7nm should be lower, perhaps low enough to be competitive with continuing with whatever 12nm product GF is schlepping next year. Only reason to keep GF around is if the WSA forces AMD to do so. Which it may not, by that point.
 

Atari2600

Golden Member
Nov 22, 2016
1,409
1,655
136
It is something I've mentioned on here in the past.

IIRC, the counter argument was the added expense of every mobo carrying a display head.

But on consideration, I'm not sure that's really a strong enough argument. If someone has no intention of ever using the 2D GPU, then they don't have to get a mobo that has the capability. I know I (and I'd guess, probably most others) - would get a motherboard with the display capabilities as a backup graphics display can be utterly invaluable at times.


Incorporating a simple 2D "GPU" into the I/O chiplet
Pros:
- The current I/O die has around 2.1 billion transistors. An old ATi RV410 (which may still be more powerful than what we want) has around 120 million transistors. So you are looking less than 6% more transistors in the I/O die.
- The pads are already there on the AM4 socket for the existing APUs, so no additional work needed on that end.
- It would be an extremely efficient way for most office machines to operate, in both power and cost.
- The external dimensions of the I/O die are defined as much by the connections to and from the die as they are by the logic within it - that's why it scales so poorly. Which means finding that 6% free area **may** not actually result in an increase in overall die area (i.e., in terms of manufacturing, recurring costs are free - defects excluded).

Cons:
- Additional design work required.
- Some users would never be happy that the GPU was so weak and would try and use it outside its intended purpose.
- The transistor budget may be better used for something else (cache coherency for the CPUs for example).
 
  • Like
Reactions: VirtualLarry

VirtualLarry

No Lifer
Aug 25, 2001
56,348
10,048
126
Adding a rudimentary 2D graphics capability to the chipset would be nice, if for no other reason than to have a backup display capability for trouble shooting.
Another thing, it would cut down on the 1-egg / star reviews on Newegg / Amazon, from people buying Ryzen CPUs, and plugged their monitors into the back of the board (most have at least an HDMI output), and getting nothing and declaring the CPU bad and leaving a bad review. (Since they're used to Intel boards, and nearly all Intel CPUs having at least a minimal iGPU present.)
 
  • Like
Reactions: DarthKyrie

chrisjames61

Senior member
Dec 31, 2013
721
446
136
What? AMD did make their AM4-ecosystem around APU:s. Almost all MB's have display outputs. Zeppelin did not have possibility to have iGPU but it's little bit weird that Matisse don't have either - probably because IO-chip design is shared with Rome.


A lot of the high end boards have no display outputs.
 

moinmoin

Diamond Member
Jun 1, 2017
4,952
7,666
136
Sorry for the late post on an already dropping thread, but the thread title...
"Would AMD make more or less money, if every CPU (APU) had integrated graphics (put it in the I/O die, with a side-band to GDDRx, or maybe some HBM2)?"
...made me laugh out loud, adding GDDRx and/or HBM2 would be even costlier.

To answer the question posed, AMD would make a lot less money in that hypothetical case. OEMs are still slow to pick up AMDs chips, with or without iGPU. The audience where the big quantity is is well served by AMD's current APUs. The money is in the data centers anyway, and that's how AMD's product lineup is designed: big money making server chips first, iGPU-free desktop market chips as a byproduct of that, and APUs last due to needing a separate die design for a mostly low margin market.

Also regarding smaller nodes the issue is not the per chip cost (which do fall as the node matures and yield improves) but the ever increasing upfront cost for masks and validation. The former can be reduced by reusing existing parts/IPs already on the same node, the latter stays costly due to the unchanging complexity.