What's wrong with APU on Desktop?

otinane

Member
Oct 13, 2016
68
13
36
What's wrong with APU on desktop?

Could be that the overall technology of such an artifact today, can't overcome the efficiency of a discrete and dedicated CPU + GPU based system

OR

simply the estimated revenues, from an equally efficient APU(vs CPU+GPU) standalone are going to be significant lower in the mid term? I mean we are all familiar with the frequent "upgrades" of an average consumer over discrete GPU and CPU in mid/long term.
 

scannall

Golden Member
Jan 1, 2012
1,946
1,638
136
What's wrong with APU on desktop?

Could be that the overall technology of such an artifact today, can't overcome the efficiency of a discrete and dedicated CPU + GPU based system

OR

simply the estimated revenues, from an equally efficient APU(vs CPU+GPU) standalone are going to be significant lower in the mid term? I mean we are all familiar with the frequent "upgrades" of an average consumer over discrete GPU and CPU in mid/long term.

Nothing wrong with an APU in a desktop computer. They work just fine for a majority of computer users out there. Not so much for enthusiasts, or people who do 'heavy lifting' type computing. But that isn't the largest segment of the market.
 
  • Like
Reactions: sandorski

jpiniero

Lifer
Oct 1, 2010
14,599
5,218
136
Basically the construction cores draw too much and are slow; the GCN cores draw too much. Then there's the bandwidth issues...

Intel's problem is that the driver support is terrible and the optimization isn't anywhere near nVidia or AMD.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
As noted already :) Basic idea utterly fine, even attractive for a fair chunk of people in theory. Current implementations a long way off attractive in practice for the moment.
 
Aug 11, 2008
10,451
642
126
"APUs" from either Intel or AMD are fine for normal, casual usage. The problem with APUs is that they are simply too limited by bandwidth and TDP to compete well against a discrete card. In the desktop, it is easy to substitute a discrete card and get nearly twice the performance for a very minimal additional cost. Igpus are more suited for laptops, but the limited TDP makes them even weaker than in desktop, and a discrete card is still needed for anything more than lite gaming. The new 14/16 nm dgpus with higher efficiency will raise the bar even further for "APUs".

I am sure eventually we will see the "but look, this APU can certainly play x game at x settings and save a few dollars". Certainly APUs can play games. There are even videos on youtube of intel apus playing a lot of games. The hard truth though, is that except in a few very limited scenarios, for any graphically demanding task, a discrete card simply is a better overall solution.
 

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
it's basically to slow for some people (128bit regular memory is going to prevent any serious performance for gaming), and fails to show an advantage for many others who don't game... pricing is also not ideal at times, look at the launch price for the 7850K and how long it took for the a8 7600 to have decent availability.
even if they had Intel CPU performance, the GPU perf is simply not there, it fails badly to deliver a good experience even compared to the typical $100 VGA.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
I am sure eventually we will see the "but look, this APU can certainly play x game at x settings and save a few dollars". Certainly APUs can play games.

I agree, but I'll bet the niche for this will always be very small (eg, A8-7670K vs. Athlon x4 860K + GT 710).

Ideally, I think we would see AMD able to consolidate all quad core APU dies to mobile (BGA).

But in order for that to happen I believe AMD will need to increase the uptake of 35W APUs in laptop.......then its probable the 65W (and higher wattage) desktop APUs won't need to exist anymore.
 
  • Like
Reactions: Chicken76

crashtech

Lifer
Jan 4, 2013
10,524
2,111
146
APUs are often less than the sum of their parts, mainly because of thermal and bandwidth constraints. Depending on one's needs, a few of them have entered "good enough" territory, but as has been mentioned, discrete components offer far more performance for only a small additional investment.
 

CHADBOGA

Platinum Member
Mar 31, 2009
2,135
832
136
What's wrong with APU on desktop?

Could be that the overall technology of such an artifact today, can't overcome the efficiency of a discrete and dedicated CPU + GPU based system

OR

simply the estimated revenues, from an equally efficient APU(vs CPU+GPU) standalone are going to be significant lower in the mid term? I mean we are all familiar with the frequent "upgrades" of an average consumer over discrete GPU and CPU in mid/long term.
Who said there was anything wrong with APU on desktop? :confused:
 

techne

Member
May 5, 2016
144
16
41
It's nice to have a desktop APU. If you don't game or work with graphics, or if you just want to assemble a 1080p HTPC or a torrent box, you simply won't have to buy a discrete GPU. And even if you can't live without a GPU, it will be much more easy to sell or donate your desktop without necessarily sell or donate your GPU.

Also, unless you work with graphics or video, you will not have to interrupt your work even if your GPU die. And if you live in a city in which your new GPU may take a week to arrive, this is a big advantage. Not to mention that you will have more time to research and decide which GPU will replace the dead GPU. For those reasons, it is clear that desktop APUs have an incredible value in many cases.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
It's nice to have a desktop APU. If you don't game or work with graphics, or if you just want to assemble a 1080p HTPC or a torrent box, you simply won't have to buy a discrete GPU. And even if you can't live without a GPU, it will be much more easy to sell or donate your desktop without necessarily sell or donate your GPU.

How do you feel about a 35W BGA APU (used in a SFF desktop) replacing a 65W socketed APU for that purpose?
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
From a previous thread:

https://forums.anandtech.com/thread...-sales-in-recent-years.2487953/#post-38499352

At least three problems I see with AMD APUs for desktop:

1. OEM desktops with APUs carry a higher than expected pricing--> https://forums.anandtech.com/thread...industry-source.2456970/page-14#post-37937094

2. At the DIY level the lower cost Athlon x 4 860K + GT730 GDDR5 combo beats the more expensive A10-7870K @ 1080p gaming-> https://forums.anandtech.com/threads/a10-7870k-vs-athlon-860k-gt730-atenra-cbn.2464083/

3. "APU size desktops" (ie, small footprint on table) with potential for 12.2" and 13" video cards can be made using chassis mentioned in this thread--> https://forums.anandtech.com/threads/what-cases-use-a-riser-for-the-video-card.2487944/

With that mentioned, I did find a socketed desktop APU to have a lower idle than Athlon x 4 860K + small dGPU:

https://forums.anandtech.com/thread...x-4-860k-dgpu-and-apus.2488541/#post-38518300

(However, I would imagine a 35W BGA desktop APU would idle as low or lower.....this especially with an AC adapter providing power rather than a 350W ATX PSU)
 

techne

Member
May 5, 2016
144
16
41
How do you feel about a 35W BGA APU (used in a SFF desktop) replacing a 65W socketed APU for that purpose?

For that purpose (1080p HTPC) I use the E35M1-M (15W). However, it will become obsolete in the near future because of HEVC. I'll be forced to replace it or to install a cheap HEVC capable GPU.
 

superstition

Platinum Member
Feb 2, 2008
2,219
221
101
What's wrong with APU on desktop?
Bigger CPU die and/or corners cut in design to keep the die size down.

More concentrated heat.

More throttling requirements.

Higher stress on power delivery circuitry since it's now having to feed both a CPU and a GPU.

Designing an APU so that the iGPU can be disabled to increase yields is one partial work-around.

When an expensive CPU like the Broadwell 5775C has half the CPU chip taken up by integrated graphics and the four actual CPU cores are only a small portion of all the circuitry on the die that's weird. (The EDRAM helps in efficiency in terms of CPU performance by being a victim cache but it's a separate chip on the die.) When the performance of the graphics is killed easily by a relatively inexpensive discreet GPU that's even worse. However, if Intel had done enthusiasts a favor by releasing a harvested Broadwell C with the iGPU disabled for yields and a higher TDP that could have made the chip less disappointing.

Even in laptops it's not optimal to have all the heat concentrated into such a small area, unless it's just to cut costs.

That's really what APUs are about, nothing more. They're about cutting costs at the expense of efficiency. The only way that will change is when something like HBM2 can be used for both the APU graphics and system RAM. Unified system memory at very high speeds will finally make the APU model more useful.

An APU prevents the cost of needing a discreet GPU. Discreet GPUs need VRM circuitry, RAM, PCBs, connectors, separate processors, etc. But, they offer less concentration of heat and VRM stress. They offer the ability to not sacrifice graphics performance nearly as much to fit into the die size and thermal envelope (including cooling) parameters. Discreet GPUs are also easily replaceable in the desktop format.

BDW-U_575px.png

BDW-H-Map_575px.png

A jack of both trades and a master of neither.
 
Last edited:
  • Like
Reactions: cbn

cbn

Lifer
Mar 27, 2009
12,968
221
106
Bigger CPU die and/or corners cut in design to keep the die size down.

More concentrated heat.

More throttling requirements.

Higher stress on power delivery circuitry since it's now having to feed both a CPU and a GPU.

Designing an APU so that the iGPU can be disabled to increase yields is one partial work-around.

When an expensive CPU like the Broadwell 5775C has half the CPU chip taken up by integrated graphics and the four actual CPU cores are only a small portion of all the circuitry on the die that's weird. (The EDRAM helps in efficiency in terms of CPU performance by being a victim cache but it's a separate chip on the die.) When the performance of the graphics is killed easily by a relatively inexpensive discreet GPU that's even worse. However, if Intel had done enthusiasts a favor by releasing a harvested Broadwell C with the iGPU disabled for yields and a higher TDP that could have made the chip less disappointing.

Even in laptops it's not optimal to have all the heat concentrated into such a small area, unless it's just to cut costs.

That's really what APUs are about, nothing more. They're about cutting costs at the expense of efficiency. The only way that will change is when something like HBM2 can be used for both the APU graphics and system RAM. Unified system memory at very high speeds will finally make the APU model more useful.

An APU prevents the cost of needing a discreet GPU. Discreet GPUs need VRM circuitry, RAM, PCBs, connectors, separate processors, etc. But, they offer less concentration of heat and VRM stress. They offer the ability to not sacrifice graphics performance nearly as much to fit into the die size and thermal envelope (including cooling) parameters. Discreet GPUs are also easily replaceable in the desktop format.

BDW-U_575px.png

BDW-H-Map_575px.png

A jack of both trades and a master of neither.

I agree with most everything you wrote.

However, HSA could be really neat for laptops, mobile computing and servers eventually....with servers most likely to benefit first.
 
Last edited:

DrMrLordX

Lifer
Apr 27, 2000
21,633
10,845
136
I own an APU and am an avid APU-user. I like my current 7870k, and I do not feel like I am currently "thermally limited". Throw enough big-boy cooling at it and it stays quite reasonably cool during even the most extreme operation.

I am currently mostly limited by the die size and memory bandwidth. There's only so much you can cram in there on a 28nm process, and DDR3 has its limits as well. HSA - one of the great promises of GCN-based APUs - is still MIA, and the widely-available OpenCL 2.0 technology is currently hideously under-utilized. Even DX12/Vulkan which promised to "change all that" haven't really changed it, either. Sadly.

There won't be a "serious" APU entry until maybe Snowy Owl. That guy may be largely unobtainable for us desktop folks unless we're ready to spend some serious cashola. Though Raven Ridge also looks interesting . . .
 

The Stilt

Golden Member
Dec 5, 2015
1,709
3,057
106
until maybe Snowy Owl

Snowy Owl is a server platform, two Zeppelin dies (MCM) on a BGA package. Raven is the first "next gen" APU, however I don't expect it's relative position against the current dGPUs change too much. There is just no way you can deliver sufficient memory bandwidth with a 128-bit memory interface, unless you have QDR / ODR instead of DDR. Raven will certainly be significantly faster than current gen APUs, however the entry level dGPUs will be faster too by the time Raven is out.
 

crashtech

Lifer
Jan 4, 2013
10,524
2,111
146
I own an APU and am an avid APU-user. I like my current 7870k, and I do not feel like I am currently "thermally limited". Throw enough big-boy cooling at it and it stays quite reasonably cool during even the most extreme operation.

I am currently mostly limited by the die size and memory bandwidth. There's only so much you can cram in there on a 28nm process, and DDR3 has its limits as well. HSA - one of the great promises of GCN-based APUs - is still MIA, and the widely-available OpenCL 2.0 technology is currently hideously under-utilized. Even DX12/Vulkan which promised to "change all that" haven't really changed it, either. Sadly.

There won't be a "serious" APU entry until maybe Snowy Owl. That guy may be largely unobtainable for us desktop folks unless we're ready to spend some serious cashola. Though Raven Ridge also looks interesting . . .
I haven't looked seriously at an APU for my own use for a couple years, but at the time I thought the A10 being considered (can't remember the number) would throttle under combined CPU/GPU loads even if adequate cooling was supplied, and some kind of hack was needed (and perhaps beefy VRMs) to prevent this behavior...
 

BonzaiDuck

Lifer
Jun 30, 2004
15,725
1,455
126
I'll insert my milktoast response to a very lively discussion.

I'd always been partial to a dGPU even in a desktop used for "mainstreamer" activities, but I like the power savings of using an iGPU. My choice of a dGPU has always been a bit labored; I don't pick "top-of-the-line" necessarily.

So having a system with iGPU -- APU or Intel graphics accelerator -- is an advantage for taking one's time building a system. In fact, when I overclocked my 6700K system, it was using the iGPU with no dGPU, and you would expect the processor would run hotter. I don't even think I noticed it though. Dropping in a GTX 1070 "OC mini" wasn't a game changer for the tweaking, but only for the graphics performance.
 

DrMrLordX

Lifer
Apr 27, 2000
21,633
10,845
136
Snowy Owl is a server platform, two Zeppelin dies (MCM) on a BGA package.

That's why it's going to be so spendy. And the boards will likely require some kind of a custom/hacked UEFI to support any kind of overclocking. OCing an MCM chip like that will also be a technical nightmare, if it's even possible!

But man, the power that those things will have . . . whew!

Raven is the first "next gen" APU, however I don't expect it's relative position against the current dGPUs change too much. There is just no way you can deliver sufficient memory bandwidth with a 128-bit memory interface, unless you have QDR / ODR instead of DDR. Raven will certainly be significantly faster than current gen APUs, however the entry level dGPUs will be faster too by the time Raven is out.

Yeah, I expect it to be competent but not a major game-changer. It'll be a good upgrade for Kaveri nuts and the like.

I haven't looked seriously at an APU for my own use for a couple years, but at the time I thought the A10 being considered (can't remember the number) would throttle under combined CPU/GPU loads even if adequate cooling was supplied, and some kind of hack was needed (and perhaps beefy VRMs) to prevent this behavior...

Thanks to The Stilt, there're many UEFIs out there for numerous FM2+ boards to defeat GeAPM throttling. Some boards still have issues with the total amount of power that can be delivered to the socket (which can induce some throttling which a GeAPM-defeat will not stop), but my A88x-Pro is pretty robust in that regard. I really have to beat the heck out of my 7870k to induce that sort of throttling . . . think Furmark + Prime95 Blend and the like.
 

superstition

Platinum Member
Feb 2, 2008
2,219
221
101
I like the power savings of using an iGPU.
I'm not sure how much power savings can be brought just by putting an GPU into a CPU chip, beyond what one gets from having Intel's 14nm process (since Intel doesn't sell discreet GPUs).

Closer proximity might improve latency related to throttling schemes, though.

How much power is wasted by having a second power delivery system versus the costs related to concentrated heat and tradeoffs made to cram both onto one chip? Do iGPUs lead in performance-per-watt necessarily or mainly because discreet GPUs are tuned for clocks rather than PPW?

Hruska said:
The other problem with trying to evaluate Intel’s claim is that its most powerful Iris and Iris Pro chips are confined to mobile processors. In many cases, chips in the 15-28W range will throttle if you try to run the GPU at full speed for very long in an intense gaming scenario. The laptops that carry discrete cards from AMD and Nvidia also tend to be thicker, bulkier systems with better overall cooling.

Performance per watt also goes out the window of the system simply can't do what it's needed to do because it can't offer the necessary minimum level of performance. There is also the issue of performance per dollar.
 

nerp

Diamond Member
Dec 31, 2005
9,866
105
106
Distilled from everything above:

Nothing inherently wrong about APUs. The implementation today still leaves some to be desired.
 

superstition

Platinum Member
Feb 2, 2008
2,219
221
101
Nothing inherently wrong about APUs. The implementation today still leaves some to be desired.
It depends on what you're using them for. iGPUs for gaming, for instance, are substandard. But, if Kaby's HEVC encoding is high-quality and needs the iGPU circuitry to provide that functionality then it's not something to be desired since it greatly speeds the encoding process. The last time I checked, though, GPU acceleration for HEVC (and H.264 as well) yielded low-quality results despite the high speed.

Whether or not it's worthwhile to devote so much of a CPU chip to enhancing HEVC encoding is also debatable.
 

The Stilt

Golden Member
Dec 5, 2015
1,709
3,057
106
Generally the current gen. fixed function HEVC encoders are completely useless, regardless of the vendor (AMD, Intel, nVidia). Limited CTU sizes (32 instead of 64), lack of B-Frames and SAO. Basically you could use X264 just as well.