[Mynavi.jp] AMD reveals plans to 2020.

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
1015513130.gif

Interesting breakdown. Most interesting thing of note is the increase in Samsung's R&D spend in just one year. 2013 they were a third of Intel's spend. Just one year later they are half.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
It seems like AMD is banking on the slow death of the discrete GPU and that the future is APUs, which is why they are going for 300W APUs.

Isn't this going to kill the upgrade path? I mean, they will have to increase the variety of apu products a lot. Today you can combine any cpu with any gpu which is a great thing. Maybe you want a specific gpu which is available only on a specific cpu which is it not what you want(the cpu part). How can you solve that? This why I do not think that the dgpu will ever die. Midrange might shrink a lot, but the dgpu will continue to live on at the high end of the market.

Nowhere in any of the articles linked does it say AMD plans to abandon dGPU graphics in favour of APUs. This is a typical case of AT forums where someone adds new information not written in the editorial and then other people respond to that information without comprehending the info in the original sources.

AMD never issued any official statement that they are abandoning dGPUs in favour of 200-300W TDP APUs. AMD can simply diversify their business and manufacture both products to fit specific market needs. It makes no sense for AMD to abandon dGPUs and design 200-300W TDP APUs when they can simply design and manufacture both types of products. It's not as if the dGPU market will die in the next 5 years.

The articles about AMD's 5-year-plan simply suggest that AMD will align GPU architectural updates in APUs on a 2-year-cadence:
http://www.fudzilla.com/news/processors/37395-amd-announces-glorious-five-year-plan

It is no secret that AMD is diversifying away from the volatile and stagnant PC market but it doesn't mean they will abandon it entirely. This idea that AMD is leaving the x86 CPU business and soon the dGPU business is pure propaganda regurgitated by few vocal anti-AMD members who are either short sellers of AMD stock, employees of a competing firm, bitter ex-AMD employees or simply fanboys of a competing firm. AMD is clearly committed to future x86 CPU designs and has no plans to abandon dGPU designs for mobile or desktop segments.

At the same time, we should continue to see AMD focusing really hard to win new business in embedded custom space. Design wins with MediaTek and Nintento's NX console are evidence of that. AMD is most definitely going to do whatever is possible to get design wins for PS5/XB2 as well and Nintendo's next console in 2021.

Interesting breakdown. Most interesting thing of note is the increase in Samsung's R&D spend in just one year. 2013 they were a third of Intel's spend. Just one year later they are half.

I think you meant Qualcomm. :D Qualcomm messed up badly with 810. That SoC is throttling, under-performing garbage that can barely beat 805 under prolonged real world workloads. 815 can't get here fast enough. The inclusion of 810 in LG G4, Sony Z4 and HTC One M9 single-handedly takes all of those phones out of contention against the S6 / iPhone 6S imo. I really feel sorry for LG/Sony/HTC.

What stands out is just how inefficient NV is when it comes to R&D spending. It's no wonder they jacked up the prices on us moving mid-range from $250 to $500-550 and high-end from $500 to $700-1000. All that R&D expenditure needs to be covered somehow. If you look at NV's gross margins in the last 7 years, they skyrocketed from low to mid-30s then to mid to high 40s and now they are at 55-56%. All this talk about NV having to raise prices to account for higher wafer costs, inflation and lower yields of cutting edge nodes is pure non-sense. The financial statements do not back that theory up.

NV could only dream of 53-56% gross margins during GeForce 6/7/8/9/GTX200/400/500 series. What changed is NV decided to increase its marketing efforts, worked a lot more closely with reviewers to prop up the NV brand in a more positive light and pushed prices way higher to see what its loyal customer base can bare. That's why when some people defend 680 and 980 at $500-550 and suggest that because historically NV/AMD haven't raised prices much to account for inflation, and there are now increased costs of newer nodes -- well that doesn't at all align with financial reality of past GPU generations from NV. NV was never a business with 55% gross margins. Simply said, NV raised prices not because they wanted to maintain their historical margins due to higher costs, but because they wanted to and combined with their marketing efforts, they convinced the market those prices are "fair." Too bad most of the PC gaming market bought into the marketing BS. Unfortunately I saw the exact same thing happen in the audiophile space and I called this 3-4 years ago. Going a bit off topic, but basically now the only way to re-align yourself on the price/technology curve is either to wait for AMD to force NV to drop prices OR wait for the 2nd half of a generation when NV brings an appropriately priced high-end flagship after they have milked the market for 9-12 months with a mid-range next gen product as "high-end" (aka 680, 980, insert Pascal GP204 in 2016, repeat).
 
Last edited:

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
Interesting breakdown. Most interesting thing of note is the increase in Samsung's R&D spend in just one year. 2013 they were a third of Intel's spend. Just one year later they are half.

I find AMD not even making the list more interesting. They seem to be doing OK once a person ignores all the doom and gloom topped off with marketing hype.

Two or three times the power of the current consoles would more than satisfy most of the market I'd think. Guess maybe keeping the crossfire option open could whittle away at the rest of the market.
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
I find AMD not even making the list more interesting. They seem to be doing OK once a person ignores all the doom and gloom topped off with marketing hype.

Two or three times the power of the current consoles would more than satisfy most of the market I'd think. Guess maybe keeping the crossfire option open could whittle away at the rest of the market.

PS4 APU: 1152 GCN cores
R290x: 2816 GCN cores
R280x: 2048 GCN cores

3x what's in current consoles would be 25% faster than the best discrete card AMD has. 2x is almost what a 290 has.

Frankly, if I could get 1152 GCN cores with enough bandwidth to feed them in an APU, I'd be pretty happy. Seems a shame to need to buy 8-16GB of fast memory when only 2-3GB (at present) needs to be fast though. Hence discrete cards.

We'll see what HBM does, though.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Interesting breakdown. Most interesting thing of note is the increase in Samsung's R&D spend in just one year. 2013 they were a third of Intel's spend. Just one year later they are half.

You mean Qualcomm
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
3x what's in current consoles would be 25% faster than the best discrete card AMD has.

It's actually much worse than that. The GPU performance in a PS4 is roughly equivalent to an R7 265 / GTX660. R9 290X (Max) is about 2X faster. To get 3X faster, you need a GPU 50% faster than an R9 290X, or about Titan X. Sure enough, Titan X is 293% (2.93X) the performance of an R7 265.

http://www.computerbase.de/2015-03/...-test/4/#diagramm-rating-1920-1080-4xaa-16xaf

It's going to take a long time before we have an APU 3X more powerful than one in the PS4. That's why it absolutely makes no sense to release next generation PS5 until 2018-2019 at the earliest.

Let's assume each successor of GM200 is 50% faster. March 2017 (x1.5), March 2018 (x1.5), March 2020 (x1.5). We would have:

293% x 1.5^3 = 988% of R7 2965 (or nearly 10X faster than the GPU in PS4).

However, that's a 250W TDP GPU. I would say 6-7X the performance increase of PS4 by 2019 is more realistic. That means it's probably going to take at least 3 years before we have an AMD APU with Titan X's performance. I think it'll actually take longer.

we will? I appreciate upgrading my GPU separately from my CPU.

It's not possible for an APU to beat an Intel designed 130-140W TDP CPU + 250-300W TDP NV/AMD dGPU combo. Again, not sure why people think APUs will completely displace dGPUs. There is an insatiable demand for higher end graphics and as VR moves closer to reality, as well as 4K and 5K-8K adoption takes off in the next 10 years, the demand for 250-300W TDP cards (and SLI/CF) will continue to grow. APUs simply cannot compete with high-end 250W TDP graphics cards. For the next 15+ years the existence of dGPUs is assured.
 
Last edited:
Dec 30, 2004
12,553
2
76
Hawaii is obviously not part of any APU from 2014. Secondly, it says dGPU on the slide so the red part is their discrete GPUs.
Pretty much confirms rebrands from AMD and most likely a 390X with high power consumption in 2015

005l.jpg

I like how the first line on that slide serves to dispell the concern they have no idea what they're doing or where to go now.
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
I like how the first line on that slide serves to dispell the concern they have no idea what they're doing or where to go now.

Hah, truly!

"We have plans for the coming 10 years"
The plan: release a new product every 2 years.
 
Dec 30, 2004
12,553
2
76
I kinda wonder if NVidias high priced Vid cards is a result of the loss of the Lowend? No longer can defective chips be simply sold as lesser models, meaning that they have to eat that loss through markups on the Highend.

it's because they know they have a superior product and capitalize on it like a wise company would for the purpose of securing enough profit to invest expeditiously for the future.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
As well as another 10 years of hype about how great the next chip will be. Who knows, it *might* actually come true.

So you are implying that HD5870, HD7970Ghz, R9 290X were not good products against the competition? Right now Maxwell looks great because it's competing against chips 1 generation behind, but once R9 390 is unveiled, things should be different. If we just look at 970/980 and ignore the $1K Titan aimed at < 1% of the entire GPU market, neither the 970 nor the 980 moved the market much in terms of next level of gaming performance, unless we just look at GW titles.

Sure, there brought HDMI 2.0 and lower power usage but in terms of actual performance bar being raised, hardly anything worth talking about. 980 @ 1.45Ghz is good but a stock one is meh for $530-550 imo. 970 basically did nothing against an after-market R9 290 other than a slightly lower price and power usage. Performance is more or less identical. Since R9 290 launched for $399 1.5 years ago, the entire desktop dGPU market has been very unexciting imo when it comes to the next level of GPU performance.

I'll just use myself as an example. My cards are now 3 years old. 970/290X are only 26-35% faster. After 3 years, that's a joke of an upgrade. The Titan X is $1K. I am pretty sure there are plenty of 680/7970Ghz owners that are extremely unhappy with all of the upgrade options up to this point. After 3 years, I should be able to get a card 70%+ faster for $550. There is no such product right now. That's why we need AMD to be competitive.
 
Last edited:

Genx87

Lifer
Apr 8, 2002
41,091
513
126
The desktop dGPU market has been pretty boring because of process issues moving past 28nm. What are we going on, the 5th year of this process?
 

Shehriazad

Senior member
Nov 3, 2014
555
2
46
300W TDP APUs?


That sounds like AMD is going to produce huge semi-SoC style Chips that you just slap into your PC and be done with it.
I mean if they feature fully specced GCN 1.whatever powered by HBM then this has the potential to be great.

I'm just worried about how you are going to cool that chip...mandatory AIO water cooling or big chip size that would need a new socket design/size? Even if you're going down to 14nm FinFET style...I'm not sure a 300w APU can be cooled in any decent manner unless the chip itself also sports a big physical size.

Just imagine an FX8350 + R9 290 crammed into a single chip....cooling that sounds like a nightmare.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The desktop dGPU market has been pretty boring because of process issues moving past 28nm. What are we going on, the 5th year of this process?

HD7970 came out Dec 22, 2011, the first 28nm GPU. That means 3.5 years of 28nm. By the time 14nm/16nm GPUs come out, it'll be almost 5 years though. We are now 1.5 years or less away from 14nm/16nm GPUs and there is still nothing 70% faster than a 7970Ghz for $550. It's shocking. At this point I am seriously considering just skipping this generation entirely.
 
Aug 11, 2008
10,451
642
126
So you are implying that HD5870, HD7970Ghz, R9 290X were not good products against the competition? Right now Maxwell looks great because it's competing against chips 1 generation behind, but once R9 390 is unveiled, things should be different. If we just look at 970/980 and ignore the $1K Titan aimed at < 1% of the entire GPU market, neither the 970 nor the 980 moved the market much in terms of next level of gaming performance, unless we just look at GW titles.

Sure, there brought HDMI 2.0 and lower power usage but in terms of actual performance bar being raised, hardly anything worth talking about. 980 @ 1.45Ghz is good but a stock one is meh for $530-550 imo. 970 basically did nothing against an after-market R9 290 other than a slightly lower price and power usage. Performance is more or less identical. Since R9 290 launched for $399 1.5 years ago, the entire desktop dGPU market has been very unexciting imo when it comes to the next level of GPU performance.

I'll just use myself as an example. My cards are now 3 years old. 970/290X are only 26-35% faster. After 3 years, that's a joke of an upgrade. The Titan X is $1K. I am pretty sure there are plenty of 680/7970Ghz owners that are extremely unhappy with all of the upgrade options up to this point. After 3 years, I should be able to get a card 70%+ faster for $550. There is no such product right now.

I have no problem with AMD's dgpus. In fact, I am running an AMD dgpu now. It is their cpu/APU lineup that has no appeal at all to me. AMD had better get on board with efficiency in dgpus though, or they are going to have it come back to bite them just like it has in the cpu space already. If the 300 lineup is one super high power expensive HBM flagship and the rest is a bunch of rebrands (for the second time), they are in serious trouble in the dgpu market as well. As for the "next chip syndrome", we have been hearing for years from AMD and now from Intel how great the next generations of igps will be, and how HSA will dominate, but we still are far from a powerful solution that rivals even low end cards like HD7750 in desktop and GT 730 or so in mobile, and HSA is still a very niche market at best.
 

DownTheSky

Senior member
Apr 7, 2013
800
167
116
APUs will replace anything up to mainstream GPUs. For performance/enthusiast segment you'll still need dGPU. For good VR you'll need CFX/SLI. To keep it short, everything stays the same just low-end gets better performance (because HBM).
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
if their APU's gpu can perform even just 50% of a current gen mid range gpu(960 or 280) I am sold.
 
Feb 19, 2009
10,457
10
76
A 200-300W APU at 14-10nm FF means it will have a huge die with lots of HBM memory capacity, huge heat-sinks and extremely complicated and expensive motherboards making it very expensive for the consumer market.
Also AMD would not want to cannibalize its High-End CPU + dGPU products and they will most likely only produce such a chip for the HPC and Server market where they can sell at high margins.

Edit: BUT, a low power, low cost derivative with performance that could rival a 100mm2 dGPU could actually be made for the Mobile/Desktop consumer market.

HBM would be soldered onto the MB, so the large chunk of $ would be the actual MB which is typically ~$100 now, could well be ~$300 (4GB HBM + better VRMs for 300W APUs).

As for the APU itself? 500-600mm2 dies on 28nm has been estimated to cost only ~$80 per die. Though obviously on 14nn ff, that cost would be a lot higher.

But the question is MSRP, with so much potential for margins..

I would happily pay for a $500 APU that packs within it 980 GPU performance. Thats getting a good CPU + great GPU combo for a reasonable price.

Now you plug it in, it's fine for most games but if you want more, due to AMD's hybrid XDMA CF, you can add in a dGPU if you need more grunt.

Ofc it will be for consumers, many gamers would love a slim PC setup like a small console. To AMD, selling you part of a wafer as an APU or dGPU makes no difference. Its both revenues, as long as margins are good, it will be for consumers.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
I also would not mind a 300w APU, provided the CPU performance were adequate (2500K or better in games)

But AMD is not selling me a 300w APU. This is an HPC product.