AMD Fury X Postmortem: What Went Wrong?

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
A couple of weeks ago, I gave best and worst case estimates for AMD's GPU releases. What I described as the "most pessimistic plausible scenario" is, essentially, what actually happened:

How bad can it get? Pretty bad. If AMD is unable to tame the challenges of first-generation HBM, then Fiji could prove to be more of a tech demo than a viable commercial product. We could be looking at Bulldozer 2.0, and that's a blow AMD may not recover from. Fiji is a massive, go-for-broke chip; if despite its technical advancements it fails to recover the performance crown (falling short of Titan X by 5%-10%), then it will be a failure. AMD won't be able to sell it for more than $499, and that price won't be sufficient to recoup R&D costs. Meanwhile, let's suppose that the pessimistic rumors are true and all the cards except Fiji are straight rebrands. No port to GloFo process, just better binning, higher clocks, and more RAM. This leaves AMD's entire lineup in an uncompetitive position; without bringing anything new to the table, they'll continue to have to compete on price with the legion of ex-mining cards out there, not to mention the cheap R9 200 series clearance sales. AMD continues to hemorrhage market share for 18 months or more, waiting for the FinFET+ process to arrive... assuming they survive that long. The only real hope at that point is a buyout offer from Samsung.

The Bulldozer analogy may be harsh; while the construction equipment CPU cores were a complete failure, Fury X isn't too far behind its strongest competitor (the GTX 980 Ti) in either performance or perf/watt. And while Bulldozer proved to be a complete architectural dead end that did nothing but waste AMD's limited R&D budget, the HBM expertise that AMD gained with Fiji is likely to be helpful to them in designing the next generation of GPUs. But in one important respect, the analogy works. Bulldozer managed to barely keep up with Sandy Bridge in heavily multi-threaded benchmarks, but in single-threaded performance, it fell significantly behind. Likewise, Fury X manages to barely keep up with the GTX 980 Ti in 4K performance on most titles, but at 1080p, it falls significantly behind. In both cases, AMD optimized for "the future" at the expense of present-day performance, and naively expected everyone else to play along. Note how low-level APIs (formerly Mantle, now DirectX 12) have often been touted as the magic cure-all for AMD's performance shortcomings in both arenas.

There is a strong connection between performance per transistor, performance per shader, and performance per watt. And because there is a maximum feasible die size for any manufacturing process, the most efficient architecture will usually win the battle for highest raw gaming performance, as well. This is what's been killing AMD ever since Maxwell came out. 2048 Maxwell shaders on a 398 sq. mm. die (GM204) can handily beat 2816 GCN 1.1 shaders on a much more densely packed 438 sq. mm. die (Hawaii). Likewise, the GM206 die is only a few square millimeters bigger than Pitcairn, has about the same number of transistors, and has the same number of shaders, TMUs, and ROPs (1024/64/32) as the *cut-down* Pitcairn. Yet even the full Pitcairn chip can't come close to GM206's performance.

Why does this happen? In some cases, AMD is shorting their chips on ROPs. This is definitely the case with Fiji compared to GM200, and Tonga compared to GM204. AMD obstinately continues to pretend that everyone should write games to favor their architecture, despite the fact that they have 20% market share. It's absurd when AMD and their fans act as if tessellation is some kind of dirty trick.

But as we saw with Pitcairn vs. GM206, even if AMD's chips have adequate ROPs, they still fall short of their Maxwell counterparts. Clock speeds are another part of the equation. At 1080p, the GTX 960 at stock clocks has a 35.7% lead against the R7 265. But the GTX 960 has a base clock of 1127 MHz and a boost clock of 1178 MHz, while the R7 265 only has a 900 MHz base clock and 925 MHz boost clock. If the R7 265 is overclocked to 1140 MHz, it gets a 21.3% boost in performance, which doesn't completely close the gap, but does indicate that it isn't as great as it might first appear. But this simply raises the question: why do Nvidia's Maxwell cards overclock so much better than AMD's GCN offerings? One possibility is that GCN has too short a pipeline, and this is hindering clock rates.

So what needs to happen next? Well, for one thing, AMD needs 14nm FinFET *yesterday*. AMD has to get the drop on Nvidia, beating them to market on FinFETs, preferably by at least 6 months. It might not be feasible to get FinFET GPUs out the door by the end of the year (the Financial Analyst Day presentation has them scheduled for 2016), but if AMD can do it, they should. Even if it's a small part that uses standard GDDR5, it would still give AMD a temporary jump on Nvidia, and paper over their architectural shortcomings for the moment. Nvidia has won 28nm; AMD needs to move on to the next node immediately. But this will only give them some breathing room. Nvidia will eventually get to 16nm FinFET with HBM2, and at that point, AMD needs to have a competitive architecture. They can't stick with little-modified die-shrinks of GCN for too long, or they'll get bulldozed by the competition yet again.

If AMD doesn't have the R&D necessary to do this, they need to hurry up and sell the company to Samsung.
 

DarkKnightDude

Senior member
Mar 10, 2011
981
44
91
This part actually competes unlike bulldozer though. problem is they need to reduce the pricing for it to be viable.
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
GCN is just old at this point. It is as old as kepler but kepler has been replaced and GCN has not.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,949
504
126
GCN is just old at this point. It is as old as kepler but kepler has been replaced and GCN has not.
GCN has held up extremely well though, the 390X is nearly on par with the 980 I find this impressive. Maybe Fury will age better than the Nvidia counterparts I don't know, hard to say. Maybe AMD has extracted nearly all they can out of GCN or maybe AMD has not got a handle on the memory controller in Fury.

Time will tell.
 

.vodka

Golden Member
Dec 5, 2014
1,203
1,537
136
Bulldozer vs Fury X would've been a fine comparison if Fury X was another 2900XT/FX5800U. It isn't the case. Subpar results considering the specs on paper, but not a failure by any means. It manages to compete just fine with 980Ti/TX, if priced a little lower.

AMD got to 980TI/TX performance levels (depending on the game/task) while consuming a little less than a 290x. A step in the right direction.

As ATM said, time will tell. Drivers could make it better (HBM is a completely new paradigm vs GDDR5, and this is priceless experience for next gen cards), overvolting could make it OC better (1150MHz on stock voltage is already higher than what Hawaii usually does), etc. Too early to tell.
 
Last edited:
Feb 19, 2009
10,457
10
76
Where to begin? How about sites that test a variety of games instead of piling on crap like Project Cars, Dying Light etc..

It basically ties 980Ti, while running cooler & a lot quieter, exhausting heat out the case. ie. all the benefits of water cooling at the same price.

perfrel_3840.gif


tAXTcsu.jpg


R9-FURY-X-2-85.jpg


"In UHD scenarios this card is easily able to trade blows with the GTX 980 Ti (if it wasn’t for a lack of optimizations in Dying Light, it would actually be ahead by 2%) and gives the uber expensive TITAN X a swift kick in the pants."

10139


And a compute beast too, destroying the competition on performance and perf/w.

Capture.png


Capture.png


Capture.png


10130


10101


10100


As to why it doesn't shine more? Easy.

Custom 980Ti models. Great overclockers.
More GameWorks titles being tested.
 
Last edited:

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
GCN has held up extremely well though, the 390X is nearly on par with the 980 I find this impressive.

The problem is that Hawaii needs to boost voltage and clocks to marginal levels in order to compete with stock GM204. The R9 390X is clearly a factory overclocked card, and the power consumption figures demonstrate that. Hawaii is 40 square millimeters bigger than GM204, and has nearly 20% more transistors. It also has a memory bus twice as wide, and uses about 40% more power in normal gaming scenarios and can use over 2x as much power on synthetics.

Sure, if AMD had access to FinFET and Nvidia didn't, then they would be temporarily competitive despite their architectural disadvantages. But even if they do manage to steal a march on Nvidia, it won't last forever. Once both companies are on FinFET and HBM2, AMD will need to compete on the basis of its architecture once more.

Maybe Fury will age better than the Nvidia counterparts I don't know, hard to say. Maybe AMD has extracted nearly all they can out of GCN or maybe AMD has not got a handle on the memory controller in Fury.

Time will tell.

There is nothing wrong with incremental improvements. After all, Intel's current CPU architecture is the result of incremental improvements from the original P6 (Pentium Pro) way back in the mid-1990s. Their two most prominent attempts to do fully new architectures (NetBurst and Itanium) were ignominious failures. And of course AMD's Bulldozer was even worse; in hindsight we know they would have been far better off continuing to die-shrink, optimize, and improve Thuban.

Nvidia hasn't had a truly new, ground-up architecture since Fermi. As Anandtech describes: "Kepler itself was a natural evolution of Fermi, further building on NVIDIA’s SM design and Direct3D 11 functionality. Maxwell in turn is a smaller evolution yet."

GCN is clearly a viable architecture; it's no Bulldozer. It needs some serious work, but a ground-up redesign is probably not necessary. AMD has obviously made fewer changes from GCN 1.0->1.1->1.2 than Nvidia has done with Fermi->Kepler->Maxwell, but Nvidia's use of unique code names can make this difference look bigger than it really is.

AMD needs to take a good look at Maxwell and see what makes it tick, and make similar improvements to GCN. And they need to do it fast.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
One word: Price

If it were $599, many people would still be disappointed, but could still fly the AMD value flag. I think this price still wouldn't attract customers.

If it were $549, it would be considered a good card, period.

If it were $499 it would hailed as the price to performance champ. Not quite return of the 4870 king, but close enough considering it out performs the 980.

$649 is dud! That's it! Anyone calling it Bulldozer needs to frak off. That was not a good CPU for most common tasks (especially gaming) no matter how much it cost.
 
Last edited:

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
One word: Price

If it were $599, many people would still be disappointed, but could still fly the AMD value flag. I think this price still wouldn't attract customers.

If it were $549, it would be considered a good card, period.

If it were $499 it would hailed as the price to performance champ. Not quite return of the 4870 king, but close enough considering it out performs the 980.

$649 is dud!

Yes, the release definitely could have been salvaged with a $549 or lower price point. But AMD didn't do that because they would have trouble making any profit on it if they did so. This is bleeding-edge technology, and the problem is that it doesn't offer the bleeding-edge performance necessary to justify the added cost of manufacture.
 

Mako88

Member
Jan 4, 2009
129
0
0
The scariest part is that the "nightmare scenario" for consumers is now here, after many years of it hiding in the shadows.

Intel, and Nvidia, have absolutely dominant marketshare positions that AMD is no longer able to counter.

Zen is a myth, Intel will blow through two more die-shrinks before that hope-on-a-star product sees the light of day, and Fury X is just one SKU...it's not an entire product line with nice profit margins in it that AMD can use to refuel their exhausted bankroll.

Within 24 months we'll see Nvidia's high-end card (NON Titan, whatever replaces the 980) priced at $799 at least, meanwhile Intel hasn't needed to improve true performance on the desktop in three years due to AMD not being able to compete...Skylake benches show it barely beating an overclocked 2500K at the same frequency.

No competition means no progress and higher prices.

Sigh.
 

iiiankiii

Senior member
Apr 4, 2008
759
47
91
GameWorks is the reason why AMD is so far behind. Don't get me wrong. AMD screwed up with the Fury X. It just isn't as good with or without GameWorks. Atleast in non-GW games, AMD is atleast competitive. But, GameWorks is straight up killing them. AMD needs to perform well in GameWorks title. That's hard because Nvidia is skewing the feature set to take advantage of their hardware advantages (duh!). Coincidentally, some of those feature sets are AMD's weaknesses. It's going to be tough from here on out for AMD.

AMD has a hard enough time building competing hardware. They can't compete with Nvidia's software prowler. AMD just doesn't have enough money to do that. Of course, one of the solution is to fight fire with fire. Some suggested AMD should start releasing their own proprietary freatures. It's not that simple. I really don't think AMD has enough financial resource to "encourage" some of these developers. That's the main problem.
 

Mako88

Member
Jan 4, 2009
129
0
0
GameWorks is the reason why AMD is so far behind. Don't get me wrong. AMD screwed up with the Fury X. It just isn't as good with or without GameWorks. Atleast in non-GW games, AMD is atleast competitive. But, GameWorks is straight up killing them. AMD needs to perform well in GameWorks title. That's hard because Nvidia is skewing the feature set to take advantage of their hardware advantages (duh!). Coincidentally, some of those feature sets are AMD's weaknesses. It's going to be tough from here on out for AMD.

AMD has a hard enough time building competing hardware. They can't compete with Nvidia's software prowler. AMD just doesn't have enough money to do that. Of course, one of the solution is to fight fire with fire. Some suggested AMD should start releasing their own proprietary freatures. It's not that simple. I really don't think AMD has enough financial resource to "encourage" some of these developers. That's the main problem.

Agree. And there's nothing AMD can do about that now, nor in the future.

Game development is costly, more costly for the AAA titles than making an Avengers movie for example, and the only way to get by is to take in investment where you can...even if it comes in the form of dollars from Nvidia to enchance features for their GPUs while simultaneously locking out their principal competitor from the source code.

This isn't going to change, even an anti-trust lawsuit would take half a decade to produce a result, and it's awful for all of us, the consumers.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
GameWorks is the reason why AMD is so far behind. Don't get me wrong. AMD screwed up with the Fury X. It just isn't as good with or without GameWorks. Atleast in non-GW games, AMD is atleast competitive. But, GameWorks is straight up killing them. AMD needs to perform well in GameWorks title. That's hard because Nvidia is skewing the feature set to take advantage of their hardware advantages (duh!). Coincidentally, some of those feature sets are AMD's weaknesses. It's going to be tough from here on out for AMD.

Skimping on geometry and tessellation performance was a choice that AMD made when developing GCN in general and Fury in particular. In retrospect, it was a poor choice. Yes, if they'd skimped somewhere else, then Nvidia might have targeted that instead. So they need a good all-around architecture without any major weaknesses.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
The problem is, you can't go out there and say "you can't test that game or this game or that one" just because it shows AMD in a negative light. If it's a modern game that is popular and uses advanced features it should be tested period. In the other thread this was mentioned as well. A reviewer chooses games based on popularity and the advanced features of the game engine used. Testing Witcher 3 is fair game for example.
 

Mako88

Member
Jan 4, 2009
129
0
0
The problem is, you can't go out there and say "you can't test that game or this game or that one" just because it shows AMD in a negative light. If it's a modern game that is popular and uses advanced features it should be tested period. In the other thread this was mentioned as well. A reviewer chooses games based on popularity and the advanced features of the game engine used. Testing Witcher 3 is fair game for example.

For sure. Anandtech tested all the major games, and in nearly all of them the Fury X loses to a stock-clocked 980 Ti at 4k. If they were both overclocked the margin of loss would be even bigger for the FX:

http://www.anandtech.com/bench/product/1496?vs=1513
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
GCN has held up extremely well though, the 390X is nearly on par with the 980 I find this impressive. Maybe Fury will age better than the Nvidia counterparts I don't know, hard to say. Maybe AMD has extracted nearly all they can out of GCN or maybe AMD has not got a handle on the memory controller in Fury.

Time will tell.

390x sucks down nearly twice the power as the 980, is 10% larger die, has twice the bus width, over a billion more transistors, and has very little OC headroom vs. 980's headroom plethora, and can only manage to almost tie the 980 when neither are overclocked. The 390x is the fat fifth year senior being outclassed by the sophomore half his size.

If you call that on par, have another drink. Nvidia could go to town on 980 pricing, still clear some decent margins, and absolutely nullify any profit AMD would be able to make if forced to price match.


AMD is being clearly out engineered now and the only thing they can do to compete is price themselves out of profit.
 
Last edited:

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Yes, the release definitely could have been salvaged with a $549 or lower price point. But AMD didn't do that because they would have trouble making any profit on it if they did so. This is bleeding-edge technology, and the problem is that it doesn't offer the bleeding-edge performance necessary to justify the added cost of manufacture.

AMD doesn't make profit on cards. They make profit on chips.

Everyone seems to forget the price includes a water cooler. Wait for the air cooled cards, they will be $100 cheaper.
 

SolMiester

Diamond Member
Dec 19, 2004
5,331
17
76
WOW, this forum is a tough crowd. I dont think the Fury is a failure, as long as its competitive it gives users a choice. Perhaps the price needs reviewing.
What I found interesting from the data so far is that AMD usually widened the gap the high the resolution, but in this instance its the other way around.
I never thought GCN was memory bandwidth staved, and todays results I think show that, hence little improvement in the lower resolutions.
With that in mind, I dont think HBM did much for Fury, so AMD better come up with something more powerful for next time, cause when NV get pascal with HBM2, they will leave AMD behind.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
For sure. Anandtech tested all the major games, and in nearly all of them the Fury X loses to a stock-clocked 980 Ti at 4k. If they were both overclocked the margin of loss would be even bigger for the FX:

http://www.anandtech.com/bench/product/1496?vs=1513

What I'm really saying is that you can't(shouldn't) start saying "hey don't test these games because they use gameworks" It's important to show the whole picture. Tossing out benchmarks because AMD doesn't have a very good driver yet for it rubs me the wrong way. The more information the better for prospective buyers.
 

master_shake_

Diamond Member
May 22, 2012
6,430
291
121
AMD doesn't make profit on cards. They make profit on chips.

Everyone seems to forget the price includes a water cooler. Wait for the air cooled cards, they will be $100 cheaper.


and slower than a full fledged fury x.

remember the air version is a cut down fiji?
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
WOW, this forum is a tough crowd. I dont think the Fury is a failure, as long as its competitive it gives users a choice. Perhaps the price needs reviewing.
What I found interesting from the data so far is that AMD usually widened the gap the high the resolution, but in this instance its the other way around.
I never thought GCN was memory bandwidth staved, and todays results I think show that, hence little improvement in the lower resolutions.
With that in mind, I dont think HBM did much for Fury, so AMD better come up with something more powerful for next time, cause when NV get pascal with HBM2, they will leave AMD behind.

I don't think Fury X is a failure, I think it's a failure at $649 given it's current performance. It will do nothing to win over new customers at it's price, and that is what AMD needs right now.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
Zen is a myth, Intel will blow through two more die-shrinks before that hope-on-a-star product sees the light of day

Didn't Intel just delay 10nm yet again?

Leaked partner roadmaps indicate Zen is supposed to debut around the middle of 2016. If they meet that timeline, they should be able to reclaim at least some market share. If things start dragging out past October 2016 and Zen is still nowhere to be seen, that's when to start worrying.

Raven Ridge has the potential to be AMD's next blockbuster product. Intel will probably still have a better x86 architecture, and Nvidia may still have a better GPU. But neither company can put a decent CPU *and* a decent GPU together on one chip, using shared HBM2 to unify them. Intel's graphics are still way too far behind, and Nvidia lacks an x86 license. 2017 could see AMD bring out a mid-range gaming PC on a chip, something only they can provide.

One thing that AMD benefits from in the CPU market is lowered expectations. In graphics, AMD is expected to be fully competitive with Nvidia, and fans are very disappointed when it falls short. On the other hand, Bulldozer and its progeny were such epic failures that if AMD can match the single-thread performance of a 2011 Sandy Bridge chip, it will be a major leap ahead and enough to satisfy a lot of buyers.
 
Feb 19, 2009
10,457
10
76
The problem is, you can't go out there and say "you can't test that game or this game or that one" just because it shows AMD in a negative light. If it's a modern game that is popular and uses advanced features it should be tested period. In the other thread this was mentioned as well. A reviewer chooses games based on popularity and the advanced features of the game engine used. Testing Witcher 3 is fair game for example.

Sure reviewers can test any game, but if they justify it on "popular games" then for sure Witcher 3 is fair (but turning on NV features like HairWorks for AMD isn't). Dying Light & Project Cars are not popular at all.

Look for yourself:
http://steamcharts.com/top

But the point is in reviews that test a lot of games (including GW titles), Fury X matches 980Ti at 1440p and 4K.

In regards to tossing out games:
Tech Report for example, they tossed out Dirt Showdown because one feature: Global Illumination (which they could disable) ran badly on Kepler. They said its a clearly AMD favored game so they can't include it in their tests. Why now do they include Project Cars? There's no features on/off that help AMD, the game just runs crap and even more biased towards NV.
 
Last edited:

Mako88

Member
Jan 4, 2009
129
0
0
and slower than a full fledged fury x.

remember the air version is a cut down fiji?

Do we know that for sure, that the air version is cut down?

Damn, that would have been really sharp if it were a full Fiji, because you could Crossfire two of them for just $1100 and have a sweet 4k setup for not too big a $$$ hit.
 

utahraptor

Golden Member
Apr 26, 2004
1,052
199
106
and slower than a full fledged fury x.

remember the air version is a cut down fiji?

The scariest part is I think during the initial unveiling comments were made by AMD that the air version would likely throttle and not perform as well as the fury x. I will look for the source.

Found it:

http://www.tomshardware.com/news/amd-fury-x-fiji-preview,29400.html

AMD says one of the biggest differences between the Fury X and the rest of the Fury line is that its water-cooled card shouldn't throttle under load (a big issue we identified with the Radeon R9 290X when it first launched). Instead, this board should always run at its highest clock rate. The air-cooled cards, by comparison, are expected to scale back as their heat sinks and fans are saturated with thermal energy.