AMD Fury X Postmortem: What Went Wrong?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Azix

Golden Member
Apr 18, 2014
1,438
67
91
The only problem with the Fury X might be the price. The thing uses less power than a 980Ti and ties with it on many sites. It's quieter and cooler. How on earth does this equal a failure that needs a thread about where AMD went wrong?

For the first showing of the card that is good in my book. I say wait a few months to see how things turn out before you start with this whole "AMD needs to do better than nvidia to be equal" crap.

Oh, another thing they "did wrong" might be that their involvement in games doesn't screw things up for gamers in general and the competition. unfortunately that works against them because a lot of us are so daft we can't see that there is no bacon on the pizza and the cat is actually a dog. They are transparent in improving games, unlike nvidia.

AMD did good. Better engineering than nvidia, maybe need to lower price because of initial results. over 8 teraflops that could be realized with dx12, higher compute. Come on. Sure if they cut dp more they might be much faster pinning the tail on the donkey but they still did good.
 

ZGR

Platinum Member
Oct 26, 2012
2,052
656
136
Nothing wrong...

This release feels like the Titan X release. Pretty quiet for most of us, except for the few buying them. Both the Fury X and Titan X are low volume cards. I am looking forward to seeing Fury X with voltage unlocked vs unlocked Titan X under water. Should be a very interesting comparison.

I see it like this:

Titan X - $1000 vs Fury X - $650
980 ti - $650 vs Fury Pro - $500(?)
 

Abwx

Lifer
Apr 2, 2011
10,949
3,462
136
In fact quite a few of the biggest recent/upcoming games are. Second, you are trying to show the Fury in the best possible light...


What is exactly this "best possible light"..?


There was 18 games used at Computerbase.de whose chart you are downplaying as a "best possible light", you mean that reviews that use 5-7 games are more likely to give an accurate idea...?

http://www.computerbase.de/2015-06/amd-radeon-r9-fury-x-test/

I guess that you did write your post out of some kind of automatism, but more seriously lets forget this "light" , what about the numbers.?.
 

parvadomus

Senior member
Dec 11, 2012
685
14
81
AMD has a tremendous driver overhead with DX11, thats why it catches when resolution rises (also has anyone tested GPU utilization in low res?). Other thing is it only has 4Gb VRAM to really shine on 4K, I really dont know what AMD was thinking (if only there where 8Gb versions..).
The maxwell efficiency is quite simple to understand. Kepler and Maxwell derivatives has DP calculation done with special shader units meanwhile AMD does it with the same shaders, they are obviously more complex and thats why they are power hungry (and maybe their pipelines are shorter), plus the more efficient memory subsystem (texture compression), and tesselation performance.
Nvidia has invested in texture compression. AMD catched Kepler with Tonga, but Maxwell is way ahead, thats why it does not need HBM to shine.
b3d-bandwidth.gif

Look at the difference between compressed and uncompressed textures, Fury X has almost the same ratio as 780Ti. I think AMD reverse engineered Kepler loseless color compression and made Tonga, and it will probably do the same with Maxwell for next iterations.
Btw I dont understand the profit AMD expects from Fiji itself (wont the professional version pack more than 4Gbs?).
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Where to begin? How about sites that test a variety of games instead of piling on crap like Project Cars, Dying Light etc..

It basically ties 980Ti, while running cooler & a lot quieter, exhausting heat out the case. ie. all the benefits of water cooling at the same price.

Hay guyz these cards can't run 4k well but they almost tie so it matters rite?! 4k is still a nonstarter. Get. Over. It. 1440p and 1080 is where it's still at for 99.5% of all PC gamers. And Fury X sucks there. It's slower, consumes more power, has less OC headroom, and costs the same.

Lose. Lose. Lose. Lose. Nvida = QUAD DAMAGE. The EVGA SC+ is 3db louder (trivial amount), absolutely spanks Fury X out of the box at any resolution, and only costs $30 more. AMD wins with heat dump. Since when in the entire existence of this ultra high end market has that ever been an issue? The 295x2 would have never seen the light of day. What will AMD do when Nvidia refreshes / rebadges Maxwell this fall / winter with the GTX 1000 series of cards and brings the pain with a GTX 1080 fully unlocked GM200 running 100mhz higher? And with Fury X just now coming out, it'd be crazy to think AMD is going to beat Nvidia to finfets. The R&D to make Fury X had to be pretty high... 597mm2 first HBM, first interposer.... surely it won't have a shorter lifespan than GM200.
 
Last edited:

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Nothing wrong...

This release feels like the Titan X release. Pretty quiet for most of us, except for the few buying them. Both the Fury X and Titan X are low volume cards. I am looking forward to seeing Fury X with voltage unlocked vs unlocked Titan X under water. Should be a very interesting comparison.

I see it like this:

Titan X - $1000 vs Fury X - $650
980 ti - $650 vs Fury Pro - $500(?)

Fury X competes against the 980Ti...

Pro hopefully will be a bit better than the 980 with good OC capabilities.
 

RampantAndroid

Diamond Member
Jun 27, 2004
6,591
3
81
Where to begin? How about sites that test a variety of games instead of piling on crap like Project Cars, Dying Light etc..

It basically ties 980Ti, while running cooler & a lot quieter, exhausting heat out the case. ie. all the benefits of water cooling at the same price.

Frankly, I don't care how cool it runs. That's just a measure of how well the cooler dissipates heat. All reviews I've seen have shown the Fury X draws more power. So sure, the cooler dissipates heat faster, but if you were to do an apples to apples comparison of the heat being thrown out by these cards, the Fury would put out more. (and I get it, it's not a linear calculation or anything, but the Fury will be putting out more heat...)

The Gigabyte G1 card stays below the maximums quite well. My case runs really cool (FT02) and so the temp of the GPU itself is of *zero* concern, as long as it isn't going to toast itself. Total heat output however IS a concern. I try to not turn the AC on...

And that said, it isn't really proper water cooling, and there won't be any hybrid coolers or versions that just come with a WC plate.

Given all of the hype, I find this release to be pretty underwhelming...
 

YBS1

Golden Member
May 14, 2000
1,945
129
106
What is exactly this "best possible light"..?


There was 18 games used at Computerbase.de whose chart you are downplaying as a "best possible light", you mean that reviews that use 5-7 games are more likely to give an accurate idea...?

http://www.computerbase.de/2015-06/amd-radeon-r9-fury-x-test/

I guess that you did write your post out of some kind of automatism, but more seriously lets forget this "light" , what about the numbers.?.

I felt I was perfectly clear, the ultra high resolutions are "the best possible light". That is not reality, most gamers are not, nor will be running 4K soon. At the resolutions over 90% of gamers play at the Ti led handily.
 
Last edited:

cbrunny

Diamond Member
Oct 12, 2007
6,791
406
126
How does a gpu that was released TODAY have post mortem?? It barely makes sense to post mortem a 7970 because of how many people actually still use it. Does OP understand how dying works?

Be patient people. I'm as disappointed as the lot but we still know basically nothing about how this will perform in the medium term.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
You're missing the point, we've seen this all before. Every AMD architecture preview gets this "OH BOY OH BOY OH BOY I CANT WAIT" treatment...

...and every single time for most of the last decade by the time it arrives, long delayed, it's already blown away by the Intel price point competitor.

I'm not suggesting any miracles on AMD's part. They have certainly disappointed often enough, with today being only the most recent example.

But I don't think it's unreasonable to expect AMD's clean-sheet CPU design debuting in 2016 to match the per-clock performance of Intel's Sandy Bridge design from 2011 - especially when AMD has explicitly focused on improved IPC and cache performance in their marketing slides. If AMD can't even do that, then sure, stick a fork in 'em - they're done.

And on the GPU side, even if a die-shrunk GCN 1.2 offering wouldn't match up to Nvidia's Pascal discrete cards, it will still beat the pants off of Intel's iGPUs. With 14nm FinFET, AMD could stick a 2048-core GPU on die with four Zen CPU cores, and still wind up with a die size no bigger than today's discrete Tonga GPU. Put that on an interposer with 8GB of HBM2, and you've got a midrange gaming PC on a chip.

AMD already has experience with a unified memory subsystem from Kaveri and Carrizo, and experience with HBM from Fiji. So this is just a matter of incremental engineering - they don't need any new technical breakthroughs to get there.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
I think if there really is a failure it's not getting open source stuff out there. Not getting physics out there etc. This is more so about removing the cancer that is gameworks by showing that there are better performing open alternatives. They are making strides with companies like square enix showing tressfx working on consoles where hairworks would probably cause an explosion. How it would be great for performance on PCs. But there needs to be more. Like I said, even if only to stop nvidia putting in rubbish in these games and affecting AMDs performance. Physx needs to be challenged. Hopefully they have a plan there.

https://youtu.be/lV-ZvAIfBW0?t=11m35s

look at that hair bruh. I'm assuming that gameplay is xb1

People talking about 1080 with these cards need to check that view. 1080p does not matter. Not till graphics effects get more demanding. These cards push out way too high fps in that resolution that it's silly to fight there. 4K is a problem since when these effects do get more demanding, we're back to no GPU being able to manage. 1440p seems the ideal ground for higher end cards to compete.
 
Last edited:

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
People talking about 1080 with these cards need to check that view. 1080p does not matter. Not till graphics effects get more demanding. These cards push out way too high fps in that resolution that it's silly to fight there. 4K is a problem since when these effects do get more demanding, we're back to no GPU being able to manage. 1440p seems the ideal ground for higher end cards to compete.

Exactly. 1440p is where the performance should be centered at and compared with for single GPU setups of this caliber. Unfortunately, 1080p should also probably be used because 1080p users are still the overwhelming vast majority. There will be more 1080p 120/144hz users buying this caliber (980 TI / Fury X) card than 4k users.
 

x3sphere

Senior member
Jul 22, 2009
722
24
81
www.exophase.com
I think if there really is a failure it's not getting open source stuff out there. Not getting physics out there etc. This is more so about removing the cancer that is gameworks by showing that there are better performing open alternatives. They are making strides with companies like square enix showing tressfx working on consoles where hairworks would probably cause an explosion. How it would be great for performance on PCs. But there needs to be more. Like I said, even if only to stop nvidia putting in rubbish in these games and affecting AMDs performance. Physx needs to be challenged. Hopefully they have a plan there.

https://youtu.be/lV-ZvAIfBW0?t=11m35s

look at that hair bruh. I'm assuming that gameplay is xb1

People talking about 1080 with these cards need to check that view. 1080p does not matter. Not till graphics effects get more demanding. These cards push out way too high fps in that resolution that it's silly to fight there. 4K is a problem since when these effects do get more demanding, we're back to no GPU being able to manage. 1440p seems the ideal ground for higher end cards to compete.

When people bring up 1080P I assume they are referring to 1080P @ 144 Hz. Running 1080P at 144 FPS is around the same workload as 1440p at 60 Hz, so...
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
I'm not suggesting any miracles on AMD's part. They have certainly disappointed often enough, with today being only the most recent example.

Fury X will sell OK, at least initially. The 300 series will sell OK. But if finfets aren't coming until Q3/Q4 of next year, then Nvidia will likely rebadge the Maxwell series with higher clocks and new names, giving them a fresh jolt of sales and bigger performance leads over AMD before new cards come.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
And with Fury X just now coming out, it'd be crazy to think AMD is going to beat Nvidia to finfets. The R&D to make Fury X had to be pretty high... 597mm2 first HBM, first interposer.... surely it won't have a shorter lifespan than GM200.

It's possible that Fiji was basically a tech demo that was primarily designed to hone AMD's R&D expertise on HBM. Even if it has a very short life, the R&D expenses don't look too bad if they are averaged across all the upcoming Arctic Islands HBM GPUs as well.

On the other hand, AMD has a history of releasing half-done tech demos and then trying to drag them out as long as possible. (Tonga is a good example of this.) Someone really needs to teach their executives about the sunk cost fallacy.

Another thing AMD might try is to debut 14nm FinFET with smaller chips that still use GDDR5. This would let them get the most out of the new process node (which presumably wouldn't have the best yields at first) and also avoid having to deal with the new challenges of both FinFET and HBM2 simultaneously. As an example, Bonaire is a 160 mm^2 chip with 896 shaders; a new GPU of similar die size on 14nm FinFET should be able to cram in 1792 shaders, the same as cut-down Tonga. If clocks and voltages were kept restrained, it could come in under 75W; it would beat the GTX 960 in performance while not requiring a 6-pin PCIe connector. If it was packed with 4GB of RAM (probably necessary to maximize performance), then a retail price of $250 wouldn't be unrealistic. And a chip like this would be great in laptops, too.
 

at80eighty

Senior member
Jun 28, 2004
458
3
81
VR will be coming soon. perhaps they might make a dent there with the liquidVR & HBM throughput
 

Abwx

Lifer
Apr 2, 2011
10,949
3,462
136
I felt I was perfectly clear, the ultra high resolutions are "the best possible light". That is not reality, most gamers are not, nor will be running 4K soon. At the resolutions over 90% of gamers play at the Ti led handily.

1080p and 1440p are ultra high res.?.

Beause there s also these res at Computerbase.de, but if you insist on writing by automatism so be it, must be an hard habit to break given the redundancy...

http://www.computerbase.de/2015-06/.../#diagramm-rating-1920-1080-normale-qualitaet
 

YBS1

Golden Member
May 14, 2000
1,945
129
106
When people bring up 1080P I assume they are referring to 1080P @ 144 Hz. Running 1080P at 144 FPS is around the same workload as 1440p at 60 Hz, so...

Of course they are, the so called high end users buying these cards are the same people buying the 144Hz screens. They buy these "overpowered for 1080p" cards specifically to try to maintain 120-144fps minimums. Now there is of course going to be some overlap into 4k and multi monitor resolutions, but don't fool yourself into thinking 1080p isn't the primary battleground. Hey, I'm all for higher resolutions, IPS, etc. I'm going to jump on a 34" 21:9 120+Hz curved IPS with blur reduction just as soon as they are out of the gate, (Take my money!), but we aren't there yet.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
Most of what I see AMD needs is on the software side. Their chips seem solid. It will all hinge on dx12 features on both sides and how quickly those games come out.

I predict AMD will be ahead in all segments there, but that hinges on what maxwell 2 is offering. Asynchronous compute is guaranteed fps boosts. Thats one key factor and its already in console games (that look damn good). There should be other features. They NEED to come good there.

Second area is just pushing things like tressfx (seems every game should be using this with how well it works). They need to attach themselves to these technologies and be shown prominently when games use them. Even if they run as well on nvidia, it would help image.

Take back what I said about 1080p. Some of these games don't really go to high in fps at max settings. The fury x is 6% behind in 1440p and 9% behind in 1080p on computerbase. around 12% behind a max OC 980ti at 1080p. Not massive numbers and within reach.
 
Last edited:

YBS1

Golden Member
May 14, 2000
1,945
129
106
1080p and 1440p are ultra high res.?.

Beause there s also these res at Computerbase.de, but if you insist on writing by automatism so be it, must be an hard habit to break given the redundancy...

http://www.computerbase.de/2015-06/.../#diagramm-rating-1920-1080-normale-qualitaet

Let me ask you something, if you can take off your team glasses, what resolutions do you see in his post that I originally quoted? I wasn't referring to the entire article and you know it. Didn't you get enough of this in the high end Intel/AMD thread?
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
WOW, this forum is a tough crowd. I dont think the Fury is a failure, as long as its competitive it gives users a choice. Perhaps the price needs reviewing.
What I found interesting from the data so far is that AMD usually widened the gap the high the resolution, but in this instance its the other way around.
I never thought GCN was memory bandwidth staved, and todays results I think show that, hence little improvement in the lower resolutions.
With that in mind, I dont think HBM did much for Fury, so AMD better come up with something more powerful for next time, cause when NV get pascal with HBM2, they will leave AMD behind.

Of course it is not a failure. The pricing needs work though. At 1080p Fury X is about as fast as a GTX 980 (comparing minimum framerates vs minimum framerates; see http://www.anandtech.com/bench/product/1513?vs=1442 for example). At 4K Fury X is a little slower or about as fast (there is a lot of variation among games) as the GTX 980 Ti. Fury X also lacks stuff like DVI, HDMI 2.0 and has 4GB vs 6GB... that can't be good for resale value.

I think a fair price for Fury X would be about $500-600. I think you could justify either number or anything in-between.... most people are closer to 1080p than 4K so that might suggest $500 (making it a little more expensive than some GTX 980s that come with free game bundled). But you can argue that people buying $500+ cards may have at least 2560x1440 monitors in which case $600 might be justified despite the lack of DVI/HDMI 2.0 and 2GB less VRAM. 980 Ti also probably has more OC headroom as it's not even watercooled and already beats Fury X.
 

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
Frankly, I don't care how cool it runs. That's just a measure of how well the cooler dissipates heat. All reviews I've seen have shown the Fury X draws more power. So sure, the cooler dissipates heat faster, but if you were to do an apples to apples comparison of the heat being thrown out by these cards, the Fury would put out more.

Agree. People don't realize this but all energy becomes heat eventually. Also in stress testing the VRMs still reached 100c with water cooling. The review said the fans wouldn't kick into a high gear too, why wouldn't you let the fan run for people who want to OC the cards?

I disagree with people crying about 4 GB. It has not been shown to impact anything no matter how many times [H] tries to assert it (in their benchmarks that are like 5 gamesworks out of 7 games.) If a 295x2 is not impacted by 4GB at 4k the Fury X won't be, that is common sense. Sites trying to point out the size of a reserved VRAM on card are either ignorant or just trying to blow smoke.

The card performance is fine but unfortunately they probably aren't making the profit that Nvidia is on the 980ti which is at least marginally better. People in this thread have been able to mention that a few times. I made a post two days ago asking what people thought the margins would be comparatively and got asked if I was shill and a mod locked the thread (apparently it is hard to review my post history and see I have 290s in CF and have repeatedly defended AMD from BS on the forum.) The fact remains however that they are likely not, their entire lineup is mispriced and I don't see any volume sales for them outside of consoles and apple at this point which is far from ideal.

With such a small performance lead in many games compared to even a 290x I do not see how any of the cut down Fury products are going to be purchases for anyone unless they are priced near a 980, which is where the 390x currently sits. However, I think that if it has 75% as many cores as Fury X it will be around the performance of a 390x anyways.

I am.. disappointed. We need real competition in this space. Hawaii was an awesome card, the custom cards were much more powerful than Nvidia's offerings and the 290 and 290x continue to be fairly competitive in todays games against current products. AMD needs a win. Consumers need AMD to have a win.
 

meloz

Senior member
Jul 8, 2008
320
0
76
IMO there is nothing wrong with not claiming the outright performance crown. As long as performance-per-$ is competitive a card will sell.

What ruins the Fury X is the stuttering or frame lag issue. Techreport have documented this nicely in their review of the new card. This is a driver problem -not a hardware problem- that AMD refuses to fix months (years?) after it was first discovered. Users notice this, gameplay is much smoother on similarly priced nvidia card compared to AMD's Fury X. Yet AMD do not take these things seriously...

IMO, were it not for the random frame lag issue this new card would have sold in decent numbers until next generation AMD GPU was ready. But right now things are not looking good.