AMD Fury X Postmortem: What Went Wrong?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Take back what I said about 1080p. Some of these games don't really go to high in fps at max settings. The fury x is 6% behind in 1440p and 9% behind in 1080p on computerbase. around 12% behind a max OC 980ti at 1080p. Not massive numbers and within reach.

Its too close to the 980 though. At 1080p high its only 7% faster than OC 980 (basically aftermarket 980s) or 14% over the stock 980. Given the prices and the OC ability of the 980 (vs. the OC levels of what we have seen so far of the fury X) I really can't see a spot for this card at 1080p.
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
IMO there is nothing wrong with not claiming the outright performance crown. As long as performance-per-$ is competitive a card will sell.

What ruins the Fury X is the stuttering or frame lag issue. Techreport have documented this nicely in their review of the new card. This is a driver problem -not a hardware problem- that AMD refuses to fix months (years?) after it was first discovered. Users notice this, gameplay is much smoother on similarly priced nvidia card compared to AMD's Fury X. Yet AMD do not take these things seriously...

IMO, were it not for the random frame lag issue this new card would have sold in decent numbers until next generation AMD GPU was ready. But right now things are not looking good.
What frame lag is there for a single gpu without hitting memory capacity?
 

YBS1

Golden Member
May 14, 2000
1,945
129
106
And what about the numbers at lower resolutions since high res doesnt suit you..?.

Or are they irrelevant as well..?.

http://www.computerbase.de/2015-06/.../#diagramm-rating-1920-1080-normale-qualitaet

What are you seeing that I'm not seeing there??? I'm seeing the 980Ti winning at least 75% of those benchmark in that link. Though I will say that site is the closest that I have seen the two at those resolutions. Since we're on Anandtech, let's try this link...

http://www.anandtech.com/bench/product/1496?vs=1513

Which is correct? Not sure there is a way to tell as of yet. Either way, the Fury is still behind at this early juncture.

And by the way, I never even implied ultra high resolutions were irrelevant. However, since the two cards seem to be in a near dead heat there I don't really see the need for much of a debate. The majority of buyers however are playing 1080-1440.
 
Last edited:

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
This is a nice read on how nvidia got where they are with maxwell and why I think AMD has done better engineering/has a better architecture. If Fury does end up beating maxwell in the next few months it's really something.

http://www.anandtech.com/show/9059/the-nvidia-geforce-gtx-titan-x-review/2

It's worth pointing out that Fury X only supports double precision at 1/16 the single precision rate. It's better than Maxwell in that regard, but only barely.

It's also worth pointing out that 99% of users don't give a rat's ass about double-precision support, and of those who do, the 4GB RAM limit for 1st generation HBM will usually be an insuperable obstacle.

For the handful of HPC tasks where this matters, the contest will still be between GK210 and Hawaii until FinFET becomes mature. For everyone else - gamers, HTPC users, and professionals who use applications like AutoCAD and SolidWorks - double precision support never mattered, doesn't matter now, and will continue not to matter in the future.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
This is a nice read on how nvidia got where they are with maxwell and why I think AMD has done better engineering/has a better architecture. If Fury does end up beating maxwell in the next few months it's really something.

http://www.anandtech.com/show/9059/the-nvidia-geforce-gtx-titan-x-review/2

Any and every Maxwell part beats any and every GCN part in perf/w. Perf/w has nothing to do with double precision when double precision isn't being used. In fact, Nvidia's best perf/w chip (GM204) is 26-50% more efficient than Fiji depending on the resolution. There is no way Fiji will catch that. In fact, there is no way Fiji will catch any Maxwell in perf/w numbers. The only thing Fiji is close to competing at in performance is 4K, but the 980 TI is still a sizeable 20% more efficient at that resolution. At 1440p the 980 TI is 27% more efficient, and at 1080p it's 33% more efficient.

Nvidia went straight graphics with Maxwell and for good reason. They knew it was going to be awhile before finfets were viable. They knew there was only so much die size left to exploit before becoming impossible to manufacture. They saw a huge opportunity in notebooks. Nvidia executed perfectly.
 

pj-

Senior member
May 5, 2015
501
278
136

WhoBeDaPlaya

Diamond Member
Sep 15, 2000
7,415
404
126
GCN is just old at this point. It is as old as kepler but kepler has been replaced and GCN has not.
Have you seen how Kepler has gone / is slowly going to sh*t in newer titles?
Compared to that, 280X / 7970 is doing very well for its age!

GCN should remain well supported for awhile yet, since it is in consoles after all.
 
Last edited:

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
I don't see how that link helps your argument. 980 ti is 10% ahead of fury x at 1080p high quality. 980 ti OC karte (not sure what karte means..) is 20% ahead.

Those are significant numbers. How is fury x possibly going to catch up?

I say that these are not that significant...

Look at the detailed scores at 1080p, the 10% difference is mainly due to 4 games where the Fury perform nowhere near what it is supposed to do, nothing that cant be soon improved by a driver...

It would have been different if it was 10% regularly distributed...
 
Last edited:

Azix

Golden Member
Apr 18, 2014
1,438
67
91
Any and every Maxwell part beats any and every GCN part in perf/w. Perf/w has nothing to do with double precision when double precision isn't being used. In fact, Nvidia's best perf/w chip (GM204) is 26-50% more efficient than Fiji depending on the resolution. There is no way Fiji will catch that. In fact, there is no way Fiji will catch any Maxwell in perf/w numbers. The only thing Fiji is close to competing at in performance is 4K, but the 980 TI is still a sizeable 20% more efficient at that resolution. At 1440p the 980 TI is 27% more efficient, and at 1080p it's 33% more efficient.

Nvidia went straight graphics with Maxwell and for good reason. They knew it was going to be awhile before finfets were viable. They knew there was only so much die size left to exploit before becoming impossible to manufacture. They saw a huge opportunity in notebooks. Nvidia executed perfectly.

DP takes up die space. And I am not sure they can turn off those parts so they also would contribute to power consumption.

All nvidia did was cut some stuff out.

You're using techpowerup figures. Other sites show fury X using less than stock 980ti or a small bit more

http://www.hardwarecanucks.com/foru...9682-amd-r9-fury-x-review-fiji-arrives-8.html

http://www.tomshardware.com/reviews/amd-radeon-r9-fury-x,4196-7.html

I am not sure any show as big a difference as techpowerup. All i was saying is that it's impressive if AMD can keep a smaller more complete chip that ends up with better performance than the larger less complete chip.
 

meloz

Senior member
Jul 8, 2008
320
0
76
What frame lag is there for a single gpu without hitting memory capacity?

I cannot answer this, I am not qualified. I was summing up my feelings on the new AMD graphics card after a quick glance at the Techreport review.

IIRC this stutter issue was also present in 1080p gaming with many titles, and how many of those games were running against memory limits? To me this looks like more of a driver issue than any hardware problem.
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
This is a nice read on how nvidia got where they are with maxwell and why I think AMD has done better engineering/has a better architecture. If Fury does end up beating maxwell in the next few months it's really something.

http://www.anandtech.com/show/9059/the-nvidia-geforce-gtx-titan-x-review/2
I read that awhile ago, and cant be bothered re-reading it....but I cant understand where you get AMD doing a better job, when Maxwell has GDDR5 (high power use) against HBM of AMD, yet still draws less power and the NV GPU appears to be more powerful in graphics draws.
If high bandwidth memory doesnt help AMD GPU, their architecture cant be that flash!
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
IMO there is nothing wrong with not claiming the outright performance crown. As long as performance-per-$ is competitive a card will sell.

What ruins the Fury X is the stuttering or frame lag issue. Techreport have documented this nicely in their review of the new card. This is a driver problem -not a hardware problem- that AMD refuses to fix months (years?) after it was first discovered. Users notice this, gameplay is much smoother on similarly priced nvidia card compared to AMD's Fury X. Yet AMD do not take these things seriously...

IMO, were it not for the random frame lag issue this new card would have sold in decent numbers until next generation AMD GPU was ready. But right now things are not looking good.

Frame lag, with what CPU? What game? I don't believe you when you say this bug has existing for "years" in AMD's drivers. They did a lot of work with frame-pacing, and XDMA-based Crossfire, to the point that review sites stopped using FCAT, because it would have shown NV to be inferior in terms of frame-pacing.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Frame lag, with what CPU? What game? I don't believe you when you say this bug has existing for "years" in AMD's drivers. They did a lot of work with frame-pacing, and XDMA-based Crossfire, to the point that review sites stopped using FCAT, because it would have shown NV to be inferior in terms of frame-pacing.

Do you have links proving this? Because I don't recall seeing that happening.
 

RampantAndroid

Diamond Member
Jun 27, 2004
6,591
3
81
Consumers need AMD to have a win.

Agree. I do think we saw something good happen (for the consumer, not for AMD) with the 980ti being released at this price point (I suspect to undercut AMD? Or at least force them to reposition their cards?)

For 4GB, I'm looking more at the future, so while 4GB is fine, people can guess all they want, but I'm going to bet that you'll see newer games exceed 4GB. It might be for good reasons, or it might be due to inefficient resource management, but I honestly think it'll happen. All else being equal, I see no reason to go with a card that has 2GB less RAM...and the Fury X isn't totally equal right now.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
I bet new GameBarelyWorks games will fill the VRAM up to 6GB. It is nv marketing 101 to cripple the competitor so they will fill the VRAM with 101 values, just for lulz.
 

DarkKnightDude

Senior member
Mar 10, 2011
981
44
91
I bet new GameBarelyWorks games will fill the VRAM up to 6GB. It is nv marketing 101 to cripple the competitor so they will fill the VRAM with 101 values, just for lulz.

Ironically they would be crippling their own 980 and 970 cards if that is the case.
 

meloz

Senior member
Jul 8, 2008
320
0
76
Frame lag, with what CPU? What game? I don't believe you when you say this bug has existing for "years" in AMD's drivers. They did a lot of work with frame-pacing, and XDMA-based Crossfire, to the point that review sites stopped using FCAT, because it would have shown NV to be inferior in terms of frame-pacing.

Sigh, just read the Techreport review. Maybe frame lag is not the correct term? What do you call this?

c3-50ms.gif


bf4-33ms.gif


gtav-16ms.gif



And their conclusion is also damning. An excerpt:

TechReport said:
Speaking of which, if you dig deeper using our frame-time-focused performance metrics—or just flip over to the 99th-percentile scatter plot above—you'll find that the Fury X struggles to live up to its considerable potential. Unfortunate slowdowns in games like The Witcher 3 and Far Cry 4 drag the Fury X's overall score below that of the less expensive GeForce GTX 980. What's important to note in this context is that these scores aren't just numbers. They mean that you'll generally experience smoother gameplay in 4K with a $499 GeForce GTX 980 than with a $649 Fury X. Our seat-of-the-pants impressions while play-testing confirm it.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Fake or ill-thought outrage, paying a few $ more for the AIO cooling. Imo, save the rage in case AMD doesn't allow voltage control on the GPU core. Because then they are denying full use of that AIO. Assuming the outraged forum posters would even consider buying it, which seems unlikely given the hyperbole used. This and other threads have set a new level of GPU trash talking, imo, most of it boils down to "it should be priced $50 less".

Some really perverse logical pretzels get formed around AMD vs Nvidia.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
In fact, Nvidia's best perf/w chip (GM204) is 26-50% more efficient than Fiji depending on the resolution. There is no way Fiji will catch that. In fact, there is no way Fiji will catch any Maxwell in perf/w numbers.

I will suggest you look at the Perf/watt of Fiji vs Maxwell after Fury Nano is released ;)
 

flopper

Senior member
Dec 16, 2005
739
19
76
Where to begin? How about sites that test a variety of games instead of piling on crap like Project Cars, Dying Light etc..

It basically ties 980Ti, while running cooler & a lot quieter, exhausting heat out the case. ie. all the benefits of water cooling at the same price.
.

I Buy AMD stuff as I cant stand nvidia as a company as their moral and ethical compass is all crap. However seeing amd shooting themselves in the foot failing to understand if you create expectations you need to deliver flawlessly.

They didnt.
They needed but couldnt.

Still I am buying a Fury as I find it better in what I do for computer use and frankly who buys old hardware like nvidias?

If they presented fury within proper expectation it be a great launch.
I find the card superb with early drivers.
but the internet crowd cant do that due to Hardocp and others frankly killing them with review talks
 

RampantAndroid

Diamond Member
Jun 27, 2004
6,591
3
81
People are saying it's priced too high. It's completely sold out. o_O

If I make 5 widgets and convince 5 people to give me their money, but am still thousands of sales away from my widget business being profitable, does that mean I'm going well?

People already said there'd be a supply issue on day 1. Let's do a real look back in 2-3 months.

who buys old hardware like nvidias?


Yes, the 980 ti is old. :rolleyes:
 

showb1z

Senior member
Dec 30, 2010
462
53
91
Ironically they would be crippling their own 980 and 970 cards if that is the case.

Nvidia has no qualms about that. Gameworks is pretty much a slider that lets them outdate hardware at a pace they see fit.

Fury Pro gives AMD another chance to turn this into a win. There has to be more performance on the table, especially on lower resolutions. Hopefully they improve their drivers by the time Fury Pro reviews come around.
 
Feb 19, 2009
10,457
10
76
Sigh, just read the Techreport review. Maybe frame lag is not the correct term? What do you call this?

And their conclusion is also damning. An excerpt:

I call it shilling. Nothing from TR is to be trusted with their record against AMD. These are the same people who refused to include Dirt in their benchmark because its "biased towards AMD" but have no problems including Project Cars which is even more biased towards NV.

But not only that, Guru3D which does FCAT testing all this time without stopping through the 970/980 period like TR did when SLI was broken on FCAT, they find Fury X fine. So does Sweclockers & Computerbase.