AMD Fury X Postmortem: What Went Wrong?

Page 16 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Mar 10, 2006
11,715
2,012
126
So DX12 will be the "saviour" of Fury?

Even if we for fun imagine this to somehow be true. Its not really working out for the Fury is it?

Stop bargaining for excuses. The result is what the result is.

And in late 2016/early 2017 everyone will be busy with their new 14/16nm flagships with something like 16GB VRAM and the Fury will be long forgotten.

Sort of like how Win 8 would be the savior of Bulldozer...
 

Stormflux

Member
Jul 21, 2010
140
26
91
I do agree banking on DX12 is a shortfall right now. However the Low Level API stuff can't be dismissed. Apparently most of the VR showcases at E3 (which were plenty) were running on Fury setups and LiquidVR, whose experiences were mostly glowing.

Between now and 14/16nm GPUs with HBM2 are major launches for VR. First with Valve/HTC Vive in the Fall (2015), followed by Occulus in the spring of 2016.

It's been pretty much confirmed that Mantle lives on as AMD's internal experimental API and LiquidVR is a direct spawn of that. I've honestly heard zero buzz about nVidia and/or GameworksVR at E3 but if someone else has, some information would be welcome.

Of course it remains to be seen but, AMD seems more poised for this new market with LiquidVR and their architectural choices as of late. You can add "Wait for VR" to your savior list I suppose.

gfxcomputedx122nu5c.jpg
 
Last edited:

looncraz

Senior member
Sep 12, 2011
722
1,651
136
It seems you only read what you wanted to read, I said currently available 780s at the time of the 290 series launch. As you stated in your post the 290X was pretty much neck and neck with the original Titan. The problem with that is some of the custom 780s (Classified for one) were already outpacing the Titan (some models substantially while overclocked). Now is this a completely fair comparison, no....That is what happens when you get beat to the market by several months though, vanilla versus custom. It happening again right now.

You could get 1.15GHz+ on a 290X with max fans and overtake the best 780s on the day of release, though it still took some time before all the utilities supported changing the voltage and even more time before the drivers matured. While some overclocked 780s could overtake the Titan, the 290X did that at stock clocks.

You just had to deal with being deaf, but you could beat or match the best 780s with a 290x :\

Today, the performance difference between the GTX 780 and the 290X is even greater as the 290 series has aged very well and the 780 has aged comparatively poorly. And, since the same GPUs are in the 390/x, they will continue to improve. Of course, if you upgrade yearly, that is a moot point. But I, like the overwhelming majority of people, only upgrade when it makes sense to do so... or when the need actually arises... are just when I get the itch :biggrin:
 
Last edited:

looncraz

Senior member
Sep 12, 2011
722
1,651
136
So DX12 will be the "saviour" of Fury?

Even if we for fun imagine this to somehow be true. Its not really working out for the Fury is it?

Stop bargaining for excuses. The result is what the result is.

And in late 2016/early 2017 everyone will be busy with their new 14/16nm flagships with something like 16GB VRAM and the Fury will be long forgotten.

https://www.youtube.com/watch?v=XzFe5OOHZko
 

looncraz

Senior member
Sep 12, 2011
722
1,651
136
Kind of agree here. I don't expect any large numbers of DX12 games for at least 2 years, probably 3.

By that time, unless devs have smartened up on their ports, the Fury X simply will not have enough Vram for 4K gaming. The massive shader array will help a lot but I expect a number of games to have problems at 4K.

If 14nm GPUs gets a butload of Vram devs will simply bloat vram requirements even more and GPUs with 4 GB will be left in the dust, similar to how GPUs with 2 GB are today.

I think DirectX 12's adoption rate will be drastically faster than prior versions for a few reasons:

1. It makes accomplishing existing tasks from DX11 easier.
2. It can have a fairly drastic increase in performance.
3. XBox One is going to get it, so console ports will be much easier...
 

looncraz

Senior member
Sep 12, 2011
722
1,651
136
If anything it will only increase faster as we move to real 64bit games.

64-bit will have very little effect on GPU RAM requirements, but it will have an effect on system RAM and storage requirements. Most of what is stored on the GPU consists of frame buffers, textures, meshes, etc..
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Kind of agree here. I don't expect any large numbers of DX12 games for at least 2 years, probably 3.

By that time, unless devs have smartened up on their ports, the Fury X simply will not have enough Vram for 4K gaming. The massive shader array will help a lot but I expect a number of games to have problems at 4K.

If 14nm GPUs gets a butload of Vram devs will simply bloat vram requirements even more and GPUs with 4 GB will be left in the dust, similar to how GPUs with 2 GB are today.
Kind of? I definitely agree. Purchasing a card for a future improvement that may or may not come is ridiculous. Especially for dx 12. By that time like shintai said there are far better cards. I mean how many people here have we Said get a stopgap card?

If you get 980ti/fury x to hold on to for a long period of time you're just stupid no polite way of saying it. I wanted both cards badly but meh, I'm not a massive enough gamer to hold on to those cards for only 1-2 years then eat a decent loss. I'll that same loss probably but on some r9 200s where my initial investment is less but I get more performance (specific to my situation).
I'm not changing my purchasing decision for dx12 games that surely will have vastly better cards out by that time.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
The console generation determines cross-platform game development, everything revolves around that unless the studio is very PC-centric.

Notice vram requirements suddenly spike once games come out that were designed for PS4/Xbone ground up?

So even if you have 16GB vram GPUs in the next 2 years, it would be useless for gaming in most titles. We can't even tell the difference between 980Ti 6GB vs Titan X 12GB at 4K. Its not going to suddenly spike up when devs will be limited by console hardware for the next 4-5 years of its cycle.
People will of course skew your words and pretend they don't understand the meaning but I do and yes I agree. This is exactly what I said in the gtx 770 2gb days.

Funny to see those people who said 2gb is fine for awhile now say that vram will continously shoot up and won't level off at any point in time.

Whatever thing they need to make their argument though.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Kind of? I definitely agree. Purchasing a card for a future improvement that may or may not come is ridiculous. Especially for dx 12. By that time like shintai said there are far better cards. I mean how many people here have we Said get a stopgap card?

If you get 980ti/fury x to hold on to for a long period of time you're just stupid no polite way of saying it. I wanted both cards badly but meh, I'm not a massive enough gamer to hold on to those cards for only 1-2 years then eat a decent loss. I'll that same loss probably but on some r9 200s where my initial investment is less but I get more performance (specific to my situation).
I'm not changing my purchasing decision for dx12 games that surely will have vastly better cards out by that time.

Yerp, pretty much. I bought a GPU for today, not 2 years from now. 2 years from now I'll be on my third card (as I have a 1year upgrade cycle, but my old card goes to the GF when I buy a new one, she doesn't care about bleeding edge, just 1080p@60 in Sims 4).

Hopefully my next upgrade cycle AMD has a card that doesn't make me second guess wanting to buy it.
 

looncraz

Senior member
Sep 12, 2011
722
1,651
136
Yes, and the driver for Win10 from AMD isnt full featured yet. So it may simply not process certain parts.

You can also conclude the opposite from this:
acu_win_amd_t2.png


So its better to wait for an actual full featured driver and not one only distributed by Windows Update as beta support.


Windows 10 just uses the Windows 8 driver. I just installed Windows 10 on another partition and installed the Omega driver so I can do a direct comparison. I'll be back :cool:
 

at80eighty

Senior member
Jun 28, 2004
458
5
81
I don't get this. DX12 changes things, but this GPU finds its foundation in an older DX11 GPU, right? I might believe some driver stuff was up, but I doubt that DX12 (when compared to a 980ti in DX12, so apples to apples) will make the gap close all by itself.

i'd like to see a deep dive by AT at some point to understand how much of an evolution GCN has gone through. no one is expecting miracles - but perhaps the approach with Fiji will make more sense as time goes by.

particularly seeing a now realistic convergence of LiquidVR, TrueAudio & whatever HBM offers in terms of latency for new platforms like Oculus will be interesting to note.
 

at80eighty

Senior member
Jun 28, 2004
458
5
81
I think DirectX 12's adoption rate will be drastically faster than prior versions for a few reasons:

1. It makes accomplishing existing tasks from DX11 easier.
2. It can have a fairly drastic increase in performance.
3. XBox One is going to get it, so console ports will be much easier...

4. MS making it practically a no brainer for everyone from Win7 onwards to upgrade for free within the year, means devs will have a near guarantee on a large user base
 

.vodka

Golden Member
Dec 5, 2014
1,203
1,537
136
DX12 isn't Fury's savior. Neither is it for the rest of AMD's lineup. DX11 is nV's trophy, because of different reasons (better threaded / less overhead drivers, gameworks, etc)


For those of us who don't upgrade hardware as often and are using AMD's, there seems to be much to gain from the DX12 transition, from the hardware itself (GCN's particular features discussed here and on other threads) to the software (DX12 = bye bye AMD's poorly threaded/high overhead DX11 driver -vs nV's- going forward). That was my point a few pages back. Of course if you're in the market for a card right now, you'll choose what performs best for your intended usage, and the better all round card right now seems to be the 980Ti, on lower resolutions. Higher resolutions is another story. But the vast majority of us is on 1080p-1440p, and there the 980Ti is king.


In the future? I don't know. Fury is bound to get some polishing driver side, as did the 7970 and 290x back in their time. As stated in a few posts above, DX12 adoption will be much faster than any other DX version before, but will not happen overnight. Nor will DX12 games be released overnight.

There's a long way to that future. DX12 affords AMD's graphics side a new start, if you'd like. Let's hope for their sake and ours that they make a good use of the opportunity and provide better competition going forward (not that Fury isn't competitive with the 980Ti, not as much as was expected going by the paper specs, but shows up to the fight and usually ties it and sometimes wins, given the necessary conditions). If it gets a price cut, then it suddenly becomes *much* more attractive... and we still have to see the rest of the Fiji lineup. AMD executed much better than last time with the 290/x in this regard, no noisy blower of hell and a poor performing product, because of that. They are learning... slowly, but learning and getting better at it. Still, there's some stuff, basic stuff they need to take care of, like not saying your product is an overclocking monster if you don't have the necessary tools at launch, for example.

Time will tell.
 
Last edited:

parvadomus

Senior member
Dec 11, 2012
685
14
81
This combined with poor performance at lower resolution (not CPU overhead, because R295X2 scaling is still great, no CPU bottlenecks) suggest Fury X has rushed drivers, unoptimized.

295x2 renders 2 frames at the same time, probably using 2 driver threads, each one having their own DX11 context.
IMHO, Fury is bottlenecked at lower resolutions. And the throttling of the card is insane to keep it under 300watts, thats why games have spikes.
AMD needs to unlock the beast, and let it consume whatever it wants. The overclocks does not even rise power consumption (almost). :hmm:
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I think DirectX 12's adoption rate will be drastically faster than prior versions for a few reasons:

1. It makes accomplishing existing tasks from DX11 easier.
2. It can have a fairly drastic increase in performance.
3. XBox One is going to get it, so console ports will be much easier...

4. Windows 10 is free and is better than Windows 8/8.1 in every way. I actually like it more than 7 as well.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
295x2 renders 2 frames at the same time, probably using 2 driver threads, each one having their own DX11 context.
IMHO, Fury is bottlenecked at lower resolutions. And the throttling of the card is insane to keep it under 300watts, thats why games have spikes.
AMD needs to unlock the beast, and let it consume whatever it wants. The overclocks does not even rise power consumption (almost). :hmm:

You just turn your power tune limit up on the 295x2, problem solved.
 

Mako88

Member
Jan 4, 2009
129
0
0
4. Windows 10 is free and is better than Windows 8/8.1 in every way. I actually like it more than 7 as well.

Looking forward to it, never made the jump to 8 as it wasn't needed (7 was solid, why risk it right).

But 10 looks like it has some fun toys to play with, could be a good one.
 

mizzou

Diamond Member
Jan 2, 2008
9,734
54
91
the days of new designs gaining a huge advantage over the comp are over. Tbe big leap onus in softwares court IMO.

this doesnt look remotely like a catastrophic failure. if the $ is right it will sell

will it make a huge profit? probably not
 

Sunaiac

Member
Dec 17, 2014
123
172
116
There's nothing wrong with the Fury.

Ppl telling us it's useless because you have 10% less perfs than a 980Ti for the same price ignoring the size and thermals and noise are those who told us a 7970GHz is less interesting than a 10% slower 680 that costed more because the later had better size, thermals and noise.
 

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
There's nothing wrong with the Fury.

Ppl telling us it's useless because you have 10% less perfs than a 980Ti for the same price ignoring the size and thermals and noise are those who told us a 7970GHz is less interesting than a 10% slower 680 that costed more because the later had better size, thermals and noise.

Ok, size. Yes it's a half slot card, but it has an external water cooler that needs an external fan bracket to fit in along with the extra piping. That's a major lose for most people.

On thermals, the only advantage there is keeping your GPU cool, which in and of itself means nothing to most . Conservation of energy, which basically says if you put 400W of power into something it doesn't just disappear.

So where do you suppose this power goes?

75498.png




I'll tell you where it goes - into your living room / office / bedroom etc. The 'thermal advantage' is only in keeping the chip itself cool because it more efficiently radiates the heat. The conversion of all that power into heat and dissipation into the surrounding environment is still there, and it's going to radiate more heat than its competitors due to the higher power draw.

On noise - that's a tradeoff. At Idle, fury is louder than any of the competition. Under load, it's quieter.

The only place Fury is clear winner is in 4K gaming. Even then it can lose in situations where the textures exceed its 4GB memory - arguably not enough for 4K. Many games aren't really playable at 4K anyway. So you've got a pretty narrow set of use cases where Fury makes sense.

Now if this were a $550 card, it would be a contender. But it isn't $550.

I think Fury Nano is the only card that might be a good price/perf/power combination, but only if it comes in ~ $450 with near GTX 980 levels of performance. If it comes in at GTX 970 levels of performance, which is what I'm beginning to suspect, it'll be a dud as well.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
There's nothing wrong with the Fury.

Ppl telling us it's useless because you have 10% less perfs than a 980Ti for the same price ignoring the size and thermals and noise are those who told us a 7970GHz is less interesting than a 10% slower 680 that costed more because the later had better size, thermals and noise.

I bought a 7970 original, before I knew there was going to be a Ghz edition and months before GTX 680 launched (and recently I was laughed at for A) not waiting for GTX 680 to launch to make my decision and B) not waiting for Ghz editions because it was $50 cheaper with 3 games...because I knew all that would happened in 01/12?)

Anyways, this time I did wait...I waited for Fury X which was teased at for almost 3 weeks before finally launched. I waited for E3 then for the release. This is what I saw:

7970 vs gtx 680
more VRAM
faster
more power consumption
louder and hotter but
more options for cooling
cheaper

So if I had waited until June to finally buy in 2012, 7970 was a no brainer.

This time, Fury X vs 980 Ti
less vram
slower
more power consumption
quieter and cooler because of included water cooler
same price.

Ryan said it best:
Once you get to a straight-up comparison, the problem AMD faces is that the GTX 980 Ti is the safer bet. On average it performs better at every resolution, it has more VRAM, it consumes a bit less power, and NVIDIA’s drivers are lean enough that we aren’t seeing CPU bottlenecking that would impact owners of 144Hz displays. To that end the R9 Fury X is by no means a bad card – in fact it’s quite a good card – but NVIDIA struck first and struck with a slightly better card, and this is the situation AMD must face. At the end of the day one could do just fine with the R9 Fury X, it’s just not what I believe to be the best card at $649.

See, you can be critical without being negative. He isn't saying the card is a "failure OMG11!!!1" but he's pointing out at the same price point - there is a better alternative.

And I agree. As my sig reflects.