AMD Fury X Reviews

Page 32 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ocre

Golden Member
Dec 26, 2008
1,594
7
81
And you'd be wrong. There are code copy pastes. Some of the code documentation is copy pasted over. This topic has been exhaustively covered in threads dedicated to the topic. Vulkan is a fork of Mantle, DX12 is Mantle's spiritual successor/co-developed and the evidence is strong is uses the exact same code in places.

Well, if that is the case.........

there goes all hope for fury and DX12. The assertion that the chip was designed for DX12 and then say that DX12 is the same code as mantle?

Well, do the math there.

Fury's mantle performance is................well, lets just say it absolutely is not helping atm. Since DX12 is mantle code, so you say, i cant see how it is safe to assume that all of a sudden there will be this boost in performance. One that isnt even there for Fury and mantle.

The logic fails.
 
Feb 19, 2009
10,457
10
76
There's a reason why many devs who had experience with the Mantle SDK & DX12 have said they are very similar. Its no coincidence that even code, documentation & even terminology usage are similar.

Whether you believe AMD helped MS developed DX12 since they worked closely on XBONE or whether MS developed it by themselves, taken from the XBONE API and modified it for Windows, it's still developed around the same uarch: GCN.

You can keep your head in the sand and deny all you want. GCN is in a better position for future proofing due to new APIs & the dominant uarch targeted by studios of cross-platform games (pretty much every AAA title). This future proofing has already happened, we've seen how strong GCN is against Kepler. We'll see the same of GCN vs Maxwell come DX12.

@ocre
Fury X Mantle works great in Civ BE, Dragon Age Inq and Thief. It doesn't work for BF4. It has done its job in showcasing the technology. It lives on in Vulkan, Metal, LiquidVR and DX12. It would be foolish for AMD to spend the little resources they have, to maintain Mantle and optimizations for it in games. They should focus on DX12 instead.
 

desprado

Golden Member
Jul 16, 2013
1,645
0
0
There's a reason why many devs who had experience with the Mantle SDK & DX12 have said they are very similar. Its no coincidence that even code, documentation & even terminology usage are similar.

Whether you believe AMD helped MS developed DX12 since they worked closely on XBONE or whether MS developed it by themselves, taken from the XBONE API and modified it for Windows, it's still developed around the same uarch: GCN.

You can keep your head in the sand and deny all you want. GCN is in a better position for future proofing due to new APIs & the dominant uarch targeted by studios of cross-platform games (pretty much every AAA title). This future proofing has already happened, we've seen how strong GCN is against Kepler. We'll see the same of GCN vs Maxwell come DX12.

@ocre
Fury X Mantle works great in Civ BE, Dragon Age Inq and Thief. It doesn't work for BF4. It has done its job in showcasing the technology. It lives on in Vulkan, Metal, LiquidVR and DX12. It would be foolish for AMD to spend the little resources they have, to maintain Mantle and optimizations for it in games. They should focus on DX12 instead.

By Henry Moreton on March 20, 2014

Speaking to a crowd of about 300 developers and press, Anuj Gosalia, development manager of DirectX at Microsoft, described DX12 as the joint effort of hardware vendors, game developers and his team. Our work with Microsoft on DirectX 12 began more than four years ago with discussions about reducing resource overhead. For the past year, NVIDIA has been working closely with the DirectX team to deliver a working design and implementation of DX12 at GDC.
Happy?

AMD said in 2012 that their is no Dx12 and we know that is not the first time AMD has lied.
 
Feb 19, 2009
10,457
10
76
You ask a PR person before they are allowed to reveal some new tech, ofc they will deny its existence. Always take PR with a grain of salt. ;)

But certainly, AMD and MS worked together on DX12 which is similar to Mantle & Xbone's API. NV joined the bandwagon late and added FL12.1 since they wanted something unique as a selling point. I bet you DX12's GameWorks will push 12.1 to the max. :D
 

desprado

Golden Member
Jul 16, 2013
1,645
0
0
You ask a PR person before they are allowed to reveal some new tech, ofc they will deny its existence. Always take PR with a grain of salt. ;)

But certainly, AMD and MS worked together on DX12 which is similar to Mantle & Xbone's API. NV joined the bandwagon late and added FL12.1 since they wanted something unique as a selling point. I bet you DX12's GameWorks will push 12.1 to the max. :D
Wrong AMD said clearly on 2012 that there is no DX12.That is why normal people do not take AMD seriously.
 
Feb 19, 2009
10,457
10
76
Looks like CF just wrecks SLI. In dual-GPU configs as we've seen, it beats reference Titan X & 980Ti easily and matches OC 980Ti models.

In Quad-GPU setups, it blows NV away. XDMA scaling awesome at 4K.

Since no single GPU is enough for 4K, I would have to conclude that multi-GPU is required and as such, Fury X wins the 4K contest. This would also apply to 1440p with DSR/VSR from 4K.

https://www.youtube.com/watch?v=fFpy2L3B8lk

xBbMaxh.jpg


XsDlGeU.jpg


Reference Titan X have a tendency to throttle due to temps in multi-GPU configs. Gotta max that fanspeed!
FC4 is benched with NV GW features enabled. Fury X doesn't care!
KzBvqnr.jpg
 
Last edited:

desprado

Golden Member
Jul 16, 2013
1,645
0
0
Looks like CF just wrecks SLI. In dual-GPU configs as we've seen, it beats reference Titan X & 980Ti easily and matches OC 980Ti models.

In Quad-GPU setups, it blows NV away. XDMA scaling awesome at 4K.

Since no single GPU is enough for 4K, I would have to conclude that multi-GPU is required and as such, Fury X wins the 4K contest. This would also apply to 1440p with DSR/VSR from 4K.

https://www.youtube.com/watch?v=fFpy2L3B8lk

xBbMaxh.jpg


XsDlGeU.jpg


Reference Titan X have a tendency to throttle due to temps in multi-GPU configs. Gotta max that fanspeed!
KzBvqnr.jpg

http://forums.overclockers.co.uk/showthread.php?t=18678073&page=113
See what Kaap said their

The guy who doing the benchmark is really crying or pissed after fury x disaster launch so he doing a benchmark which people wont be taking seriously.

Btw their were some user saying Fury X will be 20% or 40% faster than Titan X including you so did you edited your post?
 
Last edited:
Feb 19, 2009
10,457
10
76
Nah, some people just aren't happy that Fury X in the top configs for 4K stomps all over 980Ti and Titan X.

I never said it would be that much faster. I said I HOPE its ~15-20% faster, as it would justify me upgrading. At this point if I were to go 4K gaming, CF Fury X is the only solution worthwhile.

Reference 980Ti get too hot & noisy in multi-GPU setups, Hardware.fr found major throttling on auto fan. Open air 980Ti dumps 500-600W of heat in my case, which isn't clever for multi-GPU setups, recycling hot air. The only other contender is EVGA's Hybrid 980Ti. But it comes with a $100 premium. Seeing as Fury X CF scales so well, no brainer.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
if I were to go 4K gaming, CF Fury X is the only solution worthwhile.

Pretty ridiculous statement Are you just searching for any edge case where Fury comes out ahead? CF 4K and 4gb of VRAM - GCN may be future proof but that setup sure wouldn't be. Nothing like 4way CF SLI numbers to show us whose better! /s

Not to mention the frame times are brutal for any 4 way config and relying on AMD CF 4 way drivers? lol
 
Last edited:

desprado

Golden Member
Jul 16, 2013
1,645
0
0
Nah, some people just aren't happy that Fury X in the top configs for 4K stomps all over 980Ti and Titan X.

I never said it would be that much faster. I said I HOPE its ~15-20% faster, as it would justify me upgrading. At this point if I were to go 4K gaming, CF Fury X is the only solution worthwhile.

Reference 980Ti get too hot & noisy in multi-GPU setups, Hardware.fr found major throttling on auto fan. Open air 980Ti dumps 500-600W of heat in my case, which isn't clever for multi-GPU setups, recycling hot air. The only other contender is EVGA's Hybrid 980Ti. But it comes with a $100 premium. Seeing as Fury X CF scales so well, no brainer.
You want me to show that post where you claim that Fury X will be 20% faster than Titan X?

That is the reason i was not posting in rumor topic.I believe in fact and reality where NVIDIA>AMD.
 

desprado

Golden Member
Jul 16, 2013
1,645
0
0
Pretty ridiculous statement. First are we taking some non professional non repeatable youtuber as gospel here? Are you just searching for any edge case where Fury comes out ahead? CF 4K and 4gb of VRAM - GCN may be future proof but that setup sure wouldn't be. Nothing like 4way CF SLI numbers to show us whose better! /s

Not to mention the frame times are brutal for any 4 way config and relying on AMD CF 4 way drivers? lol
I dont take these guys seriously.They out of control in Fury X rumor thread by some guys saying Fury X will be 20% faster than Titan x and some other were saying it will be 40% faster.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
Of course, your choice. I just don't see it as a no brainer. People didn't have to hang on to their 780/780 ti very long for Hawaii to beat it pretty bad. nVidia's O/C'ing has been touted as far back as the 680. In the end, it hasn't helped. And it's not Tahiti's or Hawaii's VRAM advantage that has allowed it to mature better. Bandwidth, is more likely having a bigger effect and Fury has that in spades.

Again, I'm not disagreeing with what you are saying except for the no brainer part.

I can remember the exact same type of comment made about the 680. In the end though, the 7970 was the better purchase. Currently there is virtually no performance difference in neutral games. So, that's more or less a wash. There is good evidence from the last few generations that AMD will out perform nVidia's offerings in the future.

Good that you brought that up, remember back during the 780 vs R290/X time, when lots of members here touted the 780s ability to OC (Balla!) as the main factor in why the 780 was superior to the R290 and likewise for the 780Ti to the R290X. How it quickly changes and GCN leaves Kepler in the dust.

People who think 980Ti with its 6gb vram has more future proofing fail to understand that DX12 is made with Mantle as a foundation and has mostly Mantle-like features. GCN being the core that future APIs were designed for, as well as cross-platform developers cater to due to consoles... obviously GCN is the more future proof uarch.

In regards to vram, none of these GPUs are capable of running playable fluid settings where 4 vs 6GB vram matters. Even in dual-GPU configs, turning on 4x MSAA at 4K kills performance in newer titles (GTA V) while older titles don't stress vram (Tomb Raider, Metro etc).

So.......
Clearly we have seen a few games since Hawaii that have performed very very poor on kepler cards. But these statements are only true when you pick one of those games. The big picture, going by more than a couple console ports that favor GCN (hmmm).......you see that the situation is not exactly what you guys are projecting.

I am sure you can find a few games where Hawaii beats the 780ti "pretty bad". But overall, and over many titles, we see the whole story.
I am not trying to take away the great accomplishment AMD has had when it comes to GCN drivers. They have made some pretty significant gains over the years. But i will just post some modern reviews with many modern and popular games.

https://www.techpowerup.com/reviews/AMD/R9_Fury_X/31.html
http://www.techpowerup.com/reviews/MSI/GTX_980_Ti_Gaming/30.html

with 20+ games, Hawaii isnt beating kepler badly. It isnt leaving kepler in the dust. Even the gk104, its keeping pace with Tahiti.

It is an inescapable fact that AMD has made some great improvements with GCN performance. They have gained a lot on kepler. But beating nvidia pretty bad........on a game per game basis, there are cases such as these. But the bigger picture, you probably need to tame it down a little bit.

When the 290x launched, it was slower than the 780ti. Today, its as fast or faster in most cases. This is great. We have games that favor GCN. There are several modern console ports that favor GCN over kepler. AMD said that winning both next gen consoles would give them an advantage, it makes sense that it would. Games built specifically for X86 consoles built on GCN graphics, it is not beyond reason to think this would be beneficial when these games get ported to PC.
The fact that there are games that do favor GCN over kepler, it is not so surprising.
 
Feb 19, 2009
10,457
10
76
Pretty ridiculous statement Are you just searching for any edge case where Fury comes out ahead? CF 4K and 4gb of VRAM - GCN may be future proof but that setup sure wouldn't be. Nothing like 4way CF SLI numbers to show us whose better! /s

Not to mention the frame times are brutal for any 4 way config and relying on AMD CF 4 way drivers? lol

If youtubers aren't your thing, go read hardware.fr they are pretty respectable. CF Fury X pwns SLI 980Ti there.

Or go read tweaktown, CF Fury X pwns SLI 980Ti there also.

Or read digitalstorm, CF Fury X pwns SLI Titan X there as well.

Until other sites cover multi-GPU, what we have so far shows CF pwns SLI at 4K.
 
Feb 19, 2009
10,457
10
76
You want me to show that post where you claim that Fury X will be 20% faster than Titan X?

That is the reason i was not posting in rumor topic.I believe in fact and reality where NVIDIA>AMD.

In a speculative thread? It's a speculation and it's something I had hoped for to make it worth my $ to upgrade.

As it is, in top multi-GPU configs, Fury X wins. The more GPUs, the bigger the win.
 

desprado

Golden Member
Jul 16, 2013
1,645
0
0
If youtubers aren't your thing, go read hardware.fr they are pretty respectable. CF Fury X pwns SLI 980Ti there.

Or go read tweaktown, CF Fury X pwns SLI 980Ti there also.

Or read digitalstorm, CF Fury X pwns SLI Titan X there as well.

Until other sites cover multi-GPU, what we have so far shows CF pwns SLI at 4K.
Tweaktown is not a respectable site and it is just like PClab.

Digitalstrom yes.

But i want to ask how many owner are their for 4K mutli gpu? Not even 99.8% of user.

AMD has fail to take majority of the user and improved more brand image of GTX 980 Ti
 

desprado

Golden Member
Jul 16, 2013
1,645
0
0
In a speculative thread? It's a speculation and it's something I had hoped for to make it worth my $ to upgrade.

As it is, in top multi-GPU configs, Fury X wins. The more GPUs, the bigger the win.
Worth it yes but speculation like Fury X will be 20% or 40% is more than Titan X is like childish or scram.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Nah, some people just aren't happy that Fury X in the top configs for 4K stomps all over 980Ti and Titan X.

I never said it would be that much faster. I said I HOPE its ~15-20% faster, as it would justify me upgrading. At this point if I were to go 4K gaming, CF Fury X is the only solution worthwhile.

Reference 980Ti get too hot & noisy in multi-GPU setups, Hardware.fr found major throttling on auto fan. Open air 980Ti dumps 500-600W of heat in my case, which isn't clever for multi-GPU setups, recycling hot air. The only other contender is EVGA's Hybrid 980Ti. But it comes with a $100 premium. Seeing as Fury X CF scales so well, no brainer.

Lets be honest here. Reference 980 Ti is off the table for anyone intelligent buying 980 ti SLI. And I would hazard a guess that for someone like yourself, who is very knowledgeable about GPUs, was NEVER on the table to begin with. You would go aftermarket right off the bat.

Lots of people said not to compare with reference Hawaii; now we are getting comparisons to reference 980 Ti.
 

desprado

Golden Member
Jul 16, 2013
1,645
0
0
Lets be honest here. Reference 980 Ti is off the table for anyone intelligent buying 980 ti SLI. And I would hazard a guess that for someone like yourself, who is very knowledgeable about GPUs, was NEVER on the table to begin with. You would go aftermarket right off the bat.

Lots of people said not to compare with reference Hawaii; now we are getting comparisons to reference 980 Ti.

lol................Well said bro.

G1 GTX 980 Ti SLI vs Fury X CF 4K
https://www.youtube.com/watch?v=XJYWXHOUoFY
 

nsavop

Member
Aug 14, 2011
91
0
66
I dont take these guys seriously.They out of control in Fury X rumor thread by some guys saying Fury X will be 20% faster than Titan x and some other were saying it will be 40% faster.

It's best to ignore them. Same people were claiming 4gb of vram would be a fail for fury x when the "8gb dual link interposer" rumors were the running rampant, yet now 4gb is no problem.
 
Feb 19, 2009
10,457
10
76
Lets be honest here. Reference 980 Ti is off the table for anyone intelligent buying 980 ti SLI. And I would hazard a guess that for someone like yourself, who is very knowledgeable about GPUs, was NEVER on the table to begin with. You would go aftermarket right off the bat.

Lots of people said not to compare with reference Hawaii; now we are getting comparisons to reference 980 Ti.

My prior multi-GPU setup was put on water. I didn't like open air squashed next to each other, the top card gets massively warmer & louder. The ambient inside a closed case rises, causing my CPU temps to go 10C hotter.

So multi-GPU, for cards of ~250W range, works best with reference blowers or the best with water. Fury X definitely has a major advantage being already under water with full warranty unlike DIY solutions. I give a nod to the EVGA Hybrid 980Ti, a great solution and one I would seriously recommend to others who go multi-GPU.
 

desprado

Golden Member
Jul 16, 2013
1,645
0
0
It's best to ignore them. Same people were claiming 4gb of vram would be a fail for fury x when the "8gb dual link interposer" rumors were the running rampant, yet now 4gb is no problem.
This........Seriously i never seen this kind of a mess on Nvidia GPU rumor thread than why it is on AMD?
 
Last edited:

desprado

Golden Member
Jul 16, 2013
1,645
0
0
My prior multi-GPU setup was put on water. I didn't like open air squashed next to each other, the top card gets massively warmer & louder. The ambient inside a closed case rises, causing my CPU temps to go 10C hotter.

So multi-GPU, for cards of ~250W range, works best with reference blowers or the best with water. Fury X definitely has a major advantage being already under water with full warranty unlike DIY solutions. I give a nod to the EVGA Hybrid 980Ti, a great solution and one I would seriously recommend to others who go multi-GPU.

Do not bring EVGA Hybrid GTX 980 ti because it is nearly 30% faster than Fury X water to water.

Fury X real competitor is GTX 980 Ti reference so we keep it that way.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
If youtubers aren't your thing, go read hardware.fr they are pretty respectable. CF Fury X pwns SLI 980Ti there.

Or go read tweaktown, CF Fury X pwns SLI 980Ti there also.

Or read digitalstorm, CF Fury X pwns SLI Titan X there as well.

Until other sites cover multi-GPU, what we have so far shows CF pwns SLI at 4K.

My point was that 4 way results are meaningless. the only results that matter are 2 way because thats the only practical config 99% of multi-gpu users will use. Yes the Fury X scales well and catches up to the 980ti but it only goes from losing mostly in single card configs to about even at 4k.
 
Feb 19, 2009
10,457
10
76
It's best to ignore them. Same people were claiming 4gb of vram would be a fail for fury x when the "8gb dual link interposer" rumors were the running rampant, yet now 4gb is no problem.

Russian showed me the errors of my judgement regarding 4GB being fail for 4K. When faced with evidence to the contrary, a man changes his mind.

It's essentially a very simple concept: these GPUs lack the grunt to push 4K with AA settings that bottleneck the vram.