FuryX now = 980ti 1080p/1440p > 4k

Status
Not open for further replies.

mysticjbyrd

Golden Member
Oct 6, 2015
1,363
3
0
History repeats itself!

First Fury X review by TPU:
AMD : Catalyst 15.5 Beta
NVIDIA: 353.06 WHQL

This GTX 980Ti review by TPU:
NVIDIA : 358.50 WHQL
AMD : Catalyst 15.9.1 Beta

1440P Before.
GTX 980Ti was 10% ahead
900x900px-LL-94bd18ec_perfrel_2560.gif


1440P Now
GTX 980Ti and Fury X are now equal
900x900px-LL-8bdaa9e7_perfrel_2560_1440.png


4K Before
GTX 980Ti are 2% ahead
900x900px-LL-df3cd2bb_perfrel_3840.gif


4K Now
Fury X are now 4% ahead
900x900px-LL-b9eff5a5_perfrel_3840_2160.png


Source: Newest review from TPU
Source: OCnet
 
Last edited:

VulgarDisplay

Diamond Member
Apr 3, 2009
6,193
2
76
I think most people were able to understand that the Fury X was underperforming based on it's specs likely due to drivers.

I expect it will be another case of it surpassing it's similar nvidia counterpart as time passes. Just like the 7970 and 290x.
 

Hi-Fi Man

Senior member
Oct 19, 2013
601
120
106
These tests aren't equal. The games being tested aren't the same in both reviews and as such any conclusions gathered from comparing total percentages are inaccurate.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
You have to be careful in this case because TPU removed Project Cars and Wolfenstein that were crippling ALL AMD cards. It doesn't mean that suddenly GCN got a huge boost in performance. What it actually shows is why in statistics we remove significant outliers. It's obvious that Project CARS and Wolfenstein weren't accurately representing the average performance of AMD cards. Don't forget that 980Ti has 20-25% OCing headroom and 6GB as a bonus so it's still a better card.

Anyway, I don't think much changes overall. AMD still has the best price/performance from $100-400, while NV's 980Ti is untouched.

TechSpot has a newer article up comparing various AMD vs. NV cards as well:
http://www.techspot.com/review/1075-best-graphics-cards-2015/

I think right now NV continues to sell on brand value and perception in the $100-400 range. 380 2GB > 950, 380 4GB/280X > 960, 290 has no competition, 390 > 970. Yet, NV completely outsells AMD with 950/960/970 cards.

What's most surprising is just how much better 280X is against the 950/960 cards, and how poorly the 780 aged. Those are far more eye-opening for me than Fury X getting slightly better against a reference 980Ti. The crazy part is how overhyped 780 was, how people purchased it over the mostly cheaper 290 and how sites like TechReport and HardOCP completely failed the consumer by failing to warn them about 2GB limits on the 960, while downplaying the performance advantage of 280X all this time, despite the latter often being within a similar price range. Once this generation is done, in 5 years, no one will care about any of these cards per say, but I'll never forget review sites that failed to point out glaring product flaws and prioritized NV's perf/watt marketing over raw GPU horsepower and VRM. As far as I am concerned this generation was a reputation killer for certain sites that lost all credibility they have built up over the last 10 years.

TPU FTW for listening to the consumers and understanding what outliers are.
 
Last edited:

mysticjbyrd

Golden Member
Oct 6, 2015
1,363
3
0
These tests aren't equal. The games being tested aren't the same in both reviews and as such any conclusions gathered from comparing total percentages are inaccurate.

Look at the difference in specs,
lqnD06y.jpg


Even if you want to try to argue that, it's only a matter of time.
 

mysticjbyrd

Golden Member
Oct 6, 2015
1,363
3
0
You have to be careful in this case because TPU removed Project Cars and Wolfenstein that were crippling ALL AMD cards. It doesn't mean that suddenly GCN got a huge boost in performance. What it actually shows is why in statistics we remove significant outliers. It's obvious that Project CARS and Wolfenstein weren't accurately representing the average performance of AMD cards. Don't forget that 980Ti has 20-25% OCing headroom and 6GB as a bonus so it's still a better card.

Anyway, I don't think much changes overall. AMD still has the best price/performance from $100-400, while NV's 980Ti is untouched.

TechSpot has a newer article up comparing various AMD vs. NV cards as well:
http://www.techspot.com/review/1075-best-graphics-cards-2015/

I think right now NV continues to sell on brand value and perception in the $100-400 range. 380 2GB > 950, 380 4GB/280X > 960, 290 has no competition, 390 > 970. Yet, NV completely outsells AMD with 950/960/970 cards.

What's most surprising is just how much better 280X is against the 950/960 cards, and how poorly the 780 aged. Those are far more eye-opening for me than Fury X getting slightly better against a reference 980Ti. The crazy part is how overhyped 780 was, how people purchased it over the mostly cheaper 290 and how sites like TechReport and HardOCP completely failed the consumer by failing to warn them about 2GB limits on the 960, while downplaying the performance advantage of 280X all this time, despite the latter often being within a similar price range. Once this generation is done, in 5 years, no one will care about any of these cards per say, but I'll never forget review sites that failed to point out glaring product flaws and prioritized NV's perf/watt marketing over raw GPU horsepower and VRM. As far as I am concerned this generation was a reputation killer for certain sites that lost all credibility they have built up over the last 10 years.

TPU FTW for listening to the consumers and understanding what outliers are.

I think it's right to remove outliers, especially when their integrity is in question, or they were bent over a barrel with gameworks.

http://www.pcgamer.com/project-cars...entionally-crippled-performance-on-amd-cards/
 

dark zero

Platinum Member
Jun 2, 2015
2,655
138
106
I think most people were able to understand that the Fury X was underperforming based on it's specs likely due to drivers.

I expect it will be another case of it surpassing it's similar nvidia counterpart as time passes. Just like the 7970 and 290x.
Also seems that Fury X is not fully using the HBM properly... well used it becomes a beast.

Even if I like to bash AMD, I can't bash their support. They are still supporting some old tech, like nVIDIA (hopefully) and mades them competent for some time more.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Also seems that Fury X is not fully using the HBM properly... well used it becomes a beast.

Even if I like to bash AMD, I can't bash their support. They are still supporting some old tech, like nVIDIA (hopefully) and mades them competent for some time more.

While possible this is true, do you have any evidence of this?
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
You have to be careful in this case because TPU removed Project Cars and Wolfenstein that were crippling ALL AMD cards. It doesn't mean that suddenly GCN got a huge boost in performance. What it actually shows is why in statistics we remove significant outliers. It's obvious that Project CARS and Wolfenstein weren't accurately representing the average performance of AMD cards. Don't forget that 980Ti has 20-25% OCing headroom and 6GB as a bonus so it's still a better card.

Anyway, I don't think much changes overall. AMD still has the best price/performance from $100-400, while NV's 980Ti is untouched.

TechSpot has a newer article up comparing various AMD vs. NV cards as well:
http://www.techspot.com/review/1075-best-graphics-cards-2015/

I think right now NV continues to sell on brand value and perception in the $100-400 range. 380 2GB > 950, 380 4GB/280X > 960, 290 has no competition, 390 > 970. Yet, NV completely outsells AMD with 950/960/970 cards.

What's most surprising is just how much better 280X is against the 950/960 cards, and how poorly the 780 aged. Those are far more eye-opening for me than Fury X getting slightly better against a reference 980Ti. The crazy part is how overhyped 780 was, how people purchased it over the mostly cheaper 290 and how sites like TechReport and HardOCP completely failed the consumer by failing to warn them about 2GB limits on the 960, while downplaying the performance advantage of 280X all this time, despite the latter often being within a similar price range. Once this generation is done, in 5 years, no one will care about any of these cards per say, but I'll never forget review sites that failed to point out glaring product flaws and prioritized NV's perf/watt marketing over raw GPU horsepower and VRM. As far as I am concerned this generation was a reputation killer for certain sites that lost all credibility they have built up over the last 10 years.

TPU FTW for listening to the consumers and understanding what outliers are.
so what you are saying is that tpu remove the outliners in their last review and gave a fairer review this time around? I am shocked I tell you, shocked!
 

IllogicalGlory

Senior member
Mar 8, 2013
934
346
136
It's not an apples to apples comparison unfortunately.

Though I think there are still be some improvements, unless Wolfenstein alone accounts for that 5% difference in the first graph.

Though considering Project Cars accounted for 3%, it might actually be true.
 

mysticjbyrd

Golden Member
Oct 6, 2015
1,363
3
0
Also seems that Fury X is not fully using the HBM properly... well used it becomes a beast.

Even if I like to bash AMD, I can't bash their support. They are still supporting some old tech, like nVIDIA (hopefully) and mades them competent for some time more.

Yah, I never really understood the complaint about AMD drivers. Maybe long long ago that was a big issue, but anymore they are on top of their game. I believe they are coming out with a huge set of drivers soon, Omega 2. Who knows how much that will change things up.

https://www.techpowerup.com/217086/amd-readies-catalyst-omega-2015-drivers-for-november.html
 
Last edited:

96Firebird

Diamond Member
Nov 8, 2010
5,711
316
126
"This game's performance doesn't line up with how I think the card should be performing, therefore it shouldn't be included in reviews!"

What a sad way to view things... I wonder what TPU's reason for removing Project CARS was? If it was because of complainers, that's upsetting. It's the best PC racing game out there, and now their lineup doesn't include any racing games.
 

mysticjbyrd

Golden Member
Oct 6, 2015
1,363
3
0
"This game's performance doesn't line up with how I think the card should be performing, therefore it shouldn't be included in reviews!"

What a sad way to view things... I wonder what TPU's reason for removing Project CARS was? If it was because of complainers, that's upsetting. It's the best PC racing game out there, and now their lineup doesn't include any racing games.

No, it's called removing outliers. It's a principle method of quantitative analysis.

What's more there are some ethical concerns surrounding some of Nvidias deals in the past, and future.

Project Cars for instance yields absolutely horrific performance for AMD GPUs, but only because of the way the game was designed. Now, why was the game designed to cripple AMD cards? They were either bought off, they didn't care about the AMD users due to their lower market share, or they are incompetent. Now which sounds more likely? Personally, I would go with #1 and/or #2.

Either way, it's a poor benchmark to use to compare AMD and Nvidia, thus should be removed. The only reason you should care about an outlier, such as project cars, is if you plan on playing that game.
 
Last edited:

Hi-Fi Man

Senior member
Oct 19, 2013
601
120
106
Look at the difference in specs,
lqnD06y.jpg


Even if you want to try to argue that, it's only a matter of time.

Huh? I'm not sure what you're trying to say here. Quoting high level specs to compare different architectures is almost entirely pointless (again I'm not sure what your post is getting at).
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
No, it's called removing outliers. It's a principle method of quantitative analysis.

What's more there are some ethical concerns surrounding some of Nvidias deals in the past, and future.

Project Cars for instance yields absolutely horrific performance for AMD GPUs, but only because of the way the game was designed. Now, why was the game designed to cripple AMD cards? They were either bought off, they didn't care about the AMD users due to their lower market share, or they are incompetent. Now which sounds more likely? Personally, I would go with #1 and/or #2.

Either way, it's a poor benchmark to use to compare AMD and Nvidia, thus should be removed. The only reason you should care about an outlier, such as project cars, is if you plan on playing that game.

The bolded part is why doing quantitative analysis on game benchmarks is completely idiotic in the first place. The only benchmarks that should matter to any gamer are the ones for games they actually play. If by some coincidence, all they play are the outliers, then removing them to make the other side look better would give the user a false representation of what their individual experience would be. Even if you play every game in the benchmark collection, these aggregate benchmark summations are useless.


As for your Project Cars "anaylsis." The majority of money for game developers come from consoles which are both AMD hardware. That pretty much eliminates your options 1 and 2 and makes #3 the most likely choice. Somehow the developers screwed something up when porting to the PC or got over ambitious.
 

tential

Diamond Member
May 13, 2008
7,355
642
121
"This game's performance doesn't line up with how I think the card should be performing, therefore it shouldn't be included in reviews!"

What a sad way to view things... I wonder what TPU's reason for removing Project CARS was? If it was because of complainers, that's upsetting. It's the best PC racing game out there, and now their lineup doesn't include any racing games.
So few pc players bought this game it's actually making me very mad it's reviewed so much and gets so much attention.

It should have a review of it sure. But it gets so much extra attention vs games people are actually playing/purchased.

It shouldn't be a benchmark suite game that's for sure I don't know what the thinking is in including it
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Project Cars for instance yields absolutely horrific performance for AMD GPUs, but only because of the way the game was designed. Now, why was the game designed to cripple AMD cards? They were either bought off, they didn't care about the AMD users due to their lower market share, or they are incompetent. Now which sounds more likely? Personally, I would go with #1 and/or #2.

What is your explanation that Ryse runs much better on AMD hardware?
Or why would it be okay to use Star Wars Battlefront which shows a clear bias towards AMD and comes from a company which has publicly despised nVidia?

When you start to put bias into a choice of games for benchmarking then you have a huge problem.
 
Last edited:

zlatan

Senior member
Mar 15, 2011
580
291
136
What is your explanation that Ryse runs much better on AMD hardware?
Or why would it be okay to use Star Wars Battlefront which shows a clear bias towards AMD and comes from a company which has publicly despised nVidia?

GCN is very strong in the full PBR engines like CryEngine 3.4+ or the newest Frostbite. So this is not game specific. The primary reason for this is the robust cache design and the memory bandwith.
 
Last edited:

tolis626

Senior member
Aug 25, 2013
399
0
76
What is your explanation that Ryse runs much better on AMD hardware?
Or why would it be okay to use Star Wars Battlefront which shows a clear bias towards AMD and comes from a company which has publicly despised nVidia?

When you start to put bias into a choice of games for benchmarking then you have a huge problem.

Well, to be fair, I haven't seen any reviews using Ryse in their benchmarking suites, so there's that.

Battlefront on the other hand... That's another story. The reason I too despise NVidia is because they have quite the history of cheating/crippling the opposition on purpose. I like their products, they perform and have some interesting technologies behind them, but still... Anyway, AMD are not saints by any means, as much as some people here like to belive, but they aren't even close to NVidia in that regard. My only fear is that if this crap continues, the only competition we're gonna be seeing is in crippling each other and whoever has more games cripple the competition wins. Seriously, it's disgusting.

If it were up to me, both companies would receive some hefty fines with even the slightest proof of intentionally messing with the other's performance somehow (Think Gameworks). Good thing I'm not in charge of much... :p

On topic now... I think AMD has made some performance improvements in their latest beta drivers, especially for the Fury line. 390x performance has surely improved in some games, so I see no reason the Fury X shouldn't see a similar or even bigger improvement. But as others have pointed out, the Fury X seemingly wins against a stock reference 980ti. Overclock both and see the Fury X look midrange. Sad... I really wanted that card to succeed. Overclocking HBM has shown some big performance improvements, but I think there still isn't a way for cosnumers to do it, so it's a moot point.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
GCN is very strong in the full PBR engines like CryEngine 3.4+ or the newest Frostbite. So this is not game specific. The primary reason for this is the robust cache design and the memory bandwith.

Designed for GCN? Makes it a biased engine towards AMD.

This makes it very hard to recommend any game based on these engines to use for benchmarking.

Like i said: Open the box of pandora is never a good idea.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
The bolded part is why doing quantitative analysis on game benchmarks is completely idiotic in the first place. The only benchmarks that should matter to any gamer are the ones for games they actually play. If by some coincidence, all they play are the outliers, then removing them to make the other side look better would give the user a false representation of what their individual experience would be. Even if you play every game in the benchmark collection, these aggregate benchmark summations are useless.


As for your Project Cars "anaylsis." The majority of money for game developers come from consoles which are both AMD hardware. That pretty much eliminates your options 1 and 2 and makes #3 the most likely choice. Somehow the developers screwed something up when porting to the PC or got over ambitious.

It's the CPU PhysX that's choking AMD on that game.

People use the overall score as a prediction of the likely overall performance with future games. While we can see where that's not always going to be accurate, demonstrated by games like project cars, it's really the only metric we can go by. That's why it's important to not have outlier games skewing the results.

So few pc players bought this game it's actually making me very mad it's reviewed so much and gets so much attention.

It should have a review of it sure. But it gets so much extra attention vs games people are actually playing/purchased.

It shouldn't be a benchmark suite game that's for sure I don't know what the thinking is in including it
Think real hard. Since there has to be a reason or they wouldn't do it, who benefits from it? ;)

What is your explanation that Ryse runs much better on AMD hardware?
Or why would it be okay to use Star Wars Battlefront which shows a clear bias towards AMD and comes from a company which has publicly despised nVidia?

When you start to put bias into a choice of games for benchmarking then you have a huge problem.

Ryse = 8% faster for AMD @ 1440
P cars = 34% faster for nVidia @ 1440
Not even close to being comparable.

Not seeing SWBF in their suite. Have I missed it? Where's the clear bias you are charging?
 

Mondozei

Golden Member
Jul 7, 2013
1,043
41
86
The bolded part is why doing quantitative analysis on game benchmarks is completely idiotic in the first place.


Except that quantitative analysis is completely rational and to believe otherwise is idiotic - here's why.

Most people when they are buying a GPU are buying for at least 3 years. The overwhelming majority of people are NOT buying for a single or even a few games.

Performance indexes gives a weighted average which tells a potential buyer what the ballpark estimate is in terms of performance. This is then used to extrapolate. It's an imperfect process. Just look at how much better AMD's GCN has aged compared to Kepler. Yet this was no natural law. NV more or less abandoned Kepler until the outcry started and they started to give it the minimum attention required.

Nevertheless, the point remains that people buy GPUs on the understanding that they'll need it for a game that has likely not even been announced yet, and as such they need to view how the GPU performs in as wide a selection of games as possible to get an approximate sense of what they can expect.
 

96Firebird

Diamond Member
Nov 8, 2010
5,711
316
126
So few pc players bought this game it's actually making me very mad it's reviewed so much and gets so much attention.

It should have a review of it sure. But it gets so much extra attention vs games people are actually playing/purchased.

It shouldn't be a benchmark suite game that's for sure I don't know what the thinking is in including it

It is the best PC racing game out there, it for sure should be included. According to Steam Charts (credibility unknown), it has about as many players as Far Cry 4, and a LOT more players than Ryse. Of course, accuracy of these stats is up in the air, but its all I can really go on as far as game popularity.

My personal opinion is to include the most recent popular game for each genre (maybe a couple for the more popular genres such as FPS and RPG) to get a wide spread of flavors for plenty of gamers. Having both BF3 and BF4 in their suite still confuses me, but I'm not going to complain and call them biased because I don't agree with it. That's for the children.
 

tential

Diamond Member
May 13, 2008
7,355
642
121
It is the best PC racing game out there, it for sure should be included. According to Steam Charts (credibility unknown), it has about as many players as Far Cry 4, and a LOT more players than Ryse. Of course, accuracy of these stats is up in the air, but its all I can really go on as far as game popularity.

My personal opinion is to include the most recent popular game for each genre (maybe a couple for the more popular genres such as FPS and RPG) to get a wide spread of flavors for plenty of gamers. Having both BF3 and BF4 in their suite still confuses me, but I'm not going to complain and call them biased because I don't agree with it. That's for the children.
I'm not a huge racing fan but if it is the most played racing game then it should be included.

Really I don't care about a card winning a Roundup of 10 games. I just care about the games I plan on playing. I think the issue is finding a card that "wins" rather than finding the best card for the games that you play.
 
Status
Not open for further replies.