FuryX now = 980ti 1080p/1440p > 4k

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

iiiankiii

Senior member
Apr 4, 2008
759
47
91
You're not understanding what I said at all. It isn't bombing because performance hasn't regressed. I did say however, that Kepler needs careful optimization from the driver to perform at max efficiency. These optimizations are still happening, just not always at launch time, much like AMD's cards.

I think the only way to compare if it's regressing in performance is having something to compare it to. It's all relative. You have to compare it something. You can't say performance hasn't regressed. It hasn't regressed in comparison to what? You can say that performance has regressed relatively to its AMD counterpart. The numbers are don't lie.

Compared to the competition, it isn't nearly as competitive. It might be due to better AMD driver optimization, and/or lack of Nvidia's driver optimization. Regardless, one thing is clear, Kepler isn't as competitive as it once was when compared to its AMD counterpart in current games. In the end, Kepler just isn't keeping up as well as it once did. That's the bottomline.
 

Hi-Fi Man

Senior member
Oct 19, 2013
601
120
106
I think the only way to compare if it's regressing in performance is having something to compare it to. It's all relative. You have to compare it something. You can't say performance hasn't regressed. It hasn't regressed in comparison to what? You can say that performance has regressed relatively to its AMD counterpart. The numbers are don't lie.

Compared to the competition, it isn't nearly as competitive. It might be due to better AMD driver optimization, and/or lack of Nvidia's driver optimization. Regardless, one thing is clear, Kepler isn't as competitive as it once was when compared to its AMD counterpart in current games. In the end, Kepler just isn't keeping up as well as it once did. That's the bottomline.

Just to be clear I was referring to performance regressing in Kepler alone not in relation to GCN. Although I would debate your last point as Kepler in some titles hasn't regressed in performance at all in relation to GCN however, there are some titles that do show regressed performance. I just don't think it's the majority from what I've seen and experienced. What bothers me are people using hyperbole to generalize the entire architecture.
 
Last edited:

Azix

Golden Member
Apr 18, 2014
1,438
67
91
Kepler doesn't have a "software scheduler", it's a static scheduler. Only the scheduling of instructions is done in software (compiler).

so it doesn't use a software scheduler but schedules instructions in software. got it... :rolleyes:

maxwell is more efficient, but it will still need the software babysitting.

i don't think people are necessarily saying there is a drop in performance on in individual titles (eg. worsening performance in tomb raider).
 

Hi-Fi Man

Senior member
Oct 19, 2013
601
120
106
so it doesn't use a software scheduler but schedules instructions in software. got it... :rolleyes:

maxwell is more efficient, but it will still need the software babysitting.

i don't think people are necessarily saying there is a drop in performance on in individual titles (eg. worsening performance in tomb raider).

You do know there is more to scheduling than just scheduling instructions right?

For reference:

Scheduler_575px.jpg


Ryan Smith said:
However based on their own internal research and simulations, in their search for efficiency NVIDIA found that hardware scheduling was consuming a fair bit of power and area for few benefits. In particular, since Kepler’s math pipeline has a fixed latency, hardware scheduling of the instruction inside of a warp was redundant since the compiler already knew the latency of each math instruction it issued. So NVIDIA has replaced Fermi’s complex scheduler with a far simpler scheduler that still uses scoreboarding and other methods for inter-warp scheduling, but moves the scheduling of instructions in a warp into NVIDIA’s compiler. In essence it’s a return to static scheduling.

and no Maxwell doesn't need "software babysitting" because it's compiler was completely rewrote.
 
Last edited:

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
How does that prove Kepler performance has bombed (meaning it got worse)? All you're proving is that AMD dropped the ball with launch driver performance and allowed nVIDIA to get away with using GK104 as a flagship. How do I know these graphs you posted are even comparable? The ones in OP aren't.

It's also easy for me to find games that show Kepler outperforming equivalent AMD cards (GameWorks or otherwise, I don't care) but I don't feel like filling this thread with more stupid graphs.
wow, that is a really nice spin :) I am impressed! very. :thumbsup:
 

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
from 102 for reference specs to 138fps with OC

perf_oc.png


that's quite impressive

as for the relative performance from 980 ti and fury x stock, I remember other reviews with the old drivers with them around 1% of each other, so as others have said it's mostly about the game selection.
 

readers

Member
Oct 29, 2013
93
0
0
from 102 for reference specs to 138fps with OC

perf_oc.png


that's quite impressive

as for the relative performance from 980 ti and fury x stock, I remember other reviews with the old drivers with them around 1% of each other, so as others have said it's mostly about the game selection.

Maxwell OC extremely well, Fury X might caught up a bit when it come to non OC reference card, but once you OC a custom 980ti, the difference is still huge.
 

Hi-Fi Man

Senior member
Oct 19, 2013
601
120
106
wow, that is a really nice spin :) I am impressed! very. :thumbsup:

How about actually reading what's been posted and discussing instead of posting dumb sarcastic remarks that don't add any value to this thread.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
You do know there is more to scheduling than just scheduling instructions right?

For reference:

Scheduler_575px.jpg




and no Maxwell doesn't need "software babysitting" because it's compiler was completely rewrote.

yes I know there's more. But from the same page you got your quote from

With that said, in discussing Kepler with NVIDIA’s Jonah Alben, one thing that was made clear is that NVIDIA does consider this the better way to go. They’re pleased with the performance and efficiency they’re getting out of software scheduling, going so far to say that had they known what they know now about software versus hardware scheduling, they would have done Fermi differently. But whether this only applies to consumer GPUs or if it will apply to Big Kepler too remains to be seen.

So, does kepler use a software scheduler or not? Does Maxwell use a software scheduler or not?
 

Hi-Fi Man

Senior member
Oct 19, 2013
601
120
106
yes I know there's more. But from the same page you got your quote from



So, does kepler use a software scheduler or not? Does Maxwell use a software scheduler or not?

I thought I was clear by stating it has a static scheduler meaning it's only partially done in software, the GPU itself still has actual hardware to do some scheduling. Which is why I corrected you when you stated it had a software scheduler.
 
Last edited:

Azix

Golden Member
Apr 18, 2014
1,438
67
91
I thought I was clear that it's only partially done in software, the GPU itself still has actual hardware to do some scheduling that is why I corrected you by stating it has a static scheduler when you stated it had a software scheduler.

you weren't correcting me. because I was not wrong. If you want to ADD to what I said, sure. but whats the point of saying it doesn't use what it uses?
 

Hi-Fi Man

Senior member
Oct 19, 2013
601
120
106
you weren't correcting me. because I was not wrong. If you want to ADD to what I said, sure. but whats the point of saying it doesn't use what it uses?

I was correcting you and you are wrong, all you need to do is look at the articles. That NV employee is clearly referring to the software part of their scheduling. You are implying that all scheduling is done in software when this is not true, to refute this would mean you disagree with what Ryan Smith has stated.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
I was correcting you and you are wrong, all you need to do is look at the articles. That NV employee is clearly referring to the software part of their scheduling. You are implying that all scheduling is done in software when this is not true, to refute this would mean you disagree with what Ryan Smith has stated.

You were implying that I was implying that all scheduling is done in software. I did not say that. Again, does it use a software scheduler or not?
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
How does that prove Kepler performance has bombed (meaning it got worse)? All you're proving is that AMD dropped the ball with launch driver performance and allowed nVIDIA to get away with using GK104 as a flagship. How do I know these graphs you posted are even comparable? The ones in OP aren't.

It's also easy for me to find games that show Kepler outperforming equivalent AMD cards (GameWorks or otherwise, I don't care) but I don't feel like filling this thread with more stupid graphs.

So AMD now beating nVidia is because they bombed at the beginning and not because nVidia has now dropped the ball? Please? Talk about spin.

so it doesn't use a software scheduler but schedules instructions in software. got it... :rolleyes:

maxwell is more efficient, but it will still need the software babysitting.

i don't think people are necessarily saying there is a drop in performance on in individual titles (eg. worsening performance in tomb raider).
In gaming loads only. All around performance it sucks.

^^So now it's AMD's fault they their cards are getting better?

Anything that anyone can say to change the facts is good enough. Even if it's pure rationalizing. Fact that was pointed out in the OP is that the 980 ti is not any faster overall than Fury. Looking back at recent history it's only going to get worse.
 

96Firebird

Diamond Member
Nov 8, 2010
5,738
334
126
So AMD now beating nVidia is because they bombed at the beginning and not because nVidia has now dropped the ball? Please? Talk about spin.

Poor AMD. They work so hard to improve their performance via drivers, yet even their own fans don't believe that. Company can't catch a break!
 
Feb 19, 2009
10,457
10
76
This talk of optimization of Kepler has been covered before.

The gist of it is that Kepler has an un-optimal wavefront to shader ratio, so it loses ~1/3 of its maximal efficiency (128/192) if drivers are not optimized for it. For more info you can search the posts of Zlatan on the topic.

This is also exactly what we've seen in many NV GameWorks titles where the 980 is 25-30% or more faster than the 780Ti, but in neutral titles, its ~5% faster. What has happened is that Maxwell has pulled far in front of Kepler in newer games, but ONLY if its NV sponsored. Neutral title still show excellent Kepler vs GCN/Maxwell performance.

This led to the suggestion that NV has shifted their own driver optimization focus to Maxwell since its launch, neglecting Kepler. Because we have results that show such discrepancies, it's hard to argue against that suggestion.

GCN matures better due to the "console effect" that some of you disregarded when we brought it up over a year ago. As more modern game engines and games are released that are built for cross-platform and therefore must be optimized for GCN architectures on Xbone/PS4, their PC release will skew GCN upwards. Likewise, as DX12/Vulkan era approaches, we will see similar upward skew.

Fable built with Epic's UE4 (major NV sponsored game engine) is a prime example. Where the normal gap in UE4 games have NV leading by 25-50% in the current DX11 era, but with DX12, Fable has GCN basically beating Maxwell below 980Ti, with the Fury X and 980Ti neck and neck at 1080p, where its normally 10-20% slower. That kind of uplift I believe, is no coincidence as game engines built for DX12 shines on GCN which was designed for a console-like API aka Mantle/Vulkan/DX12.
 

Hi-Fi Man

Senior member
Oct 19, 2013
601
120
106
So AMD now beating nVidia is because they bombed at the beginning and not because nVidia has now dropped the ball? Please? Talk about spin.

They're coming out of the wood works now...

It's funny because you're the one spinning what I said. AMD's increased performance is due to better drivers, this was all that was stated and it has nothing to do with nVIDIA dropping the ball or anything. It's really simple to understand.

The only other thing I've mentioned is that Kepler's performance hasn't regressed in relation to itself.
 

Hi-Fi Man

Senior member
Oct 19, 2013
601
120
106
This talk of optimization of Kepler has been covered before.

The gist of it is that Kepler has an un-optimal wavefront to shader ratio, so it loses ~1/3 of its maximal efficiency (128/192) if drivers are not optimized for it. For more info you can search the posts of Zlatan on the topic.

This is also exactly what we've seen in many NV GameWorks titles where the 980 is 25-30% or more faster than the 780Ti, but in neutral titles, its ~5% faster. What has happened is that Maxwell has pulled far in front of Kepler in newer games, but ONLY if its NV sponsored. Neutral title still show excellent Kepler vs GCN/Maxwell performance.

This led to the suggestion that NV has shifted their own driver optimization focus to Maxwell since its launch, neglecting Kepler. Because we have results that show such discrepancies, it's hard to argue against that suggestion.

GCN matures better due to the "console effect" that some of you disregarded when we brought it up over a year ago. As more modern game engines and games are released that are built for cross-platform and therefore must be optimized for GCN architectures on Xbone/PS4, their PC release will skew GCN upwards. Likewise, as DX12/Vulkan era approaches, we will see similar upward skew.

Fable built with Epic's UE4 (major NV sponsored game engine) is a prime example. Where the normal gap in UE4 games have NV leading by 25-50% in the current DX11 era, but with DX12, Fable has GCN basically beating Maxwell below 980Ti, with the Fury X and 980Ti neck and neck at 1080p, where its normally 10-20% slower. That kind of uplift I believe, is no coincidence as game engines built for DX12 shines on GCN which was designed for a console-like API aka Mantle/Vulkan/DX12.

I applaud your understanding of the issues I'm talking about and the ability to discuss them. It seems others on here are unable to understand and simply scream "shill, spin, or whatever".
 

Boze

Senior member
Dec 20, 2004
634
14
91
You have to be careful in this case because TPU removed Project Cars and Wolfenstein that were crippling ALL AMD cards. It doesn't mean that suddenly GCN got a huge boost in performance. What it actually shows is why in statistics we remove significant outliers. It's obvious that Project CARS and Wolfenstein weren't accurately representing the average performance of AMD cards. Don't forget that 980Ti has 20-25% OCing headroom and 6GB as a bonus so it's still a better card.

Anyway, I don't think much changes overall. AMD still has the best price/performance from $100-400, while NV's 980Ti is untouched.

TechSpot has a newer article up comparing various AMD vs. NV cards as well:
http://www.techspot.com/review/1075-best-graphics-cards-2015/

I think right now NV continues to sell on brand value and perception in the $100-400 range. 380 2GB > 950, 380 4GB/280X > 960, 290 has no competition, 390 > 970. Yet, NV completely outsells AMD with 950/960/970 cards.

What's most surprising is just how much better 280X is against the 950/960 cards, and how poorly the 780 aged. Those are far more eye-opening for me than Fury X getting slightly better against a reference 980Ti. The crazy part is how overhyped 780 was, how people purchased it over the mostly cheaper 290 and how sites like TechReport and HardOCP completely failed the consumer by failing to warn them about 2GB limits on the 960, while downplaying the performance advantage of 280X all this time, despite the latter often being within a similar price range. Once this generation is done, in 5 years, no one will care about any of these cards per say, but I'll never forget review sites that failed to point out glaring product flaws and prioritized NV's perf/watt marketing over raw GPU horsepower and VRM. As far as I am concerned this generation was a reputation killer for certain sites that lost all credibility they have built up over the last 10 years.

TPU FTW for listening to the consumers and understanding what outliers are.

You're not factoring in load power usage into any of these scenarios... an R9 390X at load under high-end games like Crysis 3 can use as much as 487 watts according to AnandTech Bench. The GTX 980 uses 308. What's the price difference in these cards? $100? $50 if you catch a GTX 980 at $480 w/ $30 rebate?

How long do you think it'll take you to use that $50 extra cost for the NVIDIA card in power usage? Sure, to a lot of people it just won't matter. It doesn't matter to me (clearly), but I know there's some people who value cost per watt even in their video cards.

Some scenarios, it would absolutely make sense to get the AMD product... if you live in an apartment where electricity is covered through rent, or say living in a college dorm for instance. Otherwise, at some point, you really do need to seriously consider power cost.

Furthermore, as you said, there's no real offering from AMD that can match the GTX 980 Ti... they can be had for $609.99 from Newegg right now, and they are power sippers compared to Fury X.

If you ask me, unless you have access to free power, AMD's offerings are a tough sell if you tend to hang onto video cards for awhile and can recoup the initial investment into the card through power savings over time.
 

Hi-Fi Man

Senior member
Oct 19, 2013
601
120
106
Again, does it use a software scheduler or not?

I already answered you if it did or didn't (again, it uses a static scheduler which uses a combination of both software and hardware) so either you are having trouble understanding me or I don't know what.

You were implying that I was implying that all scheduling is done in software. I did not say that.

But you did it's right here and it's why I replied.

software scheduler, why not drop the improvements on kepler too? Maybe maxwell sucks less relatively in a year or two.

so it doesn't use a software scheduler but schedules instructions in software. got it... :rolleyes:

maxwell is more efficient, but it will still need the software babysitting.

If you don't believe it uses a software scheduler then why are you arguing with me and what is your point then?
 
Last edited:

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
233
106
I did say however, that Kepler needs careful optimization from the driver to perform at max efficiency. These optimizations are still happening, just not always at launch time, much like AMD's cards.

crysis3_1920_1080_1.gif

This is what I call "optimizations".


farcry4_1920_1080_2.gif

Far Cry 4 sucked on launch day and guess what, it still sucks.

GameWorks - the new business model by Nvidia. The way it's meant to be played :cool:
 

dogen1

Senior member
Oct 14, 2014
739
40
91
Now, why was the game designed to cripple AMD cards? They were either bought off, they didn't care about the AMD users due to their lower market share, or they are incompetent. Now which sounds more likely? Personally, I would go with #1 and/or #2.

You say incompetent and yet the game runs quite well on console.

Have you considered that perhaps they are telling the truth?

I don't know of any information that rules out this possibility.
 

thesmokingman

Platinum Member
May 6, 2010
2,302
231
106
You say incompetent and yet the game runs quite well on console.

Have you considered that perhaps they are telling the truth?

I don't know of any information that rules out this possibility.


How in the world do they break the game then, if as you wrote it runs quite well on the console. The console being AMD hardware, so somewhere along the line they DID something to break the game. It runs fine on Nvidia, however it is completely FAILED on AMD. Seems obvious unless you're one of those users who loves him some Gameworks, ya?
 

dogen1

Senior member
Oct 14, 2014
739
40
91
How in the world do they break the game then, if as you wrote it runs quite well on the console. The console being AMD hardware, so somewhere along the line they DID something to break the game. It runs fine on Nvidia, however it is completely FAILED on AMD. Seems obvious unless you're one of those users who loves him some Gameworks, ya?

I was referring to the quote about it being mainly a driver issue.
 
Status
Not open for further replies.