Why Doesn't Anyone Get to the Bottom of the Aging of Kepler and GCN?

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
In the video card market it seems like there have been some certain trends about GPUs aging over time, but the reasons why all seem to be speculation. Why doesn't someone (aka one of these sites that review GPUs) actually do the testing to figure it out?

Like for example Kepler. It is obviously not doing as well as it once did with new games vs both GCN and Maxwell. But the reasons why vary- some blame lack of driver optimization for Kepler, some blame overall trends in development. Seems like this is easy to test. Go back and benchmark older games (say 2013) on Maxwell to make sure that there wasn't some sort of overall driver boost, and then retest some barely older games (say 2014-2015) to make sure that subsequent driver updates don't help Kepler. If Maxwell runs better with old games, or Kepler runs better after six-eight months then obviously there is some truth to the claims that the priority is Maxwell.

Or GCN vs Nvidia aging overall. It is obvious that GCN has aged well especially vs Kepler and somewhat vs Maxwell, but the reasons vary- some blame developers targeting consoles, others say the drivers got better. Again this seems easy to test. Go back and retest old games to see if performance got better or worse vs competitors and vs old tests. If old games run better, then clearly the driver got better.

I get that these sites are focused on new cards and the enthusiasm around new technology, and at some level there might be a hesitation to screw with hindsight because it might contradict their recommendations from an earlier period. With that said, there is no new technology that most people can buy for months and given that it's a new generation of GPUs observations from the previous generations can help us draw conclusions of what unexpected outcomes to expect. At the very least it seems like there is a million clicks for the person who discovers the actual proof of what is going on as the two groups of online fans battle each other with "leaked" or made up evidence of what the new hardware will do.

I just don't understand.
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
The truth is out there, and I would still totally bang Scully.


Off-topic, and thread crapping.
Also, this is a technical forum, not off-topic or P&N, this is uncalled for.
Markfw900
 
Last edited by a moderator:

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
It's pretty clear nvidia doesn't do much kepler specific driver work anymore. With the witcher 3 performance with the game ready driver was identical to a year old driver. Then, after a lot of complaining nvidia did some kepler fixes too.

But most games aren't popular enough to get enough people to complain.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Look at say a HD6970 vs HD7970 back then and now. If it wasn't because AMD is stuck with GCN, we wouldn't see this thread. Nobody is going to use more resources than they absolutely have to for old products.
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
Look at say a HD6970 vs HD7970 back then and now. If it wasn't because AMD is stuck with GCN, we wouldn't see this thread. Nobody is going to use more resources than they absolutely have to for old products.

So you don't think the fact that the consoles are based on GCN, and that they are the primary targets for development for economic reasons, has anything to do with it?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
So you don't think the fact that the consoles are based on GCN, and that they are the primary targets for development for economic reasons, has anything to do with it?

Console is GCN 1.1, and we start to see GCN1.1 performing better and better relative to GCN1.0 and 1.2. But for non console ports its different.

What I say is, had AMD had a truly new uarch. GCN would be tanking as well. Just like in the past.

Richland(2013) and Trinity(2012) lost support completely for the same reason.
 
Last edited:

Ma_Deuce

Member
Jun 19, 2015
175
0
0
If it wasn't because AMD is stuck with GCN, we wouldn't see this thread. Nobody is going to use more resources than they absolutely have to for old products.

Nice spin. AMD isn't "stuck" with anything. The more forward thinking GCN design just holds up much better than anything nvidia had to offer. It doesn't look like that's changing any time soon.

Maybe nvidia's new stuff can reverse the trend.
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
Console is GCN 1.1, and we start to see GCN1.1 performing better and better relative to GCN1.0 and 1.2. But for non console ports its different.

What I say is, had AMD had a truly new uarch. GCN would be tanking as well. Just like in the past.

Maybe, but I think there is some truth that AMD is able to extract more from the old hardware over time.

The Crimson driver gave my old 7850 a 10% boost in benchmarks. We go nuts over 10% differences in this forum. I wonder if we went back if Maxwell has the same effect since when it launched.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
So you don't think the fact that the consoles are based on GCN, and that they are the primary targets for development for economic reasons, has anything to do with it?

Or that Kepler's poor compute capabilities are showing more with modern games?
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
From a gamer's perspective, Kepler being slow is a bad thing and it makes me less confident in their current and future products. But if I owned substantial stock in Nvidia, I would be pissed off every time I heard of a single software engineer wasting time on products no longer being sold. If I owned Nvidia, Kepler would be ignored beyond belief. Two gens old? GTFO not on my dime bishtces.
 

Goatsecks

Senior member
May 7, 2012
210
7
76
In the video card market it seems like there have been some certain trends about GPUs aging over time, but the reasons why all seem to be speculation. Why doesn't someone (aka one of these sites that review GPUs) actually do the testing to figure it out?

Like for example Kepler. It is obviously not doing as well as it once did with new games vs both GCN and Maxwell. But the reasons why vary- some blame lack of driver optimization for Kepler, some blame overall trends in development. Seems like this is easy to test. Go back and benchmark older games (say 2013) on Maxwell to make sure that there wasn't some sort of overall driver boost, and then retest some barely older games (say 2014-2015) to make sure that subsequent driver updates don't help Kepler. If Maxwell runs better with old games, or Kepler runs better after six-eight months then obviously there is some truth to the claims that the priority is Maxwell.

Or GCN vs Nvidia aging overall. It is obvious that GCN has aged well especially vs Kepler and somewhat vs Maxwell, but the reasons vary- some blame developers targeting consoles, others say the drivers got better. Again this seems easy to test. Go back and retest old games to see if performance got better or worse vs competitors and vs old tests. If old games run better, then clearly the driver got better.

I get that these sites are focused on new cards and the enthusiasm around new technology, and at some level there might be a hesitation to screw with hindsight because it might contradict their recommendations from an earlier period. With that said, there is no new technology that most people can buy for months and given that it's a new generation of GPUs observations from the previous generations can help us draw conclusions of what unexpected outcomes to expect. At the very least it seems like there is a million clicks for the person who discovers the actual proof of what is going on as the two groups of online fans battle each other with "leaked" or made up evidence of what the new hardware will do.

I just don't understand.

I have been wondering this myself.

What I would like to see is a review where for each game there is a chart with: kepler driver version on the x-axis with *insert performance metric* on the y-axis. I also love your idea for 'back testing' the AMD hardware as well to test where the improvements have come from.

This is also an extremely controversial topic. Which will definitely factor into some reviewers decisions.
 

Goatsecks

Senior member
May 7, 2012
210
7
76
Or that Kepler's poor compute capabilities are showing more with modern games?

Probably a mix of both depending on the title. There are many potential reasons for this dynamic between kepler and gcn, I suspect none of them are mutually exclusive.
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
Nice spin. AMD isn't "stuck" with anything. The more forward thinking GCN design just holds up much better than anything nvidia had to offer. It doesn't look like that's changing any time soon.

Maybe nvidia's new stuff can reverse the trend.

Read up on the development of the PS4. The forward thinking was actually done by Sony. This is one area where AMD really did benefit from the consoles. Consoles makers have to predict architecture needs years in advance since that's the expected lifespan of a console. PC video card makers think in much shorter terms since they typically replace architectures every couple of years. Sony predicted that by the middle of the PS4 lifespan that async compute would be a very important feature so they pushed for it in GCN among many other features.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
From a gamer's perspective, Kepler being slow is a bad thing and it makes me less confident in their current and future products. But if I owned substantial stock in Nvidia, I would be pissed off every time I heard of a single software engineer wasting time on products no longer being sold. If I owned Nvidia, Kepler would be ignored beyond belief. Two gens old? GTFO not on my dime bishtces.

If we go by your logic of no optimizations on products not being sold then flagship big die GPUs Gx100 would have a life of 15-18 months before they are EOL'd and replaced by the mid-range Gy104 of the next generation. Thats just completely unacceptable from a consumer point of view. :thumbsdown:
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
If we go by your logic of no optimizations on products not being sold then flagship big die GPUs Gx100 would have a life of 15-18 months before they are EOL'd and replaced by the mid-range Gy104 of the next generation. That's just completely unacceptable from a consumer point of view. :thumbsdown:

That's what I would do if I could get away with it. That's what appears to be happening anyway. Most non retarded people would do the same thing if they could get away with it and not hurt future sales. You pretend like Nvidia cares about gamers. That's a little cute actually :wub:
 
Aug 11, 2008
10,451
642
126
I suppose they are testing what is available to buy new at retail. Granted there are used cards from previous generations for sale, but I dont know what part of the market that is.

And it would increase the workload of testing sites by several fold to test cards from 3 or 4 generations from each maker with a lot of different games. That is just the way it is, pretty much every testing magazine in any field tests the "latest and greatest".
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
That's what I would do if I could get away with it. That's what appears to be happening anyway. Most non retarded people would do the same thing if they could get away with it and not hurt future sales. You pretend like Nvidia cares about gamers.

Capitalism 101:

Companies aren't your friend or your sports team (even though your sports team is a company).
 

tential

Diamond Member
May 13, 2008
7,348
642
121
That's what I would do if I could get away with it. That's what appears to be happening anyway. Most non retarded people would do the same thing if they could get away with it and not hurt future sales. You pretend like Nvidia cares about gamers. That's a little cute actually :wub:

How long is it til Nvidia requests people to not bench their older hardware?
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
I don't have an answer for you, so feel free to ignore me. But in older games it was far more likely to see the 780 Ti separate from the 970 and have the 970 just a bit over the 780.

You can peruse some sites for reviews that use 2013 games. Guru3d does this. They reuse old results, but for an older game it shouldn't matter as much.

Tomb Raider 2013. 780 Ti > 290X > Titan > 970 > 290 > 780:
http://www.guru3d.com/index.php?ct=...dmin=0a8fcaad6b03da6a6895d1ada2e171002a287bc1

Bioshock I. Nvidia favored game, so AMD lagging. But 970 = 780 here, surprisingly. Although there isn't much separation around Titan, 780, 970 they are all the same essentially:
http://www.guru3d.com/index.php?ct=...dmin=0a8fcaad6b03da6a6895d1ada2e171002a287bc1

Battlefield 3. 2011 game. 980 is barely ahead of a 780 Ti; 970 is barely ahead of a 780 and loses to a Titan:
https://tpucdn.com/reviews/NVIDIA/GeForce_GTX_980/images/bf3_2560_1600.gif

Far Cry 3 (2012). 970 ahead of the Titan by a hair, but the 780 Ti is still 11% faster:
https://tpucdn.com/reviews/NVIDIA/GeForce_GTX_980/images/farcry3_2560_1600.gif

Crysis 2007, 970 loses to the Titan, 780 Ti is still much faster:
https://tpucdn.com/reviews/NVIDIA/GeForce_GTX_980/images/crysis_2560_1600.gif

The original TPU GM104 review uses plenty of games from 2007 - 2014 in its review.
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_980/1.html

Thus we see the 970 matching a 290X even at 1440p. But while the 970 still lags behind the 780 Ti by quite a bit at launch, even then it was very high ahead of the 780 - more than 10%:
https://tpucdn.com/reviews/NVIDIA/GeForce_GTX_980/images/perfrel_2560.gif

But if in the aggregate total the 970 is faster than the 780 by more than 10%, then why in most of the older 2007-2013 games that I just listed does the 970 have less than a 10% lead? Because the TPU GM104 review also used new games from 2014 and popular late 2013 games (because I'm sure that Nvidia worked harder to optimize Maxwell for 2013's BF4 because of its continued popularity in 2014 and beyond).

Go back to the lead the 970 had over the 780 in BF3. Just 7%. But in BF4 it is 12%.
https://tpucdn.com/reviews/NVIDIA/GeForce_GTX_980/images/bf4_2560_1600.gif

Obviously Frostbite changed some. But again I have no answers. Is it drivers or is the Maxwell engine better suited for more recent games (and in turn, GCN the most suited)?

In 2016 you are much less likely to see a Titan beat the 970, and far more likely to see a 970 beat a 780 Ti which just really did not happen at GM104 launch.

In all 17 of the TPU games from GM104 launch, the 970 loses to the 780 Ti at 2560x1600. In 2 of the games even the 980 loses to the 780 Ti at 2560x1600. Who knows, but the former is no longer a sure thing and the latter is virtually unheard of nowadays.
 
Last edited:

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
TechSpot is the only site that does this kind of testing, and they don't do it often. Here's its two-part series from over a year ago that comes as close as you're going to find:

Nvidia: http://www.techspot.com/article/928-five-generations-nvidia-geforce-graphics-compared/
AMD : http://www.techspot.com/article/942-five-generations-amd-radeon-graphics-compared/

As others have stated, there are some VERY good reasons that tech sites generally don't do this, as follows:
(1) they must have the older-gen cards in their inventory, as they cannot get them from vendors
(2) writing such articles is extremely time-intensive, and is likely impossible to make profitable based on generally low interest in the subject except at a new product launch
(3) plenty of game-specific reviews already provide data for the past two generations of cards, such as...
here: http://www.techspot.com/review/1128-rise-of-the-tomb-raider-benchmarks/page3.html
here: http://www.techspot.com/review/1096-star-wars-battlefront-benchmarks/page2.html
here: http://www.techspot.com/review/1089-fallout-4-benchmarks/page3.html
and here: http://www.techspot.com/review/1148-tom-clancys-the-division-benchmarks/page2.html
(4) exposing potential areas of concern will jeopardize the sites' ability to procure review samples in the future.

In my own testing of the 780 Ti vs. GCN vs. Maxwell, it became quite clear that Kepler had an issue, and I sold off my cards, which I always buy at retail (i.e., I don't get them for free).

You can see my last round of 780 Ti vs. GCN vs. Maxwell data here: http://techbuyersguru.com/sapphire-radeon-r9-fury-tri-x-4gb-review
 
Last edited:

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
I have owned SLI GTX 570s, was suckered into buying a GTX 780 and I have zero confidence that NVIDIA's future cards will age any better than they have in the past.

My next card(s) will more than likely be AMD cards. My 7970 has aged very very well. In my opinion it's one of the best valued cards ever made.
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
I suppose they are testing what is available to buy new at retail. Granted there are used cards from previous generations for sale, but I dont know what part of the market that is.

Maybe that explains why they don't retest Kepler, but AMD cards are mostly rebranded 290s, 7870s and 7850s. That excuse doesn't fly for the GCN side.

From a gamer's perspective, Kepler being slow is a bad thing and it makes me less confident in their current and future products.

Exactly why I think it is the perfect time to have the conversation.

At this point I pretty much expect whatever new GPUs overlap between AMD and Nvidia this year at first it will seem like the Nvidia card is faster but I might still buy the AMD card because I can expect that difference to disappear over its life.

A point was made that AMD doesn't refresh their lineup as much, and maybe they can't given their financial situation. I personally see that as a good thing, especially if each new generation of cards means mine is left in the dust. It is so weird reading old reviews for say the 660 and there it is competing with the AMD 7870, and then years later the 960 is competing with another rebranded 7870.

In 2016 you are much less likely to see a Titan beat the 970, and far more likely to see a 970 beat a 780 Ti which just really did not happen at GM104 launch.

Exactly. The unsaid point there is the Titan was the first $1000 consumer GPU but today in some cases it is beaten by a competitor that is just as old that was almost half the price- the 7970. That is an amazing loss of value over the life of that card.

Some people like to say that only current performance matters because by the time something like a Titan is regulated to the midrange it doesn't matter if its upper midrange or lower midrange because they have already bought another high-end card. My retort to that point is not everyone is a high-end GPU consumer, and it is obvious unless VR explodes the PC gaming market is starting to stagnate on the hardware side. There are some people on this forum still rocking Titan OGs, and they got screwed worse than any PC gamers that have ever lived (unless they needed the DP compute).
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
TechSpot is the only site that does this kind of testing, and they don't do it often. Here's its two-part series from over a years ago that comes as close as you're going to find

Thank you very much. I think you are right that is as close as I will get.

(1) they must have the older-gen cards in their inventory, as they cannot get them from vendors

Seems like a GPU review site would keep around a card each generation or so to compare to what is coming, like how CNET used to always roll out their 8 year old Kuro Plasma to compare to new TVs.

(2) writing such articles is extremely time-intensive, and is likely impossible to make profitable based on generally low interest in the subject except at a new product launch

In general I agree, but right now there isn't a lot of other solid concrete news. In other parts of journalism these are the periods when information gets backfilled.

And the interest wouldn't be low if the headline was really clickbaity. An article like "Past Analysis Shows Why Upcoming Nvidia/AMD Hardware Might be a Really Poor Investment" would be HUGE all over the internet, it would be used as a source here and on Toms and on Reddit. It would be the centerpiece of the AMD vs Nvidia fanboy battle for months until we have real benchmarks.

(3) plenty of game-specific reviews already provide data for the past two generations of cards, such as...

Sure, some sites do a great job measuring new games with old cards. All I am asking for in the inverse- measure old games with new drivers and compare to the first time you tested. It would tell us a lot.

(4) exposing potential areas of concern will jeopardize the sites' ability to procure review samples in the future.

This statement worries the crap out of me. If I am decoding that properly it basically boils down to "If a site did that Nvidia would punish them."

If sites are worried about pointing out historical failings, then how the HELL are they going to point out current failings? At least with historical failings you can excuse it some by saying "they did better since then," but with current gen stuff there is nowhere to run.

As a sidenote, I really like your site. The fact that you buy your own GPUs makes me trust your results more.
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
...



This statement worries the crap out of me. If I am decoding that properly it basically boils down to "If a site did that Nvidia would punish them."

If sites are worried about pointing out historical failings, then how the HELL are they going to point out current failings? At least with historical failings you can excuse it some by saying "they did better since then," but with current gen stuff there is nowhere to run.

By the way, saying bad things about AMD can also get you yanked from the review list. TechReport's investigation into frame pacing did not make it any friends at AMD, and it was eventually denied a Nano for review. Ironically it did lead to Scott Wasson leaving the site to work for AMD.

As a sidenote, I really like your site. The fact that you buy your own GPUs makes me trust your results more.

Thanks! I actually just purchased a GTX 970 simply to use it for testing... I didn't have a need for it other than for 290/390/980 (and next-gen) face-offs. I'm beginning to build up a stockpile of older cards for the kind of tests you're talking about. I currently have a 560 Ti, HD 7870, 290, 390X, 970, 980, and 980 Ti. I wish I hadn't sold my 670 and 780 Ti... but that's hindsight. And while I got a Fury as a loaner for testing, I intend to buy one for future face-offs.