- Nov 20, 2005
- 14,612
- 318
- 126
In the video card market it seems like there have been some certain trends about GPUs aging over time, but the reasons why all seem to be speculation. Why doesn't someone (aka one of these sites that review GPUs) actually do the testing to figure it out?
Like for example Kepler. It is obviously not doing as well as it once did with new games vs both GCN and Maxwell. But the reasons why vary- some blame lack of driver optimization for Kepler, some blame overall trends in development. Seems like this is easy to test. Go back and benchmark older games (say 2013) on Maxwell to make sure that there wasn't some sort of overall driver boost, and then retest some barely older games (say 2014-2015) to make sure that subsequent driver updates don't help Kepler. If Maxwell runs better with old games, or Kepler runs better after six-eight months then obviously there is some truth to the claims that the priority is Maxwell.
Or GCN vs Nvidia aging overall. It is obvious that GCN has aged well especially vs Kepler and somewhat vs Maxwell, but the reasons vary- some blame developers targeting consoles, others say the drivers got better. Again this seems easy to test. Go back and retest old games to see if performance got better or worse vs competitors and vs old tests. If old games run better, then clearly the driver got better.
I get that these sites are focused on new cards and the enthusiasm around new technology, and at some level there might be a hesitation to screw with hindsight because it might contradict their recommendations from an earlier period. With that said, there is no new technology that most people can buy for months and given that it's a new generation of GPUs observations from the previous generations can help us draw conclusions of what unexpected outcomes to expect. At the very least it seems like there is a million clicks for the person who discovers the actual proof of what is going on as the two groups of online fans battle each other with "leaked" or made up evidence of what the new hardware will do.
I just don't understand.
Like for example Kepler. It is obviously not doing as well as it once did with new games vs both GCN and Maxwell. But the reasons why vary- some blame lack of driver optimization for Kepler, some blame overall trends in development. Seems like this is easy to test. Go back and benchmark older games (say 2013) on Maxwell to make sure that there wasn't some sort of overall driver boost, and then retest some barely older games (say 2014-2015) to make sure that subsequent driver updates don't help Kepler. If Maxwell runs better with old games, or Kepler runs better after six-eight months then obviously there is some truth to the claims that the priority is Maxwell.
Or GCN vs Nvidia aging overall. It is obvious that GCN has aged well especially vs Kepler and somewhat vs Maxwell, but the reasons vary- some blame developers targeting consoles, others say the drivers got better. Again this seems easy to test. Go back and retest old games to see if performance got better or worse vs competitors and vs old tests. If old games run better, then clearly the driver got better.
I get that these sites are focused on new cards and the enthusiasm around new technology, and at some level there might be a hesitation to screw with hindsight because it might contradict their recommendations from an earlier period. With that said, there is no new technology that most people can buy for months and given that it's a new generation of GPUs observations from the previous generations can help us draw conclusions of what unexpected outcomes to expect. At the very least it seems like there is a million clicks for the person who discovers the actual proof of what is going on as the two groups of online fans battle each other with "leaked" or made up evidence of what the new hardware will do.
I just don't understand.