[Techspot] Nvidia vs Nvidia 5 generations of cards tested

Feb 19, 2009
10,457
10
76
Good article, but one problem is the usage of system power consumption for the comparison, is invalid due to the massive performance deltas. Higher fps = more power use on the CPU & motherboard/memory subsystem.
 

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
980 isn't much of an upgrade from 780 ti
Still might be worth it if you play at very high resolutions
I was originally trying to wait for everyone to dump their 780 ti for the small upgrade and get two of them used on the cheap but the AMD cards are so cheap now I just got new cards and they are pretty close to Nvidia's offering so I am fine with it.

Hopefully Nvidia has another card up their sleeve and they are just stalling because as you stated, it isn't a great upgrade for top tier users.
 

SPBHM

Diamond Member
Sep 12, 2012
5,067
421
126
it would be interesting to have an AMD version of this,

but pretty interesting to see, 780 Ti to 980 is not huge, but if you accept the 980 as more like a successor to the 680 it looks good, and 580 to 980 is a huge difference after 4 years, but it makes sense, a shame we didn't see more changes with DX12 and stuff, but it's coming soon

performance/watt, comparing the 480/580 to the 980 is just amazing.
 

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
Good article, but one problem is the usage of system power consumption for the comparison, is invalid due to the massive performance deltas. Higher fps = more power use on the CPU & motherboard/memory subsystem.

I think it is completely relevant. Users are going to want to know the Power consumed by their machine in real world situations. I don't think the point of the power article is to discuss specifically the efficiency of performance/frame but instead the power consumption of a system utilizing the card in games at 1080p.
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
980 isn't much of an upgrade from 780 ti
Still might be worth it if you play at very high resolutions
I was originally trying to wait for everyone to dump their 780 ti for the small upgrade and get two of them used on the cheap but the AMD cards are so cheap now I just got new cards and they are pretty close to Nvidia's offering so I am fine with it.

Hopefully Nvidia has another card up their sleeve and they are just stalling because as you stated, it isn't a great upgrade for top tier users.
The 980 was never meant as an upgrade to the 780Ti, even NV stated that. And yes to AMD comparison please!
 
Last edited:

OlyAR15

Senior member
Oct 23, 2014
982
242
116
Are you reading the same article I am? Saying the 980 isn't an upgrade to the 780Ti?

How can you say that when the 980 is faster? If it was slower, then it wouldn't be an upgrade, but it is faster.
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
Because when NV target users, they always go for previous gen users....Just because some journo says its the upgrade path, doesnt mean they were the NV targets...780x gen need to wait for GM110 not 104!
 
Last edited:

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Pretty sure that article was using one of the mature 480 cards with power consumption not nearly as bad as the initial release cards. Still, seeing the GM204 (gtx980) come close to 2.5x faster than GK110 (gtx580) all while being significantly smaller, consuming considerably less power, and having only 1 node jump advantage is pretty cool to see.

Progress!
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
Are you reading the same article I am? Saying the 980 isn't an upgrade to the 780Ti?

How can you say that when the 980 is faster? If it was slower, then it wouldn't be an upgrade, but it is faster.

Because what is the point ? 15% is actually generous imo and they tested a limited subset of games. I've seen a lot of tests show sub 10%. The even ground is that the 980 is about a 10% improvement over a 780ti. That is a worthless upgrade path and won't provide a discernible difference between the two.

After gtx 680 it's become apparent that nvidia is doing a small die weak flagship, big die strong flagship release schedule. The big upgrade will be GM200.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Good article, but one problem is the usage of system power consumption for the comparison, is invalid due to the massive performance deltas. Higher fps = more power use on the CPU & motherboard/memory subsystem.

And sadly the 480 still tops most of the power charts despite this.
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
Interesting to see how efficient the memory bandwidth compression is with the 980
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I think it is completely relevant. Users are going to want to know the Power consumed by their machine in real world situations. I don't think the point of the power article is to discuss specifically the efficiency of performance/frame but instead the power consumption of a system utilizing the card in games at 1080p.

I tend to strongly agree with your point of view. You cannot play a game with a videocard without the rest of the system. By looking at the total system power usage it actually tells the user roughly what PSU they would need with a given CPU. So many times we hear online of people being afraid to buy a 290X or 780Ti since they read AIB requirements of needing a 700W-850W PSU and get all paranoid. These articles that focus on a total system power usage and especially on a per game basis are very useful for those types of gamers to calm their fears that you can run a top of the line GPU with a top of the line i7 on Socket 1150 on a solid 450-500W PSU. You do not need a 750W PSU to run a 290X/780Ti style card!

However, since not everyone is running 4770K/4790K, we often quote single card power usage because it would be hard to find comparable total system power usage online for an i5 2600K + 980 as such system is rarely tested.

Good article, but one problem is the usage of system power consumption for the comparison, is invalid due to the massive performance deltas. Higher fps = more power use on the CPU & motherboard/memory subsystem.

I feel the opposite. It actually tells a user with a modern i7 how much power they can save with the upgrade to a 980. While many on this forum disagree, to me total system power consumption is also a meaningful metric for single GPU power usage cases since you cannot run a GPU on its on in a vacuum. Single GPU power usage measurement (TPU style) is helpful when we want to compare the efficiency of a GPU architecture/SKU. However, it has lead to many many users incorrectly using the multiplier factor when discussing GPU setups with 2/3/4 GPU setups (i.e., they take the power usage of a single card and triple it for 3-GPU setups, which is NOT how Tri-SLI works). For example at 4:37 minute mark:

https://www.youtube.com/watch?v=UnS0xWtoRzk

Despite how useful single GPU power usage can be, it is exactly because of the factors you mentioned and diminished SLI/CF scaling that using total system power usage is often a way better method to tell us about what PSU we need and what it will actually cost us to run the total rig. For marketing performance/watt reasons it also sounds A LOT more impressive to say that GTX980 uses 90W less power than a 780Ti vs. stating that one system uses 250W and another 340W.

Too bad they didn't include the other recent games such as COD: AW, Far Cry 4, Ryse Son of Rome and AC Unity in their test but used oldies such as Bioshock Infinite or used The Crew that they didn't even count in the conclusion. One point that I always talk about is how little there is to gain by buying the fastest NV/AMD card/refresh as we see 580 didn't really last any longer than a 480, and I would imagine 670 and 680 would be similarly close on that chart.

Are you reading the same article I am? Saying the 980 isn't an upgrade to the 780Ti?

How can you say that when the 980 is faster? If it was slower, then it wouldn't be an upgrade, but it is faster.

Because who in their right mind would sell their $650-700 780Ti for $350-400 (their market prices), to get 12% more performance at 2560x1600 as per the review? It's doubtful a user would feel a 12% differences in gameplay on average without resorting to actual benchmarking. While it is an upgrade in a sense if you skipped 780Ti and got the 980 which means you do end up with a faster card than a 780Ti, the 980 is not a legitimate worthwhile upgrade for 780Ti users, unlike moving from 580 to a 680 was.

Grooveriding covered this point above. My biggest gripe is the naming convention. In the past NV would never label a mid-range GeForce Ti 4200 as 4600, 5600Ultra as 5800Ultra, 6600GT as 6800GT, GTX460 1GB as GTX480 despite those next gen mid-range cards outperforming the previous gen flagships. However, since GTX680 they are basically labelling x60/x60Ti series cards as x80 series. It's grossly misleading since they end up marketing 2 flagship cards in what essentially is just 1 GPU architecture generation. AMD isn't much better sometimes creating new generations out of thin air with HD6970 which was basically an HD5890 or 3870 which was really just a 2970XT if you will.

Wish more reviews focused on inter-generations performance increases as that data is often hard to find.
 
Last edited:

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
Because who in their right mind would sell their $650-700 780Ti for $350-400 (their market prices), to get 12% more performance at 2560x1600 as per the review?

The same people who said 'Screw it, I'm buying 1400(2800) dollars worth of GPUs to get the absolute best performance possible.'
 
Last edited:
Feb 19, 2009
10,457
10
76
You cannot play a game with a videocard without the rest of the system.

There definitely needs to be both, for serious review sites, its a non issue for them to measure over the 12V rails to the GPU.

Because looking at these charts, an uninformed reader would assume the 780ti is a terribly inefficient card like the 480 was, which just isn't true.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The same people who said 'XXXX it, I'm buying 1400(2800) dollars worth of GPUs to get the absolute best performance possible.'

Ya I forget about the top 1%. You might want to edit our post though. :D

There definitely needs to be both, for serious review sites, its a non issue for them to measure over the 12V rails to the GPU.

Because looking at these charts, an uninformed reader would assume the 780ti is a terribly inefficient card like the 480 was, which just isn't true.

I agree, both data points are valid. Of course 780Ti has better performance/watt, but its power usage is not much less than a 480.

480 peaks at 272W
power_peak.gif


780Ti peaks at 268W
power_peak.gif


290X/780Ti/480 are all power hungry cards in reference form at least.

Side-note: as I pointed out before TPU made a gross error with 273W 7970Ghz power usage, as they carried over incorrect chart data. It should be 238W, but instead they used FurMark's 273W value from their 780Ti review. Not one member from TPU or W1zzard noticed this error for nearly 1 year, and yet it's commonly finding itself being quoted by even the most experienced/senior members of TPU such as by HumanSmoke here. ;)
 
Last edited:

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
OP nice article. We can all be "hyper-critical" of bits of the article but overall it shows a nice pattern of improvement by Nvidia.

I would also join the chorus in asking for a similar AMD gpu card review.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I like how articles like these prove that buying a flagship $500-550 card and keeping it for 5 years is worse than upgrading more often with slower cards:

March 2010 480/November 2010 580 for $500 to March 2015

Vs.

Sept 2009 5850 for $270 or 470 for $275 by July 2010, then 7970 1Ghz/GTX 670 for $300 by March-April 2013.

Maybe a lot of people like to buy the flagship and keep it for 4-5 years to have that "buy it and forget it" type of mentality. Using the same idea today, an after-market $250-350 290/970 will provide 83-87% of the performance of the 980 for the next 2-2.5 years, and then a $250-300 card by December 2016-June 2017 will slaughter the 980 for the next 2-2.5 years. The 15-20% gap in today's games will be hardly noticeable but 50%+ faster performance by early 2017 will become handy for next gen games. Plus, while you can resell the 290/979 and get some value out of it by end of 2016 while in 5 years a 980 will lose 70-80% of its value.
 

kasakka

Senior member
Mar 16, 2013
334
1
81
I think the interesting part is what it says about current gen consoles. With their integrated GPUs they perform roughly at the GTX480/580 levels at 1080p (or less). Too bad the tests don't show minimum fps as that might be where the newer cards pull ahead a lot more.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
I like how articles like these prove that buying a flagship $500-550 card and keeping it for 5 years is worse than upgrading more often with slower cards:

March 2010 480/November 2010 580 for $500 to March 2015

Vs.

Sept 2009 5850 for $270 or 470 for $275 by July 2010, then 7970 1Ghz/GTX 670 for $300 by March-April 2013.

Maybe a lot of people like to buy the flagship and keep it for 4-5 years to have that "buy it and forget it" type of mentality. Using the same idea today, an after-market $250-350 290/970 will provide 83-87% of the performance of the 980 for the next 2-2.5 years, and then a $250-300 card by December 2016-June 2017 will slaughter the 980 for the next 2-2.5 years. The 15-20% gap in today's games will be hardly noticeable but 50%+ faster performance by early 2017 will become handy for next gen games. Plus, while you can resell the 290/979 and get some value out of it by end of 2016 while in 5 years a 980 will lose 70-80% of its value.

5 years is too long a period for a PC gamer who plays the latest games and wants to have a good gameplay experience. Ideally 2 - 3 years. So it makes more sense to buy the n-1 SKU or card below flagship. buy the card for USD 300 - 350 and upgrade after 2 - 3 years rather than spend USD 550 - USD 650 only once and wait for 5 years.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I think a Core i5/7 with a 580 will be faster due to CPU bottleneck lifted. The GPU in PS4 is at 7850 level, so it shouldn't be surprising that it should perform at 480/580 level in less CPU limited games. Given the low end hardware in current consoles, the graphics in games are often very close to the PC versions (FC4, DAI, AC Unity come to mind). Given the draught of amazing PS4/XB1 exclusives, I think they could have waited 1 more year to release next gen consoles. But from a business point of view given that PS4/XB1 already sold 25+ million in just 1 year, the timing worked out for them.
 

NTMBK

Lifer
Nov 14, 2011
10,461
5,845
136
Woah, from Fermi flagship to Kepler flagship is almost double the performance. Wish we were still getting node shrinks. :(
 

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
480 to 780 is pretty much 100% in all scenarios which was my upgrade path. When you look at chip to chip improvements its very consistently good. Just have to ignore Nvidia's new midrange as x80 double dip.