Updated List of Video Card GPU OVERALL Performance VP Ratings - TITAN update!

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BoFox

Senior member
May 10, 2008
689
0
0
Anybody want to score GTX 780 and GTX 770?

GPU Boost 2.0 just makes it complicated like hell for me, especially as one would be hard pressed to find an American site that ensures a GPU Boost 2.0 card is properly warmed up at the start of benchmark runs.
 

Elfear

Diamond Member
May 30, 2004
7,081
596
126
Anybody want to score GTX 780 and GTX 770?

GPU Boost 2.0 just makes it complicated like hell for me, especially as one would be hard pressed to find an American site that ensures a GPU Boost 2.0 card is properly warmed up at the start of benchmark runs.

3DCenter averaged a bunch of review sites and found the following:

The GTX 770 is dead even with the 7970Ghz at 1080p 4xAA and 3% slower at 1600p 4xAA.

The GTX 780 is 14.7% faster than the 7970GHz at both 1080p 4xAA and 1600p 4xAA.

At least two of the review sites (Computer Base and Hardware France) tested both cards at warmed up speeds.

I'd probably rate the 770 and 7970Ghz the same and have the 780 be 7970Ghz + 15%.
 

BoFox

Senior member
May 10, 2008
689
0
0
I was thinking of just doing an approximate of GTX 780 and 770 with the two sliders set to the right (which is how AlienBabelTech's review was done), rather than flooding the rankings with two separate ratings for each card.

Is that ok with you guys?

With the temperature target set at 85C rather than the default 80C (merely 5 degrees higher), it appears that most of the cases where throttling takes place subsides. Nvidia must have done this in order to keep retail PC-system wholesellers satisfied with a much higher percentage of happy customers who bought Tri-SLI systems than if the temperature target was defaulted at 85C (much higher fan noise, as most Tri-SLi setups have at least 2 cards right next to each other, with variable ambient temperatures especially during the summer). At such a price premium, Nvidia probably wanted the rest of the amenities to appear as "premium", such as incredibly quiet fan noise, somewhat modest power consumption, etc - especially after the Fermi dilemma from the last 40nm generation. Of course, Nvidia probably knew that at least 95% of American review sites would only test the Boost 2.0 cards with quick benchmark runs rather than spending about 5x as much time to ensure that the cards were properly warmed up before each benchmark (hence 5x the time required to do a proper review before an impending deadline usually around 7 days after the card(s) were received).

What do you guys think?
 

BoFox

Senior member
May 10, 2008
689
0
0
Correction - not that it actually requires 5x the time to ensure that the Boost 2.0 cards were properly warmed up at the start of the run, but that the nature of many benchmark tests require up to 30 seconds of loading time (give or take). Even when repeating some of these benchmark tests, there is still some loading time involved where a card quickly cools down by 20 degrees or more. For some reviewers, it would require an overhaul of the benchmark test suite to one that consists of specific scenarios where it's easy to repeat or loop a benchmark run without having to load it up for 30 seconds each time. Otherwise, one would have to loop it maybe 5 times or even more until the card reaches a realistic temperature plateau representative of a regular gaming session of 20 minutes or more.

So, I guess I will just have to assume that the temperature target slider was set to the right as the vast majority of reviewers benched the cards before they had the time to fully warm up, due to the nature of most benchmark tests. There will be an asterisk for each of these GPU Boost 2.0 cards, pointing to a footnote indicating that the cards were tested like as if the default temperature target was moved to the right, simply due to the fact that the temperature throttling did not yet fully come into effect in usual benching scenarios.

Power target is yet another thing, but one must keep in mind that when the card is running cool to boot, there is far less power leakage. As the tempeature goes up, power usage can easily go up by 20W or even more. So there should not be a problem with assuming that both the sliders are set to the right.

Of course, the European sites that reviewed the cards with GPU Boost 2.0 variability under scrutiny will be properly accounted for.
 
Last edited:

BoFox

Senior member
May 10, 2008
689
0
0
Perhaps I am taking this too seriously, but would I be making the ratings "unfair" in favor of Nvidia?

Say, GTX 780 is rated at 312-313 VP, but that is just what the review sites are giving us, plus we could safely assume that the majority of GTX 780 owners would simply move the Temp Target and Power Target sliders to the right rather than let it throttle even below the advertised Boost clock spec in many games.

The Radeon cards here are also rated with the assumption that the Power Control slider is set at +20% as well (mainly due to the fact that at the beginning of the benchmark run, it can easily consume 20% less power than when fully warmed up due to temperature-related power leakage).

While Nvidia might have an advantage with the VP ratings in this light (of their cards running faster than default specs in a normal gaming session of 15 minutes or longer), I'll add an asterisk next to the VP scores for GPU Boost 2.0 cards starting now.

After all, some cards are specc'ed to run at near max overclockability, while others are run at far below their overclock potential (like GTX 460's or HD 7850s on one end of the spectrum, vs Titan or HD 6970 on the other end). So, it's not the end-all, be-all.
 
Last edited:

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
I think you have valid points and the rating should reflect end users experience. Boost is pretty primitive still (a year old) and currently takes advantage of cool cards to skew benchmarks, thus doesn't accurately reflect the end users experience. I suspect that someday it will be more stable but currently cards are all over the place (generally boosting higher than rated at stock and dipping when warm). It's not like the card is ~5% faster out of the box, you have to adjust it.
 

BoFox

Senior member
May 10, 2008
689
0
0
Yeah, I agree completely! I guess it's not too big of a deal to just rate the Boost 2.0 cards as the review sites benched them that way, as long as there's an asterisk with a footnote explaining the variable nature of Boost2.0 results.
 

szvwxcszxc

Senior member
Nov 29, 2012
258
0
76
You're missing:
2x 7950 CROSSFIRE
2x 7970 CROSSFIRE
10x 999999 CRXSSFXRE999 (kidding about this one!! of course!! ;))

Also neglected resale value... $1000 cards aren't going to be worth almost anything (if you can even sell it) a few years from now (no one's going to pay $1000 for an old card, driving resale down A LOT, losing more than 50% or even 90% of what you paid for it..), but $250 cards won't decrease near as much and will hold their value pretty well.
 
Last edited:

BoFox

Senior member
May 10, 2008
689
0
0
Sorry for the delay - been on a summer vacation in Florida. I'll try to manage during the stay here during the summer. I think I can do it.. *OHMMMM** I can do it!!
 

elizabel

Junior Member
Jul 17, 2013
1
0
0
I am not an expert in Video cards but I recently upgraded my video card and it is working great. Things are fast and very clear than before. However I am surprised to learn that my new card Geforce 210
ranks 208 out of the total 212 video cards mentioned. This is the bottom of Tier N. It says that Cards at this tier do not play modern games. I tried 2-3 games and they play fine. So not sure what modern games mean.
I am not a gamer and I played the crysis game just to test my new video card and things looked great. What exactly am I missing here.

I would also like to know which one would be better (HP is offering this choice)
4GB Nvidia GeForce GT 640 vs 2GB Nvidia GeForce GTX 645

Is GTX always better than GT
Are cuda cores similar to how we have cores on CPU like duo core, quad core?

Thanks
 

ruhtraeel

Senior member
Jul 16, 2013
228
1
0
I am not an expert in Video cards but I recently upgraded my video card and it is working great. Things are fast and very clear than before. However I am surprised to learn that my new card Geforce 210
ranks 208 out of the total 212 video cards mentioned. This is the bottom of Tier N. It says that Cards at this tier do not play modern games. I tried 2-3 games and they play fine. So not sure what modern games mean.
I am not a gamer and I played the crysis game just to test my new video card and things looked great. What exactly am I missing here.

I would also like to know which one would be better (HP is offering this choice)
4GB Nvidia GeForce GT 640 vs 2GB Nvidia GeForce GTX 645

Is GTX always better than GT
Are cuda cores similar to how we have cores on CPU like duo core, quad core?

Thanks

If they are similar in price, go for the GTX 645. The GT 640 won't be powerful enough to use all of its 4GB of VRAM (for gaming at least). Since the GT 640 is comparatively underpowered, having 2GB of VRAM vs 4GB of VRAM wouldn't make any difference when held back by the rest of the card itself.


Whats the best one I can get with 100-200$?

GTX 650 Ti Boost.
 
Last edited:

BoFox

Senior member
May 10, 2008
689
0
0
Working on it now after another busy and interesting summer - first, I'll do a combined Titan rating (based on ALL reviews put together), and then use it as an additional reference point for other new cards.

If anybody has any suggestions on how to make the ratings more clear and less confusing, or whatever - please feel free to share. TIA!
 

BoFox

Senior member
May 10, 2008
689
0
0
Titan score updated, with just one rating at 343 VP (with an updated footnote in the OP stating:

* - Note for non-overclockers (who do not want to adjust the Temperature Target slider), these stock GPU Boost 2.0 cards can perform several percent faster on default settings during "cold starts" especially if benchmarks runs are less than 2 minutes long, let alone 30 seconds long (for the reasons to such, see: http://forums.anandtech.com/showpost.php?p=34664254&postcount=54 ), while the rating only reflects an average of ALL reviews put together - despite most reviewers not ensuring that the cards are warmed up in advance for more accurate reflection of normal gameplay that lasts for more than 15-30 minutes.

Using Titan as an example, if we look at just the review sites that compare warmed-up results against "cold starts" across their benchmarks and apply the difference to the overall average, we get this:
111%-- Geforce GTX TITAN (warmed up, default*) 6GB (DX11) -- 320 VP
instead of:
120%-- Geforce GTX TITAN 6GB (DX11) -- 343 VP
 
Last edited:

BoFox

Senior member
May 10, 2008
689
0
0
UPDATE - added:

GTX 780
GTX 770
GTX 760

Removed Ultra High-End class for now, reserving it for dual-GK110 and HD 9970x2 if they come out soon enough (and are good enough also)...

Also downgraded most cards in mid and high-end classes down at least 1 class. Also created 1 new class, "Mid-Lower Mid-Range", where there used to be just Middle Mid-Range - split into Mid-Upper also.

GTX 580 X2 (Mars II edition) is in Upper High-end just to occupy that class for now, until GTX 780 will be pushed down soon - very soon probably (as NV starts releasing all kinds of GK110 derivatives)..
 
Last edited:

BoFox

Senior member
May 10, 2008
689
0
0
Now, we're finally seeing some retail cards giving you more than 1 VoodooPower per dollar!!!!

That is, for less than 1 dollar per VP! This is a wonderful milestone that took 28nm cards forever to reach. 40nm GTX 460-768 was already approaching it quite fast with the price discounts, but never really got there except for one or two odd deep rebates.

For starters, without rebate - there is a HD 7870 GHz Edition, rated at 171 VP, for $169 with free s/h + 2 games:
http://www.newegg.com/Product/Produc...82E16814161404

(The same goes for HD 7850 - even better bang-4-buck without rebate, rated at 141 VP, for $135 free s/h + 2 games:
http://www.newegg.com/Product/Produc...82E16814161406

But the prices keep on dropping, and today there's another 7870 with a rebate, for $154.99 AR:
http://www.newegg.com/Product/Produc...82E16814131492

(Same thing is happening for HD 7790 pretty much!!

HD 7950 too, with 3 games - like this Boost Edition especially: http://www.newegg.com/Product/Product.aspx?Item=N82E16814161420 )


On the Geforce side, we still have yet to see a card break that $1/VP milestone....

One of the closest that I can find with a quick search is:

GTX 660 (rated at 161 VP + a tiny pre-OC), for $169 AR with free s/h and Batman: Origins:
http://www.newegg.com/Product/Produc...82E16814130826

Wait, there's actually one that just barely touches the milestone after rebate (although it's a lower class card):
GTX 650 (non-Ti) 1GB (75 VP, $75 AR)
http://www.newegg.com/Product/Produc...82E16814127703

Yay????
 
Last edited:

BoFox

Senior member
May 10, 2008
689
0
0
Hey enthusiasts,

Sorry for being busy during the summer, until AMD actually delivered with some fixes for frame pacing issues with crossfire, before I was going to try to look into how much penalty should be issued for dual-GPU Radeon cards unto their actual aggregate/overall FPS scores derived from reviews all over the internet.

It's a bit complicated now that 13.10 beta issued some more game fixes in addition to 13.8 beta, IIRC.. with reviews only showing a handful of such games fixed by 13.8 beta (DX10+ games only, excluding eyefinity)..

What should the overall penalty be? I gave all dual-GPU cards around ~15% penalty against their actual FPS performance in the past, except for GTX 690 that showed considerable frame pacing improvements against microstuttering.

Thanks in advance - links and benches and stuff would be appreciated!
 

BoFox

Senior member
May 10, 2008
689
0
0
Not enough data, it seems.. 20%, 25%, 30+% penalty for dual-GPU Radeon cards instead of the rough ~15% for everything (both dual-GPU Radeon and Geforce cards) other than GTX 690 as done in the past? If more reviews tested 13.10 beta, it would have been much more helpful as there were some more game fixes for frame pacing, but then again it only applies for DX10+ games in non-Eyefinity resolutions. Instead of picking an arbitrary number before there are more reviews to give a better idea, I'll just add a footnote with an asterisk for these dual-GPU Radeon cards for now...
 

BoFox

Senior member
May 10, 2008
689
0
0
Updated with a footnote for dual-GPU Radeon cards, stating:

* - Note for Crossfire users, these dual-GPU Radeon cards still have frame pacing (extreme microstuttering with runt frames) issues due to the lack of frame metering except for some DX10+ games on single-monitor resolutions only. Currently, only a handful of reviews covered frame pacing game fixes introduced by Catalyst 13.8 beta drivers, with a few game benchmarks. In addition, Catalyst 13.10beta "Includes a number of frame pacing improvements for the following titles: Tomb Raider, Metro Last Light, Sniper Elite, World of Warcraft, Max Payne 3, Hitman Absolution, and for Cat13.10beta2, "frame pacing improvements for CPU-bound applications." So far, not enough data exists for a more ideally accurate representation of these dual-GPU Radeon cards, so these ratings should only be considered valid for DX10+ games with resolved frame-pacing issues for now. AMD seems to be working on it, so more remains to be seen, and hopefully more reviews cover more than just a handful of games using FCAT, for a clearer picture as to just how severe the penalty should be applied overall to the ratings.
 
Last edited: