Tom's Graphics Card Guide: 32 Mid-Range Cards Benchmarked

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

rockyjohn

Member
Dec 4, 2009
104
0
0
If the cooler works and you have a decent PSU, who cares what it draws?

Please, DO NOT say "it costs more to own it" - I don't feel like proving for the millionth time that an extra 100W costs you 4 coffees a year to run.

You must be purchasing your coffee at the Ritz. I would like to see the calculations.

Using the full power - or an extra 100w - for just 6 hours per day, 365 days per year uses an extra 219 KW. At 15 cents per KW (assumes I am not exceeding my monthly allowance or the cost per KW jumps up substantially - that is $33 per year. That would be about $8 per cup.

Over a three year card life that you be almost $100 - which I could perhaps instead use to purchase a more powerful but more efficient card.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Using the full power - or an extra 100w - for just 6 hours per day, 365 days per year uses an extra 219 KW. At 15 cents per KW (assumes I am not exceeding my monthly allowance or the cost per KW jumps up substantially - that is $33 per year. That would be about $8 per cup.

His point is pretty valid in the real world actually.

No offense but who uses their GPU at 100% load 6 hours a day / 365 days a year? If you are using it for distributed computing projects, you have already accepted the electricity cost as worth your cause. If you are using it for games only.....who games 6 hours a day / 365 days a year? No vacations? No social life?

If you are gaming 6 hours a day all year long and worry about $33 a year in extra electricity costs, you have bigger problems to worry about - like graduating from high school or getting a real job!

Realistically, the power consumption difference between high-end cards is a tiny fraction of the depreciation the very same cards experience over the course of 1 year. Put it this way, if you air dry all your laundry, I bet you'll save a lot more in electricity costs over 12 months. Do you do that?
 
Last edited:

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
Yes. If a site wants to be considered a serious review/news site then they have to put the readers first. Some of these reviews need to have "advertisement" written across the top.

Video cards aren't that bad. The current king of advertisements goes to motherboard "reviews" these days. How long are they going to pretend a $200+ board is no faster than a $50 board just because they don't touch the latter?
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
I wouldn't touch the factory OC variants with a 20 foot pole, especially not nVidia parts. They're often faulty from the factory at those clocks, and we found out last time it's because there's only a handful of games they're tested in before they're considered "compliant".

I have always believed this. Even more recently, I am debating on never overclocking a video card again. In the end, it just doesn't pay. Anecdotally - I have lost 3 GTX 280's and now my 460 GTX just died on me after 4 months of use. Zotac has sent me my replacement and I should receive it Monday.

BTW - Zotac had some great turnaround time (better than eVGA) on my RMA. EVGA took weeks (like 3+ for one of them) to get me my part and never really gave me a status update. It only took Zotac 3-4 business days to send out my replacement card. After this RMA experience, I'll definitely consider Zotac in my future purchases.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
There's enough time in the day to have a full time job and game 6 hours.

Yes, but 6 hours a day x 365 days? :sneaky: (and still have enough time to "worry" about $33 a year of electricity costs?). That's not an "average gamer" example, more like an addict. It's biased to use such an extreme example to try to prove that the electricity costs are significant. In that case, the differences between running a stock vs. overclocked CPU, esp. across AMD and Intel would be even more significant. But let's not even go there...

Anecdotally - I have lost 3 GTX 280's and now my 460 GTX just died on me after 4 months of use. Zotac has sent me my replacement and I should receive it Monday.

Your failure rate is very high and unusually so. 4 videocards? I have built PCs for > 10 years and had 0 videocards fail on me. Every single one of them ran overclocked its entire useful life. But not only that, my 4890, GTX470 and my current card were/are loaded to 99% 24/7 between distributed computing projects and gaming almost from day 1 -- yet not one of them has ever failed.

Do you maintain reasonable temperatures (<85*C) on your cards with fan speed control when you overclock? What's your PSU? Are you overclocking them to reasonable levels or perhaps increasing GPU voltage by more than 10% above stock levels? Are you using a good surge protector in your house? Did you have any other electric / electronic devices fail on you in your house?
 
Last edited:

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Yes, but 6 hours a day x 365 days? :sneaky: (and still have enough time to "worry" about $33 a year of electricity costs?). That's not an "average gamer" example, more like an addict. It's biased to use such an extreme example to try to prove that the electricity costs are significant. In that case, the differences between running a stock vs. overclocked CPU, esp. across AMD and Intel would be even more significant. But let's not even go there...



Your failure rate is very high and unusually so. 4 videocards? I have built PCs for > 10 years and had 0 videocards fail on me. Every single one of them ran overclocked its entire useful life. But not only that, my 4890, GTX470 and my current card were/are loaded to 99% 24/7 between distributed computing projects and gaming almost from day 1 -- yet not one of them has ever failed.

Do you maintain reasonable temperatures (<85*C) on your cards with fan speed control when you overclock? What's your PSU? Are you overclocking them to reasonable levels or perhaps increasing GPU voltage by more than 10% above stock levels? Are you using a good surge protector in your house? Did you have any other electric / electronic devices fail on you in your house?

I just knew someone was going to say something if I posted. Actually, go ahead and search all my posts at AT (with search) you will find that my failure rate is 2 by your standards, out of 16 different video cards over the years. The reason it is actually 4 is because the eVGA RMA department kept giving me faulty cards back (different serial number). In fact, if you read their forums, there were several people with the exact same problem. We all figured that we were all just getting each others faulty cards in some grand joke.

Anyway, I know how to build a PC, I know how to keep it cool, I know how to connect the proper connectors. So lets not go there. K?
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
I think it was an okay review. One thing I didn't like is the fact that they don't seem to even know the markets for these cards. For example, they call the Performance category "Gamers" and say that the article is a comparison of mid-range cards when the only cards that qualify as so in the review are the Radeon HD 6790 and GTX 550 Ti. Everything higher than that was in Performance (6850, 6870, 560, 560 Ti) or Enthusiast (6950). Another thing is that they didn't include a price/performance metric like TPU always does.

But it was interesting to see the performance differences between the cards, nonetheless.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
I just knew someone was going to say something if I posted. Actually, go ahead and search all my posts at AT (with search) you will find that my failure rate is 2 by your standards, out of 16 different video cards over the years. The reason it is actually 4 is because the eVGA RMA department kept giving me faulty cards back (different serial number). In fact, if you read their forums, there were several people with the exact same problem. We all figured that we were all just getting each others faulty cards in some grand joke.

Anyway, I know how to build a PC, I know how to keep it cool, I know how to connect the proper connectors. So lets not go there. K?

That still seems kinda high. Even the GTX 260M on my laptop, which has been constantly run OCed and running 12 continuous hours of Folding@home and also gaming for a long time, has yet to show any signs of failing. I did change out the stock thermal paste and enabled PowerMizer, though. And this is on a 15" chassis laptop with a much smaller cooling system than a desktop.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I think, typically, O/C'd cards use too much more power for the performance advantage they give. It's not unusual to see ~10&#37; performance gain for ~30% power increase. It shortens the life of the card, makes them hotter and louder, and offers less stability. There are exceptions. Typically cards that are "downclocked" at factory settings to stop them from competing with their big brothers. GTX460 and HD6950, come to mind. Even then, I wouldn't push them to their limits or you run into diminishing returns.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Yes, but 6 hours a day x 365 days? That's not an "average gamer" example, more like an addict.

Did he specifically say that was the average gamer? No, he didn't. I would call 6 hours a day a dedicated gamer; the hobby is simply part of his lifestyle. And it wouldn't be too difficult to average 6 hours a day. On off days a person can game for 8-10 hours. Some people watch TV; other people game. For an "average" gamer 3 hours a day isn't too unreasonable.

(and still have enough time to "worry" about $33 a year of electricity costs?)
Time? Worry? It's not that big of a burden to watch your power consumption. Shutting off unnecessary lights and using appliances as little as possible while still maintaining comfortability is a lifestyle decision. You approach the situations as they come, and the same lifestyle can be applied to buying video cards as to buying energy efficient light bulbs.

It's biased to use such an extreme example to try to prove that the electricity costs are significant.
It's not biased. He was specific in the parameters he was using. He didn't try to make the situation he used to be something it wasn't. It's a real situation that only applies to that situation. Because of the variables involved - time and cost per kwhr - every situation will be different.

In that case, the differences between running a stock vs. overclocked CPU, esp. across AMD and Intel would be even more significant. But let's not even go there...

Yeah, don't go there because it has nothing to do with what we're talking about - video cards. Stock vs. overclocked CPUs, AMD vs. Intel would bring in entirely new variables to consider. All his argument concerned was video cards.
 

formulav8

Diamond Member
Sep 18, 2000
7,004
522
126
I have built PCs for > 10 years and had 0 videocards fail on me. Every single one of them ran overclocked its entire useful life. But not only that, my 4890, GTX470 and my current card were/are loaded to 99&#37; 24/7 between distributed computing projects and gaming almost from day 1 -- yet not one of them has ever failed.


Good for you. I've built computers longer than you and have had multiple video cards go bad. Although his is a somewhat high amount.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,695
387
126
Yes, but 6 hours a day x 365 days? :sneaky:

(...)

Every single one of them ran overclocked its entire useful life. But not only that, my 4890, GTX470 and my current card were/are loaded to 99&#37; 24/7 between distributed computing projects and gaming almost from day 1 -- yet not one of them has ever failed.

I can see how it is impossible for you to imagine someone loading their cards 6 hours per day, 365 days a year... :)
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Good for you. I've built computers longer than you and have had multiple video cards go bad. Although his is a somewhat high amount.

I was not trying to brag. I was just pointing out that his failure rate of 3x GTX280 (same generation) seems very unusual. It wasn't like his failure rate was spread out across a 10 year span either. Think about it, if I overclocked all my CPUs at 30&#37; voltage above stock, I'd also have a huge failure rate. That doesn't mean that "I should stop overclocking CPUs because they fail for no reason". It just means overclocking should be done with care. That's why I was curious as to his overclocking techniques and cooling profiles used.

I can see how it is impossible for you to imagine someone loading their cards 6 hours per day, 365 days a year... :)

If you read my post, I said those of us who are using the GPUs for DC / science projects have accepted the added electricity costs as a downside already. The performance difference between architectures is so massive depending on the project, that power consumption differences between brands are almost irrelevant given the difference in performance. :)

My point is it's possible to load GPUs for 24 hours a day, but I don't walk into a thread and state how massive the electricity cost differences are based on my rather abnormal GPU usage. Similarly, I think it's just as unreasonable to say that the electricity costs will be significant by using some extreme outlier such as gaming for 6 hours x 365 (obviously then it makes a bigger difference, but how many people game that much?).

Tom's hardware already ran a huge comparison and the power consumption differences between top end cards for the average gamer were pretty small over a period of 1 year.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,695
387
126
If you read my post, I said those of us who are using the GPUs for DC / science projects have accepted the added electricity costs as a downside already. The performance difference between architectures is so massive depending on the project, that power consumption differences between brands are almost irrelevant given the difference in performance. :)

My point is it's possible to load GPUs for 24 hours a day, but I don't walk into a thread and state how massive the electricity cost differences are based on my rather abnormal GPU usage. Similarly, I think it's just as unreasonable to say that the electricity costs will be significant by using some extreme outlier such as gaming for 6 hours x 365 (obviously then it makes a bigger difference, but how many people game that much?).

Tom's hardware already ran a huge comparison and the power consumption differences between top end cards for the average gamer were pretty small over a period of 1 year.

20:00-0:00 or 21:00-1:00 is 4 hours. Increase the load in weekends and 5 hours per day in a year might not be too crazy.

Anyway, the point is why would someone get a card that consumes so much more for minimal performance gains? If the entry cost was quite different, like in the prime days of 5870 vs GTX470, I could understand.

But we are talking 70W difference to a card like the Palit GTX560 Ti that gives the same game play experience.
 
Last edited:

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
I was not trying to brag. I was just pointing out that his failure rate of 3x GTX280 (same generation) seems very unusual. It wasn't like his failure rate was spread out across a 10 year span either. Think about it, if I overclocked all my CPUs at 30&#37; voltage above stock, I'd also have a huge failure rate. That doesn't mean that "I should stop overclocking CPUs because they fail for no reason". It just means overclocking should be done with care. That's why I was curious as to his overclocking techniques and cooling profiles used.

I could have said the same thing as you prior to my first card failing. I have owned Vodoo Rush, Voodoo 2, TNT 2, TNT 2 Ultra, Geforce 256, Geforce 2, Geforce 3 Ti 4200. ATI Radeon 9600 Pro, nVidia 6800 LE, 7800 GTX, 8800 GTX 320, 8800 GT, 8800 GTS 512MB, 260 GTX, 280 GTX*, 460 GTX*

My 280 was my first card that failed and they kept giving me a failing replacement. So that is where three of my failures came from (1 from the onset) and also my 460 failed, it was replaced and my new one is working fine now. So prior to January 2010, I could have said the same thing "I've had XXX and never had a video card fail"... Blah blah blah... Just because it hasn't happened doesn't mean it will not. These cards are getting higher and higher in their TDPs. Increase of failing parts should be of no surprise, the VRMs seem to be the biggest problem, especially in a non-reference design.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,001
126
The only two cards I&#8217;ve had fail on me where a 8800 GTS and an 8800 Ultra. Both failed shortly after one year, and both were G80 parts. I&#8217;d suspect bump-gate.

As for the power costs, they&#8217;re blown out of proporition. Even if it was $33 a year (which is no doubt an exaggeration), somebody that buys a $300 GPU every 12-18 months is not going to worry about $33 a year. If $33 in one year is too much for you then you can&#8217;t afford these kinds of cards to begin with.

What is a concern is the TDP, especially if a slower card is hotter and louder than a faster card. A high TDP needs to be dealt with every time you game, in the form of noise and heat.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
All concerns, or lack thereof, are relative to the user and not blanketed concerns across the board. Power usage, heat, noise, performance, price, color (sure, it matters to some), features. All relative.