Go Back   AnandTech Forums > Hardware and Technology > Video Cards and Graphics

Forums
· Hardware and Technology
· CPUs and Overclocking
· Motherboards
· Video Cards and Graphics
· Memory and Storage
· Power Supplies
· Cases & Cooling
· SFF, Notebooks, Pre-Built/Barebones PCs
· Networking
· Peripherals
· General Hardware
· Highly Technical
· Computer Help
· Home Theater PCs
· Consumer Electronics
· Digital and Video Cameras
· Mobile Devices & Gadgets
· Audio/Video & Home Theater
· Software
· Software for Windows
· All Things Apple
· *nix Software
· Operating Systems
· Programming
· PC Gaming
· Console Gaming
· Distributed Computing
· Security
· Social
· Off Topic
· Politics and News
· Discussion Club
· Love and Relationships
· The Garage
· Health and Fitness
· Merchandise and Shopping
· For Sale/Trade
· Hot Deals with Free Stuff/Contests
· Black Friday 2014
· Forum Issues
· Technical Forum Issues
· Personal Forum Issues
· Suggestion Box
· Moderator Resources
· Moderator Discussions
   

View Poll Results: How often do you normally upgrade a component? (Choose the closest please.)
Every 6 months 6 6.32%
Yearly 20 21.05%
2 years 42 44.21%
3 years 18 18.95%
4 years 5 5.26%
5 years + 4 4.21%
Voters: 95. You may not vote on this poll

Reply
 
Thread Tools
Old 08-21-2011, 11:51 AM   #76
jhansman
Platinum Member
 
jhansman's Avatar
 
Join Date: Feb 2004
Posts: 2,205
Default

Quote:
Originally Posted by NUSNA_Moebius View Post
And yes, AMD pretty much slaps Nvidia around when it comes to performance per watt. Let's just see them do it with their CPUs vs Intel
If AMD had Intel's fab resources, market share, and influence, they might. Anyway you slice it, that's a David & Goliath situation, and likely always will be. But I digress. This thread is about energy efficiency; for my part, I pulled two rotational drives out of my system and put in a boot SSD. My vidcard, admittedly low end, is passively cooled.

This discussion interests me because I am readying a GPU upgrade, and until now, never consider power consumption beyond the PSU. Oh, and I am at the 2.5 yr mark since my last build, which squares with the chart.
__________________
Blessed is the man who has nothing to say, and cannot be compelled to say it.
jhansman is offline   Reply With Quote
Old 08-21-2011, 12:14 PM   #77
Soulkeeper
Diamond Member
 
Soulkeeper's Avatar
 
Join Date: Nov 2001
Posts: 5,939
Default

my power bill recently went down by $100 because I stopped running distributed computing

to think of all the money wasted over the last 12yrs that i've run distributed computing non-stop nearly 24/7

geeze ... what a costly addiction
__________________
A8-3870 @3.3GHz 1.3125v
16GB 1866 9-9-9-23 1t 1.35v
Linux software/gaming exclusively
linuxsociety.org
Soulkeeper is offline   Reply With Quote
Old 08-21-2011, 08:28 PM   #78
RussianSensation
Elite Member
 
RussianSensation's Avatar
 
Join Date: Sep 2003
Location: Dubai, UAE
Posts: 14,464
Default

Quote:
Originally Posted by Soulkeeper View Post
my power bill recently went down by $100 because I stopped running distributed computing

to think of all the money wasted over the last 12yrs that i've run distributed computing non-stop nearly 24/7

geeze ... what a costly addiction
Wow, that's a lot.

How many GPUs are you running? Even at 400W @ 24 hours x 30 days = 288 kW.

$100 / 288 kW = $0.35 per kW.
__________________
i5 2500k | Asus P8P67 Rev.3 | Sapphire Dual-X HD7970 1150/1800 1.174V CFX | G.Skill Sniper 8GB DDR3-1600 1.5V
SeaSonic Platinum 1000W | OCZ Vertex 3 120GB + HITACHI 7K1000.B 1TB | Windows 7
Westinghouse 37" 1080P | X-Fi Platinum | Logitech Z5300E 5.1
RussianSensation is online now   Reply With Quote
Old 08-21-2011, 08:44 PM   #79
Soulkeeper
Diamond Member
 
Soulkeeper's Avatar
 
Join Date: Nov 2001
Posts: 5,939
Default

2 computers, was pulling 700-800KWh per month
the tiered price was like $0.25 per KWh in the upper bracket
i've got to stay under 350KWh per month (baseline) to get the lowest price (under $0.10)

not counting the 2 roommates and other appliances
the breakdown per computer/component was something like this:
225W workstation (full load)
160W server (full load)
100W crt tv
100W crt monitor

i've gotten down to the following:
135W workstation (idle @stock)
85W server (idle under-clocked)
20W lcd replaced the tv
100W crt monitor still

I also upgraded the cable box ... but unfortunately it still pulls the same 20W even when turned off
__________________
A8-3870 @3.3GHz 1.3125v
16GB 1866 9-9-9-23 1t 1.35v
Linux software/gaming exclusively
linuxsociety.org

Last edited by Soulkeeper; 08-21-2011 at 08:52 PM.
Soulkeeper is offline   Reply With Quote
Old 08-24-2011, 04:33 AM   #80
pandemonium
Golden Member
 
pandemonium's Avatar
 
Join Date: Mar 2011
Location: Augusta, GA
Posts: 1,404
Wink Real numbers comparitive performance

Quote:
Originally Posted by Lonbjerg View Post
You seem to have a short term memory...the price of running cards has many times (falsly) been used to attack GPU vendors on this forum.

I hope this thread will kill that strange misconception...even if it was unintetionally, the OP seemed to have a different plans with this thread:

The perf/watt has been inflated past it real world value...FOTM...now let it die.
Look below for "false attacks" and "inflated past real world value."

Quote:
Originally Posted by RussianSensation View Post
Great!!

In other words, most of the difference comes from Price of Card/Setup vs. FPS achieved.
Indeed, however, the numbers that you're citing are nearly as minimal as they can be. See the following.

Here is the clearest purpose for why I created these spreadsheets:

For all intents and purposes, let's compare the HD6870 in CF versus the GTX470 in SLI.
As per 08-20-11, they're very similar in average FPS over all benches: 96.25 versus 98.82
For someone who spends $.21 per kWh, the differences in operational costs per day will be: $.59430 versus $.81564 (or $.22134)
On a monthly basis of ~30.4 days (365/12) this equates to: $6.728

Also consider the fact that GTX470 SLI costs $475.72 initially compared to a pair of HD6870s at $349.98, $125.74 less.

Also include that the HD6870 CF runs idle at 45C, load at 76C, versus a pair of GTX470s idling at 54C, load at 96C. This will make a large impact on A/C costs.

Quite a few people in this thread have mentioned paying more than $.21 kWh, so this gets multiplied. For instance, Concillian mentioned paying $.32 per kWh if they reach tier 3. For a house with a family of 4, reaching tier 3 really isn't that unrealistic. What's the operational costs @$.32 kWh versus $.21 kWh? $10.252/month. All for a setup that provides relatively the same framerate (an unnoticeable increase of 2.57 FPS over all benches).

Hopefully this will provide perspective for those that still think GPU performance efficiency isn't something to consider.
__________________
..:: |We are but shadows of our achievements and dust of the stars; empowering the universe to have conciousness.| ::..
..:: |Fighting ignorance is bliss.| ::..

If you're curious about getting into vaping, I'd be glad to help! I quit smoking with vaping! /truestory
pandemonium is offline   Reply With Quote
Old 08-24-2011, 05:02 AM   #81
pandemonium
Golden Member
 
pandemonium's Avatar
 
Join Date: Mar 2011
Location: Augusta, GA
Posts: 1,404
Default

Quote:
Originally Posted by Ben90 View Post
I believe you should pull the Heat/FPS column as it really doesn't make sense as it stands. Heat isn't a unit of measurement. We can measure a card using 100 watts, but you can't measure a card creating 100 heat.

Second, the premise of the column is off. I understand why you wanted to put it there, to inform customers that 480SLI does indeed generate more heat than a 5770. However, the amount of heat generated has nothing to do with the temperature of the card as you are using it.

Fortunately, the information you want to convey is already in the graph. Since nearly all of the electricity running through our video cards gets turned into thermal energy, the "Watts/FPS" column tells the user nearly exactly how much heat is generated using watts as our unit of measurement.

A very very informative chart despite my nitpickings
Thanks for the feedback and praise. Much appreciated!

I'm confused on what you're saying. Watt consumption does not directly equal energy loss in the form of heat; when comparing different cards. Efficiency in architecture of the card and how it processes data, design of air flow and volume of air flow (or heat dispersal via water cooling) modify those numbers. The comparison comes into play for purposes of cooling the heat generated (A/C costs).

Oddly enough, you got me thinking about how to provide energy lost through heat / wattage used. This would need to be done by comparing the two columns of Watts/FPS and Heat generated. This would show another aspect on how efficiently a card is utilizing the energy it drains.

I've also considered using the average amount of heat generated and wattage used for all the cards compared and creating an [unfortunately] subjective scale on top of FPS/costs to get at a more universal comparison level. Subjective complications of this are detering me from doing so, however. Say, anything above average would receive a penalty. The penalty scale would be where issues arise.

Yet more columns...lol. This thing's getting ridiculously large. I hate leaving anything out, but... Ugh...

Thoughts anyone?
__________________
..:: |We are but shadows of our achievements and dust of the stars; empowering the universe to have conciousness.| ::..
..:: |Fighting ignorance is bliss.| ::..

If you're curious about getting into vaping, I'd be glad to help! I quit smoking with vaping! /truestory
pandemonium is offline   Reply With Quote
Old 08-24-2011, 05:39 AM   #82
Gryz
Senior Member
 
Gryz's Avatar
 
Join Date: Aug 2010
Posts: 417
Default

Quote:
Originally Posted by pandemonium View Post
Quite a few people in this thread have mentioned paying more than $.21 kWh, so this gets multiplied.
FYI, here in the Netherlands we pay 21 eurocents per kWh. Now for technical reasons, I am forced to heat my water with an electric boiler (in stead of using gas, like everybody else here does). Because of that I have a subscription where I pay 14 eurocents during the night, and 28 eurocents during the day.

28 eurocents = $0.40.

Over half of that price is a special energy tax. Because of the mild climate, it is very very rare for people to have AC in their homes.
Gryz is offline   Reply With Quote
Old 08-24-2011, 09:39 AM   #83
RussianSensation
Elite Member
 
RussianSensation's Avatar
 
Join Date: Sep 2003
Location: Dubai, UAE
Posts: 14,464
Default

Quote:
Originally Posted by pandemonium View Post
Indeed, however, the numbers that you're citing are nearly as minimal as they can be. See the following.
Disagree. 6 Hours of gaming a day 300 days a year is definitely on the high-end for most people. You are providing one of the worst case scenarios outside of people doing distributed computing. Like I said before, people who are gaming 6 hours a day / 300 days a year and are complaining about electricity cost should invest that time into getting a better paying job, upgrading their skills, etc. 6 hours a day x 300 days a year of gaming is an addiction, not a typical scenario.

Quote:
Originally Posted by pandemonium View Post
For all intents and purposes, let's compare the HD6870 in CF versus the GTX470 in SLI. Also include that the HD6870 CF runs idle at 45°C, load at 76°C, versus a pair of GTX470s idling at 54°C, load at 96°C.
My GTX470 idled at 38-42*C and loaded at 76-78*C. So I can't agree with the data you provided. I provided screenshots on our forum many times and can do so again. That's with the stock TIM too. I always stress how important it is to have a case with excellent air ventilation. Most people don't.

Regardless you correlating GPU temperatures with A/C is questionable. US homes tend to have central air. So if you have 4-5 rooms and 1 room is hot, A/C won't continue working just to keep that 1 room cold. You are making it sound like A/C will continue to work because 1 room has a computer in it.

Quote:
Originally Posted by pandemonium View Post
Also consider the fact that GTX470 SLI costs $475.72 initially compared to a pair of HD6870s at $349.98, $125.74 less.
I got 2 GTX470s for $404 in July 2010. HD6870 didn't launch until October 22, 2010 (nearly 5 months later). Its launch price was $239. I would have had to wait another 5 months before I could buy 2 of those for $350. HD6870s didn't dip to $175 until GTX560 Ti launched. This comparison is completely flawed because:

1) People who bought GTX470 bought it when HD6870 wasn't on the market at $175
2) People who bought GTX470 bought it because it cost $100+ less than HD5870 and performed within 10% of that card, so the value was actually greater than the HD5870.
3) NV shipped 2-3 games with GTX470 when it launched for a period of 3+ months, providing additional value the in US.

You shouldn't compare older setups with newer setups because people who are buying an HD6870 are choosing a GTX560 Ti; so the comparison should be made about what's available today. It wouldn't make any sense to compare the operational and FPS costs of HD7970 to an HD6970 8 months from today, would it? But you just did that with a GTX470 vs. HD6870. No one is going to be choosing an HD6970 vs. HD7970 8 months from now. It's a good theoretical comparison.


Quote:
Originally Posted by Gryz View Post

28 eurocents = $0.40.
Ok, but you earn your salary in euro right?
__________________
i5 2500k | Asus P8P67 Rev.3 | Sapphire Dual-X HD7970 1150/1800 1.174V CFX | G.Skill Sniper 8GB DDR3-1600 1.5V
SeaSonic Platinum 1000W | OCZ Vertex 3 120GB + HITACHI 7K1000.B 1TB | Windows 7
Westinghouse 37" 1080P | X-Fi Platinum | Logitech Z5300E 5.1

Last edited by RussianSensation; 08-24-2011 at 09:53 AM.
RussianSensation is online now   Reply With Quote
Old 08-24-2011, 10:23 AM   #84
apoppin
Lifer
 
apoppin's Avatar
 
Join Date: Mar 2000
Posts: 34,903
Default

Quote:
Originally Posted by RussianSensation View Post
Disagree. 6 Hours of gaming a day 300 days a year is definitely on the high-end for most people. ...6 hours a day x 300 days a year of gaming is an addiction, not a typical scenario.
How many people watch TV, 6 hours a day - 365/365? Think of the electricity used for those 50" screens going 24/7 in some households.
- And there are eighteen other hours per day; 6-8 is taken usually by sleep. There is always plenty of time for work if you have a job.

i'd say gaming is far more productive than TV watching and you could get into the industry.
- just for a counter-point.
__________________
.


AMDs Upcoming HD 7970 Exposed a Short-lived Video card?


Core i7 920 @ 3.8 GHz
/Noctua NH-U12P SE2/3x2 GB Kingston KHX1800/GTX 590, GTX 580 SLI or HD 6990 + HD 6970 TriFire-X3/Gigabyte GA-EX58-UD3R/2xThermaltake ToughPowerXT-775W/OCZ 850W/Thermaltake Element G/Klipsch v.2 400w/128 GB Kingston VNow 100 SSD and 2x500 GB 7200.12 Seagate HDDs/640 GB WD USB 2.0/Win 7 64/HP LP 3065 2500x1600 LCD/3 x Asus VG236H 23"
- 5760x1080 120Hz

Last edited by apoppin; 08-24-2011 at 10:28 AM.
apoppin is offline   Reply With Quote
Old 08-25-2011, 12:51 AM   #85
pandemonium
Golden Member
 
pandemonium's Avatar
 
Join Date: Mar 2011
Location: Augusta, GA
Posts: 1,404
Default

RussianSensation, you're regurgitating a lot of points we've already discussed.

Quote:
Originally Posted by RussianSensation View Post
Disagree. 6 Hours of gaming a day 300 days a year is definitely on the high-end for most people.
The amount is not meant to be indicated as an exact amount that represents all computer users. It's a representation of the differences found between each GPU in their efficiencies. As I stated before, it doesn't matter if it was 5 minutes or 24 hours a day. The number is there to easily show you scale of the differences.


Quote:
Originally Posted by RussianSensation View Post
My GTX470 idled at 38-42*C and loaded at 76-78*C. So I can't agree with the data you provided. I provided screenshots on our forum many times and can do so again. That's with the stock TIM too. I always stress how important it is to have a case with excellent air ventilation. Most people don't.
I can't take your values for a certain card as verbatin. The numbers I have on the spreadsheets are directly from Anandtech's bench tool. If you have a problem with their results I suggest you talk to their benchmarkers.

Quote:
Originally Posted by RussianSensation View Post
Regardless you correlating GPU temperatures with A/C is questionable. US homes tend to have central air. So if you have 4-5 rooms and 1 room is hot, A/C won't continue working just to keep that 1 room cold. You are making it sound like A/C will continue to work because 1 room has a computer in it.
Irrelevant. Not all computers are in rooms that are closed off to general air flow. Not only that, but with this logic, you're saying heat generated isn't worth considering at all since central air doesn't run specifically for 1 room that's hot.



Quote:
Originally Posted by RussianSensation View Post
I got 2 GTX470s for $404 in July 2010. HD6870 didn't launch until October 22, 2010 (nearly 5 months later). Its launch price was $239. I would have had to wait another 5 months before I could buy 2 of those for $350.
None of these cards came out at the same time, but the point is they are all available NOW. You can't argue past and future in a present situation, lol. How would you have known? You wouldn't have! Perhaps you need to read the disclaimers again in post #2.

Quote:
Originally Posted by RussianSensation View Post
No one is going to be choosing an HD6970 vs. HD7970 8 months from now. It's a good theoretical comparison.
You don't know that. Honestly, you don't. Not everyone buys the latest and the greatest right when it comes out. I for one don't. I wait until the prices drop off that steep ledge and the software matures to meet the capabilities of the hardware. In 8 months, the HD6970 could very well be the most efficient buy for your money depending on revamping technologies of aftermarket designs, what the next generation brings, price fluctuations and market values. What if the 7xxx series completely bombs and the HD6970 still rocks? You can't theorize the future and deny anything from being possible. That's incredibly poor rationalization.

Quote:
Originally Posted by RussianSensation View Post
Ok, but you earn your salary in euro right?
Now you're just trollin' for the sake of trollin', lol.
__________________
..:: |We are but shadows of our achievements and dust of the stars; empowering the universe to have conciousness.| ::..
..:: |Fighting ignorance is bliss.| ::..

If you're curious about getting into vaping, I'd be glad to help! I quit smoking with vaping! /truestory
pandemonium is offline   Reply With Quote
Old 08-25-2011, 02:26 AM   #86
Ben90
Platinum Member
 
Join Date: Jun 2009
Posts: 2,824
Default

Quote:
Originally Posted by pandemonium View Post
I'm confused on what you're saying. Watt consumption does not directly equal energy loss in the form of heat; when comparing different cards.
Yes it does
Quote:
Originally Posted by pandemonium
Efficiency in architecture of the card and how it processes data, design of air flow and volume of air flow (or heat dispersal via water cooling) modify those numbers. The comparison comes into play for purposes of cooling the heat generated (A/C costs).
Actually electronics are basically nearly 100% efficient at transferring electricity to heat. Even if you don't want to believe me, its very easy to see why the column is useless by applying some common sense.

The first and easiest way to show that GPU temperature isn't really related to power consumption as you are suggesting is to stick your finger into your GPU fan while running furmark. Its not suddenly using a crapton more power because the fan stopped.

Another fun example is to use different scales of temperature. What if we measured temps using something madeup called a BATTSECKS unit. The freezing and boiling point of water in the BATTSECKS scale is -50* and 50* respectively. The idle temperature of a 5770 is 313.15 degrees Kelvin, or -10.15* BATTSECKS. According to the same math as your chart, a 5770 uses negative heat when idling. While the values don't mean anything since power consumption again has nothing to do with GPU temperature, at least use the Kelvin scale so you don't get outputs that break the laws of physics.

Lets look back at the load temps for a bit. A GTX 480 gets up to 94*C in Crysis. Adding a second card for SLI brings it up to 96*C. If we are using the same metric that Heat produced = GPU temperature, that second card only adds 2.1% more heat.
Ben90 is offline   Reply With Quote
Old 08-26-2011, 05:25 AM   #87
pandemonium
Golden Member
 
pandemonium's Avatar
 
Join Date: Mar 2011
Location: Augusta, GA
Posts: 1,404
Default

I understand where you're going, Ben, and you're right. I'm definitely not disputing Joule's first law or anything here. The point I'm trying to make is that no two cards are equal when it comes to their productivity versus the energy consumed - then in parallel - the heat dispersed.

I'll try to example what I'm saying. In order to keep the comparison as objective as possible, I'll pick 2 cards that have nearly the same Load noise levels. This is to reduce the comparative amount of heat dispersal that is being done by heat-sinking and fan assemblies. I'm not comparing FPS directly between these two cards, but relating their performance measures based on how much energy they consume and the heat retained on the card; baselined by their ambient dB levels. They're the HD5870 in CF and GTX470 (non-SLI).

Per the 8-20-11 charts:
-HD5870CF averages 97.55FPS, uses 460 watts under load, runs at 61.7 dB and 80°C.
-GTX470 averages 58.16FPS, uses 366 watts under load, runs at 61.5 dB and 93°C.

The HD5870CF is generating a higher FPS than the GTX470 (+39.39FPS), it's using more energy (+94 watts), runs .2 dB louder (negligible for this comparison), and runs 13°C cooler. Can you tell me that the .2 dB increase in ambient volume makes up for the 13°C difference in heat dispersal? Will an increase of 39.39FPS and 94watts consumed scale even remotely close to a drop in 13°C against other card comparisons? If heat dispersal efficiency (even at a negligible audible level of +.2 dB) is the only consideration for weighing Watt/FPS against Heat/FPS, then how is this possibly so adverse against other comparisons?

Does that make sense?
__________________
..:: |We are but shadows of our achievements and dust of the stars; empowering the universe to have conciousness.| ::..
..:: |Fighting ignorance is bliss.| ::..

If you're curious about getting into vaping, I'd be glad to help! I quit smoking with vaping! /truestory
pandemonium is offline   Reply With Quote
Old 08-26-2011, 07:23 AM   #88
Ben90
Platinum Member
 
Join Date: Jun 2009
Posts: 2,824
Default

Quote:
Originally Posted by pandemonium View Post
I'll try to example what I'm saying. In order to keep the comparison as objective as possible, I'll pick 2 cards that have nearly the same Load noise levels. This is to reduce the comparative amount of heat dispersal that is being done by heat-sinking and fan assemblies. I'm not comparing FPS directly between these two cards, but relating their performance measures based on how much energy they consume and the heat retained on the card; baselined by their ambient dB levels. They're the HD5870 in CF and GTX470 (non-SLI).
You can't just use dB levels as a baseline of heatsink performance. If the above was true, the Intel stock cooler would be a beast and Honda Civics with Fartpipes would be fast.

Imagine a video card that had a huge phase change unit integrated within itself for a cooler. The GPU temperature could be below zero. According to your chart the card is creating negative heat. That makes no sense.

Last edited by Ben90; 08-26-2011 at 07:29 AM.
Ben90 is offline   Reply With Quote
Old 08-26-2011, 01:35 PM   #89
kalrith
Diamond Member
 
kalrith's Avatar
 
Join Date: Aug 2005
Location: Missouri
Posts: 6,586
Default

First of all, THANKS OP! The spreadsheet is great. I'm very much on board with this. For those of you who keep saying you don't care about power consumption, that's bull (unless you don't pay for your electricity). If you're comparing two cards of similar performance and one costs $40 more than the other, of course you'll choose the cheaper card. It would be the same thing with two equally priced and equally performing cards with one of them using $40 more in electricity per year.

I built a low-power HTPC/file server that draws 40W at the wall on idle, so I did a lot of research about the impact of energy costs. I also am very much an advocate for not wasting money for "unnecessary" (IMO) things. I canceled my satellite, use free TV only, and switched to a combination of prepaid cell phones and Magic Jack for my communications. Those changes have saved me $1500 per year for the past 3 years, and my only costs for going that route were about $500 - 600 for the HTPC and TV antenna, plus about $30 per year in electricity to run the HTPC.

I'm considering upgrading my gaming PC, and I hate how much it heats up the room even at idle.

I am going to make a few recommendations to the spreadsheet, which might be a lot of work, but I'll recommend them anyways. All of us have different computer-usage amounts and different electricity costs. I recommend putting three fillable fields to contain electric costs, idle usage per day, and load usage per day. Then, make your formulas based on these amounts. You can prefill them with your amounts so that the spreadsheet will start off being populated, but people will be able to tweak your findings to suit their own situation.

The other recommendation is to make the rankings column dependent on one of the other columns (probably total costs per FPS). That will allow that rankings to change dependent on that column.

Thanks for all the hard work!

Edit: Adding a fourth fillable field for frequency of GPU upgrade would be helpful as well.

Edit2: And as a caveat to my statement above about why people should care about efficiency, I finished reading the thread, and I'm not saying anything about people buying one 6950 instead of a 580 SLI setup for efficiency. I'm saying that efficiency should be looked at as much as initial purchase price. The initial price of card A might be less than card B, but the total cost of ownership could be more for card A if it's less efficient. And I think total cost of ownership is what pandemonium is trying to drive home. If you spend $3,000 on a computer, then you might not care about total cost of ownership (much like someone who buys an AMG doesn't care that it's a gas guzzler). However, I don't think it's a stretch to say that the majority of forum members fall into the sub-$300 video card category, in which efficiency can make a very big difference for the total cost of ownership.
__________________
Heatware 24+ eBay 22+

Last edited by kalrith; 08-26-2011 at 01:45 PM.
kalrith is offline   Reply With Quote
Old 08-26-2011, 02:18 PM   #90
kalrith
Diamond Member
 
kalrith's Avatar
 
Join Date: Aug 2005
Location: Missouri
Posts: 6,586
Default

Well, I had a few extra minutes, so I made my proposed changes at the following link:



Since the inputs can be changed, only one set of data needs to be included, which I think makes it a little less overwhelming. I also froze the first pane, so you'll always know what cards you're looking at.

Anyways, let me know what you guys think.

will fix link in a minute
__________________
Heatware 24+ eBay 22+
kalrith is offline   Reply With Quote
Old 08-26-2011, 03:04 PM   #91
gevorg
Diamond Member
 
gevorg's Avatar
 
Join Date: Nov 2004
Location: Bay Area
Posts: 4,749
Default

Awesome analysis! This should go on the main Anandtech page. Maybe a new section for hand-picked reviews by AT forum members.
gevorg is offline   Reply With Quote
Old 08-26-2011, 06:14 PM   #92
dualsmp
Golden Member
 
dualsmp's Avatar
 
Join Date: Aug 2003
Location: SC
Posts: 1,524
Default

Can someone upload pandemonium's spreadsheet to Rapidshare or another site? Thanks.
dualsmp is offline   Reply With Quote
Old 08-27-2011, 01:14 AM   #93
pandemonium
Golden Member
 
pandemonium's Avatar
 
Join Date: Mar 2011
Location: Augusta, GA
Posts: 1,404
Default

Quote:
Originally Posted by Ben90 View Post
You can't just use dB levels as a baseline of heatsink performance. If the above was true, the Intel stock cooler would be a beast and Honda Civics with Fartpipes would be fast.
Hah, definitely no. You're missing the other variables in those analogies that are oh-so important for the comparison; such as displacement, mpg, bhp, whp, curb weight, etc.

The point was to eliminate, as much as theoretically possible, the differences of performance of heat dispersal on the two cards so we could compare how Watt/FPS aligns to Heat/FPS between different cards. If all cards were equal, they'd scale relatively within the same line with each other with power consumed and power lost through heat. Since we know they're not equal due to design differences, we can then compare Watt/FPS and Heat/FPS as a property of efficiency as well as how each card effects our budgets. That's how I see it anyways.

Quote:
Originally Posted by Ben90 View Post
Imagine a video card that had a huge phase change unit integrated within itself for a cooler. The GPU temperature could be below zero. According to your chart the card is creating negative heat. That makes no sense.
True, but you're going too extreme for comparative purposes. All of the cards on this chart are actively air cooled. In order to equally use my comparison you'd have to apply the same cooling techniques to all the video cards.
__________________
..:: |We are but shadows of our achievements and dust of the stars; empowering the universe to have conciousness.| ::..
..:: |Fighting ignorance is bliss.| ::..

If you're curious about getting into vaping, I'd be glad to help! I quit smoking with vaping! /truestory
pandemonium is offline   Reply With Quote
Old 08-27-2011, 01:25 AM   #94
Ben90
Platinum Member
 
Join Date: Jun 2009
Posts: 2,824
Default

Quote:
Originally Posted by pandemonium
In order to keep the comparison as objective as possible, I'll pick 2 cards that have nearly the same Load noise levels. This is to reduce the comparative amount of heat dispersal that is being done by heat-sinking and fan assemblies.
Quote:
Originally Posted by pandemonium View Post
Hah, definitely no. You're missing the other variables in those analogies that are oh-so important for the comparison; such as displacement, mpg, bhp, whp, curb weight, etc.
Perfectly phrased, there are other things that determine how well a cooler cools vs just the noise it makes. Case in Point: GTX 470 vs GTX 580.

The 580 uses more power yet has a lower load temperature and lower noise signature. This is due to being switched to vapor chamber cooling.

The way you calculate "Heat" would incorrectly put the 470 as using more power when it doesn't.
Ben90 is offline   Reply With Quote
Old 08-27-2011, 01:53 AM   #95
pandemonium
Golden Member
 
pandemonium's Avatar
 
Join Date: Mar 2011
Location: Augusta, GA
Posts: 1,404
Default

Quote:
Originally Posted by kalrith View Post
First of all, THANKS OP! The spreadsheet is great. I'm very much on board with this.
You're welcome! I'm glad another person sees use for the sheets.

Quote:
Originally Posted by kalrith View Post
I am going to make a few recommendations to the spreadsheet, which might be a lot of work, but I'll recommend them anyways. All of us have different computer-usage amounts and different electricity costs. I recommend putting three fillable fields to contain electric costs, idle usage per day, and load usage per day. Then, make your formulas based on these amounts. You can prefill them with your amounts so that the spreadsheet will start off being populated, but people will be able to tweak your findings to suit their own situation.

The other recommendation is to make the rankings column dependent on one of the other columns (probably total costs per FPS). That will allow that rankings to change dependent on that column.

Thanks for all the hard work!

Edit: Adding a fourth fillable field for frequency of GPU upgrade would be helpful as well.
I agree that having a fill-able field would be perfect and actually reduce the size of the sheets so I don't have to calculate separate frequencies out. I'm actually learning how Excel works by doing these sheets, so I apologize if they're not the best way of doing things. I love learning it, but if anyone is an expert you're more than welcome to modify the sheets. Send me a message and I'll send the sheets over your way.

Quote:
Originally Posted by kalrith View Post
Well, I had a few extra minutes, so I made my proposed changes at the following link:
I'm not seeing a link. :/
__________________
..:: |We are but shadows of our achievements and dust of the stars; empowering the universe to have conciousness.| ::..
..:: |Fighting ignorance is bliss.| ::..

If you're curious about getting into vaping, I'd be glad to help! I quit smoking with vaping! /truestory
pandemonium is offline   Reply With Quote
Old 08-27-2011, 05:41 AM   #96
Silverforce11
Diamond Member
 
Silverforce11's Avatar
 
Join Date: Feb 2009
Location: Australia
Posts: 4,642
Default

Crazy how cheap electricity is for Americans.

It's around 1/6 - 1/10th the price compared to a lot of EU countries.
__________________

Rig 1: 3570K | Z77 E4 | Crossfire R290 | 840 250GB + 840EVO 250GB | 8GB G.Skill Ares 2133 | OCZ 850W Gold+ | Nanoxia DS1 | Ghetto Water
Hobby: Mobile Game Dev & Cryptocoin day-trader
Silverforce11 is offline   Reply With Quote
Old 08-30-2011, 06:30 AM   #97
FredGamer
Banned
 
Join Date: Jul 2011
Posts: 7
Default

Quote:
Originally Posted by pandemonium View Post
For someone who spends $.21 per kWh, the differences in operational costs per day will be: $.59430 versus $.81564 (or $.22134)
On a monthly basis of ~30.4 days (365/12) this equates to: $6.728

It would be difficult to spend $.21 per KWh in the USA, where residential electricity costs $.11 per KWh average across the country:

http://www.eia.gov/cneaf/electricity...able5_6_a.html

So I say "HD6990 buyers, enjoy."
FredGamer is offline   Reply With Quote
Old 08-30-2011, 07:32 PM   #98
kalrith
Diamond Member
 
kalrith's Avatar
 
Join Date: Aug 2005
Location: Missouri
Posts: 6,586
Default

I apologize for taking so long. I uploaded my changes (referenced in a previous post) to Rapidshare at the following link:

https://rapidshare.com/files/6259904...eet_083011.xls

Edit: In case it's not completely obvious, I only made changes to the first worksheet. You input your entries in the highlighted fields, and it changes the data for you.
__________________
Heatware 24+ eBay 22+

Last edited by kalrith; 08-30-2011 at 07:50 PM.
kalrith is offline   Reply With Quote
Old 08-30-2011, 08:37 PM   #99
RussianSensation
Elite Member
 
RussianSensation's Avatar
 
Join Date: Sep 2003
Location: Dubai, UAE
Posts: 14,464
Default

Quote:
Originally Posted by pandemonium View Post
None of these cards came out at the same time, but the point is they are all available NOW. You can't argue past and future in a present situation, lol. How would you have known? You wouldn't have! Perhaps you need to read the disclaimers again in post #2.
You are missing my point, because of the statement you made. For instance, sure GTX480 is available NOW for $300. But hardly anyone would buy that card when you can get a GTX570 for same price.

I mean, yes, I suppose it's useful to see how much more efficient modern cards are. However, since you brought Price / Performance FPS and wattage into consideration, it's almost a certainty that newer generations will be superior to older generations (i.e., since GPU speed increases at the same price indefinitely from 1 generation to the next).

Essentially, in order for you to make a point, you have to compare similar setups and see the difference in electricity costs. I have gone ahead and done this, as outlined below.

Quote:
You don't know that. Honestly, you don't. Not everyone buys the latest and the greatest right when it comes out. I for one don't.
I get that. I can tell you that HD6870 for $150 is one of the best bang for the buck cards right based on Price / FPS. The electricity cost differences between an HD6870 and HD6770 won't suddenly make HD6770 more attractive. Just like in 8 months from now, a $350-400 HD7970 isn't going to be a better bang for the buck than a used HD6970 for $180 based on electricity costs.

My point is in North America, electricity cost is a very small fraction of the total ownership cost (bar the extremely power hungry GTX480). From your own graphs it's evident it amounts to $5-15 annually between a worst and best case scenario (it's even smaller than that).

Look at what happens when you compare apples to apples as a buyer would do today:


.......................


Annual Electricity Costs - using YOUR data:

Single GPUs: Annual electricity costs for comparable single GPUs

Ultra-High End = GTX580 = +$6.64 over HD6970 but there is already a $100-150 price premium for this card over the HD6970/GTX570. GTX580 users tend to be price inelastic. You think they would care about $7 a year in electricity premium over HD6970 @ 6 hours of gaming/day for 300 days a year!?

High-End = GTX570 vs. HD6970 = +$2.94 difference
Mid-High-end = GTX560 Ti vs. HD6950 = +$3.10 difference
Low-Mid-range = GTX460 vs. HD6850 = +$4.41 difference

Older generation:

HD5870 vs. GTX470 (+$7.43) but HD5870 cost $350 at the time when GTX470 cost $280-300. So hardly anyone would have chosen the 5870 for electricity "savings". GTX480 cost $500! so hardly anyone who was eyeing it over the $350 HD5870 would have cared either. So again electricity costs between GTX470/480/5870 wouldn't have mattered (heat and noise would have).

Now let's move on to SLI/CF:

Multiple GPUs: Annual electricity costs for comparable Multiple GPU setups

Ultra-High End = GTX580 SLI vs. HD6970 CF = +$8.48 (When the cash outlay is $850-1000 vs. $700 on AMD side.......electricity cost is irrelevant) < Regardless proper opponent is GTX570 SLI I would say based on price.

High-End = GTX560Ti SLI vs. HD6950 CF = You don't have any data for GTX560 Ti SLI....Ok fair enough.

Let's look at HD6870 CF vs. HD6950 CF vs. HD6970 CF = $59.43 vs. $66.42 (+$6.89) vs. $78.25 (+$11.83)

Mid-Range = GTX460 SLI vs. HD6850 CF = +$6.59 difference

Your own analysis shows that the annual electricity cost difference between comparable setups ranges from about $3-12 per annum.

The only serious outliers are GTX470/480. However, both of those cards were special cases because GTX470 was significantly cheaper than the HD5870 and GTX480 was significantly more expensive than the HD5870 on release. And again, most GTX470/480/5870 users aren't going to side-grade to current generation for electricity savings.
__________________
i5 2500k | Asus P8P67 Rev.3 | Sapphire Dual-X HD7970 1150/1800 1.174V CFX | G.Skill Sniper 8GB DDR3-1600 1.5V
SeaSonic Platinum 1000W | OCZ Vertex 3 120GB + HITACHI 7K1000.B 1TB | Windows 7
Westinghouse 37" 1080P | X-Fi Platinum | Logitech Z5300E 5.1

Last edited by RussianSensation; 08-30-2011 at 09:11 PM.
RussianSensation is online now   Reply With Quote
Old 08-30-2011, 09:14 PM   #100
RussianSensation
Elite Member
 
RussianSensation's Avatar
 
Join Date: Sep 2003
Location: Dubai, UAE
Posts: 14,464
Default

Quote:
Originally Posted by kalrith View Post
First of all, THANKS OP! The spreadsheet is great. I'm very much on board with this. For those of you who keep saying you don't care about power consumption, that's bull (unless you don't pay for your electricity). If you're comparing two cards of similar performance and one costs $40 more than the other, of course you'll choose the cheaper card. It would be the same thing with two equally priced and equally performing cards with one of them using $40 more in electricity per year.
Except, if you looked at the data presented for comparable setups, the electricity cost differences between NV and AMD tend to hover at about $5-10 (+/- $2-3). So it's nowhere near $40; and therefore hardly material, despite being projected at higher electricity cost rates than the avg in US already.
__________________
i5 2500k | Asus P8P67 Rev.3 | Sapphire Dual-X HD7970 1150/1800 1.174V CFX | G.Skill Sniper 8GB DDR3-1600 1.5V
SeaSonic Platinum 1000W | OCZ Vertex 3 120GB + HITACHI 7K1000.B 1TB | Windows 7
Westinghouse 37" 1080P | X-Fi Platinum | Logitech Z5300E 5.1
RussianSensation is online now   Reply With Quote
Reply

Tags
cost, efficiency, energy consumption, fps, gaming

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 02:53 PM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.