• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

thinking of buying a new videocard - for power savings?!

Toadster

Senior member
Right now I'm running twin GTX260's - and from what I've read - I'm burning about 70-180W with both cards in the system.

If I change to say a single (newer) ATI HD5870 - I would be burning about 25W idle and about 180-200W during 3D... plus less noise & heat.

I currently run a single 24" LCD (1900x1200) so I don't really 'need' to scale my GPU beyond that resolution.

has anyone else gone this route?
 
Last edited:
If your interested in reducing the heat/noise in your room or case a new video card can be worth it. Obviously you will have new tech and features with a new card. As far as saving power, sure you will save some but your going to be spending over $300 to save it. You will not recoup your investment most likely. It's better to keep your old setup if the only reason your upgrading is for power savings.
 
Depends. If he sells his 260s for say $130 each (should be possible) and finds a 5870 deal for $361 (sapphire vapor-x abing @ TD) then the upgrade out of pocket is $100.

100 watts means 10 hours saves about 20 cents. 50 hours for a buck, so 5000 (or 2.5 years at 40 hours a week) is the break even point. Divide that in half if you have obscenely expensive electricity (hawaii, some parts of europe).

A 5850 is closer in price to 2 used GTX260s, uses even less power and might be enough for 19x12.
 
Definitely closes the gap if he sells both 260s. A $100 is more in the ball park, but not sure what his rig is pulling.
 
I'd really like to see the calculations to show just how much you are saving as over here the difference would be nominal.!
 
The only way your really gonna save money is to sell 1 gtx 260 and use the other one.
This is a dumb idea, no offence.

By the time you start saving money, we will be on 12nm chips, that consume 90 watts, that will eat 2 gtx 480's for breakfast.
 
Did a quick search: 260 in SLI use ~75w (DC) and 5870 uses ~25w (DC). You'll save ~50w (DC) or ~60w AC (at the wall).

(60w*24hr*365days)/1000 * .15 cent per kwh (for CA)= $78.84 per year (24hrs a day).

If you use your PC 1/2 the day and turn it off at night, then you'll save $39.42 per year. Will take 2.5 years to save $100 using your PC 12hrs a day. This is only an approximation though.
 
Depends. If he sells his 260s for say $130 each (should be possible) and finds a 5870 deal for $361 (sapphire vapor-x abing @ TD) then the upgrade out of pocket is $100.

100 watts means 10 hours saves about 20 cents. 50 hours for a buck, so 5000 (or 2.5 years at 40 hours a week) is the break even point. Divide that in half if you have obscenely expensive electricity (hawaii, some parts of europe).

A 5850 is closer in price to 2 used GTX260s, uses even less power and might be enough for 19x12.

you aren't including the times that I turn down the A/C in my house because the PC room gets too warm 🙂 I think the break even point is much faster if you add that factor
 
you aren't including the times that I turn down the A/C in my house because the PC room gets too warm 🙂 I think the break even point is much faster if you add that factor

Damn it's that hot. Go for it then. 😀 Your electricity rates are fairly high.
 
it's not like it's 100F in my room, but you can tell when the PC has been on all day 🙂

I finally upgraded to a i3 530 and dumped my old toasty P4 so I know what you mean. My whole system now including the monitor uses a little less than 60w.
 
exactly! this is what I'm after!

Do you use CUDA at all? If so this could be one of those times where even though you're pulling more from the wall, you're getting the job done quicker. If not, I say go for it just for the fun new tech (and I imagine more quietness).

I doubt you'll notice a real difference on your bill, you'd certainly be more efficient though. And, are you really going for sub 100 watts with a 4.2 quad? 🙂 )
 
I doubt you'll notice a real difference on your bill, you'd certainly be more efficient though. And, are you really going for sub 100 watts with a 4.2 quad? 🙂 )


savings were already calculated above. Guesses for his electricity rates and hours of use but they seem reasonable.


good point about the cpu though. Maybe software overclocking so he can idle it very low and ramp up only when needed?
 
Like the others said, being power efficient should be a side benefit, not the main reason you are swapping. Get it to make the room more comfortable, to reduce noise, etc. but don't get it primarily to save money. If you do, you will be sorely disappointed as you likely won't recoup the cost before it's time to upgrade the new card.
 
Do you use CUDA at all? If so this could be one of those times where even though you're pulling more from the wall, you're getting the job done quicker. If not, I say go for it just for the fun new tech (and I imagine more quietness).

I doubt you'll notice a real difference on your bill, you'd certainly be more efficient though. And, are you really going for sub 100 watts with a 4.2 quad? 🙂 )

I used to run BOINC on this puppy, but that REALLY heats things up - SETI cranking on both GPUs and 12-threads makes a toasty box! I should check my other apps though, I think some of the video processing runs CUDA...

quad? 🙂 this thing has a six-pack!

it would be pretty awesome though - I'll have to dig around for some info on the lowest idle power 980X setup...
 
Last edited:
Depends. If he sells his 260s for say $130 each (should be possible) and finds a 5870 deal for $361 (sapphire vapor-x abing @ TD) then the upgrade out of pocket is $100.

100 watts means 10 hours saves about 20 cents. 50 hours for a buck, so 5000 (or 2.5 years at 40 hours a week) is the break even point. Divide that in half if you have obscenely expensive electricity (hawaii, some parts of europe).

A 5850 is closer in price to 2 used GTX260s, uses even less power and might be enough for 19x12.
its about 50 cents per kilowatt hour where we are in california. i did the math and currently it takes $350 a month give or take to run my entire milkyway@home farm at those rates. im so glad im not the one paying for power, and the one who is is running 3 to 4 times as much stuff as me :awe:
 
Changing to fluorescent bulbs can actually make a noticeable difference if you have the lights on much, as far as the heat factor. I work from home, and one of the apartments we lived in had a very hot 3rd bedroom (the office) since it was upstairs and on the sunny side of the building. I looked at lots of factors to keep it cool in there without having to make the main floor ice cold by turning the AC up. I ended up making a wooden frame to house a window AC unit in the sliding door in that room, and that made a nice difference. Not particularly cost effective, but I think I recouped a good portion of the cost by not having to crank the main AC.
 
I used to run BOINC on this puppy, but that REALLY heats things up - SETI cranking on both GPUs and 12-threads makes a toasty box! I should check my other apps though, I think some of the video processing runs CUDA...

quad? 🙂 this thing has a six-pack!

it would be pretty awesome though - I'll have to dig around for some info on the lowest idle power 980X setup...

Don't know about your hex, but I was able to get my 920 stable at *.8725* @ 2.8 ghz. It'd drop to 1.6 ghz idle. Had to bump it up to .925 with hyperthreading *off*, which is truly bizarre, but it ratcheted down to the same .8725 idle anyway. I couldn't get the memory to downclock or downvolt below 1.5v either.

Let's just say at those settings the amount of heat coming from the box was pretty darn minimal. I'd feel perfectly comfortable using that for an HTPC. Might even turn a few cores off in the BIOS for that task when the time comes.
 
Like the others said, being power efficient should be a side benefit, not the main reason you are swapping. Get it to make the room more comfortable, to reduce noise, etc. but don't get it primarily to save money. If you do, you will be sorely disappointed as you likely won't recoup the cost before it's time to upgrade the new card.

This!
 
Back
Top