Power consumption of G92 when you lower clocks

panfist

Senior member
Sep 4, 2007
343
0
0
I have an 8800GT that I typically run at 680MHz core 975MHz RAM when I play games, and then lower the clocks to 300MHz/600MHz when not running any 3D applications.

I was wondering if there were any reports detailing how much power you save when doing this. If I had an ammeter or a kill-a-watt meter or the like, I'd measure it myself.

Frankly I'm surprised that nvidia cards don't do this automatically...unless the power saving is negligible.
 

DarkRogue

Golden Member
Dec 25, 2007
1,243
3
76
I tried it with a Killawatt.

It made zero difference.

When you're idling in the desktop the GPU isn't going nuts rendering it, and it made zero difference in power consumption when I had the core clock at 700MHz vs 600MHz.

Edit:
I never tried going below the stock frequency though.
 

ajaidevsingh

Senior member
Mar 7, 2008
563
0
0
I have tried it almost no difference on a 9800 GTX... "We were thinking if we lower the clock enough maybe the 2D will shifted to the mobo. but thats not how it works"

 

panfist

Senior member
Sep 4, 2007
343
0
0
Originally posted by: DarkRogue
I tried it with a Killawatt. It made zero difference.

Edit:
I never tried going below the stock frequency though.

Well...if you want to try it again underclocking below stock that would be sweet :D.

Originally posted by: OmegaShadow
If you do that, would it just save you a few cents per month?

I'm trying to find out how many cents per month I would save. A penny saved is a penny earned. Every cent counts.

Originally posted by: ajaidevsingh
I have tried it almost no difference on a 9800 GTX... "We were thinking if we lower the clock enough maybe the 2D will shifted to the mobo. but thats not how it works"

I'm not sure what you mean by the quoted part. And the first part, what was your delta between high and low clock speeds? I'm cutting my clock speed by more than half, I can't imagine that I would get a ZERO watt difference.

Originally posted by: ShadowFlareX
Yeah, I wish my 9800GTX throttle down when not 3Dgaming, It should run at least a bit cooler.

With RivaTuner or ATITool, you can underclock the graphics card, and then reset it stock clocks or overclock it when you launch a 3D application. Either program can try to automatically detect when you launch a 3D app, or you can program in a list of specific applications. I'm trying to figure out how much power I actually save by doing this.
 

Throckmorton

Lifer
Aug 23, 2007
16,830
3
0
What about the wear on the video card? Does reducing the clock reduce wear, or would you have to reduce voltage too?
 

BlueWeasel

Lifer
Jun 2, 2000
15,940
474
126
I have 2 profiles setup in RT. One for 2D/desktop use that underclocks my 8800GT as low as it will go (315/785/450) and another for gaming that overclocks the hell out of it.

As I'm in 2D 90% of the time, I figured the same -- the lower clock speeds would result in a cooler card with less draw.
 

OmegaShadow

Senior member
Dec 12, 2007
231
0
0
Originally posted by: OmegaShadow
If you do that, would it just save you a few cents per month?

I'm trying to find out how many cents per month I would save. A penny saved is a penny earned. Every cent counts.


well 1000 watt = 1 kilowat
for me every kilowat up to 600kw costs 3.5 cents per month



lets say you use your computer for 12 hours a day. that equals 3.6 kWh/day of electricity your using. if your monitor is on for the same amount of time, the monitor alone uses 2.4kw/day.

3.6+2.4 = 6kwh/day

thats about $75 of electricty you pay per month for your computer usage based on the asumption that you use your computer 12 hours a day for a month.



you should try to get an energy efficient PSU. that will probably save you more energy. or turn off your monitor if your downloading and not on your computer.




ahh don't take my figures literally lol. i just googled up some stuff to do my calculations. you should ask an expert if you want to be sure.
 

panfist

Senior member
Sep 4, 2007
343
0
0
Originally posted by: OmegaShadow
<some calculations>

I already have an 80+ PSU.

I use my computer 24 hours a day. Well, I'm not in front of it all the time, but it's on to seed my torrents, transcode video, and record TV shows. I turn off my monitors when I'm not in front of the computer.

Besides the monitors, which I already do my best to turn off when I'm not using, the 8800GT is the most power hungry component in my computer, so if I want to minimize the money I spend on power it's the next logical thing to go after.
 

panfist

Senior member
Sep 4, 2007
343
0
0
I do that, too. Standby-ing the graphics card, however, isn't an option when I am websurfing, word processing, or the like.
 

PingSpike

Lifer
Feb 25, 2004
21,732
561
126
Originally posted by: BlueWeasel
I have 2 profiles setup in RT. One for 2D/desktop use that underclocks my 8800GT as low as it will go (315/785/450) and another for gaming that overclocks the hell out of it.

As I'm in 2D 90% of the time, I figured the same -- the lower clock speeds would result in a cooler card with less draw.

Do you have a kill-a-watt? I'm curious what the power usage difference is? I was under the impression that the 8800gt already did some kind of downclocking that was hidden from the user unlike say the x19xx cards. I always thought the ati cards didn't really clock down low enough...your average PCI graphics card from 5 years ago does most windows stuff fine.
 

Denithor

Diamond Member
Apr 11, 2004
6,300
23
81
Originally posted by: OmegaShadow
well 1000 watt = 1 kilowat
for me every kilowat up to 600kw costs 3.5 cents per month

lets say you use your computer for 12 hours a day. that equals 3.6 kWh/day of electricity your using. if your monitor is on for the same amount of time, the monitor alone uses 2.4kw/day.

3.6+2.4 = 6kwh/day

thats about $75 of electricty you pay per month for your computer usage based on the asumption that you use your computer 12 hours a day for a month.

Something is off in your math there...I think it's closer to 6KW/day * 30 days/month * $0.035 = $6.30/month in electricity. And that's assuming a CRT monitor on for 12 hours, not an LCD (which are much more energy efficient).

A better estimate would be 200W for the rig and 50W for an LCD. If the rig is on 24 hours/day and the LCD is on say 6 hours/day that's 5.1KWH/day * 30 d/m * $0.035 =$5.35 per month.

But I also doubt your energy costs that little. Check out the graph of average cost/KWH on this page. Average in the US in 2006 was $0.104/KWH.

For me, my rig is on 24 hours/day (F@H so call it 250W/hour) and my LCD is on about 8 hours/day (36W/hour from specs). So my rig under these conditions consumes about 6KWH/day and at NC rates it probably costs me $18/month in electricity.
 

Denithor

Diamond Member
Apr 11, 2004
6,300
23
81
And to the OP:

I doubt that you would see any difference in power consumption from down-clocking your card as the voltage does not drop because of reduced speeds. However, you will probably generate a little less heat, meaning less wear on the card, and also less heat for your AC to have to deal with (thereby saving a tiny amount on your power bill). But we are literally splitting pennies here.
 

panfist

Senior member
Sep 4, 2007
343
0
0
Originally posted by: Denithor
I doubt that you would see any difference in power consumption from down-clocking your card as the voltage does not drop because of reduced speeds. However, you will probably generate a little less heat

You just contradicted yourself, because less heat means less power dissipated. The heat doesn't just get lowered by itself. If heat is lowered, it must be as a consequence of lower power dissipation.

Imagine your CPU or GPU as though it is a guy on a bike. This guy on a bike is constantly pedaling. RPM in this case is directly analogous to clock speed. When idle, you are biking on a flat surface, and at load, you are biking up hill. If you increase or decrease the clock speed, that's like pedaling faster or slower. Increase or decreasing the voltage is like "pumping up" this imaginary guy's muscles, so that he pedals harder, but not faster. A weak guy might not have the juice to climb uphill at a certain speed...hence you have to overvolt.

Anyway...that's my understanding of the whole process. Correct me if I'm wrong, please.
 

Raider1284

Senior member
Aug 17, 2006
809
0
0
As someone mentioned, lowering the clocks doesnt lower the voltage though. How can your card use less energy if its pulling the same voltage from the PSU? Yes, the clock speed is different but the card is demanding the same amount of energy. I would think you would have to lower the voltage of your card to see any power difference.
 

panfist

Senior member
Sep 4, 2007
343
0
0
Originally posted by: Raider1284
As someone mentioned, lowering the clocks doesnt lower the voltage though. How can your card use less energy if its pulling the same voltage from the PSU? Yes, the clock speed is different but the card is demanding the same amount of energy. I would think you would have to lower the voltage of your card to see any power difference.

wrong, wrong, wrong.

Power consumption goes up as the square of the clock speed. Every time the clock ticks, the processor does work. The more voltage you push, the "harder" it does the work. If you have less clock ticks, you do less work.

edit: here's another example. You buy a brand new e8400 that normally runs at 3.0GHz. You can overclock it to 3.6GHz without increasing the voltage. Do you think you're getting that extra 600MHz for free? No, even though you are at the same voltage, you are using more power.
 

PingSpike

Lifer
Feb 25, 2004
21,732
561
126
Speed, voltage, loaded versus idle all are going to have an effect on power usage. But just dropping the speed, all other things equal will probably not signifigantly decrease power consumption at idle. It would probably have a more noticeable impact at load.

Voltage probably has a larger impact, and running at a lower speed should allow you to run at a lower voltage while still being stable...assuming you have a way to lower the voltage!
 

katank

Senior member
Jul 18, 2008
385
0
0
Originally posted by: panfist

Power consumption goes up as the square of the clock speed.

Actually, I'm fairly certain that power consumption goes up linearly w/ clock speed or frequency. It goes up quadratically w/ voltage.

 

OCGuy

Lifer
Jul 12, 2000
27,227
36
91
Yes you will save power....

The new GT200 Cards automatically revert to 300 Core and a low memory clock while in 2D mode.
 

Zap

Elite Member
Oct 13, 1999
22,377
2
81
Originally posted by: ajaidevsingh
"We were thinking if we lower the clock enough maybe the 2D will shifted to the mobo. but thats not how it works"

You're thinking of NVIDIA HybridPower?. It's making a debut with the 55nm 9800 GT cards, but no functionality without a supporting IGP motherboard. How it works is that you would hook up your monitor to the IGP. When needing graphics power, the PCI Express GPU (9800 GT or others in future) will do the work, but output through the IGP. When not needing graphics power, the PCI Express GPU will be shut off completely through the SM Bus.

^^^ I just Googled it ;)