How Important is Hybrid Power?

taltamir

Lifer
Mar 21, 2004
13,576
6
76
AMD has originally come up with the idea of turning off the video card when it is not in use and using the onboard one instead. It was supposed to be in the HD38xx serious, but was dropped. They will not be bringing it back for the 48xx series either.

nVidia quickly announced that they have also been working on something similar, calling it hybrid power, and have actually managed to make a working system... unfortunately it only works with nforce7 for AMD motherboards and 9800GTX and above video cards.

The benefits seem obvious. Significant power savings, heat reduction, and noise elimination.

So here is a little poll, how importat is hybrid power to you? do you think AMD dropped the ball by canceling this feature? did nvidia mess up by not getting it on intel platforms?

EDIT: Just adding some math here to make things more clear...
Originally posted by: taltamir
here is texas CHEAPEST electricity is 14 cents per kwh. (thanks to nationalpowerco going out of business due to their 11 cents kwh contracts, and everyone else raising the price).
Some places are as cheap as 6 cents per kwh... I know of some places in the US where it costs 24 cents per KWH

I game for 2 hours a day (no savings there), use the computer for other things for 10 hours a day (this is where powerplay/hybrid power reduces power), and leave it off for the remaining 12 or so (zero power taken, but not due to powerplay/hybridpower).

an idling G92 single GPU card takes about 60-70 watts... and idle underclocked (speedstep like tech) 38xx single GPU card takes about 40-50 BECUASE AMD already implemented a speedstep like function of underclocking the card (the HD2xxx series... those can go OVER 100 watts while idle).
An IGP takes an additional 5 watts at most.

35 to 65 watt reduction for a current gen card, higher for the G2xxx, probably higher for the 48xx series, and definitely higher for additional GPUs. (not to mention you don't have to subtract the 5 extra watts the IGP takes).

(40-5) watt * 0.001 kilowatt/watt * 10 hours / day * 365 days / year * 0.14 $/kwh = 17.885$ a year FOR ME, and I turn them off when not in use
(70-5) watt * 0.001 kilowatt/watt * 10 hours / day * 365 days / year * 0.14 $/kwh = 33.215$ a year FOR ME, and I turn them off when not in use

semi-worst case scenario (no point in using the HD2xxx cards... they are obsolete):
leave computer on 24 hours a day (I used to do this), 3 or 4 card SLI/CF taking 200+ watts when IDLE, not accounting for AC costs again.
(200-5) watt * 0.001 kilowatt/watt * 22 hours / day * 365 days / year * 0.14 $/kwh = 219.219 $ a year... in electric costs for your idling a serious mGPU setup for a person who leaves on 24/7

But since I turn it off lets recalculate at only 10 hours a day:
(200-5) watt * 0.001 kilowatt/watt * 10 hours / day * 365 days / year * 0.14 $/kwh = 99.645$ a year FOR ME, and I turn them off when not in use

Since this is measured power draw from the wall some PSU inefficiency loss is accounted for. But increased AC costs are not.

This is the REAL dollar amount I will personally save... a person with multi GPU setup will save a multiple of this... a person who leaves his computer on 24 hours a day will save more then twice this, or exactly twice if he plays for 4 hours a day EVERY DAY.
I was being generous by saying 2 hours a day... I play 10 hours a day when a new game like mass effect comes out, and then I don't play anything for a month or so...

Note that hybrid power and the powerplay tech DOES NOTHING while you are gaming, or while the computer is off (or at S3 sleep). It only helps when you are using the computer, but not for games, which is the majority of the time for most people.
 

solog

Member
Apr 18, 2008
145
0
0
Its important to two people. You and me. About 1000 or so people use these forums so about 0.2% of the tech populace cares. Doesn't look good.

9800GTX and X2 are the only two cards that use it now but I think that all new Nvidia cards will support it
 

s44

Diamond Member
Oct 13, 2006
9,427
16
81
Honestly, I'm amazed this hasn't been done sooner, considering how much heat and power high-end cards guzzle. I assume it'll be on Intel soon enough.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
It would not affect my personal decision to buy a card. However, I can see that some people might find it useful for keeping noise down and conserving power. Heres something interesting from AT. concerning AMD.

However, AMD does not offer Hybrid power capabilities nor does the flagship 790FX offer integrated graphics capabilities. We will have to wait a few more months for the AMD 790GX to arrive for those two features.

So we will be seeing Hybrid power features from AMD in the 790GX which is due out pretty soon last I heard.
 

n7

Elite Member
Jan 4, 2004
21,281
4
81
There are many things higher on my priority list.

I haven't followed this closely, but you need a matching nV chipset + card or AMD chipset + card for this to work, right?

If so, zero interest here, as i'll be sticking with Intel chipsets for the next while most likely.
 

vanvock

Senior member
Jan 1, 2005
959
0
0
If you're looking at something on the screen then the card is in use, the difference is 2D or 3D & how much less an on-board solution would use than an actual card in 2D which I doubt is significant. The other bonus is said to be SLI with the on-board when in 3D, which might be good but could be out done with another card. If you're worried about fan noise use RivaTuner. I'm sure this has appeal to some who may think hybrid cars are a good idea & Al Gore would encourage it's use (as he jets around the world & drives his muscle car around Nashville) it just doesn't excite me.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
yes you need a matching chipset with an IGP. nVidia actually went as far as stating that they will never make a non IGP chipset again, only high end chipsets with an IGP for hybrid power, and low end chipsets with an IGP for its main display and for hybrid SLI for cheap performance.

AMD claimed it will be available with the 780G + 3870... but it got cut.

And intel will not be able to offer such a feature until they make their own video card. Since you need a single driver controlling both the IGP and the main video card, and switching between them.

I would not want to turn off, or lower, the fan on my video card while it idles, they are hot enough as it is. Actually I had to INCREASE the idle fan speed on my 8800GTS 512 from the default, because it was running two hot and I fried two different cards in a short period of time.

And the ideal car is a plug in electric / hydrogen fuel cell hybrid. (full electric engines, just two different sources of power, batteries charged overnight, or hydrogen fuel cell for range extension)
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
When the MCP7A (MCP78 intel version) arrives, it will be available for intel systems.

Hybrid power is an excellent idea. I think business who do uses top of the line workstation cards will be very interested in the next lineup of Quadros that could support hybrid power.
 

vanvock

Senior member
Jan 1, 2005
959
0
0
I idle my GT at higher than stock as well since I can't hear the difference. As far as the battery cars go it sounds good untill you have to shell out around $5k to replace them & you realize that neither they nor the power to recharge them "grow on trees".
 

imported_Scoop

Senior member
Dec 10, 2007
773
0
0
It's about time this sort of technology comes around. What I don't understand, is how Nvidia still hasn't adopted the clock drops while idle like AMD already did with the 38xx series. What's up with that? Seems like they don't give a crap.
 

SniperDaws

Senior member
Aug 14, 2007
762
0
0
pointless, just make graphics cards use a speedstep like intel do for CPU's it will work better than hybrid motherboards.
 

betasub

Platinum Member
Mar 22, 2006
2,677
0
0
Originally posted by: Scoop
What I don't understand, is how Nvidia still hasn't adopted the clock drops while idle like AMD already did with the 38xx series.

Nvidia has used them in the past - even my old 5900 card has Standard(2D) and Performance(3D) clocks. ATI hasn't recently started using them either - look back at their earlier high-end cards from previous generations.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: SniperDaws
pointless, just make graphics cards use a speedstep like intel do for CPU's it will work better than hybrid motherboards.

Incorrect. A "speedstep" type graphics setup can significantly reduce heat and power usage from the given graphics card, but not a 100% reduction. So. There is a "point" after all.
After all these years of people placing so much emphasis and importance on power consumption, heat, noise, you'd think this would be a most welcomed feature.
Instead, I amazed to see some folks call it pointless.
 

SniperDaws

Senior member
Aug 14, 2007
762
0
0
Originally posted by: keysplayr2003
Originally posted by: SniperDaws
pointless, just make graphics cards use a speedstep like intel do for CPU's it will work better than hybrid motherboards.

Incorrect. A "speedstep" type graphics setup can significantly reduce heat and power usage from the given graphics card, but not a 100% reduction. So. There is a "point" after all.
After all these years of people placing so much emphasis and importance on power consumption, heat, noise, you'd think this would be a most welcomed feature.
Instead, I amazed to see some folks call it pointless.


you would still be using power to power the on board graphics so its no diffrent than implementing a speedstep technology for graphics cards, i imo would prefer a speed step config instead of having on board graphics aswell as graphics card, its like going back to the voodoo 2 series of cards where you had a crappy 2d card for desktop use and a 3d graphics card for gaming.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: SniperDaws
Originally posted by: keysplayr2003
Originally posted by: SniperDaws
pointless, just make graphics cards use a speedstep like intel do for CPU's it will work better than hybrid motherboards.

Incorrect. A "speedstep" type graphics setup can significantly reduce heat and power usage from the given graphics card, but not a 100% reduction. So. There is a "point" after all.
After all these years of people placing so much emphasis and importance on power consumption, heat, noise, you'd think this would be a most welcomed feature.
Instead, I amazed to see some folks call it pointless.


you would still be using power to power the on board graphics so its no diffrent than implementing a speedstep technology for graphics cards, i imo would prefer a speed step config instead of having on board graphics aswell as graphics card, its like going back to the voodoo 2 series of cards where you had a crappy 2d card for desktop use and a 3d graphics card for gaming.

That's kind of a reach. How much power can onboard graphics possibly use? Little more than the chipset itself and certainly less than a High end card clocked down.

And, you could not shut down the voodoo2 series when not gaming. So how is this like going back to that era?
 

Jumpem

Lifer
Sep 21, 2000
10,757
3
81
Originally posted by: keysplayr2003
Incorrect. A "speedstep" type graphics setup can significantly reduce heat and power usage from the given graphics card, but not a 100% reduction. So. There is a "point" after all.
After all these years of people placing so much emphasis and importance on power consumption, heat, noise, you'd think this would be a most welcomed feature.
Instead, I amazed to see some folks call it pointless.

Nvidia and ATI should spend their resources on improving performance. I couldn't care less about saving $2 on my electric bill.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Jumpem
Originally posted by: keysplayr2003
Incorrect. A "speedstep" type graphics setup can significantly reduce heat and power usage from the given graphics card, but not a 100% reduction. So. There is a "point" after all.
After all these years of people placing so much emphasis and importance on power consumption, heat, noise, you'd think this would be a most welcomed feature.
Instead, I amazed to see some folks call it pointless.

Nvidia and ATI should spend their resources on improving performance. I couldn't care less about saving $2 on my electric bill.

I'm pretty sure that's exactly what they did. Boy, electricity is cheap where you are eh?
 

imported_Scoop

Senior member
Dec 10, 2007
773
0
0
Originally posted by: betasub
Originally posted by: Scoop
What I don't understand, is how Nvidia still hasn't adopted the clock drops while idle like AMD already did with the 38xx series.

Nvidia has used them in the past - even my old 5900 card has Standard(2D) and Performance(3D) clocks. ATI hasn't recently started using them either - look back at their earlier high-end cards from previous generations.

I don't really care what happened in the past, I care about the present. One of the very strong points of todays Radeon cards is the power saving feature and I am very happy to own a Radeon 3850 knowing how little it consumes while idle compared to a 8800GT.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
here is texas CHEAPEST electricity is 14 cents per kwh. (thanks to nationalpowerco going out of business due to their 11 cents kwh contracts, and everyone else raising the price).
Some places are as cheap as 6 cents per kwh... I know of some places in the US where it costs 24 cents per KWH

I game for 2 hours a day (no savings there), use the computer for other things for 10 hours a day (this is where powerplay/hybrid power reduces power), and leave it off for the remaining 12 or so (zero power taken, but not due to powerplay/hybridpower).

an idling G92 single GPU card takes about 60-70 watts... and idle underclocked (speedstep like tech) 38xx single GPU card takes about 40-50 BECUASE AMD already implemented a speedstep like function of underclocking the card (the HD2xxx series... those can go OVER 100 watts while idle).
An IGP takes an additional 5 watts at most.

35 to 65 watt reduction for a current gen card, higher for the G2xxx, probably higher for the 48xx series, and definitely higher for additional GPUs. (not to mention you don't have to subtract the 5 extra watts the IGP takes).

(40-5) watt * 0.001 kilowatt/watt * 10 hours / day * 365 days / year * 0.14 $/kwh = 17.885$ a year FOR ME, and I turn them off when not in use
(70-5) watt * 0.001 kilowatt/watt * 10 hours / day * 365 days / year * 0.14 $/kwh = 33.215$ a year FOR ME, and I turn them off when not in use

semi-worst case scenario (no point in using the HD2xxx cards... they are obsolete):
leave computer on 24 hours a day (I used to do this), 3 or 4 card SLI/CF taking 200+ watts when IDLE, not accounting for AC costs again.
(200-5) watt * 0.001 kilowatt/watt * 22 hours / day * 365 days / year * 0.14 $/kwh = 219.219 $ a year... in electric costs for your idling a serious mGPU setup for a person who leaves on 24/7

But since I turn it off lets recalculate at only 10 hours a day:
(200-5) watt * 0.001 kilowatt/watt * 10 hours / day * 365 days / year * 0.14 $/kwh = 99.645$ a year FOR ME, and I turn them off when not in use

Since this is measured power draw from the wall some PSU inefficiency loss is accounted for. But increased AC costs are not.

This is the REAL dollar amount I will personally save... a person with multi GPU setup will save a multiple of this... a person who leaves his computer on 24 hours a day will save more then twice this, or exactly twice if he plays for 4 hours a day EVERY DAY.
I was being generous by saying 2 hours a day... I play 10 hours a day when a new game like mass effect comes out, and then I don't play anything for a month or so...

Note that hybrid power and the powerplay tech DOES NOTHING while you are gaming, or while the computer is off (or at S3 sleep). It only helps when you are using the computer, but not for games, which is the majority of the time for most people.