Hybrid video... 35$ to 350$ a year realistic cost reduction?

taltamir

Lifer
Mar 21, 2004
13,576
6
76
So AMDs hybrid CF was announced some monthes ago, but the were quite clear that the first generation of it will NOT support turning off your video card while in the desktop.

nVidia just announced their version of it, and theirs WILL...

AMD:
+ increase performance
+ multiple monitors
- No turning off GPU

nVidia:
+ increase performance
+ turn off GPU
- single monitor only.


Seems like AMD misses the mark yet again. I was going to switch to AMD for this technology, but with them not including it in first gen and nvidia coming up with a version that does I guess I will be buying nvidia. In fact I am currently waiting, I was itching to upgrade my video card, but I will hold off until this tech is out and buy whichever comes up with it first. (because I am tired of my 7900GS).
This does not bode well for the competitive market since AMD has once again had come up with something amazing and missed the mark on its implementation.

EDIT:
People keep on arguing that power reduction is only useful in laptops so let me use some MATH to back up what I say...
I live in texas. Texas average cost is 14 cents + extra charges per KWh (if you use TXU, most people just haven't switched after the deregulation). I have a few tenth of a cent above the lowest in texas... coming at 12 cents per KWh (including all extra charges in that figure).

A reasonable reduction for a mid range card is 100 watts by turning off the card nad using the on-board GPU. (probably as high as 200 watts for the truly beastly cards)
100 watt drop, assuming your comp is on 24/7 and you use the video card for an average of 4 hours a day every day gives you 20 hours at 100 watt aka 2kwh a day. 730kwh a year, @12 cents per kwh thats 87.6$ a year...

If you are using a beastly card that takes 200 watt MORE then the onboard GPU on IDLE then you should save twice that... If you have two beastly cards then its four times the savings (350.4$ a year).

87.6$ minimum savings per year for a person who uses his computer 24/7 is the most tangible savings I have EVER seen from an energy efficient product.

Now not everyone leaves their computer on 24/7

I turn off my computer when not in use... on an average day I spend 8 hours doing general computing and 4 hours playing games. 8 hours of general computer where I could save 100 watts with a MID RANGE (not even a high end) card...

8 hours of 100 watts saved = 0.8 KWh power reduction per day... 292 KWh per year... multiple by the low low price of 12 cents per KWh (remember, lowest in texas!) and it comes out to be 35.04$ a year.
multiply by texas average price of 15 cents and you get 43.8$...

That is with no change whatsoever to the computer usage... all I would need to do for this is buy a compatible mobo and video card and install the driver.

So while this is not good justification for upgrading, it IS a good reason to choose what to update to.
 

shabby

Diamond Member
Oct 9, 1999
5,782
45
91
Originally posted by: taltamir
nVidia:
+ increase performance
+ turn off GPU
- single monitor only.


Seems like AMD misses the mark yet again.

And so did nvidia...
 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
Why would you want to turn it off if it supports multiple monitors either way? :confused:

From my point of view, one problem with Crossfire is that the compositing engine can still only output at 350mhz over analog as far as I know, which means I can't max out my monitor on it (can only use 2048x1536 at 75hz instead of 85hz). To be fair though, this won't affect most people and I wouldn't consider any sort of multi-card setup again anyway, after my experience with SLI.
 

nismotigerwvu

Golden Member
May 13, 2004
1,568
33
91
Its 1st generation technology (not in multiple cards but in mixed cards) compromises have to be made. As long as it drops the clocks in desktop mode I'd be happy, as I'm sure the card being on is what allows multi-monitors anyways.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: shabby
Originally posted by: taltamir
nVidia:
+ increase performance
+ turn off GPU
- single monitor only.


Seems like AMD misses the mark yet again.

And so did nvidia...

Where are you guys getting single monitor only? The attached graphic in the article clearly shows 2 monitors from the mainboard outputs, although the 2nd will most likely be VGA so analog only to a 2nd LCD.

It all sucks imo, and something no one with a high-end GPU will even bother with. Just a marketing gimmick to increase % share of GPUs in the market imho. Most with a high-end GPU would settle for a driver/OC'ing utility that properly throttled in 3D vs. 2D/desktop mode.
 

gorobei

Diamond Member
Jan 7, 2007
3,960
1,444
136
did you read the article? http://www.anandtech.com/tradeshows/showdoc.aspx?i=3193


the performance benefit is only if you are running a discrete card of equivalent performance to the IGP(ie crappy low end cards. so you now have the equivalent of 2 crappy cards that kinda come close to the lowest midrange card performance). Performance in general drops with a higher end GPU as it is saddled with the IGP crossover process.

the main objective with high end cards is to be able to shut down the 200w guzzling idle draw of a 8800 series SLI so they can claim to save power. (Most high end gamers don't care how much power they are using unless it affects OC temps or the PSU they need to buy.) It's supposed to throttle up the dedicated card when the system calls for 3d processing, meaning your expensive GPU gets to heat up and cool down hundreds of times rather than stay at a nice constant temp during 2d idle.

All of this is just a play to make nvidia motherboards more exclusive and justify their existence.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: gorobei
did you read the article? http://www.anandtech.com/tradeshows/showdoc.aspx?i=3193


the performance benefit is only if you are running a discrete card of equivalent performance to the IGP(ie crappy low end cards. so you now have the equivalent of 2 crappy cards that kinda come close to the lowest midrange card performance). Performance in general drops with a higher end GPU as it is saddled with the IGP crossover process.

the main objective with high end cards is to be able to shut down the 200w guzzling idle draw of a 8800 series SLI so they can claim to save power. (Most high end gamers don't care how much power they are using unless it affects OC temps or the PSU they need to buy.) It's supposed to throttle up the dedicated card when the system calls for 3d processing, meaning your expensive GPU gets to heat up and cool down hundreds of times rather than stay at a nice constant temp during 2d idle.

All of this is just a play to make nvidia motherboards more exclusive and justify their existence.

Not sure who this was directed to, but yes I did read the article. My point is that no one with a high-end GPU is going to bother with it for many of the reasons you pointed out, but also that they don't really care how much power their card is using as long as its giving them the best possible performance without sacrifice.

This multi-GPU solution has a LOT of question marks in that regard, from
1) passing more data through an already hot and underwhelming MCP
2) forcing frame buffer to pass through much slower system RAM rather than the main card's GDDR
3) decreasing system memory by at least 256MB, which is especially significant for 32-bit OS users.

So if given the choice between losing 256MB system RAM, potential performance hit from saturating the PCI-E bus or instability with FSB overclocks due to extra heat in the NB, the solution would be to simply disable the IGP via BIOS and run as if there was no IGP at all.

But like I said, most high-end GPU users would simply settle for a driver/profile that allowed you to properly throttle clockspeeds in 2D/desktop mode vs. 3D mode which would significantly cut down idle power use. Up until the 163s you could do this reliably in RivaTuner, but the latest drivers broke that as well forcing you to run 2D and 3D at linked clock speeds.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: gorobei
did you read the article? http://www.anandtech.com/tradeshows/showdoc.aspx?i=3193


the performance benefit is only if you are running a discrete card of equivalent performance to the IGP(ie crappy low end cards. so you now have the equivalent of 2 crappy cards that kinda come close to the lowest midrange card performance). Performance in general drops with a higher end GPU as it is saddled with the IGP crossover process.

the main objective with high end cards is to be able to shut down the 200w guzzling idle draw of a 8800 series SLI so they can claim to save power. (Most high end gamers don't care how much power they are using unless it affects OC temps or the PSU they need to buy.) It's supposed to throttle up the dedicated card when the system calls for 3d processing, meaning your expensive GPU gets to heat up and cool down hundreds of times rather than stay at a nice constant temp during 2d idle.

All of this is just a play to make nvidia motherboards more exclusive and justify their existence.

I am a high end gamer and I care... 200 watts is 50 cents a day... so dropping 200watt idle of a beastly 8800 series card means saving 180$ a year if you leave your comp on 24/7. (50 cents a day based on 12 cents per kwh... which is the LOWEST available in texas abd what I pay... texas average is 14+extra charges aka 15 cents per kwh)

However I think it will probably only drop 100-150watts in reality. But that is still a lot.
100 watt drop, assuming your comp is on 24/7 and you use the video card for an average of 4 hours a day every day gives you 20 hours at 100 watt aka 2kwh a day. 730kwh a year, @12 cents per kwh thats 87.6$ a year...

87.6$ minimum savings per year for a person who uses his computer 24/7 is the most tangible savings I have EVER seen from an energy efficient product.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Its much less than that in reality if you have S1 Standby enabled, which is enabled by default on both your board and Windows. My PC is snoozing about an hour after I am or after I head to work.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Except that I have seen the following happen on standby (each happened to me MORE then once), ranked from most to least severe:
1. OS corruption due the standby and autoupdate/setup not playing nice (windows goes into standby while something is being installed that is messing with OS files... for example SP1 for micsoft visual studio took an average person 4 hours to install... sometimes more.)
- I know this isn't SUPPOSED to happen, there are safeties to prevent it from going into standby when it shouldn't, but it happens none the less.
2. Standby entered while a p2p application was running causing data corruption.
3. Various bugs caused by standby. (ie, mouse / monitor / etc not working when comming back from standby)
4. Downloads (not p2p, IE download window or firefox etc) disrupted by standby.
5. Network file sharing disrupted by standby.
6. Distributed computing doesn't work in sleep mode, so there is absolutely no reason to use.

Standby is the absolute buggiest and unstable aspect of windows to date. I disable it on every computer I handle. Every now and then I try it again and it ruins my stuff again.


Regardless If I am not using my computer I don't want it in sleep mode, I simply turn it off OR hibernate it. The savings for a person who USES his computer many hours a day is significant as well...

Yesturday I played games for 6 hours and spend 10 hours doing other things with my computer, for the other 8 hours it was off (slept 5 hours, did other things for 3 hours)... with hybrid SLI saving 100 watts while in 2d mode that would have saved me 12 cents during those 10 hours of general usage... in ONE DAY. Now that day was unusual in that I platyed 6 hours... I usually play less... (and sleep more). And use less general computing...

multiply by 365 it comes out to 43.8$ dollars extra savings for a person who turns off his computer when not using it... So A person who leaves the comp on 24/7 would save a MINIMUM of 80+$...with probably a max of 150$ per card if they have something truly beastly.

A person who turns his computer OFF when not in use, and uses his computer for 10 hours a day NOT to play a game saves 43.8$+ a year (again, assuming lowest priced electricity in Texas, not even the average cost, much less the most expensive)
 

mruffin75

Senior member
May 19, 2007
343
0
0
Originally posted by: taltamir
So AMDs hybrid CF was announced some monthes ago, but the were quite clear that the first generation of it will NOT support turning off your video card while in the desktop.

nVidia just announced their version of it, and theirs WILL...

AMD:
+ increase performance
+ multiple monitors
- No turning off GPU

nVidia:
+ increase performance
+ turn off GPU
- single monitor only.


Seems like AMD misses the mark yet again.

turning off the GPU is really only useful for laptops. (yes it can be handy for desktops..but not as much)..

I'd so much rather multiple monitors myself...seems like Nvidia has missed it actually.. theirs will be better for laptops (but then again..how much of an increase do you think you'll actually get?)
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
turning off the GPU is really only useful for laptops. (yes it can be handy for desktops..but not as much)..

So... i updated with some math but basically speaking, you can realistically save 35$ (8 hours of desktop use a day with lowest electricity cost in TX) to 350$+ (2 beastly video cards on a computer that is on 24/7 with an average of 4 hour a day game play time and lowest electricity cost in texas...) a year with hybrid SLI depending on your usage. Thats with REALISTIC usages and REALISTIC power prices (ie, I am always assuming you have the absolute cheapest power, not even the average... and realistic hours of operation). Thats way more then you would save with a laptop... personally if I am gonna use a laptop for more then 10 minutes I have it plugged in to a power source anyways. So I would consider this a greater benefit to desktop than laptops.
 

mruffin75

Senior member
May 19, 2007
343
0
0
Originally posted by: taltamir
Yesturday I played games for 6 hours and spend 10 hours doing other things with my computer, for the other 8 hours it was off (slept 5 hours, did other things for 3 hours)... with hybrid SLI saving 100 watts while in 2d mode that would have saved me 12 cents during those 10 hours of general usage... in ONE DAY. Now that day was unusual in that I platyed 6 hours... I usually play less... (and sleep more). And use less general computing...

Ahah! So that's why you've got so much time writing junk on here! :)

Anyone who can play games for 6 hours a day is not going to care about power consumption...their parents are paying the power bill anyway! :)

And I'm assuming if you were sleeping more, the computer would be turned off more (since you're such a power-miser).. so any power savings with hybrid-sli then would be useless as the computer would be off anyway.

The only time hybrid-sli would be of any benefit would be the "general computing" time you mentioned.. but most people don't have 16 hours a day to play on their computer, they have this unusual thing called "work" that gets in the way :) And work computers usually don't even have a dedicated video card, so there goes the power saving! MOST people only have a couple hours a day to use their computer at all, so the power saving features of hybrid-sli are again, useless.

Now on the other hand, the multi-monitor support from ATI would come in VERY handly in those couple of hours...


 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: mruffin75
Originally posted by: taltamir
Yesturday I played games for 6 hours and spend 10 hours doing other things with my computer, for the other 8 hours it was off (slept 5 hours, did other things for 3 hours)... with hybrid SLI saving 100 watts while in 2d mode that would have saved me 12 cents during those 10 hours of general usage... in ONE DAY. Now that day was unusual in that I platyed 6 hours... I usually play less... (and sleep more). And use less general computing...

Ahah! So that's why you've got so much time writing junk on here! :)

Anyone who can play games for 6 hours a day is not going to care about power consumption...their parents are paying the power bill anyway! :)

Umm, he posted on 1/7. A Monday. Stating that "yesterday I played games for 6 hours etc. etc." Which would be a "Sunday". No?

 

mruffin75

Senior member
May 19, 2007
343
0
0
Originally posted by: keysplayr2003

Umm, he posted on 1/7. A Monday. Stating that "yesterday I played games for 6 hours etc. etc." Which would be a "Sunday". No?

He uh... well...err... could work on weekends? ;)

But still that wasn't really the point of the post... :)


 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
How hard is it to turn off your PC when you go to work or sleep? All I have to do is hit the power button when I leave and it shuts down. Hit the button again when I'm back and in 30 seconds it's running again. Is it really so hard to do?
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
Originally posted by: munky
How hard is it to turn off your PC when you go to work or sleep? All I have to do is hit the power button when I leave and it shuts down. Hit the button again when I'm back and in 30 seconds it's running again. Is it really so hard to do?
Yes. :p

I have Vista's sleep mode working in conjunction with Media Center. The computer automatically wakes itself 8 min. before a scheduled recording and goes back to sleep immediately after.

The trick to getting sleep mode working was in tracking down the bugs and doing stability testing. A BIOS update (ver. F6 -> F9) fixed the random lockups in sleep mode. The F10 BIOS then broke sleep mode, so I had to revert back to F9. But if you boot and test sleep mode to make sure that it will work 100% of the time and not just by sheer luck, it's very reliable.

Disabling hybrid sleep mode (i.e. where all RAM is copied to HDD) and getting a UPS really cut down on the length of time needed to sleep and resume. Now it takes just a few seconds. And most of my computer still shuts down: the processor enters a very low power state, the fans shut off (even to the PSU), the video card turns off, etc.

I have my PC in my room, and this REALLY cuts down on the heat.

nVidia's new SLI modes are interesting, but (as I said in the article's comments), they really need to fix the fundamental flaws in normal SLI before multi-GPU stuff will really take off. Users should not require hacks and workarounds (even if they are in nice profiles) just to get SLI working in certain games. Support should be a part of the API. Why can't nVidia, ATI, and Microsoft get this stuff sorted out? R700 is supposed to be multi-core, right?
 

thilanliyan

Lifer
Jun 21, 2005
12,040
2,256
126
Originally posted by: taltamir
EDIT:
People keep on arguing that power reduction is only useful in laptops so let me use some MATH to back up what I say...

8 hours of 100 watts saved = 0.8 KWh power reduction per day... 292 KWh per year... multiple by the low low price of 12 cents per KWh (remember, lowest in texas!) and it comes out to be 35.04$ a year.
multiply by texas average price of 15 cents and you get 43.8$...

What exactly is the point of the math you brought up?

If you need to save $45 a year you shouldn't be gaming on a 8800 card anyway.
 

mruffin75

Senior member
May 19, 2007
343
0
0
Originally posted by: thilan29
Originally posted by: taltamir
EDIT:
People keep on arguing that power reduction is only useful in laptops so let me use some MATH to back up what I say...

8 hours of 100 watts saved = 0.8 KWh power reduction per day... 292 KWh per year... multiple by the low low price of 12 cents per KWh (remember, lowest in texas!) and it comes out to be 35.04$ a year.
multiply by texas average price of 15 cents and you get 43.8$...

What exactly is the point of the math you brought up?

If you need to save $45 a year you shouldn't be gaming on a 8800 card anyway.

That's also assuming you have an average of 8 hours a day of gaming...which is way too high.
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
Originally posted by: mruffin75
Originally posted by: thilan29
Originally posted by: taltamir
EDIT:
People keep on arguing that power reduction is only useful in laptops so let me use some MATH to back up what I say...

8 hours of 100 watts saved = 0.8 KWh power reduction per day... 292 KWh per year... multiple by the low low price of 12 cents per KWh (remember, lowest in texas!) and it comes out to be 35.04$ a year.
multiply by texas average price of 15 cents and you get 43.8$...

What exactly is the point of the math you brought up?

If you need to save $45 a year you shouldn't be gaming on a 8800 card anyway.

That's also assuming you have an average of 8 hours a day of gaming...which is way too high.

...

:Q
 

JimiP

Senior member
May 6, 2007
258
0
71
Originally posted by: thilan29
Originally posted by: taltamir
EDIT:
People keep on arguing that power reduction is only useful in laptops so let me use some MATH to back up what I say...

8 hours of 100 watts saved = 0.8 KWh power reduction per day... 292 KWh per year... multiple by the low low price of 12 cents per KWh (remember, lowest in texas!) and it comes out to be 35.04$ a year.
multiply by texas average price of 15 cents and you get 43.8$...

What exactly is the point of the math you brought up?

If you need to save $45 a year you shouldn't be gaming on a 8800 card anyway.

Basically what he's trying to say is, why spend the extra money if you don't have to? Just because he's gaming with an 8800 GPU doesn't mean that he can't try to save some money.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
I don't know about you, but I spend very little time with my Wintendo in 2d mode. And as another poster pointed out for people who do the hundreds of extra heat up/cool down cycles may just eat up the energy savings in equipment failures.

Need a 24/7 server box? Then don't stick an 8800GT into it. Get a low power via or core celeron based box with a nice energy efficient chipset and call it a day.
 

thilanliyan

Lifer
Jun 21, 2005
12,040
2,256
126
Originally posted by: JimiP
Basically what he's trying to say is, why spend the extra money if you don't have to? Just because he's gaming with an 8800 GPU doesn't mean that he can't try to save some money.

I suppose.
 

TC777

Member
May 12, 2005
62
0
0
Originally posted by: JimiP

Basically what he's trying to say is, why spend the extra money if you don't have to? Just because he's gaming with an 8800 GPU doesn't mean that he can't try to save some money.

No, its more like this thread is just another Nvidia fanboy bashing ATI/AMD thread. Fact is, ATI has their PowerPlay in their new cards for saving power when its not needed.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
@mruffin75: No I am assuming 8 hours of regular useage that is NOT gaming (ie, talking on the anandtech forum, irc, porn, webcomics, chat, AIM, etc etc)... that 4 hours of gaming a day I was assumed had 0 savings... this does not reduce cost while gaming, only while NOT gaming.

Originally posted by: tcool93
Originally posted by: JimiP

Basically what he's trying to say is, why spend the extra money if you don't have to? Just because he's gaming with an 8800 GPU doesn't mean that he can't try to save some money.

No, its more like this thread is just another Nvidia fanboy bashing ATI/AMD thread. Fact is, ATI has their PowerPlay in their new cards for saving power when its not needed.

No I am merely disappointed in AMDs choice to drop the most important feature. I didn't even KNOW nvidia is gonna have any hybrid SLI until recently.. I was extremely hyped when I heard AMD was making hybrid CF with the ability to turn off the power of a card a few months ago and decided that because that my next card will be an AMD card... but then I get notice that AMD is not gonna support that in the first generation. And then nvidia says they will have something similar that WILL. I went from deciding to buy an AMD card (instead of an nvidia one because of hybrid CF promising to turn off the video card) back to buying an Nvidia card (when hybrid CF said it wouldn't happen at first, and nvidia DID)... basically my next card is gonna be the one with the ability to turn off the video card when not in use.
AMD needs any advantage it can get right now to keep the market competitive, they came up with a feature that was gonna lure me from buying nvidia cards to buying AMD cards again. And then they go and mess it up and nvidia comes up with the same thing done RIGHT. That is a really bad thing for the market, but just goes to show how mismanaged AMD is right now. (I say mismanaged and not differently prioritized because they are targetting an average joe to buy a low end video card + motherboard combo for an hybrid CF situation that increases performance and allows multi monitors... average joe doesn't care or comprehend. Its harcore moe who buys a super expensive card that really cares and who would be more likely to buy such a technology)

Originally posted by: thilan29
Originally posted by: JimiP
Basically what he's trying to say is, why spend the extra money if you don't have to? Just because he's gaming with an 8800 GPU doesn't mean that he can't try to save some money.

I suppose.

whats with the "if you can afford an expensive card you can afford to flush money down the toilet" attitude... I know some guys (knew them from highschool) who are now on WELFARE who buy top end computer parts to play in their TRAILER (literally). They don't do anything but play video games anyways.

@nullpointerus: If you had to upgrade and then downgrade your bios just to make sleep mode not crash on you... then its not perfectly stable... it still doesn't relate to the OTHER problems I had with sleep mode... and its all completely irrelevant if you turn off your computer when not in use... The whole point of turning off your video card when not in use is NOT sleep mode... sleep mode can ALREADY turn off your video card! The point of this is that you could be sitting there using your computer exensively, getting 100% cpu usage from compressing files while reading some forums or articles and the computer will be fully active, awake, in use, and with the video card turned off (because you are not playing a game it is not needed)

Originally posted by: mruffin75Anyone who can play games for 6 hours a day is not going to care about power consumption...their parents are paying the power bill anyway! :)
Even when I lived with my parents I still cared... anyways I have been living on my own for 5 years now. And I work to pay for my stuff.

Granted that one was a flawed example since it was a sunday... but on an average work day I work 6 hours, and use the computer for 10. and of those 10 about 8 are general computing... I revised my math for the original post... that was indeed flawed.